884 resultados para Litmanen, Tapio: The struggle over risk


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Elevated plasma cholesterol, high blood pressure and cigarette smoking are three major risk factors for coronary heart disease. Within the framework of Switzerland's participation in the multicenter study MONICA (MONItoring of trends and determinants in CArdiovascular disease), proposed by the WHO, a first risk factor survey was conducted in a representative sample of the population (25-74 years) of two reporting units (cantons of Vaud and Fribourg, canton of Tessin). A high blood cholesterol level (>6,7 mmol/l) is the most common risk factor for coronary heart disease among the studied population. Among men, about 13% have elevated blood pressure, the proportion being about one in ten among women; these proportions increase with age and are slightly above these values in Tessin. Cigarette smoking is still a common behavior; between 25 and 45 years one third of the population (male and female) regularly smoke cigarettes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To understand the reasons for differences in the delineation of target volumes between physicians. MATERIAL AND METHODS: 18 Swiss radiooncology centers were invited to delineate volumes for one prostate and one head-and-neck case. In addition, a questionnaire was sent to evaluate the differences in the volume definition (GTV [gross tumor volume], CTV [clinical target volume], PTV [planning target volume]), the various estimated margins, and the nodes at risk. Coherence between drawn and stated margins by centers was calculated. The questionnaire also included a nonspecific series of questions regarding planning methods in each institution. RESULTS: Fairly large differences in the drawn volumes were seen between the centers in both cases and also in the definition of volumes. Correlation between drawn and stated margins was fair in the prostate case and poor in the head-and-neck case. The questionnaire revealed important differences in the planning methods between centers. CONCLUSION: These large differences could be explained by (1) a variable knowledge/interpretation of ICRU definitions, (2) variable interpretations of the potential microscopic extent, (3) difficulties in GTV identification, (4) differences in the concept, and (5) incoherence between theory (i.e., stated margins) and practice (i.e., drawn margins).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sex-dependent selection often leads to spectacularly different phenotypes in males and females. In species in which sexual dimorphism is not complete, it is unclear which benefits females and males derive from displaying a trait that is typical of the other sex. In barn owls (Tyto alba), females exhibit on average larger black eumelanic spots than males but members of the two sexes display this trait in the same range of possible values. In a 12-year study, we show that selection exerted on spot size directly or on genetically correlated traits strongly favoured females with large spots and weakly favoured males with small spots. Intense directional selection on females caused an increase in spot diameter in the population over the study period. This increase is due to a change in the autosomal genes underlying the expression of eumelanic spots but not of sex-linked genes. Female-like males produced more daughters than sons, while male-like females produced more sons than daughters when mated to a small-spotted male. These sex ratio biases appear adaptive because sons of male-like females and daughters of female-like males had above-average survival. This demonstrates that selection exerted against individuals displaying a trait that is typical of the other sex promoted the evolution of specific life history strategies that enhance their fitness. This may explain why in many organisms sexual dimorphism is often not complete.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What genotype should the scientist specify for conducting a database search to try to find the source of a low-template-DNA (lt-DNA) trace? When the scientist answers this question, he or she makes a decision. Here, we approach this decision problem from a normative point of view by defining a decision-theoretic framework for answering this question for one locus. This framework combines the probability distribution describing the uncertainty over the trace's donor's possible genotypes with a loss function describing the scientist's preferences concerning false exclusions and false inclusions that may result from the database search. According to this approach, the scientist should choose the genotype designation that minimizes the expected loss. To illustrate the results produced by this approach, we apply it to two hypothetical cases: (1) the case of observing one peak for allele xi on a single electropherogram, and (2) the case of observing one peak for allele xi on one replicate, and a pair of peaks for alleles xi and xj, i ≠ j, on a second replicate. Given that the probabilities of allele drop-out are defined as functions of the observed peak heights, the threshold values marking the turning points when the scientist should switch from one designation to another are derived in terms of the observed peak heights. For each case, sensitivity analyses show the impact of the model's parameters on these threshold values. The results support the conclusion that the procedure should not focus on a single threshold value for making this decision for all alleles, all loci and in all laboratories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Objectives: This review will briefly present the epidemiology and risk factors of gout, with a focus on recent advances. Methods: Key papers for inclusion were identified by a PubMed search, and articles were selected according to their relevance for the topic, according to authors' judgment. Results and conclusions: Gout therapy has remained very much unchanged for the last 50 years, but recently we have seen the approval of another gout treatment: the xanthine oxidase inhibitor febuxostat, and several new drugs are now in the late stages of clinical testing. Together with our enhanced level of understanding of the pathophysiology of the inflammatory process involved, we are entering a new era for the treatment of gout.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To assess the prevalence of problem gambling in a population of youths in Switzerland and to determine its association with other potentially addictive behaviours. METHODS: Cross-sectional survey including 1,102 participants in the first and second year of post-compulsory education, reporting gambling, socio-demographics, internet use and substance use. For three categories of gambling (nongambler; nonproblem gambler and at-risk/problem gambler). socio-demographic and addiction data were compared using a bivariate analysis. All significant variables were included in a multinominal logistic regression using nongamblers as the reference category. RESULTS: The prevalence of gamblers was 37.48% (n = 413), with nonproblem gamblers being 31.94% (n = 352) and at-risk/problem gamblers 5.54% (n = 61). At the bivariate level, severity of gambling increased among adults (over 18 years) and among males, vocational students, participants not living with both parents and youths having a low socio-economic status. Gambling was also associated to the four addictive behaviours studied. At the multivariate level, risk of nonproblem gambling was increased in males, older youths, vocational students, participants of Swiss origin and alcohol misusers. Risk of at-risk/problem gambling was higher for males, older youths, alcohol misusers, participants not living with both parents and problem internet users. CONCLUSIONS: One-third of youths in our sample had gambled in the previous year and gambling is associated with other addictive behaviours. Clinicians should screen their adolescent patients for gambling habits, especially if other addictive behaviours are present. Additionally, gambling should be included in prevention campaigns together with other addictive behaviours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper challenges the prevailing view of the neutrality of the labour income share to labour demand, and investigates its impact on the evolution of employment. Whilst maintaining the assumption of a unitary long-run elasticity of wages with respect to productivity, we demonstrate that productivity growth affects the labour share in the long run due to frictional growth (that is, the interplay of wage dynamics and productivity growth). In the light of this result, we consider a stylised labour demand equation and show that the labour share is a driving force of employment. We substantiate our analytical exposition by providing empirical models of wage setting and employment equations for France, Germany, Italy, Japan, Spain, the UK, and the US over the 1960-2008 period. Our findings show that the timevarying labour share of these countries has significantly influenced their employment trajectories across decades. This indicates that the evolution of the labour income share (or, equivalently, the wage-productivity gap) deserves the attention of policy makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the relationship between investor protection, entrepreneurial risk taking and income inequality. In the presence of market frictions, better protection makes investors more willing to take on entrepreneurial risk when lending to firms, thereby improving the degree of risk sharing between financiers and entrepreneurs. On the other hand, by increasing risk sharing, investor protection also induces more firms to undertake risky projects. By increasing entrepreneurial risk taking, it raises income dispersion. By reducing the risk faced by entrepreneurs, it reduces income volatility. As a result, investor protection raises income inequality to the extent that it fosters risk taking, while it reduces it for a given level of risk taking. Empirical evidence from a panel of forty-five countries spanning the period 1976-2000 supports the predictions of the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Cost effective means of assessing the levels of risk factors in the population have to be defined in order to monitor these factors over time and across populations. This study is aimed at analyzing the difference in population estimates of the mean levels of body mass index (BMI) and the prevalences of overweight, between health examination survey and telephone survey. METHODS: The study compares the results of two health surveys, one by telephone (N=820) and the other by physical examination (N=1318). The two surveys, based on independent random samples of the population, were carried out over the same period (1992-1993) in the same population (canton of Vaud, Switzerland). RESULTS: Overall participation rates were 67% and 53% for the health interview survey (HIS) and the health examination survey (HES) respectively. In the HIS, the reporting rate was over 98% for weight and height values. Self-reported weight was on average lower than measured weight, by 2.2 kg in men and 3.5 kg in women, while self-reported height was on average greater than measured height, by 1.2 cm in men and 1.9 cm in women. As a result, in comparison to HES, HIS led to substantially lower mean levels of BMI, and to a reduction of the prevalence rates of obesity (BMI>30 kg/m(2)) by more than a half. These differences are larger for women than for men. CONCLUSION: The two surveys were based on different sampling procedures. However, this difference in design is unlikely to explain the systematic bias observed between self-reported and measured values for height and weight. This bias entails the overall validity of BMI assessment from telephone surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Denosumab is a fully human monoclonal antibody to the receptor activator of nuclear factor-kappaB ligand (RANKL) that blocks its binding to RANK, inhibiting the development and activity of osteoclasts, decreasing bone resorption, and increasing bone density. Given its unique actions, denosumab may be useful in the treatment of osteoporosis. METHODS: We enrolled 7868 women between the ages of 60 and 90 years who had a bone mineral density T score of less than -2.5 but not less than -4.0 at the lumbar spine or total hip. Subjects were randomly assigned to receive either 60 mg of denosumab or placebo subcutaneously every 6 months for 36 months. The primary end point was new vertebral fracture. Secondary end points included nonvertebral and hip fractures. RESULTS: As compared with placebo, denosumab reduced the risk of new radiographic vertebral fracture, with a cumulative incidence of 2.3% in the denosumab group, versus 7.2% in the placebo group (risk ratio, 0.32; 95% confidence interval [CI], 0.26 to 0.41; P<0.001)--a relative decrease of 68%. Denosumab reduced the risk of hip fracture, with a cumulative incidence of 0.7% in the denosumab group, versus 1.2% in the placebo group (hazard ratio, 0.60; 95% CI, 0.37 to 0.97; P=0.04)--a relative decrease of 40%. Denosumab also reduced the risk of nonvertebral fracture, with a cumulative incidence of 6.5% in the denosumab group, versus 8.0% in the placebo group (hazard ratio, 0.80; 95% CI, 0.67 to 0.95; P=0.01)--a relative decrease of 20%. There was no increase in the risk of cancer, infection, cardiovascular disease, delayed fracture healing, or hypocalcemia, and there were no cases of osteonecrosis of the jaw and no adverse reactions to the injection of denosumab. CONCLUSIONS: Denosumab given subcutaneously twice yearly for 36 months was associated with a reduction in the risk of vertebral, nonvertebral, and hip fractures in women with osteoporosis. (ClinicalTrials.gov number, NCT00089791.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microautophagy involves direct invagination and fission of the vacuolar/lysosomal membrane under nutrient limitation. In Saccharomyces cerevisiae microautophagic uptake of soluble cytosolic proteins occurs via an autophagic tube, a highly specialized vacuolar membrane invagination. At the tip of an autophagic tube vesicles (autophagic bodies) pinch off into thevacuolar lumen for degradation. Formation of autophagic tubes is topologically equivalent to other budding processes directed away from the cytosolic environment, e.g., the invagination of multivesicular endosomes, retroviral budding, piecemeal microautophagy of the nucleus and micropexophagy. This clearly distinguishes microautophagy from other membrane fission events following budding toward the cytosol. Such processes are implicated in transport between organelles like the plasma membrane, the endoplasmic reticulum (ER), and the Golgi. Over many years microautophagy only could be characterized microscopically. Recent studies provided the possibility to study the process in vitro and have identified the first molecules that are involved in microautophagy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new report published by the Institute of Public Health in Ireland (IPH) and released on Monday 9 July 2007, predicts a 26% increase in diabetes in Northern Ireland and a 37% increase in the Republic over the ten year period (2005-2015). The new report entitled, Making Diabetes Count: What does the future hold? is the second such report from the authors - The Irish Diabetes Prevalence Working Group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The global malaria situation has scarcely improved in the last 100 years, despite major advances in our knowledge of the basic biology, epidemiology and clinical basis of the disease. Effective malaria control, leading to a significant decrease in the morbidity and mortality attributable to malaria, will require a multidisciplinary approach. New tools - drugs, vaccine and insecticides - are needed but there is also much to be gained by better use of existing tools: using drugs in combination in order to slow the development of drug resistance; targeting resources to areas of greatest need; using geographic information systems to map the populations at risk and more sophisticated marketing techniques to distribute bed nets and insecticides. Sustainable malaria control may require the deployment of a highly effective vaccine, but there is much that can be done in the meantime to reduce the burden of disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SETTING: Ambulatory paediatric clinic in Lausanne, Switzerland, a country with a significant proportion of tuberculosis (TB) among immigrants. AIM: To assess the factors associated with positive tuberculin skin tests (TST) among children examined during a health check-up or during TB contact tracing, notably the influence of BCG vaccination (Bacille Calmette Guérin) and history of TB contact. METHOD: A descriptive study of children who had a TST (2 Units RT23) between November 2002 and April 2004. Age, sex, history of TB contact, BCG vaccination status, country of origin and birth outside Switzerland were recorded. RESULTS: Of 234 children, 176 (75%) had a reaction equal to zero and 31 (13%) tested positive (>10 mm). In a linear regression model, the size of the TST varied significantly according to the history of TB contact, age, TB incidence in the country of origin and BCG vaccination status but not according to sex or birth in or outside Switzerland. In a logistic regression model including all the recorded variables, age (Odds Ratio = 1.21, 95% CI 1.08; 1.35), a history of TB contact (OR = 7.31, 95% CI 2.23; 24) and the incidence of TB in the country of origin (OR = 1.01, 95% CI 1.00; 1.02) were significantly associated with a positive TST but sex (OR = 1.18, 95% CI 0.50; 2.78) and BCG vaccination status (OR = 2.97, 95% CI 0.91; 9.72) were not associated. CONCLUSIONS: TB incidence in the country of origin, BCG vaccination and age influence the TSTreaction (size or proportion of TST > or = 10 mm). However the most obvious risk factor for a positive TST is a history of contact with TB.