871 resultados para risk effects
Resumo:
A growing body of longitudinal studies suggests that low self-esteem is a risk factor for depression. However, it is unclear whether other characteristics of self-esteem, besides its level, explain incremental or even greater variance in subsequent depression. We examined the prospective effects of self-esteem level, instability (i.e., the degree of variability in self-esteem across short periods), and contingency (i.e., the degree to which self-esteem fluctuates in response to self-relevant events) on depressive symptoms in 1 overarching model, using data from 2 longitudinal studies. In Study 1, 372 adults were assessed at 2 waves over 6 months, including 40 daily diary assessments at Wave 1. In Study 2, 235 young adults were assessed at 2 waves over 6 weeks, including about 6 daily diary assessments at each wave. Self-esteem contingency was measured by self-report and by a statistical index based on the diary data (capturing event-related fluctuations in self-esteem). In both studies self-esteem level, but not self-esteem contingency, predicted subsequent depressive symptoms. Self-esteem instability predicted subsequent depressive symptoms in Study 2 only, with a smaller effect size than self-esteem level. Also, level, instability, and contingency of self-esteem did not interact in the prediction of depressive symptoms. Moreover, the effect of self-esteem level held when controlling for neuroticism and for all other Big Five personality traits. Thus, the findings provide converging evidence for a vulnerability effect of self-esteem level, tentative evidence for a smaller vulnerability effect of self-esteem instability, and no evidence for a vulnerability effect of self-esteem contingency.
Resumo:
The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.
Resumo:
PURPOSE To systematically appraise whether anti-infective protocols are effective in preventing biologic implant complications and implant loss after a mean observation period ≥ 10 years after loading. MATERIALS AND METHODS An electronic search of Medline via PubMed and Embase via Ovid databases complemented by manual search was conducted up to October 31, 2012. Studies were included provided that they were published in English, German, French, or Italian, and conducted on ≥ 20 partially and fully edentulous patients with dental implants and regular (≥ 1×/year) supportive periodontal therapy (SPT) over a mean observation period ≥ 10 years. Assessment of the identified studies and data extraction were performed independently by two reviewers. Authors were contacted if required. Collected data were reported by descriptive methods. RESULTS The initial electronic search resulted in the identification of 994 titles from Medline via PubMed and 531 titles from Embase via Ovid databases, respectively. After elimination of duplicate titles and exclusion of 60 full-text articles, 143 articles were analyzed, resulting in 15 studies eligible for qualitative analysis. The implant survival rate ranged from 85.7% to 99.2% after a mean observation period ≥ 10 years. One comparative study assessed the effects of regular SPT on the occurrence of biologic complications and implant loss. Overall, regular diagnosis and implementation of anti-infective therapeutic protocols were effective in the management of biological complications and prevention of implant loss. Residual probing depths at the end of active periodontal therapy and development of reinfection during supportive periodontal therapy (SPT) represented a significant risk for the onset of peri-implantitis and implant loss. Comparative studies indicated that implant survival and success rates were lower in periodontally compromised vs noncompromised patients. CONCLUSIONS In order to achieve high long-term survival and success rates of dental implants and their restorations, enrollment in regular SPT including anti-infective preventive measures should be implemented. Therapy of peri-implant mucositis should be considered as a preventive measure for the onset of peri-implantitis. Completion of active periodontal therapy should precede implant placement in periodontally compromised patients.
Resumo:
BACKGROUND Data on the association between subclinical thyroid dysfunction and fractures conflict. PURPOSE To assess the risk for hip and nonspine fractures associated with subclinical thyroid dysfunction among prospective cohorts. DATA SOURCES Search of MEDLINE and EMBASE (1946 to 16 March 2014) and reference lists of retrieved articles without language restriction. STUDY SELECTION Two physicians screened and identified prospective cohorts that measured thyroid function and followed participants to assess fracture outcomes. DATA EXTRACTION One reviewer extracted data using a standardized protocol, and another verified data. Both reviewers independently assessed methodological quality of the studies. DATA SYNTHESIS The 7 population-based cohorts of heterogeneous quality included 50,245 participants with 1966 hip and 3281 nonspine fractures. In random-effects models that included the 5 higher-quality studies, the pooled adjusted hazard ratios (HRs) of participants with subclinical hyperthyroidism versus euthyrodism were 1.38 (95% CI, 0.92 to 2.07) for hip fractures and 1.20 (CI, 0.83 to 1.72) for nonspine fractures without statistical heterogeneity (P = 0.82 and 0.52, respectively; I2= 0%). Pooled estimates for the 7 cohorts were 1.26 (CI, 0.96 to 1.65) for hip fractures and 1.16 (CI, 0.95 to 1.42) for nonspine fractures. When thyroxine recipients were excluded, the HRs for participants with subclinical hyperthyroidism were 2.16 (CI, 0.87 to 5.37) for hip fractures and 1.43 (CI, 0.73 to 2.78) for nonspine fractures. For participants with subclinical hypothyroidism, HRs from higher-quality studies were 1.12 (CI, 0.83 to 1.51) for hip fractures and 1.04 (CI, 0.76 to 1.42) for nonspine fractures (P for heterogeneity = 0.69 and 0.88, respectively; I2 = 0%). LIMITATIONS Selective reporting cannot be excluded. Adjustment for potential common confounders varied and was not adequately done across all studies. CONCLUSION Subclinical hyperthyroidism might be associated with an increased risk for hip and nonspine fractures, but additional large, high-quality studies are needed. PRIMARY FUNDING SOURCE Swiss National Science Foundation.
Resumo:
BACKGROUND Observational studies of a putative association between hormonal contraception (HC) and HIV acquisition have produced conflicting results. We conducted an individual participant data (IPD) meta-analysis of studies from sub-Saharan Africa to compare the incidence of HIV infection in women using combined oral contraceptives (COCs) or the injectable progestins depot-medroxyprogesterone acetate (DMPA) or norethisterone enanthate (NET-EN) with women not using HC. METHODS AND FINDINGS Eligible studies measured HC exposure and incident HIV infection prospectively using standardized measures, enrolled women aged 15-49 y, recorded ≥15 incident HIV infections, and measured prespecified covariates. Our primary analysis estimated the adjusted hazard ratio (aHR) using two-stage random effects meta-analysis, controlling for region, marital status, age, number of sex partners, and condom use. We included 18 studies, including 37,124 women (43,613 woman-years) and 1,830 incident HIV infections. Relative to no HC use, the aHR for HIV acquisition was 1.50 (95% CI 1.24-1.83) for DMPA use, 1.24 (95% CI 0.84-1.82) for NET-EN use, and 1.03 (95% CI 0.88-1.20) for COC use. Between-study heterogeneity was mild (I2 < 50%). DMPA use was associated with increased HIV acquisition compared with COC use (aHR 1.43, 95% CI 1.23-1.67) and NET-EN use (aHR 1.32, 95% CI 1.08-1.61). Effect estimates were attenuated for studies at lower risk of methodological bias (compared with no HC use, aHR for DMPA use 1.22, 95% CI 0.99-1.50; for NET-EN use 0.67, 95% CI 0.47-0.96; and for COC use 0.91, 95% CI 0.73-1.41) compared to those at higher risk of bias (pinteraction = 0.003). Neither age nor herpes simplex virus type 2 infection status modified the HC-HIV relationship. CONCLUSIONS This IPD meta-analysis found no evidence that COC or NET-EN use increases women's risk of HIV but adds to the evidence that DMPA may increase HIV risk, underscoring the need for additional safe and effective contraceptive options for women at high HIV risk. A randomized controlled trial would provide more definitive evidence about the effects of hormonal contraception, particularly DMPA, on HIV risk.
Resumo:
Low-grade gliomas (LGGs) are a group of primary brain tumours usually encountered in young patient populations. These tumours represent a difficult challenge because many patients survive a decade or more and may be at a higher risk for treatment-related complications. Specifically, radiation therapy is known to have a relevant effect on survival but in many cases it can be deferred to avoid side effects while maintaining its beneficial effect. However, a subset of LGGs manifests more aggressive clinical behaviour and requires earlier intervention. Moreover, the effectiveness of radiotherapy depends on the tumour characteristics. Recently Pallud et al. (2012. Neuro-Oncology, 14: , 1-10) studied patients with LGGs treated with radiation therapy as a first-line therapy and obtained the counterintuitive result that tumours with a fast response to the therapy had a worse prognosis than those responding late. In this paper, we construct a mathematical model describing the basic facts of glioma progression and response to radiotherapy. The model provides also an explanation to the observations of Pallud et al. Using the model, we propose radiation fractionation schemes that might be therapeutically useful by helping to evaluate tumour malignancy while at the same time reducing the toxicity associated to the treatment.
Resumo:
Subclinical thyroid dysfunction has been associated with coronary heart disease, but the risk of stroke is unclear. Our aim is to combine the evidence on the association between subclinical thyroid dysfunction and the risk of stroke in prospective cohort studies. We searched Medline (OvidSP), Embase, Web-of-Science, Pubmed Publisher, Cochrane and Google Scholar from inception to November 2013 using a cohort filter, but without language restriction or other limitations. Reference lists of articles were searched. Two independent reviewers screened articles according to pre-specified criteria and selected prospective cohort studies with baseline thyroid function measurements and assessment of stroke outcomes. Data were derived using a standardized data extraction form. Quality was assessed according to previously defined quality indicators by two independent reviewers. We pooled the outcomes using a random-effects model. Of 2,274 articles screened, six cohort studies, including 11,309 participants with 665 stroke events, met the criteria. Four of six studies provided information on subclinical hyperthyroidism including a total of 6,029 participants and five on subclinical hypothyroidism (n = 10,118). The pooled hazard ratio (HR) was 1.08 (95 % CI 0.87-1.34) for subclinical hypothyroidism (I (2) of 0 %) and 1.17 (95 % CI 0.54-2.56) for subclinical hyperthyroidism (I (2) of 67 %) compared to euthyroidism. Subgroup analyses yielded similar results. Our systematic review provides no evidence supporting an increased risk for stroke associated with subclinical thyroid dysfunction. However, the available literature is insufficient and larger datasets are needed to perform extended analyses. Also, there were insufficient events to exclude clinically significant risk from subclinical hyperthyroidism, and more data are required for subgroup analyses.
Resumo:
BACKGROUND AND AIMS Limited data from large cohorts are available on tumor necrosis factor (TNF) antagonists (infliximab, adalimumab, certolizumab pegol) switch over time. We aimed to evaluate the prevalence of switching from one TNF antagonist to another and to identify associated risk factors. METHODS Data from the Swiss Inflammatory Bowel Diseases Cohort Study (SIBDCS) were analyzed. RESULTS Of 1731 patients included into the SIBDCS (956 with Crohn's disease [CD] and 775 with ulcerative colitis [UC]), 347 CD patients (36.3%) and 129 UC patients (16.6%) were treated with at least one TNF antagonist. A total of 53/347 (15.3%) CD patients (median disease duration 9 years) and 20/129 (15.5%) of UC patients (median disease duration 7 years) needed to switch to a second and/or a third TNF antagonist, respectively. Median treatment duration was longest for the first TNF antagonist used (CD 25 months; UC 14 months), followed by the second (CD 13 months; UC 4 months) and third TNF antagonist (CD 11 months; UC 15 months). Primary nonresponse, loss of response and side effects were the major reasons to stop and/or switch TNF antagonist therapy. A low body mass index, a short diagnostic delay and extraintestinal manifestations at inclusion were identified as risk factors for a switch of the first used TNF antagonist within 24 months of its use in CD patients. CONCLUSION Switching of the TNF antagonist over time is a common issue. The median treatment duration with a specific TNF antagonist is diminishing with an increasing number of TNF antagonists being used.
Resumo:
Conventional risk assessments for crop protection chemicals compare the potential for causing toxicity (hazard identification) to anticipated exposure. New regulatory approaches have been proposed that would exclude exposure assessment and just focus on hazard identification based on endocrine disruption. This review comprises a critical analysis of hazard, focusing on the relative sensitivity of endocrine and non-endocrine endpoints, using a class of crop protection chemicals, the azole fungicides. These were selected because they are widely used on important crops (e.g. grains) and thereby can contact target and non-target plants and enter the food chain of humans and wildlife. Inhibition of lanosterol 14α-demethylase (CYP51) mediates the antifungal effect. Inhibition of other CYPs, such as aromatase (CYP19), can lead to numerous toxicological effects, which are also evident from high dose human exposures to therapeutic azoles. Because of its widespread use and substantial database, epoxiconazole was selected as a representative azole fungicide. Our critical analysis concluded that anticipated human exposure to epoxiconazole would yield a margin of safety of at least three orders of magnitude for reproductive effects observed in laboratory rodent studies that are postulated to be endocrine-driven (i.e. fetal resorptions). The most sensitive ecological species is the aquatic plant Lemna (duckweed), for which the margin of safety is less protective than for human health. For humans and wildlife, endocrine disruption is not the most sensitive endpoint. It is concluded that conventional risk assessment, considering anticipated exposure levels, will be protective of both human and ecological health. Although the toxic mechanisms of other azole compounds may be similar, large differences in potency will require a case-by-case risk assessment.
Resumo:
AIMS: To investigate pathways through which momentary negative affect and depressive symptoms affect risk of lapse during smoking cessation attempts. DESIGN: Ecological momentary assessment was carried out during 2 weeks after an unassisted smoking cessation attempt. A 3-month follow-up measured smoking frequency. SETTING: Data were collected via mobile devices in German-speaking Switzerland. PARTICIPANTS: A total of 242 individuals (age 20-40, 67% men) reported 7112 observations. MEASUREMENTS: Online surveys assessed baseline depressive symptoms and nicotine dependence. Real-time data on negative affect, physical withdrawal symptoms, urge to smoke, abstinence-related self-efficacy and lapses. FINDINGS: A two-level structural equation model suggested that on the situational level, negative affect increased the urge to smoke and decreased self-efficacy (β = 0.20; β = -0.12, respectively), but had no direct effect on lapse risk. A higher urge to smoke (β = 0.09) and lower self-efficacy (β = -0.11) were confirmed as situational antecedents of lapses. Depressive symptoms at baseline were a strong predictor of a person's average negative affect (β = 0.35, all P < 0.001). However, the baseline characteristics influenced smoking frequency 3 months later only indirectly, through influences of average states on the number of lapses during the quit attempt. CONCLUSIONS: Controlling for nicotine dependence, higher depressive symptoms at baseline were associated strongly with a worse longer-term outcome. Negative affect experienced during the quit attempt was the only pathway through which the baseline depressive symptoms were associated with a reduced self-efficacy and increased urges to smoke, all leading to the increased probability of lapses.
Resumo:
Species adapted to cold-climatic mountain environments are expected to face a high risk of range contractions, if not local extinctions under climate change. Yet, the populations of many endothermic species may not be primarily affected by physiological constraints, but indirectly by climate-induced changes of habitat characteristics. In mountain forests, where vertebrate species largely depend on vegetation composition and structure, deteriorating habitat suitability may thus be mitigated or even compensated by habitat management aiming at compositional and structural enhancement. We tested this possibility using four cold-adapted bird species with complementary habitat requirements as model organisms. Based on species data and environmental information collected in 300 1-km2 grid cells distributed across four mountain ranges in central Europe, we investigated (1) how species’ occurrence is explained by climate, landscape, and vegetation, (2) to what extent climate change and climate-induced vegetation changes will affect habitat suitability, and (3) whether these changes could be compensated by adaptive habitat management. Species presence was modelled as a function of climate, landscape and vegetation variables under current climate; moreover, vegetation-climate relationships were assessed. The models were extrapolated to the climatic conditions of 2050, assuming the moderate IPCC-scenario A1B, and changes in species’ occurrence probability were quantified. Finally, we assessed the maximum increase in occurrence probability that could be achieved by modifying one or multiple vegetation variables under altered climate conditions. Climate variables contributed significantly to explaining species occurrence, and expected climatic changes, as well as climate-induced vegetation trends, decreased the occurrence probability of all four species, particularly at the low-altitudinal margins of their distribution. These effects could be partly compensated by modifying single vegetation factors, but full compensation would only be achieved if several factors were changed in concert. The results illustrate the possibilities and limitations of adaptive species conservation management under climate change.
Resumo:
OBJECTIVES The aim of this study was to identify common risk factors for patient-reported medical errors across countries. In country-level analyses, differences in risks associated with error between health care systems were investigated. The joint effects of risks on error-reporting probability were modelled for hypothetical patients with different health care utilization patterns. DESIGN Data from the Commonwealth Fund's 2010 lnternational Survey of the General Public's Views of their Health Care System's Performance in 11 Countries. SETTING Representative population samples of 11 countries were surveyed (total sample = 19,738 adults). Utilization of health care, coordination of care problems and reported errors were assessed. Regression analyses were conducted to identify risk factors for patients' reports of medical, medication and laboratory errors across countries and in country-specific models. RESULTS Error was reported by 11.2% of patients but with marked differences between countries (range: 5.4-17.0%). Poor coordination of care was reported by 27.3%. The risk of patient-reported error was determined mainly by health care utilization: Emergency care (OR = 1.7, P < 0.001), hospitalization (OR = 1.6, P < 0.001) and the number of providers involved (OR three doctors = 2.0, P < 0.001) are important predictors. Poor care coordination is the single most important risk factor for reporting error (OR = 3.9, P < 0.001). Country-specific models yielded common and country-specific predictors for self-reported error. For high utilizers of care, the probability that errors are reported rises up to P = 0.68. CONCLUSIONS Safety remains a global challenge affecting many patients throughout the world. Large variability exists in the frequency of patient-reported error across countries. To learn from others' errors is not only essential within countries but may also prove a promising strategy internationally.
Resumo:
PURPOSE OF REVIEW: Obesity and gastroesophageal reflux disease (GERD) are two prevalent conditions with important impact on health resource utilization around the world. Obesity is a known risk factor in the pathogenesis of GERD. When conservative measures fail, bariatric surgery remains the only option to lose weight and correct obesity-related comorbidities. The influence of bariatric surgery on GERD depends on which bariatric intervention is used. RECENT FINDINGS: Recent studies indicate that laparoscopic gastric banding and laparoscopic sleeve gastrectomy have little influence on preexisting GERD symptoms and findings, but some patients may develop GERD after laparoscopic sleeve gastrectomy. A number of studies have documented that laparoscopic Roux-en-Y gastric bypass improves GERD symptoms and findings, making it the preferred procedure for morbid obese patients with concomitant GERD. SUMMARY: Current findings provide good arguments for searching for and treating GERD in patients scheduled to undergo bariatric surgery. The presence of GERD might represent a relative contraindication for sleeve gastrectomy or gastric banding or both. Gastric bypass might be the procedure of choice in morbid obese patients with GERD symptoms or findings or both.
Resumo:
1. Predation is a prime force of natural selection. Vulnerability to predation is typically highest early in life, hence effective antipredator defences should work already shortly after birth. Such early defences may be innate, transmitted through non-genetic parental effects or acquired by own early experience. 2. To understand potential joint effects of these sources of antipredator defences on pheno- typic expression, they should be manipulated within the same experiment. We investigated innate, parental and individual experience effects within a single experiment. Females of the African cichlid Simochromis pleurospilus were exposed to the offspring predator Ctenochromis horei or a benign species until spawning. Eggs and larvae were hand-reared, and larvae were then exposed to odour cues signalling the presence or absence of predators in a split-brood design. 3. Shortly after independence of maternal care, S. pleurospilus undergo a habitat shift from a deeper, adult habitat to a shallow juvenile habitat, a phase where young are thought to be par- ticularly exposed to predation risk. Thus, maternal effects induced by offspring predators pres- ent in the adult habitat should take effect mainly shortly after independence, whereas own experience and innate antipredator responses should shape behaviour and life history of S. pleurospilus during the later juvenile period. 4. We found that the manipulated environmental components independently affected different offspring traits. (i) Offspring of predator-exposed mothers grew faster during the first month of life and were thus larger at termination of maternal care, when the young migrate from the adult to the juvenile habitat. (ii) The offspring’s own experience shortly after hatching exerted lasting effects on predator avoidance behaviour. (iii) Finally, our results suggest that S. pleuro- spilus possess a genetically inherited ability to distinguish dangerous from benign species. 5. In S. pleurospilus, maternal effects were limited to a short but critical time window, when young undergo a niche shift. Instead, own environmental sampling of predation risk combined with an innate predisposition to correctly identify predators appears to prepare the young best for the environment, in which they grow up as juveniles.
Resumo:
Flavanoid-rich dark chocolate consumption benefits cardiovascular health, but underlying mechanisms are elusive. We investigated the acute effect of dark chocolate on the reactivity of prothrombotic measures to psychosocial stress. Healthy men aged 20-50 years (mean ± SD: 35.7 ± 8.8) were assigned to a single serving of either 50 g of flavonoid-rich dark chocolate (n=31) or 50 g of optically identical flavonoid-free placebo chocolate (n=34). Two hours after chocolate consumption, both groups underwent an acute standardised psychosocial stress task combining public speaking and mental arithmetic. We determined plasma levels of four stress-responsive prothrombotic measures (i. e., fibrinogen, clotting factor VIII activity, von Willebrand Factor antigen, fibrin D-dimer) prior to chocolate consumption, immediately before and after stress, and at 10 minutes and 20 minutes after stress cessation. We also measured the flavonoid epicatechin, and the catecholamines epinephrine and norepinephrine in plasma. The dark chocolate group showed a significantly attenuated stress reactivity of the hypercoagulability marker D-dimer (F=3.87, p=0.017) relative to the placebo chocolate group. Moreover, the blunted D-dimer stress reactivity related to higher plasma levels of the flavonoid epicatechin assessed before stress (F=3.32, p = 0.031) but not to stress-induced changes in catecholamines (p's=0.35). There were no significant group differences in the other coagulation measures (p's≥0.87). Adjustments for covariates did not alter these findings. In conclusion, our findings indicate that a single consumption of flavonoid-rich dark chocolate blunted the acute prothrombotic response to psychosocial stress, thereby perhaps mitigating the risk of acute coronary syndromes triggered by emotional stress.