133 resultados para Risk Impact
Resumo:
This paper examines the impact of disastrous and ‘ordinary’ floods on human societies in what is now Austria. The focus is on urban areas and their neighbourhoods. Examining institutional sources such as accounts of the bridge masters, charters, statutes and official petitions, it can be shown that city communities were well acquainted with this permanent risk: in fact, an office was established for the restoration of bridges and the maintenance of water defences and large depots for timber and water pipes ensured that the reconstruction of bridges and the system of water supply could start immediately after the floods had subsided. Carpenters and similar groups gained 10 to 20 per cent of their income from the repair of bridges and other flood damage. The construction of houses in endangered zones was adapted in order to survive the worst case experiences. Thus, we may describe those communities living along the central European rivers as ‘cultures of flood management’. This special knowledge vanished, however, from the mid-nineteenth century onwards, when river regulations gave the people a false feeling of security.
Resumo:
Oral contraceptives containing synthetic oestrogens have been used successfully as birth control for > 40 years and are currently prescribed to > 100 million women worldwide. Several new progestins have been introduced and the third generation of progestins has now been available for two decades. Oral contraceptives are prescribed over a prolonged period of time and therefore substantially impact on hormonal, metabolic and plasmatic functions. Oral contraceptives increase the risk for venous thrombosis and pulmonary embolism, particularly if associated with confounding factors, such as genetic predisposition, smoking, hypertension or obesity. The risk of developing coronary artery disease is also increased in users with cardiovascular risk factors. This article discusses mechanistic and clinical issues and reviews the need for novel approaches targeting the considerable side effects in order to reduce cardiovascular morbidity in women using oral contraceptives.
Resumo:
Background Few studies have monitored late presentation (LP) of HIV infection over the European continent, including Eastern Europe. Study objectives were to explore the impact of LP on AIDS and mortality. Methods and Findings LP was defined in Collaboration of Observational HIV Epidemiological Research Europe (COHERE) as HIV diagnosis with a CD4 count <350/mm3 or an AIDS diagnosis within 6 months of HIV diagnosis among persons presenting for care between 1 January 2000 and 30 June 2011. Logistic regression was used to identify factors associated with LP and Poisson regression to explore the impact on AIDS/death. 84,524 individuals from 23 cohorts in 35 countries contributed data; 45,488 were LP (53.8%). LP was highest in heterosexual males (66.1%), Southern European countries (57.0%), and persons originating from Africa (65.1%). LP decreased from 57.3% in 2000 to 51.7% in 2010/2011 (adjusted odds ratio [aOR] 0.96; 95% CI 0.95–0.97). LP decreased over time in both Central and Northern Europe among homosexual men, and male and female heterosexuals, but increased over time for female heterosexuals and male intravenous drug users (IDUs) from Southern Europe and in male and female IDUs from Eastern Europe. 8,187 AIDS/deaths occurred during 327,003 person-years of follow-up. In the first year after HIV diagnosis, LP was associated with over a 13-fold increased incidence of AIDS/death in Southern Europe (adjusted incidence rate ratio [aIRR] 13.02; 95% CI 8.19–20.70) and over a 6-fold increased rate in Eastern Europe (aIRR 6.64; 95% CI 3.55–12.43). Conclusions LP has decreased over time across Europe, but remains a significant issue in the region in all HIV exposure groups. LP increased in male IDUs and female heterosexuals from Southern Europe and IDUs in Eastern Europe. LP was associated with an increased rate of AIDS/deaths, particularly in the first year after HIV diagnosis, with significant variation across Europe. Earlier and more widespread testing, timely referrals after testing positive, and improved retention in care strategies are required to further reduce the incidence of LP.
Resumo:
We assessed the impact of antiviral prophylaxis and preemptive therapy on the incidence and outcomes of cytomegalovirus (CMV) disease in a nationwide prospective cohort of solid organ transplant recipients. Risk factors associated with CMV disease and graft failure-free survival were analyzed using Cox regression models. One thousand two hundred thirty-nine patients transplanted from May 2008 until March 2011 were included; 466 (38%) patients received CMV prophylaxis and 522 (42%) patients were managed preemptively. Overall incidence of CMV disease was 6.05% and was linked to CMV serostatus (D+/R− vs. R+, hazard ratio [HR] 5.36 [95% CI 3.14–9.14], p < 0.001). No difference in the incidence of CMV disease was observed in patients receiving antiviral prophylaxis as compared to the preemptive approach (HR 1.16 [95% CI 0.63–2.17], p = 0.63). CMV disease was not associated with a lower graft failure-free survival (HR 1.27 [95% CI 0.64–2.53], p = 0.50). Nevertheless, patients followed by the preemptive approach had an inferior graft failure-free survival after a median of 1.05 years of follow-up (HR 1.63 [95% CI 1.01–2.64], p = 0.044). The incidence of CMV disease in this cohort was low and not influenced by the preventive strategy used. However, patients on CMV prophylaxis were more likely to be free from graft failure.
Resumo:
Background HIV-prevalence, as well as incidence of zoonotic parasitic diseases like cystic echinococcosis, has increased in the Kyrgyz Republic due to fundamental socio-economic changes after the breakdown of the Soviet Union. The possible impact on morbidity and mortality caused by Toxoplasma gondii infection in congenital toxoplasmosis or as an opportunistic infection in the emerging AIDS pandemic has not been reported from Kyrgyzstan. Methodology/Principal Findings We screened 1,061 rural and 899 urban people to determine the seroprevalence of T. gondii infection in 2 representative but epidemiologically distinct populations in Kyrgyzstan. The rural population was from a typical agricultural district where sheep husbandry is a major occupation. The urban population was selected in collaboration with several diagnostic laboratories in Bishkek, the largest city in Kyrgyzstan. We designed a questionnaire that was used on all rural subjects so a risk-factor analysis could be undertaken. The samples from the urban population were anonymous and only data with regard to age and gender was available. Estimates of putative cases of congenital and AIDS-related toxoplasmosis in the whole country were made from the results of the serology. Specific antibodies (IgG) against Triton X-100 extracted antigens of T. gondii tachyzoites from in vitro cultures were determined by ELISA. Overall seroprevalence of infection with T. gondii in people living in rural vs. urban areas was 6.2% (95%CI: 4.8–7.8) (adjusted seroprevalence based on census figures 5.1%, 95% CI 3.9–6.5), and 19.0% (95%CI: 16.5–21.7) (adjusted 16.4%, 95% CI 14.1–19.3), respectively, without significant gender-specific differences. The seroprevalence increased with age. Independently low social status increased the risk of Toxoplasma seropositivity while increasing numbers of sheep owned decreased the risk of seropositivity. Water supply, consumption of unpasteurized milk products or undercooked meat, as well as cat ownership, had no significant influence on the risk for seropositivity. Conclusions We present a first seroprevalence analysis for human T. gondii infection in the Kyrgyz Republic. Based on these data we estimate that 173 (95% CI 136–216) Kyrgyz children will be born annually to mothers who seroconverted to toxoplasmosis during pregnancy. In addition, between 350 and 1,000 HIV-infected persons are currently estimated to be seropositive for toxoplasmosis. Taken together, this suggests a substantial impact of congenital and AIDS-related symptomatic toxoplasmosis on morbidity and mortality in Kyrgyzstan.
Resumo:
Research on endocrine disruption in fish has been dominated by studies on estrogen-active compounds which act as mimics of the natural estrogen, 17β-estradiol (E2), and generally exert their biological actions by binding to and activation of estrogen receptors (ERs). Estrogens play central roles in reproductive physiology and regulate (female) sexual differentiation. In line with this, most adverse effects reported for fish exposed to environmental estrogens relate to sexual differentiation and reproduction. E2, however, utilizes a variety of signaling mechanisms, has multifaceted functions and targets, and therefore the toxicological and ecological effects of environmental estrogens in fish will extend beyond those associated with the reproduction. This review first describes the diversity of estrogen receptor signaling in fish, including both genomic and non-genomic mechanisms, and receptor crosstalk. It then considers the range of non-reproductive physiological processes in fish that are known to be responsive to estrogens, including sensory systems, the brain, the immune system, growth, specifically through the growth hormone/insulin-like growth factor system, and osmoregulation. The diversity in estrogen responses between fish species is then addressed, framed within evolutionary and ecological contexts, and we make assessments on their relevance for toxicological sensitivity as well as ecological vulnerability. The diversity of estrogen actions raises questions whether current risk assessment strategies, which focus on reproductive endpoints, and a few model fish species only, are protective of the wider potential health effects of estrogens. Available - although limited - evidence nevertheless suggests that quantitative environmental threshold concentrations for environmental protection derived from reproductive tests with model fish species are protective for non-reproductive effects as well. The diversity of actions of estrogens across divergent physiological systems, however, may lead to and underestimation of impacts on fish populations as their effects are generally considered on one functional process only and this may underrepresent the impact on the different physiological processes collectively.
Resumo:
BACKGROUND: The burden of enterococcal infections has increased over the last decades with vancomycin-resistant enterococci (VRE) being a major health problem. Solid organ transplantation is considered as a risk factor. However, little is known about the relevance of enterococci in solid organ transplantation recipients in areas with a low VRE prevalence. METHODS: We examined the epidemiology of enterococcal events in patients followed in the Swiss Transplant Cohort Study between May 2008 and September 2011 and analyzed risk factors for infection, aminopenicillin resistance, treatment, and outcome. RESULTS: Of the 1234 patients, 255 (20.7%) suffered from 392 enterococcal events (185 [47.2%] infections, 205 [52.3%] colonizations, and 2 events with missing clinical information). Only 2 isolates were VRE. The highest infection rates were found early after liver transplantation (0.24/person-year) consisting in 58.6% of Enterococcus faecium. The highest colonization rates were documented in lung transplant recipients (0.33/person-year), with 46.5% E. faecium. Age, prophylaxis with a betalactam antibiotic, and liver transplantation were significantly associated with infection. Previous antibiotic treatment, intensive care unit stay, and lung transplantation were associated with aminopenicillin resistance. Only 4/205 (2%) colonization events led to an infection. Adequate treatment did not affect microbiological clearance rates. Overall mortality was 8%; no deaths were attributable to enterococcal events. CONCLUSIONS: Enterococcal colonizations and infections are frequent in transplant recipients. Progression from colonization to infection is rare. Therefore, antibiotic treatment should be used restrictively in colonization. No increased mortality because of enterococcal infection was noted
Resumo:
Over the last couple of decades, the UK experienced a substantial increase in the incidence and geographical spread of bovine tuberculosis (TB), in particular since the epidemic of foot-and-mouth disease (FMD) in 2001. The initiation of the Randomized Badger Culling Trial (RBCT) in 1998 in south-west England provided an opportunity for an in-depth collection of questionnaire data (covering farming practices, herd management and husbandry, trading and wildlife activity) from herds having experienced a TB breakdown between 1998 and early 2006 and randomly selected control herds, both within and outside the RBCT (the so-called TB99 and CCS2005 case-control studies). The data collated were split into four separate and comparable substudies related to either the pre-FMD or post-FMD period, which are brought together and discussed here for the first time. The findings suggest that the risk factors associated with TB breakdowns may have changed. Higher Mycobacterium bovis prevalence in badgers following the FMD epidemic may have contributed to the identification of the presence of badgers on a farm as a prominent TB risk factor only post-FMD. The strong emergence of contact/trading TB risk factors post-FMD suggests that the purchasing and movement of cattle, which took place to restock FMD-affected areas after 2001, may have exacerbated the TB problem. Post-FMD analyses also highlighted the potential impact of environmental factors on TB risk. Although no unique and universal solution exists to reduce the transmission of TB to and among British cattle, there is an evidence to suggest that applying the broad principles of biosecurity on farms reduces the risk of infection. However, with trading remaining as an important route of local and long-distance TB transmission, improvements in the detection of infected animals during pre- and post-movement testing should further reduce the geographical spread of the disease.
Resumo:
Background. Although tenofovir (TDF) use has increased as part of first-line antiretroviral therapy (ART) across sub-Saharan Africa, renal outcomes among patients receiving TDF remain poorly understood. We assessed changes in renal function and mortality in patients starting TDF- or non-TDF-containing ART in Lusaka, Zambia. Methods. We included patients aged ≥16 years who started ART from 2007 onward, with documented baseline weight and serum creatinine. Renal dysfunction was categorized as mild (eGFR 60-89 mL/min), moderate (30-59 mL/min) or severe (<30 mL/min) using the CKD-EPI formula. Differences in eGFR during ART were analyzed using linear mixed-effect models, the odds of developing moderate or severe eGFR decrease with logistic regression and mortality with competing risk regression. Results. We included 62,230 adults, of which 38,716 (62%) initiated a TDF-based regimen. The proportion with moderate or severe renal dysfunction at baseline was lower in the TDF compared to the non-TDF group (1.9% vs. 4.0%). Among patients with no or mild renal dysfunction, those on TDF were more likely to develop moderate (adjusted OR: 3.11; 95%CI: 2.52-3.87) or severe eGFR decrease (adjusted OR: 2.43; 95%CI: 1.80-3.28), although the incidence of such episodes was low. Among patients with moderate or severe renal dysfunction at baseline, renal function improved independently of ART regimen and mortality was similar in both treatment groups. Conclusions. TDF use did not attenuate renal function recovery or increase mortality in patients with renal dysfunction. Further studies are needed to determine the role of routine renal function monitoring before and during ART use in Africa.
Resumo:
BACKGROUND Empirical research has illustrated an association between study size and relative treatment effects, but conclusions have been inconsistent about the association of study size with the risk of bias items. Small studies give generally imprecisely estimated treatment effects, and study variance can serve as a surrogate for study size. METHODS We conducted a network meta-epidemiological study analyzing 32 networks including 613 randomized controlled trials, and used Bayesian network meta-analysis and meta-regression models to evaluate the impact of trial characteristics and study variance on the results of network meta-analysis. We examined changes in relative effects and between-studies variation in network meta-regression models as a function of the variance of the observed effect size and indicators for the adequacy of each risk of bias item. Adjustment was performed both within and across networks, allowing for between-networks variability. RESULTS Imprecise studies with large variances tended to exaggerate the effects of the active or new intervention in the majority of networks, with a ratio of odds ratios of 1.83 (95% CI: 1.09,3.32). Inappropriate or unclear conduct of random sequence generation and allocation concealment, as well as lack of blinding of patients and outcome assessors, did not materially impact on the summary results. Imprecise studies also appeared to be more prone to inadequate conduct. CONCLUSIONS Compared to more precise studies, studies with large variance may give substantially different answers that alter the results of network meta-analyses for dichotomous outcomes.
Resumo:
This longitudinal study investigated whether cybervictimisation is an additional risk factor for depressive symptoms over and beyond traditional victimisation in adolescents. Furthermore, it explored whether certain coping strategies moderate the impact of cybervictimisation on depressive symptoms. A total of 765 Swiss seventh graders (mean age at time-point 1 (t1) = 13.18 years) reported on the frequency of traditional and cybervictimisation, and of depressive symptoms twice in six months. At time-point 2 (t2) students also completed a questionnaire on coping strategies in response to a hypothetical cyberbullying scenario. Analyses showed that both traditional and cybervictimisation were associated with higher levels of depressive symptoms. Cybervictimisation also predicted increases in depressive symptoms over time. Regarding coping strategies, it was found that helpless reactions were positively associated with depressive symptoms. Moreover, support seeking from peers and family showed a significant buffering effect: cybervictims who recommended seeking close support showed lower levels of depressive symptoms at t2. In contrast, cybervictims recommending assertive coping strategies showed higher levels of depressive symptoms at t2.
Resumo:
Abstract Objectives: HIV 'treatment as prevention' (TasP) describes early treatment of HIV-infected patients intended to reduce viral load and transmission. Crucial assumptions for estimating TasP's effectiveness are the underlying estimates of transmission risk. We aimed to determine transmission risk during primary infection, and of the relation of HIV transmission risk to viral load. Design: A systematic review and meta-analysis. Methods: We searched PubMed and Embase databases for studies that established a relationship between viral load and transmission risk, or primary infection and transmission risk, in serodiscordant couples. We analysed assumptions about the relationship between viral load and transmission risk, and between duration of primary infection and transmission risk. Results: We found 36 eligible articles, based on six different study populations. Studies consistently found that larger viral loads lead to higher HIV transmission rates, but assumptions about the shape of this increase varied from exponential increase to saturation. The assumed duration of primary infection ranged from 1.5 to 12 months; for each additional month, the log10 transmission rate ratio between primary and asymptomatic infection decreased by 0.40. Conclusion: Assumptions and estimates of the relationship between viral load and transmission risk, and the relationship between primary infection and transmission risk, vary substantially and predictions of TasP's effectiveness should take this uncertainty into account.
Resumo:
Working memory is crucial for meeting the challenges of daily life and performing academic tasks, such as reading or arithmetic. Very preterm born children are at risk of low working memory capacity. The aim of this study was to examine the visuospatial working memory network of school-aged preterm children and to determine the effect of age and performance on the neural working memory network. Working memory was assessed in 41 very preterm born children and 36 term born controls (aged 7–12 years) using functional magnetic resonance imaging (fMRI) and neuropsychological assessment. While preterm children and controls showed equal working memory performance, preterm children showed less involvement of the right middle frontal gyrus, but higher fMRI activation in superior frontal regions than controls. The younger and low-performing preterm children presented an atypical working memory network whereas the older high-performing preterm children recruited a working memory network similar to the controls. Results suggest that younger and low-performing preterm children show signs of less neural efficiency in frontal brain areas. With increasing age and performance, compensational mechanisms seem to occur, so that in preterm children, the typical visuospatial working memory network is established by the age of 12 years.
Resumo:
Background and Purpose—The question whether cerebral microbleeds (CMBs) visible on MRI in acute stroke increase the risk for intracerebral hemorrhages (ICHs) or worse outcome after thrombolysis is unresolved. The aim of this study was to analyze the impact of CMB detected with pretreatment susceptibility-weighted MRI on ICH occurrence and outcome. Methods—From 2010 to 2013 we treated 724 patients with intravenous thrombolysis, endovascular therapy, or intravenous thrombolysis followed by endovascular therapy. A total of 392 of the 724 patients were examined with susceptibility-weighted MRI before treatment. CMBs were rated retrospectively. Multivariable regression analysis was used to determine the impact of CMB on ICH and outcome. Results—Of 392 patients, 174 were treated with intravenous thrombolysis, 150 with endovascular therapy, and 68 with intravenous thrombolysis followed by endovascular therapy. CMBs were detected in 79 (20.2%) patients. Symptomatic ICH occurred in 21 (5.4%) and asymptomatic in 75 (19.1%) patients, thereof 61 (15.6%) bleedings within and 35 (8.9%) outside the infarct. Neither the existence of CMB, their burden, predominant location nor their presumed pathogenesis influenced the risk for symptomatic or asymptomatic ICH. A higher CMB burden marginally increased the risk for ICH outside the infarct (P=0.048; odds ratio, 1.004; 95% confidence interval, 1.000–1.008). Conclusions—CMB detected on pretreatment susceptibility-weighted MRI did not increase the risk for ICH or worsen outcome, even when CMB burden, predominant location, or presumed pathogenesis was considered. There was only a small increased risk for ICH outside the infarct with increasing CMB burden that does not advise against thrombolysis in such patients.
Resumo:
BACKGROUND The extent of hypoperfusion is an important prognostic factor in acute ischemic stroke. Previous studies have postulated that the extent of prominent cortical veins (PCV) on susceptibility-weighted imaging (SWI) reflects the extent of hypoperfusion. Our aim was to investigate, whether there is an association between PCV and the grade of leptomeningeal arterial collateralization in acute ischemic stroke. In addition, we analyzed the correlation between SWI and perfusion-MRI findings. METHODS 33 patients with acute ischemic stroke due to a thromboembolic M1-segment occlusion underwent MRI followed by digital subtraction angiography (DSA) and were subdivided into two groups with very good to good and moderate to no leptomeningeal collaterals according to the DSA. The extent of PCV on SWI, diffusion restriction (DR) on diffusion-weighted imaging (DWI) and prolonged mean transit time (MTT) on perfusion-imaging were graded according to the Alberta Stroke Program Early CT Score (ASPECTS). The National Institutes of Health Stroke Scale (NIHSS) scores at admission and the time between symptom onset and MRI were documented. RESULTS 20 patients showed very good to good and 13 patients poor to no collateralization. PCV-ASPECTS was significantly higher for cases with good leptomeningeal collaterals versus those with poor leptomeningeal collaterals (mean 4.1 versus 2.69; p=0.039). MTT-ASPECTS was significantly lower than PCV-ASPECTS in all 33 patients (mean 1.0 versus 3.5; p<0.00). CONCLUSIONS In our small study the grade of leptomeningeal collateralization correlates with the extent of PCV in SWI in acute ischemic stroke, due to the deoxyhemoglobin to oxyhemoglobin ratio. Consequently, extensive PCV correlate with poor leptomeningeal collateralization while less pronounced PCV correlate with good leptomeningeal collateralization. Further SWI is a very helpful tool in detecting tissue at risk but cannot replace PWI since MTT detects significantly more ill-perfused areas than SWI, especially in good collateralized subjects.