871 resultados para Risk of forest inventory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Subclinical thyroid dysfunction has been associated with coronary heart disease, but the risk of stroke is unclear. Our aim is to combine the evidence on the association between subclinical thyroid dysfunction and the risk of stroke in prospective cohort studies. We searched Medline (OvidSP), Embase, Web-of-Science, Pubmed Publisher, Cochrane and Google Scholar from inception to November 2013 using a cohort filter, but without language restriction or other limitations. Reference lists of articles were searched. Two independent reviewers screened articles according to pre-specified criteria and selected prospective cohort studies with baseline thyroid function measurements and assessment of stroke outcomes. Data were derived using a standardized data extraction form. Quality was assessed according to previously defined quality indicators by two independent reviewers. We pooled the outcomes using a random-effects model. Of 2,274 articles screened, six cohort studies, including 11,309 participants with 665 stroke events, met the criteria. Four of six studies provided information on subclinical hyperthyroidism including a total of 6,029 participants and five on subclinical hypothyroidism (n = 10,118). The pooled hazard ratio (HR) was 1.08 (95 % CI 0.87-1.34) for subclinical hypothyroidism (I (2) of 0 %) and 1.17 (95 % CI 0.54-2.56) for subclinical hyperthyroidism (I (2) of 67 %) compared to euthyroidism. Subgroup analyses yielded similar results. Our systematic review provides no evidence supporting an increased risk for stroke associated with subclinical thyroid dysfunction. However, the available literature is insufficient and larger datasets are needed to perform extended analyses. Also, there were insufficient events to exclude clinically significant risk from subclinical hyperthyroidism, and more data are required for subgroup analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Early identification of patients at risk of developing persistent low back pain (LBP) is crucial. OBJECTIVE Aim of this study was to identify in patients with a new episode of LBP the time point at which those at risk of developing persistent LBP can be best identified.METHODS: Prospective cohort study of 315 patients presenting to a health practitioner with a first episode of acute LBP. Primary outcome measure was functional limitation. Patients were assessed at baseline, three, six, twelve weeks and six months looking at factors of maladaptive cognition as potential predictors. Multivariate logistic regression analysis was performed for all time points. RESULTS The best time point to predict the development of persistent LBP at six months was the twelve-week follow-up (sensitivity 78%; overall predictive value 90%). Cognitions assessed at first visit to a health practitioner were not predictive. CONCLUSIONS Maladaptive cognitions at twelve weeks appear to be suitable predictors for a transition from acute to persistent LBP. Already three weeks after patients present to a health practitioner with acute LBP cognitions might influence the development of persistent LBP. Therefore, cognitive-behavioral interventions should be considered as early adjuvant LBP treatment in patients at risk of developing persistent LBP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES To evaluate the impact of preoperative sepsis on risk of postoperative arterial and venous thromboses. DESIGN Prospective cohort study using the National Surgical Quality Improvement Program database of the American College of Surgeons (ACS-NSQIP). SETTING Inpatient and outpatient procedures in 374 hospitals of all types across the United States, 2005-12. PARTICIPANTS 2,305,380 adults who underwent surgical procedures. MAIN OUTCOME MEASURES Arterial thrombosis (myocardial infarction or stroke) and venous thrombosis (deep venous thrombosis or pulmonary embolism) in the 30 days after surgery. RESULTS Among all surgical procedures, patients with preoperative systemic inflammatory response syndrome or any sepsis had three times the odds of having an arterial or venous postoperative thrombosis (odds ratio 3.1, 95% confidence interval 3.0 to 3.1). The adjusted odds ratios were 2.7 (2.5 to 2.8) for arterial thrombosis and 3.3 (3.2 to 3.4) for venous thrombosis. The adjusted odds ratios for thrombosis were 2.5 (2.4 to 2.6) in patients with systemic inflammatory response syndrome, 3.3 (3.1 to 3.4) in patients with sepsis, and 5.7 (5.4 to 6.1) in patients with severe sepsis, compared with patients without any systemic inflammation. In patients with preoperative sepsis, both emergency and elective surgical procedures had a twofold increased odds of thrombosis. CONCLUSIONS Preoperative sepsis represents an important independent risk factor for both arterial and venous thromboses. The risk of thrombosis increases with the severity of the inflammatory response and is higher in both emergent and elective surgical procedures. Suspicion of thrombosis should be higher in patients with sepsis who undergo surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE Rapid assessment and intervention is important for the prognosis of acutely ill patients admitted to the emergency department (ED). The aim of this study was to prospectively develop and validate a model predicting the risk of in-hospital death based on all available information available at the time of ED admission and to compare its discriminative performance with a non-systematic risk estimate by the triaging first health-care provider. METHODS Prospective cohort analysis based on a multivariable logistic regression for the probability of death. RESULTS A total of 8,607 consecutive admissions of 7,680 patients admitted to the ED of a tertiary care hospital were analysed. Most frequent APACHE II diagnostic categories at the time of admission were neurological (2,052, 24 %), trauma (1,522, 18 %), infection categories [1,328, 15 %; including sepsis (357, 4.1 %), severe sepsis (249, 2.9 %), septic shock (27, 0.3 %)], cardiovascular (1,022, 12 %), gastrointestinal (848, 10 %) and respiratory (449, 5 %). The predictors of the final model were age, prolonged capillary refill time, blood pressure, mechanical ventilation, oxygen saturation index, Glasgow coma score and APACHE II diagnostic category. The model showed good discriminative ability, with an area under the receiver operating characteristic curve of 0.92 and good internal validity. The model performed significantly better than non-systematic triaging of the patient. CONCLUSIONS The use of the prediction model can facilitate the identification of ED patients with higher mortality risk. The model performs better than a non-systematic assessment and may facilitate more rapid identification and commencement of treatment of patients at risk of an unfavourable outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND & AIMS Subtle inter-patient genetic variation and environmental factors combine to determine disease progression in non-alcoholic fatty liver disease (NAFLD). Carriage of the PNPLA3 rs738409 c.444C >G minor allele (encoding the I148M variant) has been robustly associated with advanced NAFLD. Although most hepatocellular carcinoma (HCC) is related to chronic viral hepatitis or alcoholic liver disease, the incidence of NAFLD-related HCC is increasing. We examined whether rs738409 C >G was associated with HCC-risk in patients with NAFLD. METHODS PNPLA3 rs738409 genotype was determined by allelic discrimination in 100 European Caucasians with NAFLD-related HCC and 275 controls with histologically characterised NAFLD. RESULTS Genotype frequencies were significantly different between NAFLD-HCC cases (CC=28, CG=43, GG=29) and NAFLD-controls (CC=125, CG=117, GG=33) (p=0.0001). In multivariate analysis adjusted for age, gender, diabetes, BMI, and presence of cirrhosis, carriage of each copy of the rs738409 minor (G) allele conferred an additive risk for HCC (adjusted OR 2.26 [95% CI 1.23-4.14], p=0.0082), with GG homozygotes exhibiting a 5-fold [1.47-17.29], p=0.01 increased risk over CC. When compared to the UK general population (1958 British Birth Cohort, n=1476), the risk-effect was more pronounced (GC vs. CC: unadjusted OR 2.52 [1.55-4.10], p=0.0002; GG vs. CC: OR 12.19 [6.89-21.58], p<0.0001). CONCLUSIONS Carriage of the PNPLA3 rs738409 C >G polymorphism is not only associated with greater risk of progressive steatohepatitis and fibrosis but also of HCC. If validated, these findings suggest that PNPLA3 genotyping has the potential to contribute to multi-factorial patient-risk stratification, identifying those to whom HCC surveillance may be targeted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND & AIMS Pegylated interferon-based treatment is still the backbone of current hepatitis C therapy and is associated with bone marrow suppression and an increased risk of infections. The aim of this retrospective cohort study was to assess the risk of infections during interferon-based treatment among patients with chronic HCV infection and advanced hepatic fibrosis and its relation to treatment-induced neutropenia. METHODS This cohort study included all consecutive patients with chronic HCV infection and biopsy-proven bridging fibrosis or cirrhosis (Ishak 4-6) who started treatment between 1990 and 2003 in five large hepatology units in Europe and Canada. Neutrophil counts between 500/μL-749/μL and below 500/μL were considered as moderate and severe neutropenia, respectively. RESULTS This study included 723 interferon-based treatments, administered to 490 patients. In total, 113 infections were reported during 88 (12%) treatments, of which 24 (21%) were considered severe. Only one patient was found to have moderate neutropenia and three patients were found to have severe neutropenia at the visit before the infection. Three hundred and twelve (99.7%) visits with moderate neutropenia and 44 (93.6%) visits with severe neutropenia were not followed by an infection. Multivariable analysis showed that cirrhosis (OR 2.85, 95%CI 1.38-5.90, p=0.005) and severe neutropenia at the previous visit (OR 5.42, 95%CI 1.34-22.0, p=0.018) were associated with the occurrence of infection, while moderate neutropenia was not. Among a subgroup of patients treated with PegIFN, severe neutropenia was not significantly associated (OR 1.63, 95%CI 0.19-14.2, p=0.660). CONCLUSIONS In this large cohort of patients with bridging fibrosis and cirrhosis, infections during interferon-based therapy were generally mild. Severe interferon-induced neutropenia rarely occurred, but was associated with on-treatment infection. Moderate neutropenia was not associated with infection, suggesting that current dose reduction guidelines might be too strict.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Patients with HIV exposed to the antiretroviral drug abacavir may have an increased risk of cardiovascular disease (CVD). There is concern that this association arises because of a channelling bias. Even if exposure is a risk, it is not clear how that risk changes as exposure cumulates. METHODS We assess the effect of exposure to abacavir on the risk of CVD events in the Swiss HIV Cohort Study. We use a new marginal structural Cox model to estimate the effect of abacavir as a flexible function of past exposures while accounting for risk factors that potentially lie on a causal pathway between exposure to abacavir and CVD. RESULTS 11,856 patients were followed for a median of 6.6 years; 365 patients had a CVD event (4.6 events per 1000 patient years). In a conventional Cox model, recent - but not cumulative - exposure to abacavir increased the risk of a CVD event. In the new marginal structural Cox model, continued exposure to abacavir during the past four years increased the risk of a CVD event (hazard ratio 2.06, 95% confidence interval 1.43-2.98). The estimated function for the effect of past exposures suggests that exposure during the past 6 to 36 months caused the greatest increase in risk. CONCLUSIONS Abacavir increases the risk of a CVD event: the effect of exposure is not immediate, rather the risk increases as exposure cumulates over the past few years. This gradual increase in risk is not consistent with a rapidly acting mechanism, such as acute inflammation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Epixylic bryophytes are important components of forest vegetation but are currently endangered by increment of wood harvest and intensive forest management. In this paper we present a study about the relationship between forest management, deadwood abundance, deadwood attributes and species richness of epixylic bryophytes on 30 plots comprising three forest types (managed coniferous, managed deciduous and unmanaged deciduous forests) in three regions in Germany. Additionally we analyzed the relations between deadwood attributes (wood species, decay, deadwood type, size) and bryophytes on deadwood items (n = 799) and calculated species interaction networks of wood species and bryophytes. Overall, species richness of epixylic bryophytes was positively related to deadwood abundance and diversity. The mean deadwood abundance was lowest in unmanaged forests (9.7 m3 ha-1) compared with 15.0 m3 ha-1 in managed deciduous and 25.1 m3 ha-1 in managed coniferous forests. Accordingly, epixylic bryophyte species richness per plot increased from 7 species per 400 m 2 in unmanaged, 10 in managed deciduous and 16 in managed coniferous forests. The interaction network provided evidence of importance of tree-species diversity for bryophyte diversity and the relevance of particular wood species for rare bryophytes.

Generally, the results demonstrate a considerable lack of deadwood in all forest types, even in unmanaged forests. Species richness of epixylic bryophytes was strongly limited by available substrates within the observed deadwood abundance ranging up to only 60 m3 ha-1. Altogether, this suggests a high demand to increase both abundance and diversity of deadwood in forests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Childhood leukaemia (CL) may have an infectious cause and population mixing may therefore increase the risk of CL. We aimed to determine whether CL was associated with population mixing in Switzerland. We followed children aged <16 years in the Swiss National Cohort 1990-2008 and linked CL cases from the Swiss Childhood Cancer Registry to the cohort. We calculated adjusted hazard ratios (HRs) for all CL, CL at age <5 years and acute lymphoblastic leukaemia (ALL) for three measures of population mixing (population growth, in-migration and diversity of origin), stratified by degree of urbanisation. Measures of population mixing were calculated for all municipalities for the 5-year period preceding the 1990 and 2000 censuses. Analyses were based on 2,128,012 children of whom 536 developed CL. HRs comparing highest with lowest quintile of population growth were 1.11 [95 % confidence interval (CI) 0.65-1.89] in rural and 0.59 (95 % CI 0.43-0.81) in urban municipalities (interaction: p = 0.271). Results were similar for ALL and for CL at age <5 years. For level of in-migration there was evidence of a negative association with ALL. HRs comparing highest with lowest quintile were 0.60 (95 % CI 0.41-0.87) in urban and 0.61 (95 % CI 0.30-1.21) in rural settings. There was little evidence of an association with diversity of origin. This nationwide cohort study of the association between CL and population growth, in-migration and diversity of origin provides little support for the population mixing hypothesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We developed a model to calculate a quantitative risk score for individual aquaculture sites. The score indicates the risk of the site being infected with a specific fish pathogen (viral haemorrhagic septicaemia virus (VHSV); infectious haematopoietic necrosis virus, Koi herpes virus), and is intended to be used for risk ranking sites to support surveillance for demonstration of zone or member state freedom from these pathogens. The inputs to the model include a range of quantitative and qualitative estimates of risk factors organised into five risk themes (1) Live fish and egg movements; (2) Exposure via water; (3) On-site processing; (4) Short-distance mechanical transmission; (5) Distance-independent mechanical transmission. The calculated risk score for an individual aquaculture site is a value between zero and one and is intended to indicate the risk of a site relative to the risk of other sites (thereby allowing ranking). The model was applied to evaluate 76 rainbow trout farms in 3 countries (42 from England, 32 from Italy and 2 from Switzerland) with the aim to establish their risk of being infected with VHSV. Risk scores for farms in England and Italy showed great variation, clearly enabling ranking. Scores ranged from 0.002 to 0.254 (mean score 0.080) in England and 0.011 to 0.778 (mean of 0.130) for Italy, reflecting the diversity of infection status of farms in these countries. Requirements for broader application of the model are discussed. Cost efficient farm data collection is important to realise the benefits from a risk-based approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Temporary increases in plasma HIV RNA ('blips') are common in HIV patients on combination antiretroviral therapy (cART). Blips above 500 copies/mL have been associated with subsequent viral rebound. It is not clear if this relationship still holds when measurements are made using newer more sensitive assays. METHODS We selected antiretroviral-naive patients that then recorded one or more episodes of viral suppression on cART with HIV RNA measurements made using more sensitive assays (lower limit of detection below 50 copies/ml). We estimated the association in these episodes between blip magnitude and the time to viral rebound. RESULTS Four thousand ninety-four patients recorded a first episode of viral suppression on cART using more sensitive assays; 1672 patients recorded at least one subsequent suppression episode. Most suppression episodes (87 %) were recorded with TaqMan version 1 or 2 assays. Of the 2035 blips recorded, 84 %, 12 % and 4 % were of low (50-199 copies/mL), medium (200-499 copies/mL) and high (500-999 copies/mL) magnitude respectively. The risk of viral rebound increased as blip magnitude increased with hazard ratios of 1.20 (95 % CI 0.89-1.61), 1.42 (95 % CI 0.96-2.19) and 1.93 (95 % CI 1.24-3.01) for low, medium and high magnitude blips respectively; an increase of hazard ratio 1.09 (95 % CI 1.03 to 1.15) per 100 copies/mL of HIV RNA. CONCLUSIONS With the more sensitive assays now commonly used for monitoring patients, blips above 200 copies/mL are increasingly likely to lead to viral rebound and should prompt a discussion about adherence.