178 resultados para Risk of forest inventory
Resumo:
Forest management is known to influence species diversity of various taxa but inconsistent or even contrasting effects are reported for arthropods. Regional differences in management as well as differences in regional species pools might be responsible for these inconsistencies, but, inter-regional replicated studies that account for regional variability are rare. We investigated the effect of forest type on the abundance, diversity, community structure and composition of two important ground-dwelling beetle families, Carabidae and Staphylinidae, in 149 forest stands distributed over three regions in Germany. In particular we focused on recent forestry history, stand age and dominant tree species, in addition to a number of environmental descriptors. Overall management effects on beetle communities were small and mainly mediated by structural habitat parameters such as the cover of forest canopy or the plant diversity on forest stands. The general response of both beetle taxa to forest management was similar in all regions: abundance and species richness of beetles was higher in older than in younger stands and species richness was lower in unmanaged than in managed stands. The abundance ratio of forest species-to-open habitat species differed between regions, but generally increased from young to old stands, from coniferous to deciduous stands and from managed to unmanaged stands. The response of both beetle families to dominant tree species was variable among regions and staphylinid richness varied in the response to recent forestry history. Our results suggest that current forest management practices change the composition of ground-dwelling beetle communities mainly by favoring generalists and open habitat species. To protect important forest beetle communities and thus the ecosystem functions and services provided by them, we suggest to shelter remaining ancient forests and to develop near-to-nature management strategies by prolonging rotation periods and increasing structural diversity of managed forests. Possible geographic variations in the response of beetle communities need to be considered in conservation-orientated forest management strategies.
Resumo:
Amyotrophic lateral sclerosis (ALS) has been associated with exposures in so-called 'electrical occupations'. It is unclear if this possible link may be explained by exposure to extremely low-frequency magnetic fields (ELF-MF) or by electrical shocks. We evaluated ALS mortality in 2000-2008 and exposure to ELF-MF and electrical shocks in the Swiss National Cohort, using job exposure matrices for occupations at censuses 1990 and 2000. We compared 2.2 million workers with high or medium vs. low exposure to ELF-MF and electrical shocks using Cox proportional hazard models. Results showed that mortality from ALS was higher in people who had medium or high ELF-MF exposure in both censuses (HR 1.55 (95% CI 1.11-2.15)), but closer to unity for electrical shocks (HR 1.17 (95% CI 0.83-1.65)). When both exposures were included in the same model, the HR for ELF-MF changed little (HR 1.56), but the HR for electric shocks was attenuated to 0.97. In conclusion, there was an association between exposure to ELF-MF and mortality from ALS among workers with a higher likelihood of long-term exposure.
Resumo:
OBJECTIVES It is still debated if pre-existing minority drug-resistant HIV-1 variants (MVs) affect the virological outcomes of first-line NNRTI-containing ART. METHODS This Europe-wide case-control study included ART-naive subjects infected with drug-susceptible HIV-1 as revealed by population sequencing, who achieved virological suppression on first-line ART including one NNRTI. Cases experienced virological failure and controls were subjects from the same cohort whose viraemia remained suppressed at a matched time since initiation of ART. Blinded, centralized 454 pyrosequencing with parallel bioinformatic analysis in two laboratories was used to identify MVs in the 1%-25% frequency range. ORs of virological failure according to MV detection were estimated by logistic regression. RESULTS Two hundred and sixty samples (76 cases and 184 controls), mostly subtype B (73.5%), were used for the analysis. Identical MVs were detected in the two laboratories. 31.6% of cases and 16.8% of controls harboured pre-existing MVs. Detection of at least one MV versus no MVs was associated with an increased risk of virological failure (OR = 2.75, 95% CI = 1.35-5.60, P = 0.005); similar associations were observed for at least one MV versus no NRTI MVs (OR = 2.27, 95% CI = 0.76-6.77, P = 0.140) and at least one MV versus no NNRTI MVs (OR = 2.41, 95% CI = 1.12-5.18, P = 0.024). A dose-effect relationship between virological failure and mutational load was found. CONCLUSIONS Pre-existing MVs more than double the risk of virological failure to first-line NNRTI-based ART.
Resumo:
Abstract BACKGROUND: Many studies have been conducted to define risk factors for the transmission of bovine paratuberculosis, mostly in countries with large herds. Little is known about the epidemiology in infected Swiss herds and risk factors important for transmission in smaller herds. Therefore, the presence of known factors which might favor the spread of paratuberculosis and could be related to the prevalence at animal level of fecal shedding of Mycobacterium avium subsp. paratuberculosis were assessed in 17 infected herds (10 dairy, 7 beef). Additionally, the level of knowledge of herd managers about the disease was assessed. In a case-control study with 4 matched negative control herds per infected herd, the association of potential risk factors with the infection status of the herd was investigated. RESULTS: Exposure of the young stock to feces of older animals was frequently observed in infected and in control herds. The farmers' knowledge about paratuberculosis was very limited, even in infected herds. An overall prevalence at animal level of fecal shedding of Mycobacterium avium subsp. paratuberculosis of 6.1% was found in infected herds, whereby shedders younger than 2 years of age were found in 46.2% of the herds where the young stock was available for testing. Several factors related to contamination of the heifer area with cows' feces and the management of the calving area were found to be significantly associated with the within-herd prevalence. Animal purchase was associated with a positive herd infection status (OR = 7.25, p = 0.004). CONCLUSIONS: Numerous risk factors favoring the spread of Mycobacterium avium subsp. paratuberculosis from adult animals to the young stock were observed in infected Swiss dairy and beef herds, which may be amenable to improvement in order to control the disease. Important factors were contamination of the heifer and the calving area, which were associated with higher within-herd prevalence of fecal shedding. The awareness of farmers of paratuberculosis was very low, even in infected herds. Animal purchase in a herd was significantly associated with the probability of a herd to be infected and is thus the most important factor for the control of the spread of disease between farms.
Resumo:
BACKGROUND The Cochrane risk of bias (RoB) tool has been widely embraced by the systematic review community, but several studies have reported that its reliability is low. We aim to investigate whether training of raters, including objective and standardized instructions on how to assess risk of bias, can improve the reliability of this tool. We describe the methods that will be used in this investigation and present an intensive standardized training package for risk of bias assessment that could be used by contributors to the Cochrane Collaboration and other reviewers. METHODS/DESIGN This is a pilot study. We will first perform a systematic literature review to identify randomized clinical trials (RCTs) that will be used for risk of bias assessment. Using the identified RCTs, we will then do a randomized experiment, where raters will be allocated to two different training schemes: minimal training and intensive standardized training. We will calculate the chance-corrected weighted Kappa with 95% confidence intervals to quantify within- and between-group Kappa agreement for each of the domains of the risk of bias tool. To calculate between-group Kappa agreement, we will use risk of bias assessments from pairs of raters after resolution of disagreements. Between-group Kappa agreement will quantify the agreement between the risk of bias assessment of raters in the training groups and the risk of bias assessment of experienced raters. To compare agreement of raters under different training conditions, we will calculate differences between Kappa values with 95% confidence intervals. DISCUSSION This study will investigate whether the reliability of the risk of bias tool can be improved by training raters using standardized instructions for risk of bias assessment. One group of inexperienced raters will receive intensive training on risk of bias assessment and the other will receive minimal training. By including a control group with minimal training, we will attempt to mimic what many review authors commonly have to do, that is-conduct risk of bias assessment in RCTs without much formal training or standardized instructions. If our results indicate that an intense standardized training does improve the reliability of the RoB tool, our study is likely to help improve the quality of risk of bias assessments, which is a central component of evidence synthesis.
Resumo:
29 parent- and alkyl-polycyclic aromatic hydrocarbons (PAHs), 15 oxygenated-PAHs (OPAHs), 11 nitrated-PAHs (NPAHs) and 4 azaarenes (AZAs) in both the gaseous and particulate phases, as well as the particulate-bound carbon fractions (organic carbon, elemental carbon, char, and soot) in ambient air sampled in March and September 2012 from an urban site in Xi'an, central China were extracted and analyzed. The average concentrations (gaseous+particulate) of 29PAHs, 15OPAHs, 11NPAHs and 4AZAs were 1267.0±307.5, 113.8±46.1, 11.8±4.8 and 26.5±11.8ngm(-3) in March and 784.7±165.1, 67.2±9.8, 9.0±1.5 and 21.6±5.1ngm(-3) in September, respectively. Concentrations of 29PAHs, 15OPAHs and 11NPAHs in particulates were significantly correlated with those of the carbon fractions (OC, EC, char and soot). Both absorption into organic matter in particles and adsorption onto the surface of particles were important for PAHs and OPAHs in both sampling periods, with more absorption occurring in September, while absorption was always the most important process for NPAHs. The total carcinogenic risk of PAHs plus the NPAHs was higher in March. Gaseous compounds, which were not considered in most previous studies, contributed 29 to 44% of the total health risk in March and September, respectively.
Resumo:
BACKGROUND AND PURPOSE To assess the association of lesion location and risk of aspiration and to establish predictors of transient versus extended risk of aspiration after supratentorial ischemic stroke. METHODS Atlas-based localization analysis was performed in consecutive patients with MRI-proven first-time acute supratentorial ischemic stroke. Standardized swallowing assessment was carried out within 8±18 hours and 7.8±1.2 days after admission. RESULTS In a prospective, longitudinal analysis, 34 of 94 patients (36%) were classified as having acute risk of aspiration, which was extended (≥7 days) or transient (<7 days) in 17 cases. There were no between-group differences in age, sex, cause of stroke, risk factors, prestroke disability, lesion side, or the degree of age-related white-matter changes. Correcting for stroke volume and National Institutes of Health Stroke Scale with a multiple logistic regression model, significant adjusted odds ratios in favor of acute risk of aspiration were demonstrated for the internal capsule (adjusted odds ratio, 6.2; P<0.002) and the insular cortex (adjusted odds ratio, 4.8; P<0.003). In a multivariate model of extended versus transient risk of aspiration, combined lesions of the frontal operculum and insular cortex was the only significant independent predictor of poor recovery (adjusted odds ratio, 33.8; P<0.008). CONCLUSIONS Lesions of the insular cortex and the internal capsule are significantly associated with acute risk of aspiration after stroke. Combined ischemic infarctions of the frontal operculum and the insular cortex are likely to cause extended risk of aspiration in stroke patients, whereas risk of aspiration tends to be transient in subcortical stroke.
Resumo:
BACKGROUND Polypharmacy, defined as the concomitant use of multiple medications, is very common in the elderly and may trigger drug-drug interactions and increase the risk of falls in patients receiving vitamin K antagonists. OBJECTIVE To examine whether polypharmacy increases the risk of bleeding in elderly patients who receive vitamin K antagonists for acute venous thromboembolism (VTE). DESIGN We used a prospective cohort study. PARTICIPANTS In a multicenter Swiss cohort, we studied 830 patients aged ≥ 65 years with VTE. MAIN MEASURES We defined polypharmacy as the prescription of more than four different drugs. We assessed the association between polypharmacy and the time to a first major and clinically relevant non-major bleeding, accounting for the competing risk of death. We adjusted for known bleeding risk factors (age, gender, pulmonary embolism, active cancer, arterial hypertension, cardiac disease, cerebrovascular disease, chronic liver and renal disease, diabetes mellitus, history of major bleeding, recent surgery, anemia, thrombocytopenia) and periods of vitamin K antagonist treatment as a time-varying covariate. KEY RESULTS Overall, 413 (49.8 %) patients had polypharmacy. The mean follow-up duration was 17.8 months. Patients with polypharmacy had a significantly higher incidence of major (9.0 vs. 4.1 events/100 patient-years; incidence rate ratio [IRR] 2.18, 95 % confidence interval [CI] 1.32-3.68) and clinically relevant non-major bleeding (14.8 vs. 8.0 events/100 patient-years; IRR 1.85, 95 % CI 1.27-2.71) than patients without polypharmacy. After adjustment, polypharmacy was significantly associated with major (sub-hazard ratio [SHR] 1.83, 95 % CI 1.03-3.25) and clinically relevant non-major bleeding (SHR 1.60, 95 % CI 1.06-2.42). CONCLUSIONS Polypharmacy is associated with an increased risk of both major and clinically relevant non-major bleeding in elderly patients receiving vitamin K antagonists for VTE.
Resumo:
OBJECTIVE Whether or not a high risk of falls increases the risk of bleeding in patients receiving anticoagulants remains a matter of debate. METHODS We conducted a prospective cohort study involving 991 patients ≥ 65 years of age who received anticoagulants for acute venous thromboembolism (VTE) at nine Swiss hospitals between September 2009 and September 2012. The study outcomes were as follows: the time to a first major episode of bleeding; and clinically relevant nonmajor bleeding. We determined the associations between the risk of falls and the time to a first episode of bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS Four hundred fifty-eight of 991 patients (46%) were at high risk of falls. The mean duration of follow-up was 16.7 months. Patients at high risk of falls had a higher incidence of major bleeding (9.6 vs. 6.6 events/100 patient-years; P = 0.05) and a significantly higher incidence of clinically relevant nonmajor bleeding (16.7 vs. 8.3 events/100 patient-years; P < 0.001) than patients at low risk of falls. After adjustment, a high risk of falls was associated with clinically relevant nonmajor bleeding [subhazard ratio (SHR) = 1.74, 95% confidence interval (CI) = 1.23-2.46], but not with major bleeding (SHR = 1.24, 95% CI = 0.83-1.86). CONCLUSION In elderly patients who receive anticoagulants because of VTE, a high risk of falls is significantly associated with clinically relevant nonmajor bleeding, but not with major bleeding. Whether or not a high risk of falls is a reason against providing anticoagulation beyond 3 months should be based on patient preferences and the risk of VTE recurrence.
Resumo:
BACKGROUND Although the possibility of bleeding during anticoagulant treatment may limit patients from taking part in physical activity, the association between physical activity and anticoagulation-related bleeding is uncertain. OBJECTIVES To determine whether physical activity is associated with bleeding in elderly patients taking anticoagulants. PATIENTS/METHODS In a prospective multicenter cohort study of 988 patients aged ≥65 years receiving anticoagulants for venous thromboembolism, we assessed patients' self-reported physical activity level. The primary outcome was the time to a first major bleeding, defined as fatal bleeding, symptomatic bleeding in a critical site, or bleeding causing a fall in hemoglobin or leading to transfusions. The secondary outcome was the time to a first clinically-relevant non-major bleeding. We examined the association between physical activity level and time to a first bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS During a mean follow-up of 22 months, patients with a low, moderate, and high physical activity level had an incidence of major bleeding of 11.6, 6.3, and 3.1 events per 100 patient-years, and an incidence of clinically relevant non-major bleeding of 14.0, 10.3, and 7.7 events per 100 patient-years, respectively. A high physical activity level was significantly associated with a lower risk of major bleeding (adjusted sub-hazard ratio 0.40, 95%-CI 0.22-0.72). There was no association between physical activity and non-major bleeding. CONCLUSIONS A high level of physical activity is associated with a decreased risk of major bleeding in elderly patients receiving anticoagulant therapy. This article is protected by copyright. All rights reserved.
Resumo:
CONTEXT Subclinical hypothyroidism has been associated with increased risk of coronary heart disease (CHD), particularly with thyrotropin levels of 10.0 mIU/L or greater. The measurement of thyroid antibodies helps predict the progression to overt hypothyroidism, but it is unclear whether thyroid autoimmunity independently affects CHD risk. OBJECTIVE The objective of the study was to compare the CHD risk of subclinical hypothyroidism with and without thyroid peroxidase antibodies (TPOAbs). DATA SOURCES AND STUDY SELECTION A MEDLINE and EMBASE search from 1950 to 2011 was conducted for prospective cohorts, reporting baseline thyroid function, antibodies, and CHD outcomes. DATA EXTRACTION Individual data of 38 274 participants from six cohorts for CHD mortality followed up for 460 333 person-years and 33 394 participants from four cohorts for CHD events. DATA SYNTHESIS Among 38 274 adults (median age 55 y, 63% women), 1691 (4.4%) had subclinical hypothyroidism, of whom 775 (45.8%) had positive TPOAbs. During follow-up, 1436 participants died of CHD and 3285 had CHD events. Compared with euthyroid individuals, age- and gender-adjusted risks of CHD mortality in subclinical hypothyroidism were similar among individuals with and without TPOAbs [hazard ratio (HR) 1.15, 95% confidence interval (CI) 0.87-1.53 vs HR 1.26, CI 1.01-1.58, P for interaction = .62], as were risks of CHD events (HR 1.16, CI 0.87-1.56 vs HR 1.26, CI 1.02-1.56, P for interaction = .65). Risks of CHD mortality and events increased with higher thyrotropin, but within each stratum, risks did not differ by TPOAb status. CONCLUSIONS CHD risk associated with subclinical hypothyroidism did not differ by TPOAb status, suggesting that biomarkers of thyroid autoimmunity do not add independent prognostic information for CHD outcomes.
Resumo:
BACKGROUND The early repolarization (ER) pattern is associated with an increased risk of arrhythmogenic sudden death. However, strategies for risk stratification of patients with the ER pattern are not fully defined. OBJECTIVES This study sought to determine the role of electrophysiology studies (EPS) in risk stratification of patients with ER syndrome. METHODS In a multicenter study, 81 patients with ER syndrome (age 36 ± 13 years, 60 males) and aborted sudden death due to ventricular fibrillation (VF) were included. EPS were performed following the index VF episode using a standard protocol. Inducibility was defined by the provocation of sustained VF. Patients were followed up by serial implantable cardioverter-defibrillator interrogations. RESULTS Despite a recent history of aborted sudden death, VF was inducible in only 18 of 81 (22%) patients. During follow-up of 7.0 ± 4.9 years, 6 of 18 (33%) patients with inducible VF during EPS experienced VF recurrences, whereas 21 of 63 (33%) patients who were noninducible experienced recurrent VF (p = 0.93). VF storm occurred in 3 patients from the inducible VF group and in 4 patients in the noninducible group. VF inducibility was not associated with maximum J-wave amplitude (VF inducible vs. VF noninducible; 0.23 ± 0.11 mV vs. 0.21 ± 0.11 mV; p = 0.42) or J-wave distribution (inferior, odds ratio [OR]: 0.96 [95% confidence interval (CI): 0.33 to 2.81]; p = 0.95; lateral, OR: 1.57 [95% CI: 0.35 to 7.04]; p = 0.56; inferior and lateral, OR: 0.83 [95% CI: 0.27 to 2.55]; p = 0.74), which have previously been demonstrated to predict outcome in patients with an ER pattern. CONCLUSIONS Our findings indicate that current programmed stimulation protocols do not enhance risk stratification in ER syndrome.
Resumo:
BACKGROUND Observational studies of a putative association between hormonal contraception (HC) and HIV acquisition have produced conflicting results. We conducted an individual participant data (IPD) meta-analysis of studies from sub-Saharan Africa to compare the incidence of HIV infection in women using combined oral contraceptives (COCs) or the injectable progestins depot-medroxyprogesterone acetate (DMPA) or norethisterone enanthate (NET-EN) with women not using HC. METHODS AND FINDINGS Eligible studies measured HC exposure and incident HIV infection prospectively using standardized measures, enrolled women aged 15-49 y, recorded ≥15 incident HIV infections, and measured prespecified covariates. Our primary analysis estimated the adjusted hazard ratio (aHR) using two-stage random effects meta-analysis, controlling for region, marital status, age, number of sex partners, and condom use. We included 18 studies, including 37,124 women (43,613 woman-years) and 1,830 incident HIV infections. Relative to no HC use, the aHR for HIV acquisition was 1.50 (95% CI 1.24-1.83) for DMPA use, 1.24 (95% CI 0.84-1.82) for NET-EN use, and 1.03 (95% CI 0.88-1.20) for COC use. Between-study heterogeneity was mild (I2 < 50%). DMPA use was associated with increased HIV acquisition compared with COC use (aHR 1.43, 95% CI 1.23-1.67) and NET-EN use (aHR 1.32, 95% CI 1.08-1.61). Effect estimates were attenuated for studies at lower risk of methodological bias (compared with no HC use, aHR for DMPA use 1.22, 95% CI 0.99-1.50; for NET-EN use 0.67, 95% CI 0.47-0.96; and for COC use 0.91, 95% CI 0.73-1.41) compared to those at higher risk of bias (pinteraction = 0.003). Neither age nor herpes simplex virus type 2 infection status modified the HC-HIV relationship. CONCLUSIONS This IPD meta-analysis found no evidence that COC or NET-EN use increases women's risk of HIV but adds to the evidence that DMPA may increase HIV risk, underscoring the need for additional safe and effective contraceptive options for women at high HIV risk. A randomized controlled trial would provide more definitive evidence about the effects of hormonal contraception, particularly DMPA, on HIV risk.
Resumo:
In this study, we assess the climate mitigation potential from afforestation in a mountainous snow-rich region (Switzerland) with strongly varying environmental conditions. Using radiative forcing calculations, we quantify both the carbon sequestration potential and the effect of albedo change at high resolution. We calculate the albedo radiative forcing based on remotely sensed data sets of albedo, global radiation and snow cover. Carbon sequestration is estimated from changes in carbon stocks based on national inventories. We first estimate the spatial pattern of radiative forcing (RF) across Switzerland assuming homogeneous transitions from open land to forest. This highlights where forest expansion still exhibits climatic benefits when including the radiative forcing of albedo change. Second, given that forest expansion is currently the dominant land-use change process in the Swiss Alps, we calculate the radiative forcing that occurred between 1985 and 1997. Our results show that the net RF of forest expansion ranges from −24 W m−2 at low elevations of the northern Prealps to 2 W m−2 at high elevations of the Central Alps. The albedo RF increases with increasing altitude, which offsets the CO2 RF at high elevations with long snow-covered periods, high global radiation and low carbon sequestration. Albedo RF is particularly relevant during transitions from open land to open forest but not in later stages of forest development. Between 1985 and 1997, when overall forest expansion in Switzerland was approximately 4%, the albedo RF offset the CO2 RF by an average of 40%. We conclude that the albedo RF should be considered at an appropriately high resolution when estimating the climatic effect of forestation in temperate mountainous regions.