871 resultados para Risk of forest inventory
Resumo:
Southern Switzerland is a fire prone area where fire has to be considered as a natural environmental factor. In the past decades, fire frequency has tended to increase due to changes in landscape management. The most common type of fire is surface fire which normally breaks out during the vegetation resting period. Usually this type of fire shows short residence time (rapid spread), low to medium fire intensity and limited size. South-facing slopes are particularly fire-prone, so that very high fire frequency is possible: under these conditions passive resistant species and postfire resprouting species are favoured, usually leading to a reduction in the number of surviving species to a few fire adapted sprouters. Evergreen broadleaves are extremely sensitive to repeated fires. A simulation of the potential vegetation of southern Switzerland under climatic changed conditions evidenced the coincidence of the potential area of spreading forests rich in evergreen broad-leaved species with the most fire-prone area of the region. Therefore, in future, wildfires could play an important regulating role: most probably they will not stop the large-scale laurophyllisation of the thermophilous forests of southern Switzerland, but at sites with high fire frequency the vegetation shift could be slowed or even prevented by fire-disturbances.
Resumo:
BACKGROUND & AIMS Non-selective beta-blockers (NSBB) are used in patients with cirrhosis and oesophageal varices. Experimental data suggest that NSBB inhibit angiogenesis and reduce bacterial translocation, which may prevent hepatocellular carcinoma (HCC). We therefore assessed the effect of NSBB on HCC by performing a systematic review with meta-analyses of randomized trials. METHODS Electronic and manual searches were combined. Authors were contacted for unpublished data. Included trials assessed NSBB for patients with cirrhosis; the control group could receive any other intervention than NSBB. Fixed and random effects meta-analyses were performed with I(2) as a measure of heterogeneity. Subgroup, sensitivity, regression and sequential analyses were performed to evaluate heterogeneity, bias and the robustness of the results after adjusting for multiple testing. RESULTS Twenty-three randomized trials on 2618 patients with cirrhosis were included, of which 12 reported HCC incidence and 23 reported HCC mortality. The mean duration of follow-up was 26 months (range 8-82). In total, 47 of 694 patients randomized to NSBB developed HCC vs 65 of 697 controls (risk difference -0.026; 95% CI-0.052 to -0.001; number needed to treat 38 patients). There was no heterogeneity (I(2) = 7%) or evidence of small study effects (Eggers P = 0.402). The result was not confirmed in sequential analysis, which suggested that 3719 patients were needed to achieve the required information size. NSBB did not reduce HCC-related mortality (RD -0.011; 95% CI -0.040 to 0.017). CONCLUSIONS Non-selective beta-blockers may prevent HCC in patients with cirrhosis.
Resumo:
BACKGROUND Children born preterm or with a small size for gestational age are at increased risk for childhood asthma. OBJECTIVE We sought to assess the hypothesis that these associations are explained by reduced airway patency. METHODS We used individual participant data of 24,938 children from 24 birth cohorts to examine and meta-analyze the associations of gestational age, size for gestational age, and infant weight gain with childhood lung function and asthma (age range, 3.9-19.1 years). Second, we explored whether these lung function outcomes mediated the associations of early growth characteristics with childhood asthma. RESULTS Children born with a younger gestational age had a lower FEV1, FEV1/forced vital capacity (FVC) ratio, and forced expiratory volume after exhaling 75% of vital capacity (FEF75), whereas those born with a smaller size for gestational age at birth had a lower FEV1 but higher FEV1/FVC ratio (P < .05). Greater infant weight gain was associated with higher FEV1 but lower FEV1/FVC ratio and FEF75 in childhood (P < .05). All associations were present across the full range and independent of other early-life growth characteristics. Preterm birth, low birth weight, and greater infant weight gain were associated with an increased risk of childhood asthma (pooled odds ratio, 1.34 [95% CI, 1.15-1.57], 1.32 [95% CI, 1.07-1.62], and 1.27 [95% CI, 1.21-1.34], respectively). Mediation analyses suggested that FEV1, FEV1/FVC ratio, and FEF75 might explain 7% (95% CI, 2% to 10%) to 45% (95% CI, 15% to 81%) of the associations between early growth characteristics and asthma. CONCLUSIONS Younger gestational age, smaller size for gestational age, and greater infant weight gain were across the full ranges associated with childhood lung function. These associations explain the risk of childhood asthma to a substantial extent.
The Risk of Seizure After Surgery for Unruptured Intracranial Aneurysms: A Prospective Cohort Study.
Resumo:
BACKGROUND We aimed to identify a group of patients with a low risk of seizure after surgery for unruptured intracranial aneurysms (UIA). OBJECTIVE To determine the risk of seizure after discharge from surgery for UIA. METHODS A consecutive prospectively collected cohort database was interrogated for all surgical UIA cases. There were 726 cases of UIA (excluding cases proximal to the superior cerebellar artery on the vertebrobasilar system) identified and analyzed. Cox proportional hazards regression models and Kaplan-Meier life table analyses were generated assessing risk factors. RESULTS Preoperative seizure history and complication of aneurysm repair were the only risk factors found to be significant. The risk of first seizure after discharge from hospital following surgery for patients with neither preoperative seizure, treated middle cerebral artery aneurysm, nor postoperative complications (leading to a modified Rankin Scale score >1) was <0.1% and 1.1% at 12 months and 7 years, respectively. The risk for those with preoperative seizures was 17.3% and 66% at 12 months and 7 years, respectively. The risk for seizures with either complications (leading to a modified Rankin Scale score >1) from surgery or treated middle cerebral artery aneurysm was 1.4% and 6.8% at 12 months and 7 years, respectively. These differences in the 3 Kaplan-Meier curves were significant (log-rank P < .001). CONCLUSION The risk of seizures after discharge from hospital following surgery for UIA is very low when there is no preexisting history of seizures. If this result can be supported by other series, guidelines that restrict returning to driving because of the risk of postoperative seizures should be reconsidered. ABBREVIATIONS MCA, middle cerebral arterymRS, modified Rankin ScaleUIA, unruptured intracranial aneurysms.
Resumo:
A missense variant (c.1637C>T, T546M) in ABCC11 encoding the MRP8 (multidrug resistance protein 8), a transporter of 5-fluorodeoxyuridine monophosphate, has been associated with an increased risk of 5-fluorouracil-related severe leukopenia. To validate this association, we investigated the impact of the ABCC11 variants c.1637C>T, c.538G>A and c.395+1087C>T on the risk of early-onset fluoropyrimidine-related toxicity in 514 cancer patients. The ABCC11 variant c.1637C>T was strongly associated with severe leukopenia in patients carrying risk variants in DPYD, encoding the key fluoropyrimidine-metabolizing enzyme dihydropyrimidine dehydrogenase (odds ratio (OR): 71.0; 95% confidence interval (CI): 2.5-2004.8; Pc.1637C>T*DPYD=0.013). In contrast, in patients without DPYD risk variants, no association with leukopenia (OR: 0.95; 95% CI: 0.34-2.6) or overall fluoropyrimidine-related toxicity (OR: 1.02; 95% CI: 0.5-2.1) was observed. Our study thus suggests that c.1637C>T affects fluoropyrimidine toxicity to leukocytes particularly in patients with high drug exposure, for example, because of reduced fluoropyrimidine catabolism.
Resumo:
BACKGROUND Antiretroviral therapy (ART) initiation is now recommended irrespective of CD4 count. However data on the relationship between CD4 count at ART initiation and loss to follow-up (LTFU) are limited and conflicting. METHODS We conducted a cohort analysis including all adults initiating ART (2008-2012) at three public sector sites in South Africa. LTFU was defined as no visit in the 6 months before database closure. The Kaplan-Meier estimator and Cox's proportional hazards models examined the relationship between CD4 count at ART initiation and 24-month LTFU. Final models were adjusted for demographics, year of ART initiation, programme expansion and corrected for unascertained mortality. RESULTS Among 17 038 patients, the median CD4 at initiation increased from 119 (IQR 54-180) in 2008 to 257 (IQR 175-318) in 2012. In unadjusted models, observed LTFU was associated with both CD4 counts <100 cells/μL and CD4 counts ≥300 cells/μL. After adjustment, patients with CD4 counts ≥300 cells/μL were 1.35 (95% CI 1.12 to 1.63) times as likely to be LTFU after 24 months compared to those with a CD4 150-199 cells/μL. This increased risk for patients with CD4 counts ≥300 cells/μL was largest in the first 3 months on treatment. Correction for unascertained deaths attenuated the association between CD4 counts <100 cells/μL and LTFU while the association between CD4 counts ≥300 cells/μL and LTFU persisted. CONCLUSIONS Patients initiating ART at higher CD4 counts may be at increased risk for LTFU. With programmes initiating patients at higher CD4 counts, models of ART delivery need to be reoriented to support long-term retention.
Resumo:
The Carrabassett Valley Sanitary District in Carrabassett Valley, Maine has utilized both a forest spray irrigation system and a Snowfluent™ system for the treatment of their wastewater effluent. This study was designed to evaluate potential changes in soil properties after approximately 20 years of treatment in the forested spray irrigation site and three years of treatment in the field Snowfluent™ site. In addition, grass yield and composition were evaluated on the field study sites. After treatment with effluent or Snowfluent™, soils showed an increase in soil exchangeable Ca, Mg, Na, and K, base saturation, and pH. While most constituents were higher in treated soils, available P was lower in treated soils compared to the controls. This difference was attributed to higher rates of P mineralization from soil organic matter due to an irrigation effect of the treatment, depleting available P pools despite the P addition with the treatment. Most of the differences due to treatment were greatest at the surface and diminished with depth. Depth patterns in soil properties mostly reflected the decreasing influence of organic matter and its decomposition products with depth as evidenced by significantly higher total C in the surface compared to lower horizons. There were decreasing concentrations of total N, and exchangeable or extractable Ca, Mg, Na, K, Mn, Zn, and P with depth. In addition, there was decreasing BS with depth, driven primarily by declining exchangeable Ca and Mg. Imgation with Snowfluent™ altered the chemical composition of the grass on the site. All element concentrations were significantly higher in the grass foliage except for Ca. The differences were attributed to the additional nutrients and moisture derived from the Snowfluent™. The use of forest spray imgation and Snowfluent™ as a wastewater treatment strategy appears to work well. The soil and vegetation were able to retain most of the applied nutrients, and do not appear to be moving toward saturation. Vegetation management may be a key tool for managing nutrient accumulation on the grass sites as the system ages.
Resumo:
The discoveries of the BRCA1 and BRCA2 genes have made it possible for women of families with hereditary breast/ovarian cancer to determine if they carry cancer-predisposing genetic mutations. Women with germline mutations have significantly higher probabilities of developing both cancers than the general population. Since the presence of a BRCA1 or BRCA2 mutation does not guarantee future cancer development, the appropriate course of action remains uncertain for these women. Prophylactic mastectomy and oophorectomy remain controversial since the underlying premise for surgical intervention is based more upon reduction in the estimated risk of cancer than on actual evidence of clinical benefit. Issues that are incorporated in a woman's decision making process include quality of life without breasts, ovaries, attitudes toward possible surgical morbidity as well as a remaining risk of future development of breast/ovarian cancer despite prophylactic surgery. The incorporation of patient preferences into decision analysis models can determine the quality-adjusted survival of different prophylactic approaches to breast/ovarian cancer prevention. Monte Carlo simulation was conducted on 4 separate decision models representing prophylactic oophorectomy, prophylactic mastectomy, prophylactic oophorectomy/mastectomy and screening. The use of 3 separate preference assessment methods across different populations of women allows researchers to determine how quality adjusted survival varies according to clinical strategy, method of preference assessment and the population from which preferences are assessed. ^
Resumo:
A retrospective cohort study was conducted among 1542 patients diagnosed with CLL between 1970 and 2001 at the M. D. Anderson Cancer Center (MDACC). Changes in clinical characteristics and the impact of CLL on life expectancy were assessed across three decades (1970–2001) and the role of clinical factors on prognosis of CLL were evaluated among patients diagnosed between 1985 and 2001 using Kaplan-Meier and Cox proportional hazards method. Among 1485 CLL patients diagnosed from 1970 to 2001, patients in the recent cohort (1985–2001) were diagnosed at a younger age and an earlier stage compared to the earliest cohort (1970–1984). There was a 44% reduction in mortality among patients diagnosed in 1985–1995 compared to those diagnosed in 1970–1984 after adjusting for age, sex and Rai stage among patients who ever received treatment. There was an overall 11 years (5 years for stage 0) loss of life expectancy among 1485 patients compared with the expected life expectancy based on the age-, sex- and race-matched US general population, with a 43% decrease in the 10-year survival rate. Abnormal cytogenetics was associated with shorter progression-free (PF) survival after adjusting for age, sex, Rai stage and beta-2 microglobulin (beta-2M); whereas, older age, abnormal cytogenetics and a higher beta-2M level were adverse predictors for overall survival. No increased risk of second cancer overall was observed, however, patients who received treatment for CLL had an elevated risk of developing AML and HD. Two out of three patients who developed AML were treated with alkylating agents. In conclusion, CLL patients had improved survival over time. The identification of clinical predictors of PF/overall survival has important clinical significance. Close surveillance of the development of second cancer is critical to improve the quality of life of long-term survivors. ^
Resumo:
Background. The rise in survival rates along with more detailed follow-up using sophisticated imaging studies among non-small lung cancer (NSCLC) patients has led to an increased risk of second primary tumors (SPT) among these cases. Population and hospital based studies of lung cancer patients treated between 1974 and 1996 have found an increasing risk over time for the development of all cancers following treatment of non-small cell lung cancer (NSCLC). During this time the primary modalities for treatment were surgery alone, radiation alone, surgery and post-operative radiation therapy, or combinations of chemotherapy and radiation (sequentially or concurrently). There is limited information in the literature about the impact of treatment modalities on the development of second primary tumors in these patients. ^ Purpose. To investigate the impact of treatment modalities on the risk of second primary tumors in patients receiving treatment with curative intent for non-metastatic (Stage I–III) non-small cell lung cancer (NSCLC). ^ Methods. The hospital records of 1,095 NSCLC patients who were diagnosed between 1980–2001 and received treatment with curative intent at M.D. Anderson Cancer Center with surgery alone, radiation alone (with a minimum total radiation dose of at least 45Gy), surgery and post-operative radiation therapy, radiation therapy in combination with chemotherapy or surgery in combination with chemotherapy and radiation were retrospectively reviewed. A second primary malignancy was be defined as any tumor histologically different from the initial cancer, or of another anatomic location, or a tumor of the same location and histology as the initial tumor having an interval between cancers of at least five years. Only primary tumors occurring after treatment for NSCLC will qualified as second primary tumors for this study. ^ Results. The incidence of second primary tumor was 3.3%/year and the rate increased over time following treatment. The type of NSCLC treatment was not found to have a striking effect upon SPT development. Increased rates were observed in the radiation only and chemotherapy plus radiation treatment groups; but, these increases did not exceed expected random variation. Higher radiation treatment dose, patient age and weight loss prior to index NSCLC treatment were associated with higher SPT development. ^
Resumo:
Purpose. The central concepts in pressure ulcer risk are exposure to external pressure caused by inactivity and tissue tolerance to pressure, a factor closely related to blood flow. Inactivity measures are effective in predicting pressure ulcer risk. The purpose of the study is to evaluate whether a physiological measure of skin blood flow improves pressure ulcer risk prediction. Skin temperature regularity and self-similarity, as proxy measures of blood flow, and not previously described, may be undefined pressure ulcer risk factors. The specific aims were to determine whether a sample of nursing facility residents at high risk of pressure ulcers classified using the Braden Scale for Pressure Sore Risk© differ from a sample of low risk residents according to (1) exposure to external pressure as measured by resident activity, (2) tissue tolerance to external pressure as measured by skin temperature, and (3) skin temperature fluctuations and recovery in response to a commonly occurring stressor, bathing and additionally whether (4) scores on the Braden Scale mobility subscale score are related to entropy and the spectral exponent. ^ Methods. A two group observational time series design was used to describe activity and skin temperature regularity and self-similarity, calculating entropy and the spectral exponent using detrended fluctuation analysis respectively. Twenty nursing facility residents wore activity and skin temperature monitors for one week. One bathing episode was observed as a commonly occurring stressor for skin temperature.^ Results. Skin temperature multiscale entropy (MSE), F(1, 17) = 5.55, p = .031, the skin temperature spectral exponent, F(1, 17) = 6.19, p = .023, and the activity mean MSE, F(1, 18) = 4.52, p = .048 differentiated the risk groups. The change in skin temperature entropy during bathing was significant, t(16) = 2.55, p = .021, (95% CI, .04-.40). Multiscale entropy for skin temperature was lowest in those who developed pressure ulcers, F(1, 18) = 35.14, p < .001.^ Conclusions. This study supports the tissue tolerance component of the Braden and Bergstrom conceptual framework and shows differences in skin temperature multiscale entropy between pressure ulcer risk categories, pressure ulcer outcome, and during a commonly occurring stressor. ^
Resumo:
Existing literature examining the association between occupation and asthma has not been adequately powered to address this question in the food preparation or food service industries. Few studies have addressed the possible link between occupational exposure to cooking fumes and asthma. This secondary analysis of cohort study data aimed to investigate the association between adult-onset asthma and exposure to: (a) cooking fumes at work or (b) longest-held employment in food preparation or food service (e.g. waiters and waitresses, food preparation workers, non-restaurant food servers, etc.). Participants arose from a cohort of Mexican-American women residing in Houston, TX, recruited between July 2001 and June 2007. This analysis used Cox proportional-hazards regression to estimate the hazard ratio of adult-onset asthma given the exposures of interest, adjusting for age, BMI, smoking status, acculturation, and birthplace. We found a strong association between adult-onset asthma and occupational exposure to cooking fumes (hazard ratio [HR] = 1.77; 95% confidence interval [CI], 1.15, 2.72), especially in participants whose longest-held occupation was not in the food-related industry (HR = 2.12; 95% CI, 1.21, 3.60). In conclusion, adult-onset asthma is a serious public health concern for food industry workers. ^
Resumo:
Several studies have examined the association between high glycemic index (GI) and glycemic load (GL) diets and the risk for coronary heart disease (CHD). However, most of these studies were conducted primarily on white populations. The primary aim of this study was to examine whether high GI and GL diets are associated with increased risk for developing CHD in whites and African Americans, non-diabetics and diabetics, and within stratifications of body mass index (BMI) and hypertension (HTN). Baseline and 17-year follow-up data from ARIC (Atherosclerosis Risk in Communities) study was used. The study population (13,051) consisted of 74% whites, 26% African Americans, 89% non-diabetics, 11% diabetics, 43% male, 57% female aged 44 to 66 years at baseline. Data from the ARIC food frequency questionnaire at baseline were analyzed to provide GI and GL indices for each subject. Increases of 25 and 30 units for GI and GL respectively were used to describe relationships on incident CHD risk. Adjusted hazard ratios for propensity score with 95% confidence intervals (CI) were used to assess associations. During 17 years of follow-up (1987 to 2004), 1,683 cases of CHD was recorded. Glycemic index was associated with 2.12 fold (95% CI: 1.05, 4.30) increased incident CHD risk for all African Americans and GL was associated with 1.14 fold (95% CI: 1.04, 1.25) increased CHD risk for all whites. In addition, GL was also an important CHD risk factor for white non-diabetics (HR=1.59; 95% CI: 1.33, 1.90). Furthermore, within stratum of BMI 23.0 to 29.9 in non-diabetics, GI was associated with an increased hazard ratio of 11.99 (95% CI: 2.31, 62.18) for CHD in African Americans, and GL was associated with 1.23 fold (1.08, 1.39) increased CHD risk in whites. Body mass index modified the effect of GI and GL on CHD risk in all whites and white non-diabetics. For HTN, both systolic blood pressure and diastolic blood pressure modified the effect on GI and GL on CHD risk in all whites and African Americans, white and African American non-diabetics, and white diabetics. Further studies should examine other factors that could influence the effects of GI and GL on CHD risk, including dietary factors, physical activity, and diet-gene interactions. ^