961 resultados para Risk identification
Resumo:
Over the last forty years, applying dendrogeomorphology to palaeoflood analysis has improved estimates of the frequency and magnitude of past floods worldwide. This paper reviews the main results obtained by applying dendrogeomorphology to flood research in several case studies in Central Spain. These dendrogeomorphological studies focused on the following topics: (1) anatomical analysis to understand the physiological response of trees to flood damage and improve sampling efficiency; (2) compiling robust flood chronologies in ungauged mountain streams, (3) determining flow depth and estimating flood discharge using two-dimensional hydraulic modelling, and comparing them with other palaeostage indicators; (4) calibrating hydraulic model parameters (i.e. Manning roughness); and (5) implementing stochastic-based, cost–benefit analysis to select optimal mitigation measures. The progress made in these areas is presented with suggestions for further research to improve the applicability of dendrogeochronology to palaeoflood studies. Further developments will include new methods for better identification of the causes of specific types of flood damage to trees (e.g. tilted trees) or stable isotope analysis of tree rings to identify the climatic conditions associated with periods of increasing flood magnitude or frequency.
Resumo:
BACKGROUND: : Women at increased risk of breast cancer (BC) are not widely accepting of chemopreventive interventions, and ethnic minorities are underrepresented in related trials. Furthermore, there is no validated instrument to assess the health-seeking behavior of these women with respect to these interventions. METHODS: : By using constructs from the Health Belief Model, the authors developed and refined, based on pilot data, the Breast Cancer Risk Reduction Health Belief (BCRRHB) scale using a population of 265 women at increased risk of BC who were largely medically underserved, of low socioeconomic status (SES), and ethnic minorities. Construct validity was assessed using principal components analysis with oblique rotation to extract factors, and generate and interpret summary scales. Internal consistency was determined using Cronbach alpha coefficients. RESULTS: : Test-retest reliability for the pilot and final data was calculated to be r = 0.85. Principal components analysis yielded 16 components that explained 64% of the total variance, with communalities ranging from 0.50-0.75. Cronbach alpha coefficients for the extracted factors ranged from 0.45-0.77. CONCLUSIONS: : Evidence suggests that the BCRRHB yields reliable and valid data that allows for the identification of barriers and enhancing factors associated with use of breast cancer chemoprevention in the study population. These findings allow for tailoring treatment plans and intervention strategies to the individual. Future research is needed to validate the scale for use in other female populations. Cancer 2009. (c) 2009 American Cancer Society.
Resumo:
BACKGROUND Prediction studies in subjects at Clinical High Risk (CHR) for psychosis are hampered by a high proportion of uncertain outcomes. We therefore investigated whether quantitative EEG (QEEG) parameters can contribute to an improved identification of CHR subjects with a later conversion to psychosis. METHODS This investigation was a project within the European Prediction of Psychosis Study (EPOS), a prospective multicenter, naturalistic field study with an 18-month follow-up period. QEEG spectral power and alpha peak frequencies (APF) were determined in 113 CHR subjects. The primary outcome measure was conversion to psychosis. RESULTS Cox regression yielded a model including frontal theta (HR=1.82; [95% CI 1.00-3.32]) and delta (HR=2.60; [95% CI 1.30-5.20]) power, and occipital-parietal APF (HR=.52; [95% CI .35-.80]) as predictors of conversion to psychosis. The resulting equation enabled the development of a prognostic index with three risk classes (hazard rate 0.057 to 0.81). CONCLUSIONS Power in theta and delta ranges and APF contribute to the short-term prediction of psychosis and enable a further stratification of risk in CHR samples. Combined with (other) clinical ratings, EEG parameters may therefore be a useful tool for individualized risk estimation and, consequently, targeted prevention.
Resumo:
Conventional risk assessments for crop protection chemicals compare the potential for causing toxicity (hazard identification) to anticipated exposure. New regulatory approaches have been proposed that would exclude exposure assessment and just focus on hazard identification based on endocrine disruption. This review comprises a critical analysis of hazard, focusing on the relative sensitivity of endocrine and non-endocrine endpoints, using a class of crop protection chemicals, the azole fungicides. These were selected because they are widely used on important crops (e.g. grains) and thereby can contact target and non-target plants and enter the food chain of humans and wildlife. Inhibition of lanosterol 14α-demethylase (CYP51) mediates the antifungal effect. Inhibition of other CYPs, such as aromatase (CYP19), can lead to numerous toxicological effects, which are also evident from high dose human exposures to therapeutic azoles. Because of its widespread use and substantial database, epoxiconazole was selected as a representative azole fungicide. Our critical analysis concluded that anticipated human exposure to epoxiconazole would yield a margin of safety of at least three orders of magnitude for reproductive effects observed in laboratory rodent studies that are postulated to be endocrine-driven (i.e. fetal resorptions). The most sensitive ecological species is the aquatic plant Lemna (duckweed), for which the margin of safety is less protective than for human health. For humans and wildlife, endocrine disruption is not the most sensitive endpoint. It is concluded that conventional risk assessment, considering anticipated exposure levels, will be protective of both human and ecological health. Although the toxic mechanisms of other azole compounds may be similar, large differences in potency will require a case-by-case risk assessment.
Resumo:
We investigated the clinical relevance of dihydropyrimidine dehydrogenase gene (DPYD) variants to predict severe early-onset fluoropyrimidine (FP) toxicity, in particular of a recently discovered haplotype hapB3 and a linked deep intronic splice site mutation c.1129-5923C>G. Selected regions of DPYD were sequenced in prospectively collected germline DNA of 500 patients receiving FP-based chemotherapy. Associations of DPYD variants and haplotypes with hematologic, gastrointestinal, infectious, and dermatologic toxicity in therapy cycles 1-2 and resulting FP-dose interventions (dose reduction, therapy delay or cessation) were analyzed accounting for clinical and demographic covariates. Fifteen additional cases with toxicity-related therapy delay or cessation were retrospectively examined for risk variants. The association of c.1129-5923C>G/hapB3 (4.6% carrier frequency) with severe toxicity was replicated in an independent prospective cohort. Overall, c.1129-5923G/hapB3 carriers showed a relative risk of 3.74 (RR, 95% CI = 2.30-6.09, p = 2 × 10(-5)) for severe toxicity (grades 3-5). Of 31 risk variant carriers (c.1129-5923C>G/hapB3, c.1679T>G, c.1905+1G>A or c.2846A>T), 11 (all with c.1129-5923C>G/hapB3) experienced severe toxicity (15% of 72 cases, RR = 2.73, 95% CI = 1.61-4.63, p = 5 × 10(-6)), and 16 carriers (55%) required FP-dose interventions. Seven of the 15 (47%) retrospective cases carried a risk variant. The c.1129-5923C>G/hapB3 variant is a major contributor to severe early-onset FP toxicity in Caucasian patients. This variant may substantially improve the identification of patients at risk of FP toxicity compared to established DPYD risk variants (c.1905+1G>A, c.1679T>G and c.2846A>T). Pre-therapeutic DPYD testing may prevent 20-30% of life-threatening or lethal episodes of FP toxicity in Caucasian patients.
Resumo:
BACKGROUND Early identification of patients at risk of developing persistent low back pain (LBP) is crucial. OBJECTIVE Aim of this study was to identify in patients with a new episode of LBP the time point at which those at risk of developing persistent LBP can be best identified.METHODS: Prospective cohort study of 315 patients presenting to a health practitioner with a first episode of acute LBP. Primary outcome measure was functional limitation. Patients were assessed at baseline, three, six, twelve weeks and six months looking at factors of maladaptive cognition as potential predictors. Multivariate logistic regression analysis was performed for all time points. RESULTS The best time point to predict the development of persistent LBP at six months was the twelve-week follow-up (sensitivity 78%; overall predictive value 90%). Cognitions assessed at first visit to a health practitioner were not predictive. CONCLUSIONS Maladaptive cognitions at twelve weeks appear to be suitable predictors for a transition from acute to persistent LBP. Already three weeks after patients present to a health practitioner with acute LBP cognitions might influence the development of persistent LBP. Therefore, cognitive-behavioral interventions should be considered as early adjuvant LBP treatment in patients at risk of developing persistent LBP.
Resumo:
Babesia are tick-borne parasites that are increasingly considered as a threat to animal and public health. We aimed to assess the role of European free-ranging wild ruminants as maintenance mammalian hosts for Babesia species and to determine risk factors for infection. EDTA blood was collected from 222 roe deer (Capreolus c. capreolus), 231 red deer (Cervus e. elaphus), 267 Alpine chamois (Rupicapra r. rupicapra) and 264 Alpine ibex (Capra i. ibex) from all over Switzerland and analysed by PCR with pan-Babesia primers targeting the 18S rRNA gene, primers specific for B. capreoli and Babesia sp. EU1, and by sequencing. Babesia species, including B. divergens, B. capreoli, Babesia sp. EU1, Babesia sp. CH1 and B. motasi, were detected in 10.7% of all samples. Five individuals were co-infected with two Babesia species. Infection with specific Babesia varied widely between host species. Cervidae were significantly more infected with Babesia spp. than Caprinae. Babesia capreoli and Babesia sp. EU1 were mostly found in roe deer (prevalences 17.1% and 7.7%, respectively) and B. divergens and Babesia sp. CH1 only in red deer. Factors significantly associated with infection were low altitude and young age. Identification of Babesia sp. CH1 in red deer, co-infection with multiple Babesia species and infection of wild Caprinae with B. motasi and Babesia sp. EU1 are novel findings. We propose wild Caprinae as spillover or accidental hosts for Babesia species but wild Cervidae as mammalian reservoir hosts for B. capreoli, possibly Babesia sp. EU1 and Babesia sp. CH1, whereas their role regarding B. divergens is more elusive.
Resumo:
PURPOSE Rapid assessment and intervention is important for the prognosis of acutely ill patients admitted to the emergency department (ED). The aim of this study was to prospectively develop and validate a model predicting the risk of in-hospital death based on all available information available at the time of ED admission and to compare its discriminative performance with a non-systematic risk estimate by the triaging first health-care provider. METHODS Prospective cohort analysis based on a multivariable logistic regression for the probability of death. RESULTS A total of 8,607 consecutive admissions of 7,680 patients admitted to the ED of a tertiary care hospital were analysed. Most frequent APACHE II diagnostic categories at the time of admission were neurological (2,052, 24 %), trauma (1,522, 18 %), infection categories [1,328, 15 %; including sepsis (357, 4.1 %), severe sepsis (249, 2.9 %), septic shock (27, 0.3 %)], cardiovascular (1,022, 12 %), gastrointestinal (848, 10 %) and respiratory (449, 5 %). The predictors of the final model were age, prolonged capillary refill time, blood pressure, mechanical ventilation, oxygen saturation index, Glasgow coma score and APACHE II diagnostic category. The model showed good discriminative ability, with an area under the receiver operating characteristic curve of 0.92 and good internal validity. The model performed significantly better than non-systematic triaging of the patient. CONCLUSIONS The use of the prediction model can facilitate the identification of ED patients with higher mortality risk. The model performs better than a non-systematic assessment and may facilitate more rapid identification and commencement of treatment of patients at risk of an unfavourable outcome.
Resumo:
The main goals of this study were to identifythe alpine torrent catchments that are sensitive to climatic changes and to assess the robustness of the methods for the elaboration of flood and debris flow hazard zone maps to specific effects of climate changes. In this study, a procedure for the identification and localization of torrent catchments in which the climate scenarios will modify the hazard situation was developed. In two case studies, the impacts of a potential increase of precipitation intensities to the delimited hazard zones were studied. The identification and localization of the torrent and river catchments, where unfavourable changes in the hazard situation occur, could eliminate speculative and unnecessary measures against the impacts of climate changes like a general enlargement of hazard zones or a general over dimensioning of protection structures for the whole territory. The results showed a high spatial variability of the sensitivity of catchments to climate changes. In sensitive catchments, the sediment management in alpine torrents will meet future challenges due to a higher rate for sediment removal from retention basins. The case studies showed a remarkable increase of the areas affected by floods and debris flow when considering possible future precipitation intensities in hazard mapping. But, the calculated increase in extent of future hazard zones lay within the uncertainty of the methods used today for the delimitation of the hazard zones. Thus, the consideration of the uncertainties laying in the methods for the elaboration of hazard zone maps in the torrent and river catchments sensitive to climate changes would provide a useful instrument for the consideration of potential future climate conditions. The study demonstrated that weak points in protection structures in future will become more important in risk management activities.
Resumo:
Fish, like mammals, can be affected by neoplastic proliferations. As yet, there are only a very small number of studies reporting on the occurrence of tumours in koi carp Cyprinus carpio koi and only sporadic reports on the nature of the tumours or on risk factors associated with their development. Between 2008 and 2012, koi with abdominal swelling were examined pathologically: neoplastic lesions were diagnosed and classified histologically. We evaluated possible risk factors for the development of these internal neoplasms in koi carp in Switzerland, using an online 2-part questionnaire sent to fish keepers with koi affected by internal tumours and to fish keepers who had not previously reported any affected koi. Part 1 addressed all participants and focused on general information about koi husbandry and pond technical data; Part 2 addressed participants that had one or several case(s) of koi with internal tumour(s) between 2008 and 2012, and consisted of specific questions about affected koi. A total of 112 internal tumours were reported by the 353 koi keepers participating in the survey. Analysis of the obtained data revealed that tumour occurrence was significantly associated with the location (indoors vs. outdoors) and volume of the pond, frequency of water changes, origin of the koi, number of koi kept in a Pond and the use of certain pond disinfectant/medication products. Our results contribute to the identification of possible risk factors, which in turn could help to establish prophylactic measures in order to reduce the occurrence of internal neoplasms in koi.
Resumo:
Astronauts performing extravehicular activities (EVA) are at risk for occupational hazards due to a hypobaric environment, in particular Decompression Sickness (DCS). DCS results from nitrogen gas bubble formation in body tissues and venous blood. Denitrogenation achieved through lengthy staged decompression protocols has been the mainstay of prevention of DCS in space. Due to the greater number and duration of EVAs scheduled for construction and maintenance of the International Space Station, more efficient alternatives to accomplish missions without compromising astronaut safety are desirable. ^ This multi-center, multi-phase study (NASA-Prebreathe Reduction Protocol study, or PRP) was designed to identify a shorter denitrogenation protocol that can be implemented before an EVA, based on the combination of adynamia and exercise enhanced oxygen prebreathe. Human volunteers recruited at three sites (Texas, North Carolina and Canada) underwent three different combinations (“PRP phases”) of intense and light exercise prior to decompression in an altitude chamber. The outcome variables were detection of venous gas embolism (VGE) by precordial Doppler ultrasound, and clinical manifestations of DCS. Independent variables included age, gender, body mass index, oxygen consumption peak, peak heart rate, and PRP phase. Data analysis was performed both by pooling results from all study sites, and by examining each site separately. ^ Ten percent of the subjects developed DCS and 20% showed evidence of high grade VGE. No cases of DCS occurred in one particular PRP phase with use of the combination of dual-cycle ergometry (10 minutes at 75% of VO2 peak) plus 24 minutes of light EVA exercise (p = 0.04). No significant effects were found for the remaining independent variables on the occurrence of DCS. High grade VGE showed a strong correlation with subsequent development of DCS (sensitivity, 88.2%; specificity, 87.2%). In the presence of high grade VGE, the relative risk for DCS ranged from 7.52 to 35.0. ^ In summary, a good safety level can be achieved with exercise-enhanced oxygen denitrogenation that can be generalized to the astronaut population. Exercise is beneficial in preventing DCS if a specific schedule is followed, with an individualized VO2 prescription that provides a safety level that can then be applied to space operations. Furthermore, VGE Doppler detection is a useful clinical tool for prediction of altitude DCS. Because of the small number of high grade VGE episodes, the identification of a high probability DCS situation based on the presence of high grade VGE seems justified in astronauts. ^
Resumo:
Following up genetic linkage studies to identify the underlying susceptibility gene(s) for complex disease traits is an arduous yet biologically and clinically important task. Complex traits, such as hypertension, are considered polygenic with many genes influencing risk, each with small effects. Chromosome 2 has been consistently identified as a genomic region with genetic linkage evidence suggesting that one or more loci contribute to blood pressure levels and hypertension status. Using combined positional candidate gene methods, the Family Blood Pressure Program has concentrated efforts in investigating this region of chromosome 2 in an effort to identify underlying candidate hypertension susceptibility gene(s). Initial informatics efforts identified the boundaries of the region and the known genes within it. A total of 82 polymorphic sites in eight positional candidate genes were genotyped in a large hypothesis-generating sample consisting of 1640 African Americans, 1339 whites, and 1616 Mexican Americans. To adjust for multiple comparisons, resampling-based false discovery adjustment was applied, extending traditional resampling methods to sibship samples. Following this adjustment for multiple comparisons, SLC4A5, a sodium bicarbonate transporter, was identified as a primary candidate gene for hypertension. Polymorphisms in SLC4A5 were subsequently genotyped and analyzed for validation in two populations of African Americans (N = 461; N = 778) and two of whites (N = 550; N = 967). Again, SNPs within SLC4A5 were significantly associated with blood pressure levels and hypertension status. While not identifying a single causal DNA sequence variation that is significantly associated with blood pressure levels and hypertension status across all samples, the results further implicate SLC4A5 as a candidate hypertension susceptibility gene, validating previous evidence for one or more genes on chromosome 2 that influence hypertension related phenotypes in the population-at-large. The methodology and results reported provide a case study of one approach for following up the results of genetic linkage analyses to identify genes influencing complex traits. ^
Resumo:
A retrospective cohort study was conducted among 1542 patients diagnosed with CLL between 1970 and 2001 at the M. D. Anderson Cancer Center (MDACC). Changes in clinical characteristics and the impact of CLL on life expectancy were assessed across three decades (1970–2001) and the role of clinical factors on prognosis of CLL were evaluated among patients diagnosed between 1985 and 2001 using Kaplan-Meier and Cox proportional hazards method. Among 1485 CLL patients diagnosed from 1970 to 2001, patients in the recent cohort (1985–2001) were diagnosed at a younger age and an earlier stage compared to the earliest cohort (1970–1984). There was a 44% reduction in mortality among patients diagnosed in 1985–1995 compared to those diagnosed in 1970–1984 after adjusting for age, sex and Rai stage among patients who ever received treatment. There was an overall 11 years (5 years for stage 0) loss of life expectancy among 1485 patients compared with the expected life expectancy based on the age-, sex- and race-matched US general population, with a 43% decrease in the 10-year survival rate. Abnormal cytogenetics was associated with shorter progression-free (PF) survival after adjusting for age, sex, Rai stage and beta-2 microglobulin (beta-2M); whereas, older age, abnormal cytogenetics and a higher beta-2M level were adverse predictors for overall survival. No increased risk of second cancer overall was observed, however, patients who received treatment for CLL had an elevated risk of developing AML and HD. Two out of three patients who developed AML were treated with alkylating agents. In conclusion, CLL patients had improved survival over time. The identification of clinical predictors of PF/overall survival has important clinical significance. Close surveillance of the development of second cancer is critical to improve the quality of life of long-term survivors. ^
Resumo:
Apolipoprotein E (ApoE) plays a major role in the metabolism of high density and low density lipoproteins (HDL and LDL). Its common protein isoforms (E2, E3, E4) are risk factors for coronary artery disease (CAD) and explain between 16 to 23% of the inter-individual variation in plasma apoE levels. Linkage analysis has been completed for plasma apoE levels in the GENOA study (Genetic Epidemiology Network of Atherosclerosis). After stratification of the population by lipoprotein levels and body mass index (BMI) to create more homogeneity with regard to biological context for apoE levels, Hispanic families showed significant linkage on chromosome 17q for two strata (LOD=2.93 at 104 cM for a low cholesterol group, LOD=3.04 at 111 cM for a low cholesterol, high HDLC group). Replication of 17q linkage was observed for apoB and apoE levels in the unstratified Hispanic and African-American populations, and for apoE levels in African-American families. Replication of this 17q linkage in different populations and strata provides strong support for the presence of gene(s) in this region with significant roles in the determination of inter-individual variation in plasma apoE levels. Through a positional and functional candidate gene approach, ten genes were identified in the 17q linked region, and 62 polymorphisms in these genes were genotyped in the GENOA families. Association analysis was performed with FBAT, GEE, and variance-component based tests followed by conditional linkage analysis. Association studies with partial coverage of TagSNPs in the gene coding for apolipoprotein H (APOH) were performed, and significant results were found for 2 SNPs (APOH_20951 and APOH_05407) in the Hispanic low cholesterol strata accounting for 3.49% of the inter-individual variation in plasma apoE levels. Among the other candidate genes, we identified a haplotype block in the ACE1 gene that contains two major haplotypes associated with apoE levels as well as total cholesterol, apoB and LDLC levels in the unstratified Hispanic population. Identifying genes responsible for the remaining 60% of inter-individual variation in plasma apoE level, will yield new insights into the understanding of genetic interactions involved in the lipid metabolism, and a more precise understanding of the risk factors leading to CAD. ^
Resumo:
To identify genetic susceptibility loci for severe diabetic retinopathy, 286 Mexican-Americans with type 2 diabetes from Starr County, Texas completed detailed physical and ophthalmologic examinations including fundus photography for diabetic retinopathy grading. 103 individuals with moderate-to-severe non-proliferative diabetic retinopathy or proliferative diabetic retinopathy were defined as cases for this study. DNA samples extracted from study subjects were genotyped using the Affymetrix GeneChip® Human Mapping 100K Set, which includes 116,204 single nucleotide polymorphisms (SNPs) across the whole genome. Single-marker allelic tests and 2- to 8-SNP sliding-window Haplotype Trend Regression implemented in HelixTreeTM were first performed with these direct genotypes to identify genes/regions contributing to the risk of severe diabetic retinopathy. An additional 1,885,781 HapMap Phase II SNPs were imputed from the direct genotypes to expand the genomic coverage for a more detailed exploration of genetic susceptibility to diabetic retinopathy. The average estimated allelic dosage and imputed genotypes with the highest posterior probabilities were subsequently analyzed for associations using logistic regression and Fisher's Exact allelic tests, respectively. To move beyond these SNP-based approaches, 104,572 directly genotyped and 333,375 well-imputed SNPs were used to construct genetic distance matrices based on 262 retinopathy candidate genes and their 112 related biological pathways. Multivariate distance matrix regression was then used to test hypotheses with genes and pathways as the units of inference in the context of susceptibility to diabetic retinopathy. This study provides a framework for genome-wide association analyses, and implicated several genes involved in the regulation of oxidative stress, inflammatory processes, histidine metabolism, and pancreatic cancer pathways associated with severe diabetic retinopathy. Many of these loci have not previously been implicated in either diabetic retinopathy or diabetes. In summary, CDC73, IL12RB2, and SULF1 had the best evidence as candidates to influence diabetic retinopathy, possibly through novel biological mechanisms related to VEGF-mediated signaling pathway or inflammatory processes. While this study uncovered some genes for diabetic retinopathy, a comprehensive picture of the genetic architecture of diabetic retinopathy has not yet been achieved. Once fully understood, the genetics and biology of diabetic retinopathy will contribute to better strategies for diagnosis, treatment and prevention of this disease.^