894 resultados para Risk based Maintenance
Resumo:
This paper evaluates whether the Swiss monitoring programme for foreign substances in animal products fulfils basic epidemiological quality requirements, and identifies possible sources of bias in the selection of samples. The sampling was analysed over a 4-year period (2002-05). The sampling frame in 37 participating abattoirs covered 51% of all slaughtered pigs, 73% of calves, 68% of beef and 36% of cows. The analysis revealed that some sub-populations as defined by the region of origin were statistically over-represented while others were under-represented. The programme that is in accordance with European Union requirements contained some relevant bias. Patterns of under-sampled regions characterized by management type differences were identified. This could lead to an underestimate of the number of contaminated animals within the programme. Although the current sampling was stratified and partially risk-based, its efficiency could be improved by adopting a more targeted approach.
Resumo:
Early onset neonatal sepsis due to Group B streptococci (GBS) is responsible for severe morbidity and mortality of newborns. While different preventive strategies to identify women at risk are being recommended, the optimal strategy depends on the incidence of GBS-sepsis and on the prevalence of anogenital GBS colonization. We therefore aimed to assess the Group B streptococci prevalence and its consequences on different prevention strategies. We analyzed 1316 pregnant women between March 2005 and September 2006 at our institution. The prevalence of GBS colonization was determined by selective cultures of anogenital smears. The presence of risk factors was analyzed. In addition, the direct costs of screening and intrapartum antibiotic prophylaxis were estimated for different preventive strategies. The prevalence of GBS colonization was 21%. Any maternal intrapartum risk factor was present in 37%. The direct costs of different prevention strategies have been estimated as follows: risk-based: 18,500 CHF/1000 live births, screening-based: 50,110 CHF/1000 live births, combined screening- and risk-based: 43,495/1000 live births. Strategies to prevent GBS-sepsis in newborn are necessary. With our colonization prevalence of 21%, and the intrapartum risk profile of women, the screening-based approach seems to be superior as compared to a risk-based approach.
Evaluation of control and surveillance strategies for classical swine fever using a simulation model
Resumo:
Classical swine fever (CSF) outbreaks can cause enormous losses in naïve pig populations. How to best minimize the economic damage and number of culled animals caused by CSF is therefore an important research area. The baseline CSF control strategy in the European Union and Switzerland consists of culling all animals in infected herds, movement restrictions for animals, material and people within a given distance to the infected herd and epidemiological tracing of transmission contacts. Additional disease control measures such as pre-emptive culling or vaccination have been recommended based on the results from several simulation models; however, these models were parameterized for areas with high animal densities. The objective of this study was to explore whether pre-emptive culling and emergency vaccination should also be recommended in low- to moderate-density areas such as Switzerland. Additionally, we studied the influence of initial outbreak conditions on outbreak severity to improve the efficiency of disease prevention and surveillance. A spatial, stochastic, individual-animal-based simulation model using all registered Swiss pig premises in 2009 (n=9770) was implemented to quantify these relationships. The model simulates within-herd and between-herd transmission (direct and indirect contacts and local area spread). By varying the four parameters (a) control measures, (b) index herd type (breeding, fattening, weaning or mixed herd), (c) detection delay for secondary cases during an outbreak and (d) contact tracing probability, 112 distinct scenarios were simulated. To assess the impact of scenarios on outbreak severity, daily transmission rates were compared between scenarios. Compared with the baseline strategy (stamping out and movement restrictions) vaccination and pre-emptive culling neither reduced outbreak size nor duration. Outbreaks starting in a herd with weaning piglets or fattening pigs caused higher losses regarding to the number of culled premises and were longer lasting than those starting in the two other index herd types. Similarly, larger transmission rates were estimated for these index herd type outbreaks. A longer detection delay resulted in more culled premises and longer duration and better transmission tracing increased the number of short outbreaks. Based on the simulation results, baseline control strategies seem sufficient to control CSF in low-medium animal-dense areas. Early detection of outbreaks is crucial and risk-based surveillance should be focused on weaning piglet and fattening pig premises.
Resumo:
Pork occupies an important place in the diet of the population of Nagaland, one of the North East Indian states. We carried out a pilot study along the pork meat production chain, from live animal to end consumer. The goal was to obtain information about the presence of selected food borne hazards in pork in order to assess the risk deriving from these hazards to the health of the local consumers and make recommendations for improving food safety. A secondary objective was to evaluate the utility of risk-based approaches to food safety in an informal food system. We investigated samples from pigs and pork sourced at slaughter in urban and rural environments, and at retail, to assess a selection of food-borne hazards. In addition, consumer exposure was characterized using information about hygiene and practices related to handling and preparing pork. A qualitative hazard characterization, exposure assessment and hazard characterization for three representative hazards or hazard proxies, namely Enterobacteriaceae, T. solium cysticercosis and antibiotic residues, is presented. Several important potential food-borne pathogens are reported for the first time including Listeria spp. and Brucella suis. This descriptive pilot study is the first risk-based assessment of food safety in Nagaland. We also characterise possible interventions to be addressed by policy makers, and supply data to inform future risk assessments.
Resumo:
There is growing evidence for the development of posttraumatic stress symptoms as a consequence of acute cardiac events. Acute coronary syndrome (ACS) patients experience a range of acute cardiac symptoms, and these may cluster together in specific patterns. The objectives of this study were to establish distinct symptom clusters in ACS patients, and to investigate whether the experience of different types of symptom clusters are associated with posttraumatic symptom intensity at six months. ACS patients were interviewed in hospital within 48 h of admission, 294 patients provided information on symptoms before hospitalisation, and cluster analysis was used to identify patterns. Posttraumatic stress symptoms were assessed in 156 patients at six months. Three symptom clusters were identified; pain symptoms, diffuse symptoms and symptoms of dyspnea. In multiple regression analyses, adjusting for sociodemographic, clinical and psychological factors, the pain symptoms cluster (β = .153, P = .044) emerged as a significant predictor of posttraumatic symptom severity at six months. A marginally significant association was observed between symptoms of dyspnea and reduced intrusive symptoms at six months (β = -.156, P = .061). Findings suggest acute ACS symptoms occur in distinct clusters, which may have distinctive effects on intensity of subsequent posttraumatic symptoms. Since posttraumatic stress is associated with adverse outcomes, identifying patients at risk based on their symptom experience during ACS may be useful in targeting interventions.
Resumo:
BACKGROUND Acetabular fractures and surgical interventions used to treat them can result in nerve injuries. To date, only small case studies have tried to explore the frequency of nerve injuries and their association with patient and treatment characteristics. High-quality data on the risk of traumatic and iatrogenic nerve lesions and their epidemiology in relation to different fracture types and surgical approaches are lacking. QUESTIONS/PURPOSES The purpose of this study was to determine (1) the proportion of patients who develop nerve injuries after acetabular fracture; (2) which fracture type(s) are associated with increased nerve injury risk; and (3) which surgical approach was associated with the highest proportion of patients developing nerve injuries using data from the German Pelvic Trauma Registry. Two secondary aims were (4) to assess hospital volume-nerve-injury relationship; and (5) internal data validity. METHODS Between March 2001 and June 2012, 2236 patients with acetabular fractures were entered into a prospectively maintained registry from 29 hospitals; of those, 2073 (92.7%) had complete records on the endpoints of interest in this retrospective study and were analyzed. The neurological status in these patients was captured at their admission and at the discharge. A total of 1395 of 2073 (67%) patients underwent surgery, and the proportions of intervention-related and other hospital-acquired nerve injuries were obtained. Overall proportions of patients developing nerve injuries, risk based on fracture type, and risk of surgical approach type were analyzed. RESULTS The proportion of patients being diagnosed with nerve injuries at hospital admission was 4% (76 of 2073) and at discharge 7% (134 or 2073). Patients with fractures of the "posterior wall" (relative risk [RR], 2.0; 95% confidence interval [CI], 1.4-2.8; p=0.001), "posterior column and posterior wall" (RR, 2.9; CI, 1.6-5.0; p=0.002), and "transverse+posterior wall" fracture (RR, 2.1; CI, 1.3-3.5; p=0.010) were more likely to have nerve injuries at hospital discharge. The proportion of patients with intervention-related nerve injuries and that of patients with other hospital-acquired nerve injuries was 2% (24 of 1395 and 46 of 2073, respectively). They both were associated with the Kocher-Langenbeck approach (RR, 3.0; CI, 1.4-6.2; p=0.006; and RR, 2.4; CI, 1.4-4.3; p=0.004, respectively). CONCLUSIONS Acetabular fractures with the involvement of posterior wall were most commonly accompanied with nerve injuries. The data suggest also that Kocher-Langenbeck approach to the pelvic ring is associated with a higher risk of perioperative nerve injuries. Trauma surgeons should be aware of common nerve injuries, particularly in posterior wall fractures. The results of the study should help provide patients with more exact information on the risk of perioperative nerve injuries in acetabular fractures. LEVEL OF EVIDENCE Level III, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.
Resumo:
The purpose of this dissertation was to explore and describe the factors that influence the safer sex choices of African-American college women. The pandemic of HIV and the prevalence of other sexually transmitted diseases has disproportionately affected African-American females. As young women enter college they are faced with a myriad of choices. Unprotected sexual exploration is one choice that can lead to deadly consequences. This dissertation explores, through in-depth interviews, the factors associated with the decision to practice or not practice safe sex. ^ The first study describes the factors associated with increased sexual risk taking among African-American college women. Sexual risk taking or sex without a condom was found to be more likely when issues of self or partner pleasure were raised. Participants were also likely to have sexual intercourse without a condom if they desired a long term relationship with their partner. ^ The second study examined safe sex decision making processes among a group of African-American college women. Women were found to employ both emotional and philosophical strategies to determine their safe sex behavior. These strategies range from assessing a partner's physical capabilities and appearance to length of the dating relationship. ^ The third study explores the association between knowledge and risk perception as predictors for safer sex behaviors. Knowledge of HIV/AIDS and other STDs was not found to be a determinant of safer sex behavior. Perception of personal risk was also not highly correlated with consistent safer sex behavior. ^ These studies demonstrate the need for risk-based safer sex education and intervention programs. The current climate of knowledge-based program development insures that women will continue to predicate their decision to practice safer sex on their limited perception and understanding of the risks associated with unprotected sexual behavior. Further study into the emotional and philosophical determinants of sexual behavior is necessary for the realistic design of applicable and meaningful interventions. ^
Resumo:
This chapter provides a detailed discussion of the evidence on housing and mortgage lending discrimination, as well as the potential impacts of such discrimination on minority outcomes like homeownership and neighborhood environment. The paper begins by discussing conceptual issues surrounding empirical analyses of discrimination including explanations for why discrimination takes place, defining different forms of discrimination, and the appropriate interpretation of observed racial and ethnic differences in treatment or outcomes. Next, the paper reviews evidence on housing market discrimination starting with evidence of segregation and price differences in the housing market and followed by direct evidence of discrimination by real estate agents in paired testing studies. Finally, mortgage market discrimination and barriers in access to mortgage credit are discussed. This discussion begins with an assessment of the role credit barriers play in explaining racial and ethnic differences in homeownership and follows with discussions of analyses of underwriting and the price of credit based on administrative and private sector data sources including analyses of the subprime market. The paper concludes that housing discrimination has declined especially in the market for owner-occupied housing and does not appear to play a large role in limiting the neighborhood choices of minority households or the concentration of minorities into central cities. On the other hand, the patterns of racial centralization and lower home ownership rates of African-Americans appear to be related to each other, and lower minority homeownership rates are in part attributable to barriers in the market for mortgage credit. The paper presents considerable evidence of racial and ethnic differences in mortgage underwriting, as well as additional evidence suggesting these differences may be attributable to differential provision of coaching, assistance, and support by loan officers. At this point, innovation in loan products, the shift towards risk based pricing, and growth of the subprime market have not mitigated the role credit barriers play in explaining racial and ethnic differences in homeownership. Further, the growth of the subprime lending industry appears to have segmented the mortgage market in terms of geography leading to increased costs of relying on local/neighborhood sources of mortgage credit and affecting the integrity of many low-income minority neighborhoods through increased foreclosure rates.
Resumo:
My dissertation focuses on developing methods for gene-gene/environment interactions and imprinting effect detections for human complex diseases and quantitative traits. It includes three sections: (1) generalizing the Natural and Orthogonal interaction (NOIA) model for the coding technique originally developed for gene-gene (GxG) interaction and also to reduced models; (2) developing a novel statistical approach that allows for modeling gene-environment (GxE) interactions influencing disease risk, and (3) developing a statistical approach for modeling genetic variants displaying parent-of-origin effects (POEs), such as imprinting. In the past decade, genetic researchers have identified a large number of causal variants for human genetic diseases and traits by single-locus analysis, and interaction has now become a hot topic in the effort to search for the complex network between multiple genes or environmental exposures contributing to the outcome. Epistasis, also known as gene-gene interaction is the departure from additive genetic effects from several genes to a trait, which means that the same alleles of one gene could display different genetic effects under different genetic backgrounds. In this study, we propose to implement the NOIA model for association studies along with interaction for human complex traits and diseases. We compare the performance of the new statistical models we developed and the usual functional model by both simulation study and real data analysis. Both simulation and real data analysis revealed higher power of the NOIA GxG interaction model for detecting both main genetic effects and interaction effects. Through application on a melanoma dataset, we confirmed the previously identified significant regions for melanoma risk at 15q13.1, 16q24.3 and 9p21.3. We also identified potential interactions with these significant regions that contribute to melanoma risk. Based on the NOIA model, we developed a novel statistical approach that allows us to model effects from a genetic factor and binary environmental exposure that are jointly influencing disease risk. Both simulation and real data analyses revealed higher power of the NOIA model for detecting both main genetic effects and interaction effects for both quantitative and binary traits. We also found that estimates of the parameters from logistic regression for binary traits are no longer statistically uncorrelated under the alternative model when there is an association. Applying our novel approach to a lung cancer dataset, we confirmed four SNPs in 5p15 and 15q25 region to be significantly associated with lung cancer risk in Caucasians population: rs2736100, rs402710, rs16969968 and rs8034191. We also validated that rs16969968 and rs8034191 in 15q25 region are significantly interacting with smoking in Caucasian population. Our approach identified the potential interactions of SNP rs2256543 in 6p21 with smoking on contributing to lung cancer risk. Genetic imprinting is the most well-known cause for parent-of-origin effect (POE) whereby a gene is differentially expressed depending on the parental origin of the same alleles. Genetic imprinting affects several human disorders, including diabetes, breast cancer, alcoholism, and obesity. This phenomenon has been shown to be important for normal embryonic development in mammals. Traditional association approaches ignore this important genetic phenomenon. In this study, we propose a NOIA framework for a single locus association study that estimates both main allelic effects and POEs. We develop statistical (Stat-POE) and functional (Func-POE) models, and demonstrate conditions for orthogonality of the Stat-POE model. We conducted simulations for both quantitative and qualitative traits to evaluate the performance of the statistical and functional models with different levels of POEs. Our results showed that the newly proposed Stat-POE model, which ensures orthogonality of variance components if Hardy-Weinberg Equilibrium (HWE) or equal minor and major allele frequencies is satisfied, had greater power for detecting the main allelic additive effect than a Func-POE model, which codes according to allelic substitutions, for both quantitative and qualitative traits. The power for detecting the POE was the same for the Stat-POE and Func-POE models under HWE for quantitative traits.
Resumo:
Documented risks of physical activity include reduced bone mineral density at high activity volume, and sudden cardiac death among adults and adolescents. Further illumination of these risks is needed to inform future public health guidelines. The present research seeks to 1) quantify the association between physical activity and bone mineral density (BMD) across a broad range of activity volume, 2) assess the utility of an existing pre-screening questionnaire among US adults, and 3) determine if pre-screening risk stratification by questionnaire predicts referral to physician among Texas adolescents. ^ Among 9,468 adults 20 years of age or older in the National Health and Nutrition Examination Survey (NHANES) 2007-2010, linear regression analyses revealed generally higher BMD at the lumbar spine and proximal femur with greater reported activity volume. Only lumbar BMD in women was unassociated with activity volume. Among men, BMD was similar at activity beyond four times the minimum volume recommended in the Physical Activity Guidelines. These results suggest that the range of activity reported by US adults is not associated with low BMD at either site. ^ The American Heart Association / American College of Sports Medicine Preparticipation Questionnaire (AAPQ) was applied to 6,661 adults 40 years of age or older from NHANES 2001-2004 by using NHANES responses to complete AAPQ items. Following AAPQ referral criteria, 95.5% of women and 93.5% of men would be referred to a physician before exercise initiation, suggesting little utility for the AAPQ among adults aged 40 years or older. Unnecessary referral before exercise initiation may present a barrier to exercise adoption and may strain an already stressed healthcare infrastructure. ^ Among 3181 athletes in the Texas Adolescent Athlete Heart Screening Registry, 55.2% of boys and 62.2% of girls were classified as high-risk based on questionnaire answers. Using sex-stratified contingency table analyses, risk categories were not significantly associated with referral to physician based on electrocardiogram or echocardiogram, nor were they associated with confirmed diagnoses on follow-up. Additional research is needed to identify which symptoms are most closely related to sudden cardiac death, and determine the best methods for rapid and reliable assessment. ^ In conclusion, this research suggests that the volume of activity reported by US adults is not associated with low BMD at two clinically relevant sites, casts doubts on the utility of two existing cardiac screening tools, and raises concern about barriers to activity erected through ineffective screening. These findings augment existing research in this area that may inform revisions to the Physical Activity Guidelines regarding risk mitigation.^
Resumo:
Brownfield rehabilitation is an essential step for sustainable land-use planning and management in the European Union. In brownfield regeneration processes, the legacy contamination plays a significant role, firstly because of the persistent contaminants in soil or groundwater which extends the existing hazards and risks well into the future; and secondly, problems from historical contamination are often more difficult to manage than contamination caused by new activities. Due to the complexity associated with the management of brownfield site rehabilitation, Decision Support Systems (DSSs) have been developed to support problem holders and stakeholders in the decision-making process encompassing all phases of the rehabilitation. This paper presents a comparative study between two DSSs, namely SADA (Spatial Analysis and Decision Assistance) and DESYRE (Decision Support System for the Requalification of Contaminated Sites), with the main objective of showing the benefits of using DSSs to introduce and process data and then to disseminate results to different stakeholders involved in the decision-making process. For this purpose, a former car manufacturing plant located in the Brasov area, Central Romania, contaminated chiefly by heavy metals and total petroleum hydrocarbons, has been selected as a case study to apply the two examined DSSs. Major results presented here concern the analysis of the functionalities of the two DSSs in order to identify similarities, differences and complementarities and, thus, to provide an indication of the most suitable integration options.
Resumo:
Monilinia spp. (M. laxa, M. fructigena y M. fructicola) causa marchitez en brotes y flores, chancros en ramas y podredumbre de la fruta de hueso provocando pérdidas económicas importantes en años con climatología favorable para el desarrollo de la enfermedad, particularmente en variedades tardías de melocotonero y nectarino. En estos huéspedes en España, hasta el momento, la especie predominante es M. laxa y, en menor proporción, M. fructigena. La reciente introducción en Europa de la especie de cuarentena M. fructicola hace necesaria una detección e identificación rápida de cada una de las especies. Además, hay diversos aspectos de la etiología y epidemiología de la enfermedad que no se conocen en las condiciones de cultivo españolas. En un primer objetivo de esta Tesis se ha abordado la detección e identificación de las especies de Monilinia spp. causantes de podredumbre parda. El estudio de las bases epidemiológicas para el control de la enfermedad constituye el fin del segundo objetivo. Para la detección del género Monilinia en material vegetal por PCR, diferenciándolo de otros hongos presentes en la superficie del melocotonero, se diseñaron una pareja de cebadores siguiendo un análisis del ADN ribosomal. La discriminación entre especies de Monilinia se consiguió utilizando marcadores SCAR (región amplificada de secuencia caracterizada), obtenidos después de un estudio de marcadores polimórficos de ADN amplificados al azar (RAPDs). También fue diseñado un control interno de amplificación (CI) basado en la utilización de un plásmido con secuencias de los cebadores diferenciadores del género, para ser utilizado en el protocolo de diagnóstico de la podredumbre parda con el fin de reconocer falsos negativos debidos a la inhibición de PCR por componentes celulares del material vegetal. Se disponía de un kit comercial que permitía distinguir Monilinia de otros géneros y M. fructicola del resto de especies mediante anticuerpos monoclonales utilizando la técnica DAS-ELISA. En esta Tesis se probaron diferentes fuentes de material como micelio ó conidias procedentes de cultivos en APD, o el micelio de la superficie de frutas o de momias frescas, como formas de antígeno. Los resultados obtenidos con ELISA se compararon con la identificación por métodos morfológico-culturales y por PCR con los cebadores desarrollados en esta Tesis. Los resultados demostraron la posibilidad de una detección temprana en frutas frescas por este método, realzando las posibilidades de una diagnosis temprana para una prevención más eficaz de M. fructicola en fruta de hueso. El estudio epidemiológico de la enfermedad comenzó con la determinación de las principales fuentes de inóculo primario y su importancia relativa en melocotoneros y nectarinos del valle del Ebro. Para ello se muestrearon 9 huertos durante los años 2003 a 2005 recogiendo todas las momias, frutos abortados, gomas, chancros, y brotes necróticos en los árboles. También se recogieron brotes aparentemente sanos y muestras de material vegetal situados en el suelo. En estas muestras se determinó la presencia de Monilinia spp. Los resultados mostraron que la fuente principal de inóculo son las momias que se quedan en los árboles en las que la supervivencia del hongo tras el invierno es muy alta. También son fuentes de inóculo las momias del suelo y los brotes necróticos. De aquí se deriva que una recomendación importante para los agricultores es que deben eliminar este material de los huertos. Un aspecto no estudiado en melocotonero o nectarino en España es la posible relación que puede darse entre la incidencia de infecciones latentes en los frutos inmaduros a lo largo del cultivo y la incidencia de podredumbre en los frutos en el momento de la recolección y en postcosecha. Esta relación se había observado previamente en otros frutales de hueso infectados con M. fructicola en diversos países del mundo. Para estudiar esta relación se realizaron ensayos en cinco huertos comerciales de melocotonero y nectarino situados en el Valle del Ebro en cuatro estados fenológicos durante los años 2000-2002. No se observaron infecciones latentes en botón rosa, dándose la máxima incidencia en precosecha, aunque en algunos huertos se daba otro pico en el endurecimiento del embrión. La especie prevaleciente fue M. laxa. Se obtuvo una correlación positiva significativa entre la incidencia de infecciones latentes y la incidencia de podredumbre en postcosecha. Se desarrolló también un modelo de predicción de las infecciones latentes en función de la temperatura (T) y el periodo de humectación (W). Este modelo indicaba que T y W explicaban el 83% de la variación en la incidencia de infecciones latentes causadas por Monilinia spp. Por debajo de 8ºC no se predecían latentes, necesitándose más de 22h de W para predecir la ocurrencia de latentes con T = 8ºC, mientras que solo se necesitaban 5h de W a 25ºC. Se hicieron también ensayos en condiciones controladas para determinar la relación entre la incidencia de las infecciones latentes, las condiciones ambientales (T y W), la concentración de inóculo del patógeno (I) y el estado de desarrollo del huésped (S) y para validar el modelo de predicción desarrollado con los experimentos de campo. Estos ensayos se llevaron cabo con flores y frutos de nectarino procedentes de un huerto comercial en seis estados fenológicos en los años 2004 y 2005, demostrándose que la incidencia de podredumbre en postcosecha y de infecciones latentes estaba afectada por T, W, I y S. En los frutos se producían infecciones latentes cuando la T no era adecuada para el desarrollo de podredumbre. Una vez desarrollado el embrión eran necesarias más de 4-5h de W al día y un inóculo superior a 104 conidias ml-1 para que se desarrollase o podredumbre o infección latente. La ecuación del modelo obtenido con los datos de campo era capaz de predecir los datos observados en estos experimentos. Para evaluar el efecto del inóculo de Monilinia spp. en la incidencia de infecciones latentes y de podredumbre de frutos en postcosecha se hicieron 11 experimentos en huertos comerciales de melocotonero y nectarino del Valle del Ebro durante 2002 a 2005. Se observó una correlación positiva entre los números de conidias de Monilinia spp. en la superficie de los frutos y la incidencia de infecciones latentes De los estudios anteriores se deducen otras dos recomendaciones importantes para los agricultores: las estrategias de control deben tener en cuenta las infecciones latentes y estimar el riesgo potencial de las mismas basándose en la T y W. Deben tener también en cuenta la concentración de esporas de Monilinia spp. en la superficie de los frutos para disminuir el riesgo de podredumbre parda. El conocimiento de la estructura poblacional de los patógenos sienta las bases para establecer métodos más eficaces de manejo de las enfermedades. Por ello en esta Tesis se ha estudiado el grado de diversidad genética entre distintas poblaciones de M. laxa en diferentes localidades españolas utilizando 144 marcadores RAPDs (59 polimórficos y 85 monomórficos) y 21 aislados. El análisis de la estructura de la población reveló que la diversidad genética dentro de las subpoblaciones (huertos) (HS) representaba el 97% de la diversidad genética (HT), mientras que la diversidad genética entre subpoblaciones (DST) sólo representaba un 3% del total de esta diversidad. La magnitud relativa de la diferenciación génica entre subpoblaciones (GST) alcanzaba 0,032 y el número estimado de migrantes por generación (Nm) fue de 15,1. Los resultados obtenidos en los dendrogramas estaban de acuerdo con el análisis de diversidad génica. Las agrupaciones obtenidas eran independientes del huerto de procedencia, año o huésped. En la Tesis se discute la importancia relativa de las diferentes fuentes evolutivas en las poblaciones de M. laxa. Finalmente se realizó un muestreo en distintos huertos de melocotonero y nectarino del Valle del Ebro para determinar la existencia o no de aislados resistentes a los fungicidas del grupo de los benzimidazoles y las dicarboximidas, fungicidas utilizados habitualmente para el control de la podredumbre parda y con alto riesgo de desarrollar resistencia en las poblaciones patógenas. El análisis de 114 aislados de M. laxa con los fungicidas Benomilo (bencimidazol) (1Bg m.a ml-1), e Iprodiona (dicarboximida) (5Bg m.a ml-1), mostró que ninguno era resistente en las dosis ensayadas. Monilinia spp. (M. laxa, M. fructigena and M. fructicola) cause bud and flower wilt, canker in branches and stone fruit rot giving rise important economic losses in years with appropriate environmental conditions, it is particularly important in late varieties of peach and nectarine. Right now, M. laxa is the major species for peach and nectarine in Spain followed by M. fructigena, in a smaller proportion. The recent introduction of the quarantine organism M. fructicola in Europe makes detection and identification of each one of the species necessary. In addition, there are different aspects of disease etiology and epidemiology that are not well known in Spain conditions. The first goal of this Thesis was the detection and identification of Monilinia spp. causing brown rot. Study of the epidemiology basis for disease control was the second objective. A pair of primers for PCR detection was designed based on the ribosomal DNA sequence in order to detect Monilinia spp. in plant material and to discriminate it from other fungi colonizing peach tree surface. Discrimination among Monilinia spp. was successful by SCAR markers (Sequence Characterized Amplified Region), obtained after a random amplified polymorphic DNA markers (RAPDs) study. An internal control for the PCR (CI) based on the use of a mimic plasmid designed on the primers specific for Monilinia was constructed to be used in diagnosis protocol for brown rot in order to avoid false negatives due to the inhibition of PCR as consequence of remained plant material. A commercial kit based on DAS-ELISA and monoclonals antibodies was successfully tested to distinguish Monilinia from other fungus genera and M. fructicola from other Monilinia species. Different materials such as micelium or conidias from APD cultures, or micelium from fresh fruit surfaces or mummies were tested in this Thesis, as antigens. Results obtained by ELISA were compared to classical identification by morphologic methods and PCR with the primers developed in this Thesis. Results demonstrated the possibility of an early detection in fresh fruits by this method for a more effective prevention of M. fructicola in stone fruit. The epidemiology study of the disease started with the determination of the main sources of primary inoculum and its relative importance in peach trees and nectarines in the Ebro valley. Nine orchards were evaluated during years 2003 to 2005 collecting all mummies, aborted fruits, rubbers, cankers, and necrotic buds from the trees. Apparently healthy buds and plant material located in the ground were also collected. In these samples presence of Monilinia spp. was determined. Results showed that the main inoculum sources are mummies that stay in the trees and where fungus survival after the winter is very high. Mummies on the ground and the necrotics buds are also sources of inoculum. As consequence of this an important recommendation for the growers is the removal of this material of the orchards. An important issue not well studied in peach or nectarine in Spain is the possible relationship between the incidence of latent infections in the immature fruits and the incidence of fruit rot at harvesting and postharvesting. This relationship had been previously shown in other stone fruit trees infected with M. fructicola in different countries over the world. In order to study this relationship experiments were run in five commercial peach and nectarine orchards located in the Ebro Valley in four phenologic states from 2000 to 2002. Latent infections were not observed in pink button, the maxima incidence arise in preharvest, although in some orchards another increase occurred in the embryo hardening. The most prevalence species was M. laxa. A significant positive correlation between the incidence of latent infections and the incidence of rot in postharvest was obtained. A prediction model of the latent infections based on the temperature (T) and the wetness duration (W) was also developed. This model showed that T and W explained 83% of the variation in latent infection incidence caused by Monilinia spp. Below 8ºC latent infection was not predicted, more than 22h of W with T = 8ºC were needed to predict latent infection occurrence of, whereas at 25ºC just 5h of W were enough. Tests under controlled conditions were also made to determine the relationship among latent infections incidence, environmental conditions (T and W), inoculum concentration of the pathogen (I) and development state of the host (S) to validate the prediction model developed on the field experiments. These tests were made with flowers and fruits of nectarine coming from a commercial orchard, in six phenologic states in 2004 and 2005, showing that incidence of rot in postharvest and latent infections were affected by T, W, I and S. In fruits latent infections took place when the T was not suitable for rot development. Once developed the embryo, more than 4-5h of W per day and higher inoculums (104 conidia ml-1) were necessary for rot or latent infection development. The equation of the model obtained with the field data was able to predict the data observed in these experiments. In order to evaluate the effect of inoculum of Monilinia spp. in the incidence of latent infections and of rot of fruits in postharvest, 11 experiments in commercial orchards of peach and nectarine of the Ebro Valley were performed from 2002 to 2005. A positive correlation between the conidial numbers of Monilinia spp. in the surface of the fruits and the incidence of latent infections was observed. Based on those studies other two important recommendations for the agriculturists are deduced: control strategies must consider the latent infections and potential risk based on the T and W. Spores concentration of Monilinia spp. in the surface of fruits must be also taken in concern to reduce the brown rot risk. The knowledge of the population structure of the pathogens determines the bases to establish more effective methods of diseases handling. For that reason in this Thesis the degree of genetic diversity among different M. laxa populations has been studied in different Spanish locations using 144 RAPDs markers (59 polymorphic and 85 monomorphics) on 21 fungal isolates. The analysis of the structure of the population revealed that the genetic diversity within the subpopulations (orchards) (HS) represented 97% of the genetic diversity (HT), whereas the genetic diversity between subpopulations (DST) only represented a 3% of the total of this diversity. The relative magnitude of the genic differentiation between subpopulations (GST) reached 0.032 and the considered number of migrantes by generation (Nm) was of 15.1. The results obtained in dendrograms were in agreement with the analysis of genic diversity. The obtained groupings were independent of the orchard of origin, year or host. In the Thesis the relative importance of the different evolutionary sources in the populations from M. laxa is discussed. Finally a sampling of resistant isolates in different orchards from peach and nectarine of Ebro Valley was made to determine the existence of fungicide resistance of the group of benzimidazoles and the dicarboximidas, fungicides used habitually for the control of rot brown and with high risk of resistance developing in the pathogenic populations. The analysis of 114 isolated ones of M. laxa with the fungicides Benomilo (bencimidazol) (1Bg m.a ml-1), and Iprodiona (dicarboximida) (5Bg m.a ml-1), showed no resistant in the doses evaluated.
Resumo:
Sustaining irrigated agriculture to meet food production needs while maintaining aquatic ecosystems is at the heart of many policy debates in various parts of the world, especially in arid and semi-arid areas. Researchers and practitioners are increasingly calling for integrated approaches, and policy-makers are progressively supporting the inclusion of ecological and social aspects in water management programs. This paper contributes to this policy debate by providing an integrated economic-hydrologic modeling framework that captures the socio-economic and environmental effects of various policy initiatives and climate variability. This modeling integration includes a risk-based economic optimization model and a hydrologic water management simulation model that have been specified for the Middle Guadiana basin, a vulnerable drought-prone agro-ecological area with highly regulated river systems in southwest Spain. Namely, two key water policy interventions were investigated: the implementation of minimum environmental flows (supported by the European Water Framework Directive, EU WFD), and a reduction in the legal amount of water delivered for irrigation (planned measure included in the new Guadiana River Basin Management Plan, GRBMP, still under discussion). Results indicate that current patterns of excessive water use for irrigation in the basin may put environmental flow demands at risk, jeopardizing the WFD s goal of restoring the ?good ecological status? of water bodies by 2015. Conflicts between environmental and agricultural water uses will be stressed during prolonged dry episodes, and particularly in summer low-flow periods, when there is an important increase of crop irrigation water requirements. Securing minimum stream flows would entail a substantial reduction in irrigation water use for rice cultivation, which might affect the profitability and economic viability of small rice-growing farms located upstream in the river. The new GRBMP could contribute to balance competing water demands in the basin and to increase economic water productivity, but might not be sufficient to ensure the provision of environmental flows as required by the WFD. A thoroughly revision of the basin s water use concession system for irrigation seems to be needed in order to bring the GRBMP in line with the WFD objectives. Furthermore, the study illustrates that social, economic, institutional, and technological factors, in addition to bio-physical conditions, are important issues to be considered for designing and developing water management strategies. The research initiative presented in this paper demonstrates that hydro-economic models can explicitly integrate all these issues, constituting a valuable tool that could assist policy makers for implementing sustainable irrigation policies.
Resumo:
La sequía afecta a todos los sectores de la sociedad y se espera que su frecuencia e intensidad aumente debido al cambio climático. Su gestión plantea importantes retos en el futuro. El enfoque de riesgo, que promueve una respuesta proactiva, se identifica como un marco de gestión apropiado que se está empezando a consolidar a nivel internacional. Sin embargo, es necesario contar con estudios sobre las características de la gestión de la sequía bajo este enfoque y sus implicaciones en la práctica. En esta tesis se evalúan diversos elementos que son relevantes para la gestión de la sequía, desde diferentes perspectivas, con especial énfasis en el componente social de la sequía. Para esta investigación se han desarrollado cinco estudios: (1) un análisis de las leyes de emergencia aprobadas durante la sequía 2005-2008 en España; (2) un estudio sobre la percepción de la sequía de los agricultores a nivel local; (3) una evaluación de las características y enfoque de gestión en seis casos de estudio a nivel europeo; (4) un análisis sistemático de los estudios de cuantificación de la vulnerabilidad a la sequía a nivel global; y (5) un análisis de los impactos de la sequía a partir en una base de datos europea. Los estudios muestran la importancia de la capacidad institucional como un factor que promueve y facilita la adopción del enfoque de riesgo. Al mismo tiempo, la falta de estudios de vulnerabilidad, el escaso conocimiento de los impactos y una escasa cultura de la evaluación post-sequía destacan como importantes limitantes para aprovechar el conocimiento que se genera en la gestión de un evento. A través del estudio de las leyes de sequía se evidencia la existencia de incoherencias entre cómo se define el problema de la sequía y las soluciones que se plantean, así como el uso de un discurso de securitización para perseguir objetivos más allá de la gestión de la sequía. El estudio de percepción permite identificar la existencia de diferentes problemas y percepciones de la sequía y muestra cómo los regantes utilizan principalmente los impactos para identificar y caracterizar la severidad de un evento, lo cual difiere de las definiciones predominantes a otros niveles de gestión. Esto evidencia la importancia de considerar la diversidad de definiciones y percepciones en la gestión, para realizar una gestión más ajustada a las necesidades de los diferentes sectores y colectivos. El análisis de la gestión de la sequía en seis casos de estudio a nivel europeo ha permitido identificar diferentes niveles de adopción del enfoque de riesgo en la práctica. El marco de análisis establecido, que se basa en seis dimensiones de análisis y 21 criterios, ha resultado ser una herramienta útil para diagnosticar los elementos que funcionan y los que es necesario mejorar en relación a la gestión del riesgo a la sequía. El análisis sistemático de los estudios de vulnerabilidad ha evidenciado la heterogeneidad en los marcos conceptuales utilizados así como debilidades en los factores de vulnerabilidad que se suelen incluir, en muchos casos derivada de la falta de datos. El trabajo sistemático de recolección de información sobre impactos de la sequía ha evidenciado la escasez de información sobre el tema a nivel europeo y la importancia de la gestión de la información. La base de datos de impactos desarrollada tiene un gran potencial como herramienta exploratoria y orientativa del tipo de impactos que produce la sequía en cada región, pero todavía presenta algunos retos respecto a su contenido, proceso de gestión y utilidad práctica. Existen importantes limitaciones vinculadas con el acceso y la disponibilidad de información y datos relevantes vinculados con la gestión de la sequía y todos sus componentes. La participación, los niveles de gestión, la perspectiva sectorial y las relaciones entre los componentes de gestión del riesgo considerados constituyen aspectos críticos que es necesario mejorar en el futuro. Así, los cinco artículos en su conjunto presentan ejemplos concretos que ayudan a conocer mejor la gestión de la sequía y que pueden resultar de utilidad para políticos, gestores y usuarios. ABSTRACT Drought affects all sectors and their frequency and intensity is expected to increase due to climate change. Drought management poses significant challenges in the future. Undertaking a drought risk management approach promotes a proactive response, and it is starting to consolidate internationally. However, it is still necessary to conduct studies on the characteristics of drought risk management and its practical implications. This thesis provides an evaluation of various relevant aspects of drought management from different perspectives and with special emphasis on the social component of droughts. For the purpose of this research a number of five studies have been carried out: (1) analysis of the emergency laws adopted during the 2005-2008 drought in Spain; (2) study of farmers perception of drought at a local level; (3) assessment of the characteristics and drought management issues in six case studies across Europe; (4) systematic analysis of drought vulnerability assessments; and (5) analysis of drought impacts from an European impacts text-based database. The results show the importance of institutional capacity as a factor that promotes and facilitates the adoption of a risk approach. In contrast, the following issues are identified as the main obstacles to take advantage of the lessons learnt: (1) lack of vulnerability studies, (2) limited knowledge about the impact and (3) limited availability of post-drought assessments Drought emergency laws evidence the existence of inconsistencies between drought problem definition and the measures proposed as solutions. Moreover, the securitization of the discourse pursue goals beyond management drought. The perception of drought by farmers helps to identify the existence of several definitions of drought. It also highlights the importance of impacts in defining and characterizing the severity of an event. However, this definition differs from the one used at other institutional and management level. As a conclusion, this remarks the importance of considering the diversity of definitions and perceptions to better tailor drought management to the needs of different sectors and stakeholders. The analysis of drought management in six case studies across Europe show different levels of risk adoption approach in practice. The analytical framework proposed is based on six dimensions and 21 criteria. This method has proven to be a useful tool in diagnosing the elements that work and those that need to be improved in relation to drought risk management. The systematic analysis of vulnerability assessment studies demonstrates the heterogeneity of the conceptual frameworks used. Driven by the lack of relevant data, the studies point out significant weaknesses of the vulnerabilities factors that are typically included The heterogeneity of the impact data collected at European level to build the European Drought Impact Reports Database (EDII) highlights the importance of information management. The database has great potential as exploratory tool and provides indicative useful information of the type of impacts that occurred in a particular region. However, it still presents some challenges regarding their content, the process of data collection and management and its usefulness. There are significant limitations associated with the access and availability of relevant information and data related to drought management and its components. The following improvement areas on critical aspects have been identified for the near future: participation, levels of drought management, sectorial perspective and in-depth assessment of the relationships between the components of drought risk management The five articles presented in this dissertation provides concrete examples of drought management evaluation that help to better understand drought management from a risk-based perspective which can be useful for policy makers, managers and users.
Resumo:
This Guideline is an official statement of the European Society of Gastrointestinal Endoscopy (ESGE). It addresses the diagnosis and management of nonvariceal upper gastrointestinal hemorrhage (NVUGIH). Main Recommendations MR1. ESGE recommends immediate assessment of hemodynamic status in patients who present with acute upper gastrointestinal hemorrhage (UGIH), with prompt intravascular volume replacement initially using crystalloid fluids if hemodynamic instability exists (strong recommendation, moderate quality evidence). MR2. ESGE recommends a restrictive red blood cell transfusion strategy that aims for a target hemoglobin between 7 g/dL and 9 g/dL. A higher target hemoglobin should be considered in patients with significant co-morbidity (e. g., ischemic cardiovascular disease) (strong recommendation, moderate quality evidence). MR3. ESGE recommends the use of the Glasgow-Blatchford Score (GBS) for pre-endoscopy risk stratification. Outpatients determined to be at very low risk, based upon a GBS score of 0 - 1, do not require early endoscopy nor hospital admission. Discharged patients should be informed of the risk of recurrent bleeding and be advised to maintain contact with the discharging hospital (strong recommendation, moderate quality evidence). MR4. ESGE recommends initiating high dose intravenous proton pump inhibitors (PPI), intravenous bolus followed by continuous infusion (80 mg then 8 mg/hour), in patients presenting with acute UGIH awaiting upper endoscopy. However, PPI infusion should not delay the performance of early endoscopy (strong recommendation, high quality evidence). MR5. ESGE does not recommend the routine use of nasogastric or orogastric aspiration/lavage in patients presenting with acute UGIH (strong recommendation, moderate quality evidence). MR6. ESGE recommends intravenous erythromycin (single dose, 250 mg given 30 - 120 minutes prior to upper gastrointestinal [GI] endoscopy) in patients with clinically severe or ongoing active UGIH. In selected patients, pre-endoscopic infusion of erythromycin significantly improves endoscopic visualization, reduces the need for second-look endoscopy, decreases the number of units of blood transfused, and reduces duration of hospital stay (strong recommendation, high quality evidence). MR7. Following hemodynamic resuscitation, ESGE recommends early (≤ 24 hours) upper GI endoscopy. Very early (< 12 hours) upper GI endoscopy may be considered in patients with high risk clinical features, namely: hemodynamic instability (tachycardia, hypotension) that persists despite ongoing attempts at volume resuscitation; in-hospital bloody emesis/nasogastric aspirate; or contraindication to the interruption of anticoagulation (strong recommendation, moderate quality evidence). MR8. ESGE recommends that peptic ulcers with spurting or oozing bleeding (Forrest classification Ia and Ib, respectively) or with a nonbleeding visible vessel (Forrest classification IIa) receive endoscopic hemostasis because these lesions are at high risk for persistent bleeding or rebleeding (strong recommendation, high quality evidence). MR9. ESGE recommends that peptic ulcers with an adherent clot (Forrest classification IIb) be considered for endoscopic clot removal. Once the clot is removed, any identified underlying active bleeding (Forrest classification Ia or Ib) or nonbleeding visible vessel (Forrest classification IIa) should receive endoscopic hemostasis (weak recommendation, moderate quality evidence). MR10. In patients with peptic ulcers having a flat pigmented spot (Forrest classification IIc) or clean base (Forrest classification III), ESGE does not recommend endoscopic hemostasis as these stigmata present a low risk of recurrent bleeding. In selected clinical settings, these patients may be discharged to home on standard PPI therapy, e. g., oral PPI once-daily (strong recommendation, moderate quality evidence). MR11. ESGE recommends that epinephrine injection therapy not be used as endoscopic monotherapy. If used, it should be combined with a second endoscopic hemostasis modality (strong recommendation, high quality evidence). MR12. ESGE recommends PPI therapy for patients who receive endoscopic hemostasis and for patients with adherent clot not receiving endoscopic hemostasis. PPI therapy should be high dose and administered as an intravenous bolus followed by continuous infusion (80 mg then 8 mg/hour) for 72 hours post endoscopy (strong recommendation, high quality evidence). MR13. ESGE does not recommend routine second-look endoscopy as part of the management of nonvariceal upper gastrointestinal hemorrhage (NVUGIH). However, in patients with clinical evidence of rebleeding following successful initial endoscopic hemostasis, ESGE recommends repeat upper endoscopy with hemostasis if indicated. In the case of failure of this second attempt at hemostasis, transcatheter angiographic embolization (TAE) or surgery should be considered (strong recommendation, high quality evidence). MR14. In patients with NVUGIH secondary to peptic ulcer, ESGE recommends investigating for the presence of Helicobacter pylori in the acute setting with initiation of appropriate antibiotic therapy when H. pylori is detected. Re-testing for H. pylori should be performed in those patients with a negative test in the acute setting. Documentation of successful H. pylori eradication is recommended (strong recommendation, high quality evidence). MR15. In patients receiving low dose aspirin for secondary cardiovascular prophylaxis who develop peptic ulcer bleeding, ESGE recommends aspirin be resumed immediately following index endoscopy if the risk of rebleeding is low (e. g., FIIc, FIII). In patients with high risk peptic ulcer (FIa, FIb, FIIa, FIIb), early reintroduction of aspirin by day 3 after index endoscopy is recommended, provided that adequate hemostasis has been established (strong recommendation, moderate quality evidence).