19 resultados para risk assessments

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Conventional risk assessments for crop protection chemicals compare the potential for causing toxicity (hazard identification) to anticipated exposure. New regulatory approaches have been proposed that would exclude exposure assessment and just focus on hazard identification based on endocrine disruption. This review comprises a critical analysis of hazard, focusing on the relative sensitivity of endocrine and non-endocrine endpoints, using a class of crop protection chemicals, the azole fungicides. These were selected because they are widely used on important crops (e.g. grains) and thereby can contact target and non-target plants and enter the food chain of humans and wildlife. Inhibition of lanosterol 14α-demethylase (CYP51) mediates the antifungal effect. Inhibition of other CYPs, such as aromatase (CYP19), can lead to numerous toxicological effects, which are also evident from high dose human exposures to therapeutic azoles. Because of its widespread use and substantial database, epoxiconazole was selected as a representative azole fungicide. Our critical analysis concluded that anticipated human exposure to epoxiconazole would yield a margin of safety of at least three orders of magnitude for reproductive effects observed in laboratory rodent studies that are postulated to be endocrine-driven (i.e. fetal resorptions). The most sensitive ecological species is the aquatic plant Lemna (duckweed), for which the margin of safety is less protective than for human health. For humans and wildlife, endocrine disruption is not the most sensitive endpoint. It is concluded that conventional risk assessment, considering anticipated exposure levels, will be protective of both human and ecological health. Although the toxic mechanisms of other azole compounds may be similar, large differences in potency will require a case-by-case risk assessment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

1.Pollinating insects provide crucial and economically important ecosystem services to crops and wild plants, but pollinators, particularly bees, are globally declining as a result of various driving factors, including the prevalent use of pesticides for crop protection. Sublethal pesticide exposure negatively impacts numerous pollinator life-history traits, but its influence on reproductive success remains largely unknown. Such information is pivotal, however, to our understanding of the long-term effects on population dynamics. 2.We investigated the influence of field-realistic trace residues of the routinely used neonicotinoid insecticides thiamethoxam and clothianidin in nectar substitutes on the entire life-time fitness performance of the red mason bee Osmia bicornis. 3.We show that chronic, dietary neonicotinoid exposure has severe detrimental effects on solitary bee reproductive output. Neonicotinoids did not affect adult bee mortality; however, monitoring of fully controlled experimental populations revealed that sublethal exposure resulted in almost 50% reduced total offspring production and a significantly male-biased offspring sex ratio. 4.Our data add to the accumulating evidence indicating that sublethal neonicotinoid effects on non-Apis pollinators are expressed most strongly in a rather complex, fitness-related context. Consequently, to fully mitigate long-term impacts on pollinator population dynamics, present pesticide risk assessments need to be expanded to include whole life-cycle fitness estimates, as demonstrated in the present study using O. bicornis as a model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pork occupies an important place in the diet of the population of Nagaland, one of the North East Indian states. We carried out a pilot study along the pork meat production chain, from live animal to end consumer. The goal was to obtain information about the presence of selected food borne hazards in pork in order to assess the risk deriving from these hazards to the health of the local consumers and make recommendations for improving food safety. A secondary objective was to evaluate the utility of risk-based approaches to food safety in an informal food system. We investigated samples from pigs and pork sourced at slaughter in urban and rural environments, and at retail, to assess a selection of food-borne hazards. In addition, consumer exposure was characterized using information about hygiene and practices related to handling and preparing pork. A qualitative hazard characterization, exposure assessment and hazard characterization for three representative hazards or hazard proxies, namely Enterobacteriaceae, T. solium cysticercosis and antibiotic residues, is presented. Several important potential food-borne pathogens are reported for the first time including Listeria spp. and Brucella suis. This descriptive pilot study is the first risk-based assessment of food safety in Nagaland. We also characterise possible interventions to be addressed by policy makers, and supply data to inform future risk assessments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bovine spongiform encephalopathy (BSE), popularly known as 'mad cow disease', led to an epidemic in Europe that peaked in the mid-1990s. Its impact on developing countries, such as Nigeria, has not been fully established as information on livestock and surveillance has eluded those in charge of this task. The BSE risk to Nigeria's cattle population currently remains undetermined, which has resulted in international trade restrictions on commodities from the cattle population. This is mainly because of a lack of updated BSE risk assessments and disease surveillance data. To evaluate the feasibility of BSE surveillance in Nigeria, we carried out a pilot study targeting cattle that were presented for emergency or casualty slaughter. In total, 1551 cattle of local breeds, aged 24 months and above were clinically examined. Ataxia, recumbency and other neurological signs were topmost on our list of criteria. A total of 96 cattle, which correspond to 6.2%, presented clinical signs that supported a suspect of BSE. The caudal brainstem tissues of these animals were collected post-mortem and analysed for the disease-specific form of the prion protein using a rapid test approved by the International Animal Health Organization (OIE). None of the samples were positive for BSE. Although our findings do not exclude the presence of BSE in Nigeria, they do demonstrate that targeted sampling of clinically suspected cases of BSE is feasible in developing countries. In addition, these findings point to the possibility of implementing clinical monitoring schemes for BSE and potentially other diseases with grave economic and public health consequences.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we propose the adoption of a statistical framework used in the evaluation of forensic evidence as a tool for evaluating and presenting circumstantial "evidence" of a disease outbreak from syndromic surveillance. The basic idea is to exploit the predicted distributions of reported cases to calculate the ratio of the likelihood of observing n cases given an ongoing outbreak over the likelihood of observing n cases given no outbreak. The likelihood ratio defines the Value of Evidence (V). Using Bayes' rule, the prior odds for an ongoing outbreak are multiplied by V to obtain the posterior odds. This approach was applied to time series on the number of horses showing clinical respiratory symptoms or neurological symptoms. The separation between prior beliefs about the probability of an outbreak and the strength of evidence from syndromic surveillance offers a transparent reasoning process suitable for supporting decision makers. The value of evidence can be translated into a verbal statement, as often done in forensics or used for the production of risk maps. Furthermore, a Bayesian approach offers seamless integration of data from syndromic surveillance with results from predictive modeling and with information from other sources such as disease introduction risk assessments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The recent Q fever epidemic in the Netherlands raised concerns about the potential risk of outbreaks in other European countries. In Switzerland, the prevalence of Q fever in animals and humans has not been studied in recent years. In this study, we describe the current situation with respect to Coxiella (C.) burnetii infections in small ruminants and humans in Switzerland, as a basis for future epidemiological investigations and public health risk assessments. Specific objectives of this cross-sectional study were to (i) estimate the seroprevalence of C. burnetii in sheep and goats, (ii) quantify the amount of bacteria shed during abortion and (iii) analyse temporal trends in human C. burnetii infections. The seroprevalence of C. burnetii in small ruminants was determined by commercial ELISA from a representative sample of 100 sheep flocks and 72 goat herds. Herd-level seroprevalence was 5.0% (95% CI: 1.6-11.3) for sheep and 11.1% (95% CI: 4.9-20.7) for goats. Animal-level seroprevalence was 1.8% (95% CI: 0.8-3.4) for sheep and 3.4% (95% CI: 1.7-6) for goats. The quantification of C. burnetii in 97 ovine and caprine abortion samples by real-time PCR indicated shedding of >10(4) bacteria/g in 13.4% of all samples tested. To our knowledge, this is the first study reporting C. burnetii quantities in a large number of small ruminant abortion samples. Annual human Q fever serology data were provided by five major Swiss laboratories. Overall, seroprevalence in humans ranged between 1.7% and 3.5% from 2007 to 2011, and no temporal trends were observed. Interestingly, the two laboratories with significantly higher seroprevalences are located in the regions with the largest goat populations as well as, for one laboratory, with the highest livestock density in Switzerland. However, a direct link between animal and human infection data could not be established in this study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Queen health is crucial to colony survival of social bees. Recently, queen failure has been proposed to be a major driver of managed honey bee colony losses, yet few data exist concerning effects of environmental stressors on queens. Here we demonstrate for the first time that exposure to field realistic concentrations of neonicotinoid pesticides during development can severely affect queens of western honey bees (Apis mellifera). In pesticide-exposed queens, reproductive anatomy (ovaries) and physiology (spermathecal-stored sperm quality and quantity), rather than flight behaviour, were compromised and likely corresponded to reduced queen success (alive and producing worker offspring). This study highlights the detriments of neonicotinoids to queens of environmentally and economically important social bees, and further strengthens the need for stringent risk assessments to safeguard biodiversity and ecosystem services that are vulnerable to these substances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tajikistan is judged to be highly vulnerable to risk, including food insecurity risks and climate change risks. By some vulnerability measures it is the most vulnerable among all 28 countries in the World Bank’s Europe and Central Asia Region – ECA (World Bank 2009). The rural population, with its relatively high incidence of poverty, is particularly vulnerable. The Pilot Program for Climate Resilience (PPCR) in Tajikistan (2011) provided an opportunity to conduct a farm-level survey with the objective of assessing various dimensions of rural population’s vulnerability to risk and their perception of constraints to farming operations and livelihoods. The survey should be accordingly referred to as the 2011 PPCR survey. The rural population in Tajikistan is highly agrarian, with about 50% of family income deriving from agriculture (see Figure 4.1; also LSMS 2007 – own calculations). Tajikistan’s agriculture basically consists of two groups of producers: small household plots – the successors of Soviet “private agriculture” – and dehkan (or “peasant”) farms – new family farming structures that began to be created under relevant legislation passed after 1992 (Lerman and Sedik, 2008). The household plots manage 20% of arable land and produce 65% of gross agricultural output (GAO). Dehkan farms manage 65% of arable land and produce close to 30% of GAO. The remaining 15% of arable land is held in agricultural enterprises – the rapidly shrinking sector of corporate farms that succeeded the Soviet kolkhozes and sovkhozes and today produces less than 10% of GAO (TajStat 2011) The survey conducted in May 2011 focused on dehkan farms, as budgetary constraints precluded the inclusion of household plots. A total of 142 dehkan farms were surveyed in face-to-face interviews. They were sampled from 17 districts across all four regions – Sughd, Khatlon, RRP, and GBAO. The districts were selected so as to represent different agro-climatic zones, different vulnerability zones (based on the World Bank (2011) vulnerability assessment), and different food-insecurity zones (based on WFP/IPC assessments). Within each district, 3-4 jamoats were chosen at random and 2-3 farms were selected in each jamoat from lists provided by jamoat administration so as to maximize the variability by farm characteristics. The sample design by region/district is presented in Table A, which also shows the agro-climatic zone and the food security phase for each district. The sample districts are superimposed on a map of food security phases based on IPC April 2011.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomarkers are currently best used as mechanistic "signposts" rather than as "traffic lights" in the environmental risk assessment of endocrine-disrupting chemicals (EDCs). In field studies, biomarkers of exposure [e.g., vitellogenin (VTG) induction in male fish] are powerful tools for tracking single substances and mixtures of concern. Biomarkers also provide linkage between field and laboratory data, thereby playing an important role in directing the need for and design of fish chronic tests for EDCs. It is the adverse effect end points (e.g., altered development, growth, and/or reproduction) from such tests that are most valuable for calculating adverseNOEC (no observed effect concentration) or adverseEC10 (effective concentration for a 10% response) and subsequently deriving predicted no effect concentrations (PNECs). With current uncertainties, biomarkerNOEC or biomarkerEC10 data should not be used in isolation to derive PNECs. In the future, however, there may be scope to increasingly use biomarker data in environmental decision making, if plausible linkages can be made across levels of organization such that adverse outcomes might be envisaged relative to biomarker responses. For biomarkers to fulfil their potential, they should be mechanistically relevant and reproducible (as measured by interlaboratory comparisons of the same protocol). VTG is a good example of such a biomarker in that it provides an insight to the mode of action (estrogenicity) that is vital to fish reproductive health. Interlaboratory reproducibility data for VTG are also encouraging; recent comparisons (using the same immunoassay protocol) have provided coefficients of variation (CVs) of 38-55% (comparable to published CVs of 19-58% for fish survival and growth end points used in regulatory test guidelines). While concern over environmental xenoestrogens has led to the evaluation of reproductive biomarkers in fish, it must be remembered that many substances act via diverse mechanisms of action such that the environmental risk assessment for EDCs is a broad and complex issue. Also, biomarkers such as secondary sexual characteristics, gonadosomatic indices, plasma steroids, and gonadal histology have significant potential for guiding interspecies assessments of EDCs and designing fish chronic tests. To strengthen the utility of EDC biomarkers in fish, we need to establish a historical control database (also considering natural variability) to help differentiate between statistically detectable versus biologically significant responses. In conclusion, as research continues to develop a range of useful EDC biomarkers, environmental decision-making needs to move forward, and it is proposed that the "biomarkers as signposts" approach is a pragmatic way forward in the current risk assessment of EDCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Single-nucleotide polymorphisms in genes involved in lipoprotein and adipocyte metabolism may explain why dyslipidemia and lipoatrophy occur in some but not all antiretroviral therapy (ART)-treated individuals. METHODS: We evaluated the contribution of APOC3 -482C-->T, -455T-->C, and 3238C-->G; epsilon 2 and epsilon 4 alleles of APOE; and TNF -238G-->A to dyslipidemia and lipoatrophy by longitudinally modeling >2600 lipid determinations and 2328 lipoatrophy assessments in 329 ART-treated patients during a median follow-up period of 3.4 years. RESULTS: In human immunodeficiency virus (HIV)-infected individuals, the effects of variant alleles of APOE on plasma cholesterol and triglyceride levels and of APOC3 on plasma triglyceride levels were comparable to those reported in the general population. However, when treated with ritonavir, individuals with unfavorable genotypes of APOC3 and [corrected] APOE were at risk of extreme hypertriglyceridemia. They had median plasma triglyceride levels of 7.33 mmol/L, compared with 3.08 mmol/L in the absence of ART. The net effect of the APOE*APOC3*ritonavir interaction was an increase in plasma triglyceride levels of 2.23 mmol/L. No association between TNF -238G-->A and lipoatrophy was observed. CONCLUSIONS: Variant alleles of APOE and APOC3 contribute to an unfavorable lipid profile in patients with HIV. Interactions between genotypes and ART can lead to severe hyperlipidemia. Genetic analysis may identify patients at high risk for severe ritonavir-associated hypertriglyceridemia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European Water Framework Directive (WFD) requires a status assessment of all water bodies. If that status is deteriorated, the WFD urges the identification of its potential causes in order to be able to suggest appropriate management measures. The instrument of investigative monitoring allows for such identification, provided that appropriate tools are available to link the observed effects to causative stressors, while unravelling confounding factors. In this chapter, the state of the art of status and causal pathway assessment is described for the major stressors responsible for the deterioration of European water bodies, i.e. toxicity, acidification, salinisation, eutrophication and oxygen depletion, parasites and pathogens, invasive alien species, hydromorphological degradation, changing water levels as well as sediments and suspended matter. For each stressor, an extensive description of the potential effects on the ecological status is given. Secondly, stressor-specific abiotic and biotic indicators are described that allow for a first indication of probable causes, based on the assessment of available monitoring data. Subsequently, more advanced tools for site-specific confirmation of stressors at hand are discussed. Finally, the local status assessments are put into the perspective of the risk for downstream stretches in order to be able to prioritise stressors and to be able to select appropriate measures for mitigation of the risks resulting from these stressors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Risk behaviors such as substance use or deviance are often limited to the early stages of the life course. Whereas the onset of risk behavior is well studied, less is currently known about the decline and timing of cessation of risk behaviors of different domains during young adulthood. Prevalence and longitudinal developmental patterning of alcohol use, drinking to the point of drunkenness, smoking, cannabis use, deviance, and HIV-related sexual risk behavior were compared in a Swiss community sample (N = 2,843). Using a longitudinal cohort-sequential approach to link multiple assessments with 3 waves of data for each individual, the studied period spanned the ages of 16 to 29 years. Although smoking had a higher prevalence, both smoking and drinking up to the point of drunkenness followed an inverted U-shaped curve. Alcohol consumption was also best described by a quadratic model, though largely stable at a high level through the late 20s. Sexual risk behavior increased slowly from age 16 to age 22 and then remained largely stable. In contrast, cannabis use and deviance linearly declined from age 16 to age 29. Young men were at higher risk for all behaviors than were young women, but apart from deviance, patterning over time was similar for both sexes. Results about the timing of increase and decline as well as differences between risk behaviors may inform tailored prevention programs during the transition from late adolescence to adulthood.