56 resultados para Risk based Maintenance
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Traditionally, the routine artificial digestion test is applied to assess the presence of Trichinella larvae in pigs. However, this diagnostic method has a low sensitivity compared to serological tests. The results from artificial digestion tests in Switzerland were evaluated over a time period of 15 years to determine by when freedom from infection based on these data could be confirmed. Freedom was defined as a 95% probability that the prevalence of infection was below 0.0001%. Freedom was demonstrated after 12 years at the latest. A new risk-based surveillance approach was then developed based on serology. Risk-based surveillance was also assessed over 15 years, starting in 2010. It was shown that by using this design, the sample size could be reduced by at least a factor of 4 when compared with the traditional testing regimen, without lowering the level of confidence in the Trichinella-free status of the pig population.
Resumo:
In all European Union countries, chemical residues are required to be routinely monitored in meat. Good farming and veterinary practice can prevent the contamination of meat with pharmaceutical substances, resulting in a low detection of drug residues through random sampling. An alternative approach is to target-monitor farms suspected of treating their animals with antimicrobials. The objective of this project was to assess, using a stochastic model, the efficiency of these two sampling strategies. The model integrated data on Swiss livestock as well as expert opinion and results from studies conducted in Switzerland. Risk-based sampling showed an increase in detection efficiency of up to 100% depending on the prevalence of contaminated herds. Sensitivity analysis of this model showed the importance of the accuracy of prior assumptions for conducting risk-based sampling. The resources gained by changing from random to risk-based sampling should be transferred to improving the quality of prior information.
Resumo:
Swiss aquaculture farms were assessed according to their risk of acquiring or spreading viral haemorrhagic septicaemia (VHS) and infectious haematopoietic necrosis (IHN). Risk factors for the introduction and spread of VHS and IHN were defined and assessed using published data and expert opinions. Among the 357 aquaculture farms identified in Switzerland, 49.3% were categorised as high risk, 49.0% as medium risk and 1.7% as low risk. According to the new Directive 2006/88/EC for aquaculture of the European Union, the frequency of farm inspections must be derived from their risk levels. A sensitivity analysis showed that water supply and fish movements were highly influential on the output of the risk assessment regarding the introduction of VHS and IHN. Fish movements were also highly influential on the risk assessment output regarding the spread of these diseases.
Resumo:
BACKGROUND: This study focused on the descriptive analysis of cattle movements and farm-level parameters derived from cattle movements, which are considered to be generically suitable for risk-based surveillance systems in Switzerland for diseases where animal movements constitute an important risk pathway. METHODS: A framework was developed to select farms for surveillance based on a risk score summarizing 5 parameters. The proposed framework was validated using data from the bovine viral diarrhoea (BVD) surveillance programme in 2013. RESULTS: A cumulative score was calculated per farm, including the following parameters; the maximum monthly ingoing contact chain (in 2012), the average number of animals per incoming movement, use of mixed alpine pastures and the number of weeks in 2012 a farm had movements registered. The final score for the farm depended on the distribution of the parameters. Different cut offs; 50, 90, 95 and 99%, were explored. The final scores ranged between 0 and 5. Validation of the scores against results from the BVD surveillance programme 2013 gave promising results for setting the cut off for each of the five selected farm level criteria at the 50th percentile. Restricting testing to farms with a score ≥ 2 would have resulted in the same number of detected BVD positive farms as testing all farms, i.e., the outcome of the 2013 surveillance programme could have been reached with a smaller survey. CONCLUSIONS: The seasonality and time dependency of the activity of single farms in the networks requires a careful assessment of the actual time period included to determine farm level criteria. However, selecting farms in the sample for risk-based surveillance can be optimized with the proposed scoring system. The system was validated using data from the BVD eradication program. The proposed method is a promising framework for the selection of farms according to the risk of infection based on animal movements.
Resumo:
The parasite Echinococcus multilocularis was first detected in The Netherlands in 1996 and repeated studies have shown that the parasite subsequently spread in the local population of foxes in the province of Limburg. It was not possible to quantify the human risk of alveolar echinococcosis because no relationship between the amount of parasite eggs in the environment and the probability of infection in humans was known. Here, we used the spread of the parasite in The Netherlands as a predictor, together with recently published historical records of the epidemiology of alveolar echinococcosis in Switzerland, to achieve a relative quantification of the risk. Based on these analyses, the human risk in Limburg was simulated and up to three human cases are predicted by 2018. We conclude that the epidemiology of alveolar echinococcosis in The Netherlands might have changed from a period of negligible risk in the past to a period of increasing risk in the forthcoming years.
Resumo:
Switzerland implemented a risk-based monitoring of Swiss dairy products in 2002 based on a risk assessment (RA) that considered the probability of exceeding a microbiological limit value set by law. A new RA was launched in 2007 to review and further develop the previous assessment, and to make recommendations for future risk-based monitoring according to current risks. The resulting qualitative RA was designed to ascertain the risk to human health from the consumption of Swiss dairy products. The products and microbial hazards to be considered in the RA were determined based on a risk profile. The hazards included Campylobacter spp., Listeria monocytogenes, Salmonella spp., Shiga toxin-producing Escherichia coli, coagulase-positive staphylococci and Staphylococcus aureus enterotoxin. The release assessment considered the prevalence of the hazards in bulk milk samples, the influence of the process parameters on the microorganisms, and the influence of the type of dairy. The exposure assessment was linked to the production volume. An overall probability was estimated combining the probabilities of release and exposure for each combination of hazard, dairy product and type of dairy. This overall probability represents the likelihood of a product from a certain type of dairy exceeding the microbiological limit value and being passed on to the consumer. The consequences could not be fully assessed due to lack of detailed information on the number of disease cases caused by the consumption of dairy products. The results were expressed as a ranking of overall probabilities. Finally, recommendations for the design of the risk-based monitoring programme and for filling the identified data gaps were given. The aims of this work were (i) to present the qualitative RA approach for Swiss dairy products, which could be adapted to other settings and (ii) to discuss the opportunities and limitations of the qualitative method.
Resumo:
BACKGROUND: Patients with chemotherapy-related neutropenia and fever are usually hospitalized and treated on empirical intravenous broad-spectrum antibiotic regimens. Early diagnosis of sepsis in children with febrile neutropenia remains difficult due to non-specific clinical and laboratory signs of infection. We aimed to analyze whether IL-6 and IL-8 could define a group of patients at low risk of septicemia. METHODS: A prospective study was performed to assess the potential value of IL-6, IL-8 and C-reactive protein serum levels to predict severe bacterial infection or bacteremia in febrile neutropenic children with cancer during chemotherapy. Statistical test used: Friedman test, Wilcoxon-Test, Kruskal-Wallis H test, Mann-Whitney U-Test and Receiver Operating Characteristics. RESULTS: The analysis of cytokine levels measured at the onset of fever indicated that IL-6 and IL-8 are useful to define a possible group of patients with low risk of sepsis. In predicting bacteremia or severe bacterial infection, IL-6 was the best predictor with the optimum IL-6 cut-off level of 42 pg/ml showing a high sensitivity (90%) and specificity (85%). CONCLUSION: These findings may have clinical implications for risk-based antimicrobial treatment strategies.
Resumo:
Starting with an overview on losses due to mountain hazards in the Russian Federation and the European Alps, the question is raised why a substantial number of events still are recorded—despite considerable efforts in hazard mitigation and risk reduction. The main reason for this paradox lies in a missing dynamic risk-based approach, and it is shown that these dynamics have different roots: firstly, neglecting climate change and systems dynamics, the development of hazard scenarios is based on the static approach of design events. Secondly, due to economic development and population dynamics, the elements at risk exposed are subject to spatial and temporal changes. These issues are discussed with respect to temporal and spatial demands. As a result, it is shown how risk is dynamic on a long-term and short-term scale, which has to be acknowledged in the risk concept if this concept is targeted at a sustainable development of mountain regions. A conceptual model is presented that can be used for dynamical risk assessment, and it is shown by different management strategies how this model may be converted into practice. Furthermore, the interconnectedness and interaction between hazard and risk are addressed in order to enhance prevention, the level of protection and the degree of preparedness.
Resumo:
BACKGROUND/AIMS Several countries are working to adapt clinical trial regulations to align the approval process to the level of risk for trial participants. The optimal framework to categorize clinical trials according to risk remains unclear, however. Switzerland is the first European country to adopt a risk-based categorization procedure in January 2014. We assessed how accurately and consistently clinical trials are categorized using two different approaches: an approach using criteria set forth in the new law (concept) or an intuitive approach (ad hoc). METHODS This was a randomized controlled trial with a method-comparison study nested in each arm. We used clinical trial protocols from eight Swiss ethics committees approved between 2010 and 2011. Protocols were randomly assigned to be categorized in one of three risk categories using the concept or the ad hoc approach. Each protocol was independently categorized by the trial's sponsor, a group of experts and the approving ethics committee. The primary outcome was the difference in categorization agreement between the expert group and sponsors across arms. Linear weighted kappa was used to quantify agreements, with the difference between kappas being the primary effect measure. RESULTS We included 142 of 231 protocols in the final analysis (concept = 78; ad hoc = 64). Raw agreement between the expert group and sponsors was 0.74 in the concept and 0.78 in the ad hoc arm. Chance-corrected agreement was higher in the ad hoc (kappa: 0.34 (95% confidence interval = 0.10-0.58)) than in the concept arm (0.27 (0.06-0.50)), but the difference was not significant (p = 0.67). LIMITATIONS The main limitation was the large number of protocols excluded from the analysis mostly because they did not fit with the clinical trial definition of the new law. CONCLUSION A structured risk categorization approach was not better than an ad hoc approach. Laws introducing risk-based approaches should provide guidelines, examples and templates to ensure correct application.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.
Resumo:
We developed a model to calculate a quantitative risk score for individual aquaculture sites. The score indicates the risk of the site being infected with a specific fish pathogen (viral haemorrhagic septicaemia virus (VHSV); infectious haematopoietic necrosis virus, Koi herpes virus), and is intended to be used for risk ranking sites to support surveillance for demonstration of zone or member state freedom from these pathogens. The inputs to the model include a range of quantitative and qualitative estimates of risk factors organised into five risk themes (1) Live fish and egg movements; (2) Exposure via water; (3) On-site processing; (4) Short-distance mechanical transmission; (5) Distance-independent mechanical transmission. The calculated risk score for an individual aquaculture site is a value between zero and one and is intended to indicate the risk of a site relative to the risk of other sites (thereby allowing ranking). The model was applied to evaluate 76 rainbow trout farms in 3 countries (42 from England, 32 from Italy and 2 from Switzerland) with the aim to establish their risk of being infected with VHSV. Risk scores for farms in England and Italy showed great variation, clearly enabling ranking. Scores ranged from 0.002 to 0.254 (mean score 0.080) in England and 0.011 to 0.778 (mean of 0.130) for Italy, reflecting the diversity of infection status of farms in these countries. Requirements for broader application of the model are discussed. Cost efficient farm data collection is important to realise the benefits from a risk-based approach.
Resumo:
We used meat-inspection data collected over a period of three years in Switzerland to evaluate slaughterhouse-level, farm-level and animal-level factors that may be associated with whole carcass condemnation (WCC) in cattle after slaughter. The objective of this study was to identify WCC risk factors so they can be communicated to, and managed by, the slaughter industry and veterinary services. During meat inspection, there were three main important predictors of the risk of WCC; the slaughtered animal's sex, age, and the size of the slaughterhouse it was processed in. WCC for injuries and significant weight loss (visible welfare indicators) were almost exclusive to smaller slaughterhouses. Cattle exhibiting clinical syndromes that were not externally visible (e.g. pneumonia lesions) and that are associated with fattening of cattle, end up in larger slaughterhouses. For this reason, it is important for animal health surveillance to collect data from both types of slaughterhouses. Other important risk factors for WCC were on-farm mortality rate and the number of cattle on the farm of origin. This study highlights the fact that the many risk factors for WCC are as complex as the production system itself, with risk factors interacting with one another in ways which are sometimes difficult to interpret biologically. Risk-based surveillance aimed at farms with reoccurring health problems (e.g. a history of above average condemnation rates) may be more appropriate than the selection, of higher-risk animals arriving at slaughter. In Switzerland, the introduction of a benchmarking system that would provide feedback to the farmer with information on condemnation reasons, and his/her performance compared to the national/regional average could be a first step towards improving herd-management and financial returns for producers.
Resumo:
The WHO fracture risk assessment tool FRAX® is a computer based algorithm that provides models for the assessment of fracture probability in men and women. The approach uses easily obtained clinical risk factors (CRFs) to estimate 10-year probability of a major osteoporotic fracture (hip, clinical spine, humerus or wrist fracture) and the 10-year probability of a hip fracture. The estimate can be used alone or with femoral neck bone mineral density (BMD) to enhance fracture risk prediction. FRAX® is the only risk engine which takes into account the hazard of death as well as that of fracture. Probability of fracture is calculated in men and women from age, body mass index, and dichotomized variables that comprise a prior fragility fracture, parental history of hip fracture, current tobacco smoking, ever long-term use of oral glucocorticoids, rheumatoid arthritis, other causes of secondary osteoporosis, daily alcohol consumption of 3 or more units daily. The relationship between risk factors and fracture probability was constructed using information of nine population-based cohorts from around the world. CRFs for fracture had been identified that provided independent information on fracture risk based on a series of meta-analyses. The FRAX® algorithm was validated in 11 independent cohorts with in excess of 1 million patient-years, including the Swiss SEMOF cohort. Since fracture risk varies markedly in different regions of the world, FRAX® models need to be calibrated to those countries where the epidemiology of fracture and death is known. Models are currently available for 31 countries across the world. The Swiss-specific FRAX® model was developed very soon after the first release of FRAX® in 2008 and was published in 2009, using Swiss epidemiological data, integrating fracture risk and death hazard of our country. Two FRAX®-based approaches may be used to explore intervention thresholds. They have recently been investigated in the Swiss setting. In the first approach the guideline that individuals with a fracture probability equal to or exceeding that of women with a prior fragility fracture should be considered for treatment is translated into thresholds using 10-year fracture probabilities. In that case the threshold is age-dependent and increases from 16 % at the age of 60 ys to 40 % at the age of 80 ys. The second approach is a cost-effectiveness approach. Using a FRAX®-based intervention threshold of 15 % for both, women and men 50 years and older, should permit cost-effective access to therapy to patients at high fracture probability in our country and thereby contribute to further reduce the growing burden of osteoporotic fractures.