56 resultados para Risk based Maintenance
Resumo:
BACKGROUND: In order to optimise the cost-effectiveness of active surveillance to substantiate freedom from disease, a new approach using targeted sampling of farms was developed and applied on the example of infectious bovine rhinotracheitis (IBR) and enzootic bovine leucosis (EBL) in Switzerland. Relevant risk factors (RF) for the introduction of IBR and EBL into Swiss cattle farms were identified and their relative risks defined based on literature review and expert opinions. A quantitative model based on the scenario tree method was subsequently used to calculate the required sample size of a targeted sampling approach (TS) for a given sensitivity. We compared the sample size with that of a stratified random sample (sRS) with regard to efficiency. RESULTS: The required sample sizes to substantiate disease freedom were 1,241 farms for IBR and 1,750 farms for EBL to detect 0.2% herd prevalence with 99% sensitivity. Using conventional sRS, the required sample sizes were 2,259 farms for IBR and 2,243 for EBL. Considering the additional administrative expenses required for the planning of TS, the risk-based approach was still more cost-effective than a sRS (40% reduction on the full survey costs for IBR and 8% for EBL) due to the considerable reduction in sample size. CONCLUSIONS: As the model depends on RF selected through literature review and was parameterised with values estimated by experts, it is subject to some degree of uncertainty. Nevertheless, this approach provides the veterinary authorities with a promising tool for future cost-effective sampling designs.
Resumo:
This paper evaluates whether the Swiss monitoring programme for foreign substances in animal products fulfils basic epidemiological quality requirements, and identifies possible sources of bias in the selection of samples. The sampling was analysed over a 4-year period (2002-05). The sampling frame in 37 participating abattoirs covered 51% of all slaughtered pigs, 73% of calves, 68% of beef and 36% of cows. The analysis revealed that some sub-populations as defined by the region of origin were statistically over-represented while others were under-represented. The programme that is in accordance with European Union requirements contained some relevant bias. Patterns of under-sampled regions characterized by management type differences were identified. This could lead to an underestimate of the number of contaminated animals within the programme. Although the current sampling was stratified and partially risk-based, its efficiency could be improved by adopting a more targeted approach.
Resumo:
Early onset neonatal sepsis due to Group B streptococci (GBS) is responsible for severe morbidity and mortality of newborns. While different preventive strategies to identify women at risk are being recommended, the optimal strategy depends on the incidence of GBS-sepsis and on the prevalence of anogenital GBS colonization. We therefore aimed to assess the Group B streptococci prevalence and its consequences on different prevention strategies. We analyzed 1316 pregnant women between March 2005 and September 2006 at our institution. The prevalence of GBS colonization was determined by selective cultures of anogenital smears. The presence of risk factors was analyzed. In addition, the direct costs of screening and intrapartum antibiotic prophylaxis were estimated for different preventive strategies. The prevalence of GBS colonization was 21%. Any maternal intrapartum risk factor was present in 37%. The direct costs of different prevention strategies have been estimated as follows: risk-based: 18,500 CHF/1000 live births, screening-based: 50,110 CHF/1000 live births, combined screening- and risk-based: 43,495/1000 live births. Strategies to prevent GBS-sepsis in newborn are necessary. With our colonization prevalence of 21%, and the intrapartum risk profile of women, the screening-based approach seems to be superior as compared to a risk-based approach.
Evaluation of control and surveillance strategies for classical swine fever using a simulation model
Resumo:
Classical swine fever (CSF) outbreaks can cause enormous losses in naïve pig populations. How to best minimize the economic damage and number of culled animals caused by CSF is therefore an important research area. The baseline CSF control strategy in the European Union and Switzerland consists of culling all animals in infected herds, movement restrictions for animals, material and people within a given distance to the infected herd and epidemiological tracing of transmission contacts. Additional disease control measures such as pre-emptive culling or vaccination have been recommended based on the results from several simulation models; however, these models were parameterized for areas with high animal densities. The objective of this study was to explore whether pre-emptive culling and emergency vaccination should also be recommended in low- to moderate-density areas such as Switzerland. Additionally, we studied the influence of initial outbreak conditions on outbreak severity to improve the efficiency of disease prevention and surveillance. A spatial, stochastic, individual-animal-based simulation model using all registered Swiss pig premises in 2009 (n=9770) was implemented to quantify these relationships. The model simulates within-herd and between-herd transmission (direct and indirect contacts and local area spread). By varying the four parameters (a) control measures, (b) index herd type (breeding, fattening, weaning or mixed herd), (c) detection delay for secondary cases during an outbreak and (d) contact tracing probability, 112 distinct scenarios were simulated. To assess the impact of scenarios on outbreak severity, daily transmission rates were compared between scenarios. Compared with the baseline strategy (stamping out and movement restrictions) vaccination and pre-emptive culling neither reduced outbreak size nor duration. Outbreaks starting in a herd with weaning piglets or fattening pigs caused higher losses regarding to the number of culled premises and were longer lasting than those starting in the two other index herd types. Similarly, larger transmission rates were estimated for these index herd type outbreaks. A longer detection delay resulted in more culled premises and longer duration and better transmission tracing increased the number of short outbreaks. Based on the simulation results, baseline control strategies seem sufficient to control CSF in low-medium animal-dense areas. Early detection of outbreaks is crucial and risk-based surveillance should be focused on weaning piglet and fattening pig premises.
Resumo:
Pork occupies an important place in the diet of the population of Nagaland, one of the North East Indian states. We carried out a pilot study along the pork meat production chain, from live animal to end consumer. The goal was to obtain information about the presence of selected food borne hazards in pork in order to assess the risk deriving from these hazards to the health of the local consumers and make recommendations for improving food safety. A secondary objective was to evaluate the utility of risk-based approaches to food safety in an informal food system. We investigated samples from pigs and pork sourced at slaughter in urban and rural environments, and at retail, to assess a selection of food-borne hazards. In addition, consumer exposure was characterized using information about hygiene and practices related to handling and preparing pork. A qualitative hazard characterization, exposure assessment and hazard characterization for three representative hazards or hazard proxies, namely Enterobacteriaceae, T. solium cysticercosis and antibiotic residues, is presented. Several important potential food-borne pathogens are reported for the first time including Listeria spp. and Brucella suis. This descriptive pilot study is the first risk-based assessment of food safety in Nagaland. We also characterise possible interventions to be addressed by policy makers, and supply data to inform future risk assessments.
Resumo:
There is growing evidence for the development of posttraumatic stress symptoms as a consequence of acute cardiac events. Acute coronary syndrome (ACS) patients experience a range of acute cardiac symptoms, and these may cluster together in specific patterns. The objectives of this study were to establish distinct symptom clusters in ACS patients, and to investigate whether the experience of different types of symptom clusters are associated with posttraumatic symptom intensity at six months. ACS patients were interviewed in hospital within 48 h of admission, 294 patients provided information on symptoms before hospitalisation, and cluster analysis was used to identify patterns. Posttraumatic stress symptoms were assessed in 156 patients at six months. Three symptom clusters were identified; pain symptoms, diffuse symptoms and symptoms of dyspnea. In multiple regression analyses, adjusting for sociodemographic, clinical and psychological factors, the pain symptoms cluster (β = .153, P = .044) emerged as a significant predictor of posttraumatic symptom severity at six months. A marginally significant association was observed between symptoms of dyspnea and reduced intrusive symptoms at six months (β = -.156, P = .061). Findings suggest acute ACS symptoms occur in distinct clusters, which may have distinctive effects on intensity of subsequent posttraumatic symptoms. Since posttraumatic stress is associated with adverse outcomes, identifying patients at risk based on their symptom experience during ACS may be useful in targeting interventions.
Resumo:
BACKGROUND Acetabular fractures and surgical interventions used to treat them can result in nerve injuries. To date, only small case studies have tried to explore the frequency of nerve injuries and their association with patient and treatment characteristics. High-quality data on the risk of traumatic and iatrogenic nerve lesions and their epidemiology in relation to different fracture types and surgical approaches are lacking. QUESTIONS/PURPOSES The purpose of this study was to determine (1) the proportion of patients who develop nerve injuries after acetabular fracture; (2) which fracture type(s) are associated with increased nerve injury risk; and (3) which surgical approach was associated with the highest proportion of patients developing nerve injuries using data from the German Pelvic Trauma Registry. Two secondary aims were (4) to assess hospital volume-nerve-injury relationship; and (5) internal data validity. METHODS Between March 2001 and June 2012, 2236 patients with acetabular fractures were entered into a prospectively maintained registry from 29 hospitals; of those, 2073 (92.7%) had complete records on the endpoints of interest in this retrospective study and were analyzed. The neurological status in these patients was captured at their admission and at the discharge. A total of 1395 of 2073 (67%) patients underwent surgery, and the proportions of intervention-related and other hospital-acquired nerve injuries were obtained. Overall proportions of patients developing nerve injuries, risk based on fracture type, and risk of surgical approach type were analyzed. RESULTS The proportion of patients being diagnosed with nerve injuries at hospital admission was 4% (76 of 2073) and at discharge 7% (134 or 2073). Patients with fractures of the "posterior wall" (relative risk [RR], 2.0; 95% confidence interval [CI], 1.4-2.8; p=0.001), "posterior column and posterior wall" (RR, 2.9; CI, 1.6-5.0; p=0.002), and "transverse+posterior wall" fracture (RR, 2.1; CI, 1.3-3.5; p=0.010) were more likely to have nerve injuries at hospital discharge. The proportion of patients with intervention-related nerve injuries and that of patients with other hospital-acquired nerve injuries was 2% (24 of 1395 and 46 of 2073, respectively). They both were associated with the Kocher-Langenbeck approach (RR, 3.0; CI, 1.4-6.2; p=0.006; and RR, 2.4; CI, 1.4-4.3; p=0.004, respectively). CONCLUSIONS Acetabular fractures with the involvement of posterior wall were most commonly accompanied with nerve injuries. The data suggest also that Kocher-Langenbeck approach to the pelvic ring is associated with a higher risk of perioperative nerve injuries. Trauma surgeons should be aware of common nerve injuries, particularly in posterior wall fractures. The results of the study should help provide patients with more exact information on the risk of perioperative nerve injuries in acetabular fractures. LEVEL OF EVIDENCE Level III, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.
Resumo:
PURPOSE: The mandibular implant overdenture is a popular treatment modality and is well documented in the literature. Follow-up studies with a long observation period are difficult to perform due to the increasing age of patients. The present data summarize a long-term clinical observation of patients with implant overdentures. MATERIALS AND METHODS: Between 1984 and 1997, edentulous patients were consecutively admitted for treatment with an implant overdenture. The dentures were connected to the implants by means of bars or ball anchors. Regular maintenance was provided with at least one or two scheduled visits per year. Recall attendance and reasons for dropout were analyzed based on the specific history of the patient. Denture maintenance service, relining, repair, and fabrication of new dentures were identified, and complications with the retention devices specified separately. RESULTS: In the time period from 1984 to 2008, 147 patients with a total of 314 implants had completed a follow-up period of >10 years. One hundred one patients were still available in 2008, while 46 patients were not reexamined for various reasons. Compliance was high, with a regular recall attendance of >90%. More than 80% of dentures remained in continuous service. Although major prosthetic maintenance was rather low in relation to the long observation period, visits to a dental hygienist and dentist resulted in an annual visit rate of 1.5 and 2.4, respectively. If new dentures became necessary, these were made in student courses, which increased the treatment time and number of appointments needed. Complications with the retention devices consisted mostly of the mounting of new female retainers, the repair of bars, and the changing of ball anchors. The average number of events and the rate of prosthetic service with ball anchors were significantly higher than those with bars. Twenty-two patients changed from ball anchors to bars; 9 patients switched from a clip bar to a rigid U-shaped bar. CONCLUSIONS: This long-term follow-up study demonstrates that implant overdentures are a favorable solution for edentulous patients with regular maintenance. In spite of specific circumstances in an aging population, it is possible to provide long-term care, resulting in a good prognosis and low risk for this treatment modality. For various reasons the dropout rate can be considerable in elderly patients and prosthetic service must be provided regularly.
Resumo:
To date, few risk factors for childhood acute lymphoblastic leukemia (ALL) have been confirmed and the scientific literature is full of controversial "evidence." We examined if family characteristics, particularly maternal and paternal age and number of older siblings, were risk factors for childhood acute lymphoblastic leukemia (ALL).
Resumo:
Darunavir was designed for activity against HIV resistant to other protease inhibitors (PIs). We assessed the efficacy, tolerability and risk factors for virological failure of darunavir for treatment-experienced patients seen in clinical practice.
Resumo:
Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm.
Resumo:
Ultrasound detection of sub-clinical atherosclerosis (ATS) may help identify individuals at high cardiovascular risk. Most studies evaluated intima-media thickness (IMT) at carotid level. We compared the relationships between main cardiovascular risk factors (CVRF) and five indicators of ATS (IMT, mean and maximal plaque thickness, mean and maximal plaque area) at both carotid and femoral levels. Ultrasound was performed on 496 participants aged 45-64 years randomly selected from the general population of the Republic of Seychelles. 73.4 % participants had ≥ 1 plaque (IMT thickening ≥ 1.2 mm) at carotid level and 67.5 % at femoral level. Variance (adjusted R2) contributed by age, sex and CVRF (smoking, LDL-cholesterol, HDL-cholesterol, blood pressure, diabetes) in predicting any of the ATS markers was larger at femoral than carotid level. At both carotid and femoral levels, the association between CVRF and ATS was stronger based on plaque-based markers than IMT. Our findings show that the associations between CVRF and ATS markers were stronger at femoral than carotid level, and with plaque-based markers rather than IMT. Pending comparison of these markers using harder cardiovascular endpoints, our findings suggest that markers based on plaque morphology assessed at femoral artery level might be useful cardiovascular risk predictors.