14 resultados para estimation risk
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Bovine tuberculosis (bTB) caused by Mycobacterium bovis or M. caprae has recently (re-) emerged in livestock and wildlife in all countries bordering Switzerland (CH) and the Principality of Liechtenstein (FL). Comprehensive data for Swiss and Liechtenstein wildlife are not available so far, although two native species, wild boar (Sus scrofa) and red deer (Cervus elaphus elaphus), act as bTB reservoirs elsewhere in continental Europe. Our aims were (1) to assess the occurrence of bTB in these wild ungulates in CH/FL and to reinforce scanning surveillance in all wild mammals; (2) to evaluate the risk of a future bTB reservoir formation in wild boar and red deer in CH/FL. Tissue samples collected from 2009 to 2011 from 434 hunted red deer and wild boar and from eight diseased ungulates with tuberculosis-like lesions were tested by direct real-time PCR and culture to detect mycobacteria of the Mycobacterium tuberculosis complex (MTBC). Identification of suspicious colonies was attempted by real-time PCR, genotyping and spoligotyping. Information on risk factors for bTB maintenance within wildlife populations was retrieved from the literature and the situation regarding identified factors was assessed for our study areas. Mycobacteria of the MTBC were detected in six out of 165 wild boar (3.6%; 95% CI: 1.4-7.8) but none of the 269 red deer (0%; 0-1.4). M. microti was identified in two MTBC-positive wild boar, while species identification remained unsuccessful in four cases. Main risk factors for bTB maintenance worldwide, including different causes of aggregation often resulting from intensive wildlife management, are largely absent in CH and FL. In conclusion, M. bovis and M. caprae were not detected but we report for the first time MTBC mycobacteria in Swiss wild boar. Present conditions seem unfavorable for a reservoir emergence, nevertheless increasing population numbers of wild ungulates and offal consumption may represent a risk.
Resumo:
Agricultural workers are exposed to various risks, including chemical agents, noise, and many other factors. One of the most characteristic and least known risk factors is constituted by the microclimatic conditions in the different phases of work (in field, in greenhouse, etc). A typical condition is thermal stress due to high temperatures during harvesting operations in open fields or in greenhouses. In Italy, harvesting is carried out for many hours during the day, mainly in the summer, with temperatures often higher than 30 degrees C. According to ISO 7243, these conditions can be considered dangerous for workers' health. The aim of this study is to assess the risks of exposure to microclimatic conditions (heat) for fruit and vegetable harvesters in central Italy by applying methods established by international standards. In order to estimate the risk for workers, the air temperature, radiative temperature, and air speed were measured using instruments in conformity with ISO 7726. Thermodynamic parameters and two more subjective parameters, clothing and the metabolic heat production rate related to the worker's physical activity, were used to calculate the predicted heat strain (PHS) for the exposed workers in conformity with ISO 7933. Environmental and subjective parameters were also measured for greenhouse workers, according to ISO 7243, in order to calculate the wet-bulb globe temperature (WBGT). The results show a slight risk for workers during manual harvesting in the field. On the other hand, the data collected in the greenhouses show that the risk for workers must not be underestimated. The results of the study show that, for manual harvesting work in climates similar to central Italy, it is essential to provide plenty of drinking water and acclimatization for the workers in order to reduce health risks. Moreover, the study emphasizes that the possible health risks for greenhouse workers increase from the month of April through July.
Resumo:
Background Guidelines for the prevention of coronary heart disease (CHD) recommend use of Framingham-based risk scores that were developed in white middle-aged populations. It remains unclear whether and how CHD risk prediction might be improved among older adults. We aimed to compare the prognostic performance of the Framingham risk score (FRS), directly and after recalibration, with refit functions derived from the present cohort, as well as to assess the utility of adding other routinely available risk parameters to FRS. Methods Among 2193 black and white older adults (mean age, 73.5 years) without pre-existing cardiovascular disease from the Health ABC cohort, we examined adjudicated CHD events, defined as incident myocardial infarction, CHD death, and hospitalization for angina or coronary revascularization. Results During 8-year follow-up, 351 participants experienced CHD events. The FRS poorly discriminated between persons who experienced CHD events vs. not (C-index: 0.577 in women; 0.583 in men) and underestimated absolute risk prediction by 51% in women and 8% in men. Recalibration of the FRS improved absolute risk prediction, particulary for women. For both genders, refitting these functions substantially improved absolute risk prediction, with similar discrimination to the FRS. Results did not differ between whites and blacks. The addition of lifestyle variables, waist circumference and creatinine did not improve risk prediction beyond risk factors of the FRS. Conclusions The FRS underestimates CHD risk in older adults, particularly in women, although traditional risk factors remain the best predictors of CHD. Re-estimated risk functions using these factors improve accurate estimation of absolute risk.
Resumo:
The development of a clinical decision tree based on knowledge about risks and reported outcomes of therapy is a necessity for successful planning and outcome of periodontal therapy. This requires a well-founded knowledge of the disease entity and a broad knowledge of how different risk conditions attribute to periodontitis. The infectious etiology, a complex immune response, and influence from a large number of co-factors are challenging conditions in clinical periodontal risk assessment. The difficult relationship between independent and dependent risk conditions paired with limited information on periodontitis prevalence adds to difficulties in periodontal risk assessment. The current information on periodontitis risk attributed to smoking habits, socio-economic conditions, general health and subjects' self-perception of health, is not comprehensive, and this contributes to limited success in periodontal risk assessment. New models for risk analysis have been advocated. Their utility for the estimation of periodontal risk assessment and prognosis should be tested. The present review addresses several of these issues associated with periodontal risk assessment.
Resumo:
Pulse wave velocity (PWV) is a surrogate of arterial stiffness and represents a non-invasive marker of cardiovascular risk. The non-invasive measurement of PWV requires tracking the arrival time of pressure pulses recorded in vivo, commonly referred to as pulse arrival time (PAT). In the state of the art, PAT is estimated by identifying a characteristic point of the pressure pulse waveform. This paper demonstrates that for ambulatory scenarios, where signal-to-noise ratios are below 10 dB, the performance in terms of repeatability of PAT measurements through characteristic points identification degrades drastically. Hence, we introduce a novel family of PAT estimators based on the parametric modeling of the anacrotic phase of a pressure pulse. In particular, we propose a parametric PAT estimator (TANH) that depicts high correlation with the Complior(R) characteristic point D1 (CC = 0.99), increases noise robustness and reduces by a five-fold factor the number of heartbeats required to obtain reliable PAT measurements.
Resumo:
BACKGROUND: The estimation of physiologic ability and surgical stress (E-PASS) has been used to produce a numerical estimate of expected mortality and morbidity after elective gastrointestinal surgery. The aim of this study was to validate E-PASS in a selected cohort of patients requiring liver resections (LR). METHODS: In this retrospective study, E-PASS predictor equations for morbidity and mortality were applied to the prospective data from 243 patients requiring LR. The observed rates were compared with predicted rates using Fisher's exact test. The discriminative capability of E-PASS was evaluated using receiver-operating characteristic (ROC) curve analysis. RESULTS: The observed and predicted overall mortality rates were both 3.3% and the morbidity rates were 31.3 and 26.9%, respectively. There was a significant difference in the comprehensive risk scores for deceased and surviving patients (p = 0.043). However, the scores for patients with or without complications were not significantly different (p = 0.120). Subsequent ROC curve analysis revealed a poor predictive accuracy for morbidity. CONCLUSIONS: The E-PASS score seems to effectively predict mortality in this specific group of patients but is a poor predictor of complications. A new modified logistic regression might be required for LR in order to better predict the postoperative outcome.
Resumo:
BACKGROUND: Reduced bone mineral density (BMD) is common in adults infected with human immunodeficiency virus (HIV). The role of proximal renal tubular dysfunction (PRTD) and alterations in bone metabolism in HIV-related low BMD are incompletely understood. METHODS: We quantified BMD (dual-energy x-ray absorptiometry), blood and urinary markers of bone metabolism and renal function, and risk factors for low BMD (hip or spine T score, -1 or less) in an ambulatory care setting. We determined factors associated with low BMD and calculated 10-year fracture risks using the World Health Organization FRAX equation. RESULTS: We studied 153 adults (98% men; median age, 48 years; median body mass index, 24.5; 67 [44%] were receiving tenofovir, 81 [53%] were receiving a boosted protease inhibitor [PI]). Sixty-five participants (42%) had low BMD, and 11 (7%) had PRTD. PI therapy was associated with low BMD in multivariable analysis (odds ratio, 2.69; 95% confidence interval, 1.09-6.63). Tenofovir use was associated with increased osteoblast and osteoclast activity (P< or = .002). The mean estimated 10-year risks were 1.2% for hip fracture and 5.4% for any major osteoporotic fracture. CONCLUSIONS: In this mostly male population, low BMD was significantly associated with PI therapy. Tenofovir recipients showed evidence of increased bone turnover. Measurement of BMD and estimation of fracture risk may be warranted in treated HIV-infected adults.
Resumo:
BACKGROUND In recent years, the occurrence and the relevance of Mycoplasma hyopneumoniae infections in suckling pigs has been examined in several studies. Whereas most of these studies were focused on sole prevalence estimation within different age groups, follow-up of infected piglets or assessment of pathological findings, none of the studies included a detailed analysis of individual and environmental risk factors. Therefore, the aim of the present study was to investigate the frequency of M. hyopneumoniae infections in suckling pigs of endemically infected herds and to identify individual risk factors potentially influencing the infection status of suckling pigs at the age of weaning. RESULTS The animal level prevalence of M. hyopneumoniae infections in suckling pigs examined in three conventional pig breeding herds was 3.6% (41/1127) at the time of weaning. A prevalence of 1.2% was found in the same pigs at the end of their nursery period. In a multivariable Poisson regression model it was found that incidence rate ratios (IRR) for suckling pigs are significantly lower than 1 when teeth grinding was conducted (IRR: 0.10). Moreover, high temperatures in the piglet nest during the first two weeks of life (occasionally >40°C) were associated with a decrease of the probability of an infection (IRR: 0.23-0.40). Contrary, the application of PCV2 vaccines to piglets was associated with an increased infection risk (IRR: 9.72). CONCLUSIONS Since single infected piglets are supposed to act as initiators for the transmission of this pathogen in nursery and fattening pigs, the elimination of the risk factors described in this study should help to reduce the incidence rate of M. hyopneumoniae infections and thereby might contribute to a reduced probability of high prevalences in older pigs.
Resumo:
BACKGROUND The role of surgery for patients with metastatic esophagogastric adenocarcinoma (EGC) is not defined. The purpose of this study was to define selection criteria for patients who may benefit from resection following systemic chemotherapy. METHODS From 1987 to 2007, 160 patients presenting with synchronous metastatic EGC (cT3/4 cNany cM0/1 finally pM1) were treated with chemotherapy followed by resection of the primary tumor and metastases. Clinical and histopathological data, site and number of metastases were analyzed. A prognostic score was established and validated in a second cohort from another academic center (n = 32). RESULTS The median survival (MS) in cohort 1 was 13.6 months. Significant prognostic factors were grading (p = 0.046), ypT- (p = 0.001), ypN- (p = 0.011) and R-category (p = 0.015), lymphangiosis (p = 0.021), clinical (p = 0.004) and histopathological response (p = 0.006), but not localization or number of metastases. The addition of grading (G1/2:0 points; G3/4:1 points), clinical response (responder: 0; nonresponder: 1) and R-category (complete:0; R1:1; R2:2) defines two groups of patients with significantly different survival (p = 0.001) [low risk group (Score 0/1), n = 22: MS 35.3 months, 3-year-survival 47.6%); high risk group (Score 2/3/4) n = 126: MS 12.0 months, 3-year-survival 14.2%]. The score showed a strong trend in the validation cohort (p = 0.063) [low risk group (MS not reached, 3-year-survival 57.1%); high risk group (MS 19.9 months, 3-year-survival 6.7%)]. CONCLUSION We observed long-term survival after resection of metastatic EGC. A simple clinical score may help to identify a subgroup of patients with a high chance of benefit from resection. However, the accurate estimation of achieving a complete resection, which is an integral element of the score, remains challenging.
Resumo:
OBJECTIVE Cognitive impairments are regarded as a core component of schizophrenia. However, the cognitive dimension of psychosis is hardly considered by ultra-high risk (UHR) criteria. Therefore, we studied whether the combination of symptomatic UHR criteria and the basic symptom criterion "cognitive disturbances" (COGDIS) is superior in predicting first-episode psychosis. METHOD In a naturalistic 48-month follow-up study, the conversion rate to first-episode psychosis was studied in 246 outpatients of an early detection of psychosis service (FETZ); thereby, the association between conversion, and the combined and singular use of UHR criteria and COGDIS was compared. RESULTS Patients that met UHR criteria and COGDIS (n=127) at baseline had a significantly higher risk of conversion (hr=0.66 at month 48) and a shorter time to conversion than patients that met only UHR criteria (n=37; hr=0.28) or only COGDIS (n=30; hr=0.23). Furthermore, the risk of conversion was higher for the combined criteria than for UHR criteria (n=164; hr=0.56 at month 48) and COGDIS (n=158; hr=0.56 at month 48) when considered irrespective of each other. CONCLUSIONS Our findings support the merits of considering both COGDIS and UHR criteria in the early detection of persons who are at high risk of developing a first psychotic episode within 48months. Applying both sets of criteria improves sensitivity and individual risk estimation, and may thereby support the development of stage-targeted interventions. Moreover, since the combined approach enables the identification of considerably more homogeneous at-risk samples, it should support both preventive and basic research.
Resumo:
BACKGROUND Prediction studies in subjects at Clinical High Risk (CHR) for psychosis are hampered by a high proportion of uncertain outcomes. We therefore investigated whether quantitative EEG (QEEG) parameters can contribute to an improved identification of CHR subjects with a later conversion to psychosis. METHODS This investigation was a project within the European Prediction of Psychosis Study (EPOS), a prospective multicenter, naturalistic field study with an 18-month follow-up period. QEEG spectral power and alpha peak frequencies (APF) were determined in 113 CHR subjects. The primary outcome measure was conversion to psychosis. RESULTS Cox regression yielded a model including frontal theta (HR=1.82; [95% CI 1.00-3.32]) and delta (HR=2.60; [95% CI 1.30-5.20]) power, and occipital-parietal APF (HR=.52; [95% CI .35-.80]) as predictors of conversion to psychosis. The resulting equation enabled the development of a prognostic index with three risk classes (hazard rate 0.057 to 0.81). CONCLUSIONS Power in theta and delta ranges and APF contribute to the short-term prediction of psychosis and enable a further stratification of risk in CHR samples. Combined with (other) clinical ratings, EEG parameters may therefore be a useful tool for individualized risk estimation and, consequently, targeted prevention.
Resumo:
Global environmental change includes changes in a wide range of global scale phenomena, which are expected to affect a number of physical processes, as well as the vulnerability of the communities that will experience their impact. Decision-makers are in need of tools that will enable them to assess the loss of such processes under different future scenarios and to design risk reduction strategies. In this paper, a tool is presented that can be used by a range of end-users (e.g. local authorities, decision makers, etc.) for the assessment of the monetary loss from future landslide events, with a particular focus on torrential processes. The toolbox includes three functions: a) enhancement of the post-event damage data collection process, b) assessment of monetary loss of future events and c) continuous updating and improvement of an existing vulnerability curve by adding data of recent events. All functions of the tool are demonstrated through examples of its application.
Resumo:
The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.
Resumo:
Tick-borne encephalitis (TBE) is one of the most dangerous human neurological infections occurring in Europe and Northern parts of Asia with thousands of cases and millions vaccinated against it. The risk of TBE might be assessed through analyses of the samples taken from wildlife or from animals which are in close contact with humans. Dogs have been shown to be a good sentinel species for these studies. Serological assays for diagnosis of TBE in dogs are mainly based on purified and inactivated TBEV antigens. Here we describe novel dog anti-TBEV IgG monoclonal antibody (MAb)-capture assay which is based on TBEV prME subviral particles expressed in mammalian cells from Semliki Forest virus (SFV) replicon as well as IgG immunofluorescence assay (IFA) which is based on Vero E6 cells transfected with the same SFV replicon. We further demonstrate their use in a small-scale TBEV seroprevalence study of dogs representing different regions of Finland. Altogether, 148 dog serum samples were tested by novel assays and results were compared to those obtained with a commercial IgG enzyme immunoassay (EIA), hemagglutination inhibition test and IgG IFA with TBEV infected cells. Compared to reference tests, the sensitivities of the developed assays were 90-100% and the specificities of the two assays were 100%. Analysis of the dog serum samples showed a seroprevalence of 40% on Åland Islands and 6% on Southwestern archipelago of Finland. In conclusion, a specific and sensitive EIA and IFA for the detection of IgG antibodies in canine sera were developed. Based on these assays the seroprevalence of IgG antibodies in dogs from different regions of Finland was assessed and was shown to parallel the known human disease burden as the Southwestern archipelago and Åland Islands in particular had considerable dog TBEV antibody prevalence and represent areas with high risk of TBE for humans.