78 resultados para Risk controlling strategies
Resumo:
The sudden independence of Kyrgyzstan from the Soviet Union in 1991 led to a total rupture of industrial and agricultural production. Based on empirical data, this study seeks to identify key land use transformation processes since the late 1980s, their impact on people's livelihoods and the implication for natural resources in the communes of Tosh Bulak and Saz, located in the Sokuluk River Basin on the northern slope of the Kyrgyz Range. Using the concept of the sustainable livelihood approach as an analytical framework, three different livelihood strategies were identified: (1) An accumulation strategy applied by wealthy households where renting and/or buying of land is a key element; they are the only household category capable of venturing into rain fed agriculture. (2) A preserving strategy involving mainly intermediate households who are not able to buy or rent additional agricultural land; very often they are forced to return their land to the commune or sell it to wealthier households. (3) A coping strategy including mainly poor households consisting of elderly pensioners or headed by single mothers; due to their limited labour and economic power, agricultural production is very low and hardly covers subsistence needs; pensions and social allowances form the backbone of these livelihoods. Ecological assessments have shown that the forage productivity of remote high mountain pastures has increased from 5 to 22 per cent since 1978. At the same time forage productivity on pre-mountain and mountain pastures close to villages has generally decreased from 1 to 34 per cent. It seems that the main avenues for livelihoods to increase their wealth are to be found in the agricultural sector by controlling more and mainly irrigated land as well as by increasing livestock. The losers in this process are thus those households unable to keep or exploit their arable land or to benefit from new agricultural land. Ensuring access to land for the poor is therefore imperative in order to combat rural poverty and socio-economic disparities in rural Kyrgyzstan.
Resumo:
Mass screening for osteoporosis using DXA measurements at the spine and hip is presently not recommended by health authorities. Instead, risk factor questionnaires and peripheral bone measurements may facilitate the selection of women eligible for axial bone densitometry. The aim of this study was to validate a case finding strategy for postmenopausal women who would benefit most from subsequent DXA measurement by using phalangeal radiographic absorptiometry (RA) alone or in combination with risk factors in a general practice setting. The sensitivity and specificity of this strategy in detecting osteoporosis (T-score < or =2.5 SD at the spine and/or the hip) were compared with those of the current reimbursement criteria for DXA measurements in Switzerland. Four hundred and twenty-three postmenopausal women with one or more risk factors for osteoporosis were recruited by 90 primary care physicians who also performed the phalangeal RA measurements. All women underwent subsequent DXA measurement of the spine and the hip at the Osteoporosis Policlinic of the University Hospital of Berne. They were allocated to one of two groups depending on whether they matched with the Swiss reimbursement conditions for DXA measurement or not. Logistic regression models were used to predict the likelihood of osteoporosis versus "no osteoporosis" and to derive ROC curves for the various strategies. Differences in the areas under the ROC curves (AUC) were tested for significance. In women lacking reimbursement criteria, RA achieved a significantly larger AUC (0.81; 95% CI 0.72-0.89) than the risk factors associated with patients' age, height and weight (0.71; 95% C.I. 0.62-0.80). Furthermore, in this study, RA provided a better sensitivity and specificity in identifying women with underlying osteoporosis than the currently accepted criteria for reimbursement of DXA measurement. In the Swiss environment, RA is a valid case finding tool for patients with risk factors for osteoporosis, especially for those who do not qualify for DXA reimbursement.
Resumo:
Animal and early clinical studies of gene therapy for tissue ischaemia suggested that this approach might provide benefit to patients with coronary artery disease not amenable to traditional revascularization. This enthusiasm was then tempered by the subsequent disappointing results of randomized clinical trials and led researchers to develop strategies using progenitor cells as an alternative to improve collateral function. However, the recent publication of several randomized clinical trials reporting either negative or weakly positive results using this approach have led to questions regarding its effectiveness. There are several factors that need to be considered in explaining the discordance between the positive studies of such treatments in animals and the disappointing results seen in randomized patient trials. Aside from the practical issues of arteriogenic therapies, such as effective delivery, vascular remodelling is an extraordinarily complex process, and the administration of a single agent or cell in the hope that it would lead to lasting physiological effects may be far too simplistic an approach. In addition, however, evidence now suggests that many of the traditional cardiovascular risk factors-such as age and hypercholesterolemia-may impair the host response not only to ischaemia but, critically, also to treatment as well. This review discusses the evidence and mechanisms for these observations and highlights future directions that might be taken in an effort to provide more effective therapies.
Resumo:
BACKGROUND: Patients with chemotherapy-related neutropenia and fever are usually hospitalized and treated on empirical intravenous broad-spectrum antibiotic regimens. Early diagnosis of sepsis in children with febrile neutropenia remains difficult due to non-specific clinical and laboratory signs of infection. We aimed to analyze whether IL-6 and IL-8 could define a group of patients at low risk of septicemia. METHODS: A prospective study was performed to assess the potential value of IL-6, IL-8 and C-reactive protein serum levels to predict severe bacterial infection or bacteremia in febrile neutropenic children with cancer during chemotherapy. Statistical test used: Friedman test, Wilcoxon-Test, Kruskal-Wallis H test, Mann-Whitney U-Test and Receiver Operating Characteristics. RESULTS: The analysis of cytokine levels measured at the onset of fever indicated that IL-6 and IL-8 are useful to define a possible group of patients with low risk of sepsis. In predicting bacteremia or severe bacterial infection, IL-6 was the best predictor with the optimum IL-6 cut-off level of 42 pg/ml showing a high sensitivity (90%) and specificity (85%). CONCLUSION: These findings may have clinical implications for risk-based antimicrobial treatment strategies.
Resumo:
BACKGROUND: there is inadequate evidence to support currently formulated NHS strategies to achieve health promotion and preventative care in older people through broad-based screening and assessment in primary care. The most extensively evaluated delivery instrument for this purpose is Health Risk Appraisal (HRA). This article describes a trial using HRA to evaluate the effect on health behaviour and preventative-care uptake in older people in NHS primary care. METHODS: a randomised controlled trial was undertaken in three London primary care group practices. Functionally independent community-dwelling patients older than 65 years (n = 2,503) received a self-administered Health Risk Appraisal for Older Persons (HRA-O) questionnaire leading to computer-generated individualised written feedback to participants and general practitioners (GPs), integrated into practice information-technology (IT) systems. All primary care staff received training in preventative health in older people. The main outcome measures were self-reported health behaviour and preventative care uptake at 1-year follow-up. RESULTS: of 2,503 individuals randomised, 2,006 respondents (80.1%) (intervention, n = 940, control n = 1,066) were available for analysis. Intervention group respondents reported slightly higher pneumococcal vaccination uptake and equivocal improvement in physical activity levels compared with controls. No significant differences were observed for any other categories of health behaviour or preventative care measures at 1-year follow-up. CONCLUSIONS: HRA-O implemented in this way resulted in minimal improvement of health behaviour or uptake of preventative care measures in older people. Supplementary reinforcement involving contact by health professionals with patients over and above routine clinical encounters may be a prerequisite to the effectiveness of IT-based delivery systems for health promotion in older people.
Resumo:
In order to determine anticoagulation strategies in OPCAB a questionnaire survey among 750 European cardio-thoracic surgeons was performed. Questions addressed volume of OPCAB procedures performed, intra- and perioperative heparinization and antiplatelet therapy, as well as perioperative management. A total of 325 (43.7%) questionnaires were returned and validated. Perioperative protocols for administration of antiplatelets differed among the respondent surgeons. Perioperative prophylaxis of thrombosis (low or high molecular weight heparin) is performed by 78%. Intraoperative heparin dosage range between 70 U/kg to 500 U/kg, 60% of respondents prefer a low-dose regimen (< or = 150 U/kg). Correspondingly, the lowest activated clotting time (ACT) during surgery is accepted to be 200 s by 24%, 250 s by 18% and 300 s by 26% of surgeons. Protamine is used by 91% of respondents, while 52% perform a 1:1 reversal. A cell-saver and antifibrinolytics are used by 70% and 40%, respectively. Interestingly, 56% of respondents think bleeding in OPCAB patients is not reduced when compared to on-pump CABG. In addition, 34% of respondents believe there is an increased risk of early graft occlusion following OPCAB. This survey demonstrates widely different intra- and perioperative anticoagulation strategies for OPCAB procedures among European surgeons.
Resumo:
OBJECTIVE: To investigate HIV-related immunodeficiency as a risk factor for hepatocellular carcinoma (HCC) among persons infected with HIV, while controlling for the effect of frequent coinfection with hepatitis C and B viruses. DESIGN: A case-control study nested in the Swiss HIV Cohort Study. METHODS: Twenty-six HCC patients were identified in the Swiss HIV Cohort Study or through linkage with Swiss Cancer Registries, and were individually matched to 251 controls according to Swiss HIV Cohort Study centre, sex, HIV-transmission category, age and year at enrollment. Odds ratios and corresponding confidence intervals were estimated by conditional logistic regression. RESULTS: All HCC patients were positive for hepatitis B surface antigen or antibodies against hepatitis C virus. HCC patients included 14 injection drug users (three positive for hepatitis B surface antigen and 13 for antibodies against hepatitis C virus) and 12 men having sex with men/heterosexual/other (11 positive for hepatitis B surface antigen, three for antibodies against hepatitis C virus), revealing a strong relationship between HIV transmission route and hepatitis viral type. Latest CD4+ cell count [Odds ratio (OR) per 100 cells/mul decrease = 1.33, 95% confidence interval (CI) 1.06-1.68] and CD4+ cell count percentage (OR per 10% decrease = 1.65, 95% CI 1.01-2.71) were significantly associated with HCC. The effects of CD4+ cell count were concentrated among men having sex with men/heterosexual/other rather than injecting drug users. Highly active antiretroviral therapy use was not significantly associated with HCC risk (OR for ever versus never = 0.59, 95% confidence interval 0.18-1.91). CONCLUSION: Lower CD4+ cell counts increased the risk for HCC among persons infected with HIV, an effect that was particularly evident for hepatitis B virus-related HCC arising in non-injecting drug users.
Resumo:
The development of coronary vasculopathy is the main determinant of long-term survival in cardiac transplantation. The identification of risk factors, therefore, seems necessary in order to identify possible treatment strategies. Ninety-five out of 397 patients, undergoing orthotopic cardiac transplantation from 10/1985 to 10/1992 were evaluated retrospectively on the basis of perioperative and postoperative variables including age, sex, diagnosis, previous operations, renal function, cholesterol levels, dosage of immunosuppressive drugs (cyclosporin A, azathioprine, steroids), incidence of rejection, treatment with calcium channel blockers at 3, 6, 12, and 18 months postoperatively. Coronary vasculopathy was assessed by annual angiography at 1 and 2 years postoperatively. After univariate analysis, data were evaluated by stepwise multiple logistic regression analysis. Coronary vasculopathy was assessed in 15 patients at 1 (16%), and in 23 patients (24%) at 2, years. On multivariate analysis, previous operations and the incidence of rejections were identified as significant risk factors (P < 0.05), whereas the underlying diagnosis had borderline significance (P = 0.058) for the development of graft coronary vasculopathy. In contrast, all other variables were not significant in our subset of patients investigated. We therefore conclude that the development of coronary vasculopathy in cardiac transplant patients mainly depends on the rejection process itself, aside from patient-dependent factors. Therapeutic measures, such as the administration of calcium channel blockers and regulation of lipid disorders, may therefore only reduce the progress of native atherosclerotic disease in the posttransplant setting.
Resumo:
QUESTIONS UNDER STUDY: To assess whether the prevalence of HIV positive tests in clients at five anonymous testing sites in Switzerland had increased since the end of the 1990s, and ascertain whether there had been any concurrent change in the proportions of associated risk factors. METHODS: Baseline characteristics were analysed, by groups of years, over the eleven consecutive years of data collected from the testing sites. Numbers of HIV positive tests were presented as prevalence/1000 tests performed within each category. Multivariable analyses, stratified by African nationality and risk group of heterosexuals or men who have sex with men (MSM), were done controlling simultaneously for a series of variables. Odds ratios (ORs) were reported together with their 95% confidence intervals (CI). P values were calculated from likelihood ratio tests. RESULTS: There was an increase in the prevalence of positive tests in African heterosexuals between 1996-1999 and 2004-2006, rising from 54.2 to 86.4/1000 and from 5.6 to 25.2/1000 in females and males respectively. The proportion of MSM who knew that one or more of their sexual partners was infected with HIV increased from 2% to 17% and the proportion who reported having more than five sexual partners in the preceding two years increased from 44% to 51%. CONCLUSIONS: Surveillance data from anonymous testing sites continue to provide useful information on the changing epidemiology of HIV and thus inform public health strategies against HIV.
Resumo:
Early onset neonatal sepsis due to Group B streptococci (GBS) is responsible for severe morbidity and mortality of newborns. While different preventive strategies to identify women at risk are being recommended, the optimal strategy depends on the incidence of GBS-sepsis and on the prevalence of anogenital GBS colonization. We therefore aimed to assess the Group B streptococci prevalence and its consequences on different prevention strategies. We analyzed 1316 pregnant women between March 2005 and September 2006 at our institution. The prevalence of GBS colonization was determined by selective cultures of anogenital smears. The presence of risk factors was analyzed. In addition, the direct costs of screening and intrapartum antibiotic prophylaxis were estimated for different preventive strategies. The prevalence of GBS colonization was 21%. Any maternal intrapartum risk factor was present in 37%. The direct costs of different prevention strategies have been estimated as follows: risk-based: 18,500 CHF/1000 live births, screening-based: 50,110 CHF/1000 live births, combined screening- and risk-based: 43,495/1000 live births. Strategies to prevent GBS-sepsis in newborn are necessary. With our colonization prevalence of 21%, and the intrapartum risk profile of women, the screening-based approach seems to be superior as compared to a risk-based approach.
Resumo:
OBJECTIVE: To investigate a large outbreak of scabies in an intensive care unit of a university hospital and an affiliated rehabilitation center, and to establish effective control measures to prevent further transmission. DESIGN: Outbreak investigation. SETTING: The intensive care unit of a 750-bed university hospital and an affiliated 92-bed rehabilitation center. METHODS: All exposed individuals were screened by a senior staff dermatologist. Scabies was diagnosed on the basis of (1) identification of mites by skin scraping, (2) identification of mites by dermoscopy, or (3) clinical examination of patients without history of prior treatment for typical burrows. During a follow-up period of 6 months, the attack rate was calculated as the number of symptomatic individuals divided by the total number of exposed individuals. INTERVENTIONS: All exposed healthcare workers (HCWs) and their household members underwent preemptive treatment. Initially, the most effective registered drug in Switzerland (ie, topical lindane) was prescribed, but this prescription was switched to topical permethrin or systemic ivermectin as a result of the progression of the outbreak. Individuals with any signs or symptoms of scabies underwent dermatological examination. RESULTS: Within 7 months, 19 cases of scabies were diagnosed, 6 in children with a mean age of 3.1 years after exposure to the index patient with HIV and crusted scabies. A total of 1,640 exposed individuals underwent preemptive treatment. The highest attack rate of 26%-32% was observed among HCWs involved in the care of the index patient. A too-restricted definition of individuals at risk, noncompliance with treatment, and the limited effectiveness of lindane likely led to treatment failure, relapse, and reinfestation within families. CONCLUSIONS: Crusted scabies resulted in high attack rates among HCWs and household contacts. Timely institution of hygienic precautions with close monitoring and widespread, simultaneous scabicide treatment of all exposed individuals are essential for control of an outbreak.
Resumo:
This report on The Potential of Mode of Action (MoA) Information Derived from Non-testing and Screening Methodologies to Support Informed Hazard Assessment, resulted from a workshop organised within OSIRIS (Optimised Strategies for Risk Assessment of Industrial Chemicals through Integration of Non-test and Test Information), a project partly funded by the EU Commission within the Sixth Framework Programme. The workshop was held in Liverpool, UK, on 30 October 2008, with 35 attendees. The goal of the OSIRIS project is to develop integrated testing strategies (ITS) fit for use in the REACH system, that would enable a significant increase in the use of non-testing information for regulatory decision making, and thus minimise the need for animal testing. One way to improve the evaluation of chemicals may be through categorisation by way of mechanisms or modes of toxic action. Defining such groups can enhance read-across possibilities and priority settings for certain toxic modes or chemical structures responsible for these toxic modes. Overall, this may result in a reduction of in vivo testing on organisms, through combining available data on mode of action and a focus on the potentially most-toxic groups. In this report, the possibilities of a mechanistic approach to assist in and guide ITS are explored, and the differences between human health and environmental areas are summarised.
Resumo:
Starting with an overview on losses due to mountain hazards in the Russian Federation and the European Alps, the question is raised why a substantial number of events still are recorded—despite considerable efforts in hazard mitigation and risk reduction. The main reason for this paradox lies in a missing dynamic risk-based approach, and it is shown that these dynamics have different roots: firstly, neglecting climate change and systems dynamics, the development of hazard scenarios is based on the static approach of design events. Secondly, due to economic development and population dynamics, the elements at risk exposed are subject to spatial and temporal changes. These issues are discussed with respect to temporal and spatial demands. As a result, it is shown how risk is dynamic on a long-term and short-term scale, which has to be acknowledged in the risk concept if this concept is targeted at a sustainable development of mountain regions. A conceptual model is presented that can be used for dynamical risk assessment, and it is shown by different management strategies how this model may be converted into practice. Furthermore, the interconnectedness and interaction between hazard and risk are addressed in order to enhance prevention, the level of protection and the degree of preparedness.
Resumo:
We investigated the distribution of commensal staphylococcal species and determined the prevalence of multi-drug resistance in healthy cats and dogs. Risk factors associated with the carriage of multi-drug resistant strains were explored. Isolates from 256 dogs and 277 cats were identified at the species level using matrix-assisted laser desorption ionisation-time of flight mass spectrometry. The diversity of coagulase-negative Staphylococci (CNS) was high, with 22 species in dogs and 24 in cats. Multi-drug resistance was frequent (17%) and not always associated with the presence of the mecA gene. A stay in a veterinary clinic in the last year was associated with an increased risk of colonisation by multi-drug resistant Staphylococci (OR = 2.4, 95% CI: 1.1˜5.2, p value LRT = 0.04). When identifying efficient control strategies against antibiotic resistance, the presence of mechanisms other than methicillin resistance and the possible role of CNS in the spread of resistance determinants should be considered.
Resumo:
In all European Union countries, chemical residues are required to be routinely monitored in meat. Good farming and veterinary practice can prevent the contamination of meat with pharmaceutical substances, resulting in a low detection of drug residues through random sampling. An alternative approach is to target-monitor farms suspected of treating their animals with antimicrobials. The objective of this project was to assess, using a stochastic model, the efficiency of these two sampling strategies. The model integrated data on Swiss livestock as well as expert opinion and results from studies conducted in Switzerland. Risk-based sampling showed an increase in detection efficiency of up to 100% depending on the prevalence of contaminated herds. Sensitivity analysis of this model showed the importance of the accuracy of prior assumptions for conducting risk-based sampling. The resources gained by changing from random to risk-based sampling should be transferred to improving the quality of prior information.