908 resultados para Willingness to pay for risk reduction
Resumo:
Free fatty acids (FFAs) have been shown to produce alteration of heart rate variability (HRV) in healthy and diabetic individuals. Changes in HRV have been described in septic patients and in those with hyperglycemia and elevated plasma FFA levels. We studied if sepsis-induced heart damage and HRV alteration are associated with plasma FFA levels in patients. Thirty-one patients with sepsis were included. The patients were divided into two groups: survivors(n = 12) and nonsurvivors (n = 19). The following associations were investigated: (a) troponin I elevation and HRV reduction and (b) clinical evolution and HRV index, plasma troponin, and plasma FFA levels. Initial measurements of C-reactive protein and gravity Acute Physiology and Chronic Health Evaluation scores were similar in both groups. Overall, an increase in plasma troponin level was related to increased mortality risk. From the first day of study, the nonsurvivor group presented a reduced left ventricular stroke work systolic index and a reduced low frequency (LF) that is one of HRV indexes. The correlation coefficient for LF values and troponin was r(2) = 0.75 (P < 0.05). All patients presented elevated plasma FFA levels on the first day of the study (5.11 +/- 0.53 mg/mL), and this elevation was even greater in the nonsurvivor group compared with the survivors (6.88 +/- 0.13 vs. 3.85 +/- 0.48 mg/mL, respectively; P < 0.05). Cardiac damage was confirmed by measurement of plasma troponin I and histological analysis. Heart dysfunction was determined by left ventricular stroke work systolic index and HRV index in nonsurvivor patients. A relationship was found between plasma FFA levels, LFnu index, troponin levels, and histological changes. Plasma FFA levels emerged as possible cause of heart damage in sepsis.
Resumo:
Background: Around 15% of patients die or become dependent after cerebral vein and dural sinus thrombosis (CVT). Method: We used the International Study on Cerebral Vein and Dural Sinus Thrombosis (ISCVT) sample (624 patients, with a median follow-up time of 478 days) to develop a Cox proportional hazards regression model to predict outcome, dichotomised by a modified Rankin Scale score > 2. From the model hazard ratios, a risk score was derived and a cut-off point selected. The model and the score were tested in 2 validation samples: (1) the prospective Cerebral Venous Thrombosis Portuguese Collaborative Study Group (VENO-PORT) sample with 91 patients; (2) a sample of 169 consecutive CVT patients admitted to 5 ISCVT centres after the end of the ISCVT recruitment period. Sensitivity, specificity, c statistics and overall efficiency to predict outcome at 6 months were calculated. Results: The model (hazard ratios: malignancy 4.53; coma 4.19; thrombosis of the deep venous system 3.03; mental status disturbance 2.18; male gender 1.60; intracranial haemorrhage 1.42) had overall efficiencies of 85.1, 84.4 and 90.0%, in the derivation sample and validation samples 1 and 2, respectively. Using the risk score (range from 0 to 9) with a cut-off of 6 3 points, overall efficiency was 85.4, 84.4 and 90.1% in the derivation sample and validation samples 1 and 2, respectively. Sensitivity and specificity in the combined samples were 96.1 and 13.6%, respectively. Conclusions: The CVT risk score has a good estimated overall rate of correct classifications in both validation samples, but its specificity is low. It can be used to avoid unnecessary or dangerous interventions in low-risk patients, and may help to identify high-risk CVT patients. Copyright (C) 2009 S. Karger AG, Basel
Resumo:
Heart failure (HF) incidence in diabetes in both the presence and absence of CHD is rising. Prospective population-based studies can help describe the relationship between HbA(1c), a measure of glycaemia control, and HF risk. We studied the incidence of HF hospitalisation or death among 1,827 participants in the Atherosclerosis Risk in Communities (ARIC) study with diabetes and no evidence of HF at baseline. Cox proportional hazard models included age, sex, race, education, health insurance status, alcohol consumption, BMI and WHR, and major CHD risk factors (BP level and medications, LDL- and HDL-cholesterol levels, and smoking). In this population of persons with diabetes, crude HF incidence rates per 1,000 person-years were lower in the absence of CHD (incidence rate 15.5 for CHD-negative vs 56.4 for CHD-positive, p < 0.001). The adjusted HR of HF for each 1% higher HbA(1c) was 1.17 (95% CI 1.11-1.25) for the non-CHD group and 1.20 (95% CI 1.04-1.40) for the CHD group. When the analysis was limited to HF cases which occurred in the absence of prevalent or incident CHD (during follow-up) the adjusted HR remained 1.20 (95% CI 1.11-1.29). These data suggest HbA(1c) is an independent risk factor for incident HF in persons with diabetes with and without CHD. Long-term clinical trials of tight glycaemic control should quantify the impact of different treatment regimens on HF risk reduction.
The structure of middle management remuneration packages: An application to Australian mine managers
Resumo:
This paper investigates the composition of remuneration packages for middle managers and relates the structure of remuneration contracts to firm-specific attributes. A statutorily defined position in a single industry is studied as an example of middle management. This allows us to control for differences in task complexity across managers and industry-induced factors that could determine differences in remuneration contracts. Higher-risk firms are expected to pay their mine managers a greater proportion of variable salaries and market and/or accounting-based compensation than low-risk firms. Results indicate that high-risk firms pay a higher proportion of variable salaries and more compensation based on market and/or accounting performance.
Resumo:
OBJECTIVES We developed a prognostic strategy for quantifying the long-term risk of coronary heart disease (CHD) events in survivors of acute coronary syndromes (ACS). BACKGROUND Strategies for quantifying long-term risk of CHD events have generally been confined to primary prevention settings. The Long-term Intervention with Pravastatin in Ischemic Disease (LIPID) study, which demonstrated that pravastatin reduces CHD events in ACS survivors with a broad range of cholesterol levels, enabled assessment of long-term prognosis in a secondary prevention setting. METHODS Based on outcomes in 8,557 patients in the LIPID study, a multivariate risk factor model was developed for prediction of CHD death or nonfatal myocardial infarction. Prognostic indexes were developed based on the model, and low-, medium-, high- and very high-risk groups were defined by categorizing the prognostic indexes. RESULTS In addition to pravastatin treatment, the independently significant risk factors included: total and high density lipoprotein cholesterol, age, gender, smoking status, qualifying ACS, prior coronary revascularization, diabetes mellitus, hypertension and prior stroke. Pravastatin reduced coronary event rates in each risk level, and the relative risk reduction did not vary significantly between risk levels. The predicted five-year coronary event rates ranged from 5% to 19% for those assigned pravastatin and from 6.4% to 23.6% fur those assigned placebo. CONCLUSIONS Long-term prognosis of ACS survivors varied substantially according to conventional risk factor profile. Pravastatin reduced coronary risk within all risk levels; however, absolute risk remained high in treated patients with unfavorable profiles. Our risk stratification strategy enables identification of ACS survivors who remain at very high risk despite statin therapy. CT Am Coil Cardiol 2001;38:56-63) (C) 2001 by the American College of Cardiology.
Resumo:
Functional knowledge of the physiological basis of crop adaptation to stress is a prerequisite for exploiting specific adaptation to stress environments in breeding programs. This paper presents an analysis of yield components for pearl millet, to explain the specific adaptation of local landraces to stress environments in Rajasthan, India. Six genotypes, ranging from high-tillering traditional landraces to low-tillering open-pollinated modern cultivars, were grown in 20 experiments, covering a range of nonstress and drought stress patterns. In each experiment, yield components (particle number, grain number, 100 grain mass) were measured separately for main shoots, basal tillers, and nodal tillers. Under optimum conditions, landraces had a significantly lower grain yield than the cultivars, but no significant differences were observed at yield levels around 1 ton ha(-1). This genotype x environment interaction for grain yield was due to a difference in yield strategy, where landraces aimed at minimising the risk of a crop failure under stress conditions, and modem cultivars aimed at maximising yield potential under optimum conditions. A key aspect of the adaptation of landraces was the small size of the main shoot panicle, as it minimised (1) the loss of productive tillers during stem elongation; (2) the delay in anthesis if mid-season drought occurs; and (3) the reduction in panicle productivity of the basal tillers under stress. In addition, a low investment in structural panicle weight, relative to vegetative crop growth rate, promoted the production of nodal tillers, providing a mechanism to compensate for reduced basal tiller productivity if stress occurred around anthesis. A low maximum 100 grain mass also ensured individual grain mass was little affected by environmental conditions. The strategy of the high-tillering landraces carries a yield penalty under optimum conditions, but is expected to minimise the risk of a crop failure, particularly if mid-season drought stress occurs. The yield architecture of low-tillering varieties, by contrast, will be suited to end-of-season drought stress, provided anthesis is early. Application of the above adaptation mechanisms into a breeding program could enable the identification of plant types that match the prevalent stress patterns in the target environments. (C) 2003 E.J. van Oosterom. Published by Elsevier Science B.V. All rights reserved.
Resumo:
Introduction: The purpose of this review is to gather and analyse current research publications to evaluate Sinogram-Affirmed Iterative Reconstruction (SAFIRE). The aim of this review is to investigate whether this algorithm is capable of reducing the dose delivered during CT imaging while maintaining image quality. Recent research shows that children have a greater risk per unit dose due to increased radiosensitivity and longer life expectancies, which means it is particularly important to reduce the radiation dose received by children. Discussion: Recent publications suggest that SAFIRE is capable of reducing image noise in CT images, thereby enabling the potential to reduce dose. Some publications suggest a decrease in dose, by up to 64% compared to filtered back projection, can be accomplished without a change in image quality. However, literature suggests that using a higher SAFIRE strength may alter the image texture, creating an overly ‘smoothed’ image that lacks contrast. Some literature reports SAFIRE gives decreased low contrast detectability as well as spatial resolution. Publications tend to agree that SAFIRE strength three is optimal for an acceptable level of visual image quality, but more research is required. The importance of creating a balance between dose reduction and image quality is stressed. In this literature review most of the publications were completed using adults or phantoms, and a distinct lack of literature for paediatric patients is noted. Conclusion: It is necessary to find an optimal way to balance dose reduction and image quality. More research relating to SAFIRE and paediatric patients is required to fully investigate dose reduction potential in this population, for a range of different SAFIRE strengths.
Resumo:
High loads of fungi have been reported in different types of waste management plants. This study intends to assess fungal contamination in one waste-sorting plant before and after cleaning procedures in order to analyze their effectiveness. Air samples of 50 L were collected through an impaction method, while surface samples, taken at the same time, were collected by the swabbing method and subject to further macro- and microscopic observations. In addition, we collected air samples of 250 L using the impinger Coriolis μ air sampler (Bertin Technologies) at 300 L/min airflow rate in order to perform real-time quantitative PCR (qPCR) amplification of genes from specific fungal species, namely Aspergillus fumigatus and Aspergillus flavus complexes, as well as Stachybotrys chartarum species. Fungal quantification in the air ranged from 180 to 5,280 CFU m−3 before cleaning and from 220 to 2,460 CFU m−3 after cleaning procedures. Surfaces presented results that ranged from 29 × 104 to 109 × 104 CFU m−2 before cleaning and from 11 × 104 to 89 × 104 CFU m−2 after cleaning. Statistically significant differences regarding fungal load were not detected between before and after cleaning procedures. Toxigenic strains from A. flavus complex and S. chartarum were not detected by qPCR. Conversely, the A. fumigatus species was successfully detected by qPCR and interestingly it was amplified in two samples where no detection by conventional methods was observed. Overall, these results reveal the inefficacy of the cleaning procedures and that it is important to determine fungal burden in order to carry out risk assessment.
Resumo:
It is well recognized that professional musicians are at risk of hearing damage due to the exposure to high sound pressure levels during music playing. However, it is important to recognize that the musicians’ exposure may start early in the course of their training as students in the classroom and at home. Studies regarding sound exposure of music students and their hearing disorders are scarce and do not take into account important influencing variables. Therefore, this study aimed to describe sound level exposures of music students at different music styles, classes, and according to the instrument played. Further, this investigation attempted to analyze the perceptions of students in relation to exposure to loud music and consequent health risks, as well as to characterize preventive behaviors. The results showed that music students are exposed to high sound levels in the course of their academic activity. This exposure is potentiated by practice outside the school and other external activities. Differences were found between music style, instruments, and classes. Tinnitus, hyperacusis, diplacusis, and sound distortion were reported by the students. However, students were not entirely aware of the health risks related to exposure to high sound pressure levels. These findings reflect the importance of starting intervention in relation to noise risk reduction at an early stage, when musicians are commencing their activity as students.
Resumo:
Aim of the paper: The purpose of this paper is to examine human resources management practices (HRM practices) in small firms and to improve the understanding of the relationship between this kind of practices and business growth. This exploratory study is based on the resource-based view of the firm and empirical work carried out in two small firms by relating HRM practices with the firms’ results. Contribution to the literature: This is an in-depth study of HRM practices and its impact on performance growth in micro firms, isolating and controlling for most of the contextual and internal variables considered in the literature that relate HRM to growth. Firm growth analysis was broadened by the use of several dependent variables: employment growth and operational and financial performance growth. Some hypotheses for further research in identifying HRM practices in small business and its relation with firm growth are suggested. Methodology: Case study methodology was used to study two firms. The techniques used to collect data were semi-structured interviews to the owner and all the employees, unstructured observation at the firms’ facilities (during two days), entrepreneur profile definition (survey answer) and document data collection (on demographic characterization and performance results). Data was analyzed through content analysis methodology, and categories derived from the interviews’ protocols and literature. Results and implications: Results revealed that despite the firms’ organizational characteristics similarities, they differ significantly in owners’ motivation to grow, HRM practices and organizational performance and growth. Future studies should pay special attention to owner willingness to grow, to firms’ years of experience in business, to staff’s years of experience in their field of work and turnover. HRM practices in micro/small firms should be better defined and characterized. The external image of management posture relating to longitudinal financial results and growth should also be explored.
Resumo:
Anti-Toxoplasma IgG-avidity was determined in 168 serum samples from IgG- and IgM-positive pregnant women at various times during pregnancy, in order to evaluate the predictive value for risk of mother-to-child transmission in a single sample, taking the limitations of conventional serology into account. The neonatal IgM was considered the serologic marker of transmission. Fluorometric tests for IgG, IgM (immunocapture) and IgG-avidity were performed. Fifty-one of the 128 pregnant women tested gave birth in the hospital and neonatal IgM was obtained. The results showed 32 (62.75%) pregnant women having high avidity, IgM indexes between 0.6 and 2.4, and no infected newborn. Nineteen (37.25%) had low or inconclusive avidity, IgM indexes between 0.6 and 11.9, and five infected newborns and one stillbirth. In two infected newborns and the stillbirth maternal IgM indexes were low and in one infected newborn the only maternal parameter that suggested fetal risk was IgG-avidity. In the present study, IgG-avidity performed in single samples from positive IgM pregnant women helped to determine the risk of transmission at any time during pregnancy, especially when the indexes of the two tests were analysed with respect to gestational age. This model may be less expensive in developing countries where there is a high prevalence of infection than the follow-up of susceptible mothers until childbirth with monthly serology, and it creates a new perspective for the diagnosis of congenital toxoplasmosis.
Resumo:
Reducing low-density lipoprotein cholesterol (LDL-C) levels using statins is associated with significant reductions in cardiovascular (CV) events in a wide range of patient populations. Although statins are generally considered to be safe, recent studies suggest they are associated with an increased risk of developing Type 2 diabetes (T2D). This led the US Food and Drug Administration (FDA) to change their labelling requirements for statins to include a warning about the possibility of increased blood sugar and HbA1c levels and the European Medicines Agency (EMA) to issue guidance on a small increased risk of T2D with the statin class. This review examines the evidence leading to these claims and provides practical guidance for primary care physicians on the use of statins in people with or at risk of developing T2D. Overall, evidence suggests that the benefits of statins for the reduction of CV risk far outweigh the risk of developing T2D, especially in individuals with higher CV risk. To reduce the risk of developing T2D, physicians should assess all patients for T2D risk prior to starting statin therapy, educate patients about their risks, and encourage risk-reduction through lifestyle changes. Whether some statins are more diabetogenic than others requires further study. Statin-treated patients at high risk of developing T2D should regularly be monitored for changes in blood glucose or HbA1c levels, and the risk of conversion from pre-diabetes to T2D should be reduced by intensifying lifestyle changes. Should a patient develop T2D during statin treatment, physicians should continue with statin therapy and manage T2D in accordance with relevant national guidelines.
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Industrial
Resumo:
INTRODUCTION: Urinary tract infections (UTI) among transplant recipients are usually caused by gram-negative microorganisms and can provoke a high incidence of morbidity and mortality. The aim of this study was to evaluate the risk factors associated with the acquisition of UTIs during the first year after renal transplantation. METHODS: Here, we report a single-center retrospective cohort study of 99 renal transplant patients followed for the first year after surgery. The definition of a UTI episode was a urine culture showing bacterial growth and leucocyturia when patients presented with urinary symptoms. The absence of infection (asymptomatic bacteriuria) was defined as an absence of symptoms with negative urine culture or bacterial growth with any number of colonies. RESULTS: Ninety-nine patients were included in the study. During the study, 1,847 urine cultures were collected, and 320 (17.3%) tested positive for bacterial growth. Twenty-six (26.2%) patients developed a UTI. The most frequent microorganisms isolated from patients with UTIs were Klebsiella pneumoniae (36%), with 33% of the strains resistant to carbapenems, followed by Escherichia coli (20%). There were no deaths or graft losses associated with UTI episodes. CONCLUSIONS: Among the UTI risk factors studied, the only one that was associated with a higher incidence of infection was female sex. Moreover, the identification of drug-resistant strains is worrisome, as these infections have become widespread globally and represent a challenge in the control and management of infections, especially in solid organ transplantation.
Resumo:
Abstract: INTRODUCTION: Despite multidrug therapy, leprosy remains a public health issue. The intradermal Bacillus Calmette-Guérin (BCG) vaccine, Mitsuda test (lepromin skin test), and anti-phenolic glycolipid I (PGL-I) serology are widely used in leprosy studies and have shown great epidemiological value. METHODS: This longitudinal study evaluated the relative risks and benefits of these three tools by comparing results observed in household contacts (HHCs) of leprosy patients who developed leprosy with those of HHCs who did not in a population of 2,992 individuals monitored during a 10-year period. RESULTS : Seventy-five (2.5%) new leprosy cases were diagnosed, including 28 (0.9%) co-prevalent cases. Therefore, for the risk-benefit assessment, 47 (1.6%) HHCs were considered as truly diagnosed during follow-up. The comparison between healthy and affected contacts demonstrated that not only did BCG vaccination increase protection, but boosters also increased to 95% relative risk (RR) reduction when results for having two or more scars were compared with having no scars [RR, 0.0459; 95% confidence interval (CI), 0.006-0.338]. Similarly, Mitsuda reactions >7mm in induration presented 7-fold greater protection against disease development compared to reactions of 0-3mm (RR, 0.1446; 95% CI, 0.0566-0.3696). In contrast, anti-PGL-I ELISA seropositivity indicated a 5-fold RR increase for disease outcome (RR, 5.688; 95% CI, 3.2412-9.9824). The combined effect of no BCG scars, Mitsuda reaction of <7mm, and seropositivity to anti-PGL-I increased the risk for leprosy onset 8-fold (RR, 8.109; 95% CI, 5.1167-12.8511). CONCLUSIONS: The adoption of these combined assays may impose measures for leprosy control strategies.