33 resultados para Value at risk
Resumo:
OBJECTIVE: To assess the prognostic value of Technetium-99m-labeled single-photon emission computerized tomography (SPECT) in the follow-up of patients who had undergone their first myocardial revascularization. METHODS: We carried out a retrospective study of 280 revascularized patients undergoing myocardial scintigraphy under stress (exercise or pharmacological stress with dipyridamole) and at rest according to a 2-day protocol. A set of clinical, stress electrocardiographic and scintigraphic variables was assessed. Cardiac events were classified as "major" (death, infarction, unstable angina) and "any" (major event or coronary angioplasty or new myocardial revascularization surgery). RESULTS: Thirty-six major events occurred as follows: 3 deaths, 11 infarctions, and 22 unstable anginas. In regard to any event, 22 angioplasties and 7 new surgeries occurred in addition to major events, resulting a total of 65 events. The sensitivity of scintigraphy in prognosticating a major event or any event was, respectively, 55% and 58%, showing a negative predictive value of 90% and 83%, respectively. Diabetes mellitus, inconclusive stress electrocardiography, and a scintigraphic visualization of left ventricular enlargement were significant variables for the occurrence of a major event. On multivariate analysis, abnormal myocardial scintigraphy was a predictor of any event. CONCLUSION: Myocardial perfusion tomography with Technetium-99m may be used to identify high-risk patients after their first myocardial revascularization surgery.
Resumo:
OBJECTIVE: To determine the immediate behavior and the prognostic value in terms of late survival of serum troponin I measurement in patients undergoing myocardial revascularization surgery with extracorporeal circulation. METHODS: We studied 88 random patients, 65 (73.8%) of the male sex, who underwent myocardial revascularization surgery with extracorporeal circulation. Troponin measurements were performed as follows: in the preoperative period, right after intensive care unit admission, and on the first and second postoperative days. Values below 0.1 nanogram per milliliter (ng/mL) were considered normal. The cut points for late prognostic assessment were 0.5 ng/mL; 1 ng/mL; 2.5 ng/mL; and 5 ng/mL. RESULTS: The serum troponin I levels were elevated on the first postoperative day, suggesting the occurrence of specific myocardial damage. Patients with a poor prognosis could be identified, because the serum levels above 2.5 ng/mL and 5 ng/mL in the postoperative period resulted, respectively, in mortality rates of 33% and 50% in a maximum 6-month follow-up. CONCLUSION: Troponin I values around 2.5 ng/mL in the postoperative period should call attention to the need for more aggressive diagnostic or therapeutical measures.
Resumo:
Background: The TIMI Score for ST-segment elevation myocardial infarction (STEMI) was created and validated specifically for this clinical scenario, while the GRACE score is generic to any type of acute coronary syndrome. Objective: Between TIMI and GRACE scores, identify the one of better prognostic performance in patients with STEMI. Methods: We included 152 individuals consecutively admitted for STEMI. The TIMI and GRACE scores were tested for their discriminatory ability (C-statistics) and calibration (Hosmer-Lemeshow) in relation to hospital death. Results: The TIMI score showed equal distribution of patients in the ranges of low, intermediate and high risk (39 %, 27 % and 34 %, respectively), as opposed to the GRACE Score that showed predominant distribution at low risk (80 %, 13 % and 7%, respectively). Case-fatality was 11%. The C-statistics of the TIMI score was 0.87 (95%CI = 0.76 to 0.98), similar to GRACE (0.87, 95%CI = 0.75 to 0.99) - p = 0.71. The TIMI score showed satisfactory calibration represented by χ2 = 1.4 (p = 0.92), well above the calibration of the GRACE score, which showed χ2 = 14 (p = 0.08). This calibration is reflected in the expected incidence ranges for low, intermediate and high risk, according to the TIMI score (0 %, 4.9 % and 25 %, respectively), differently to GRACE (2.4%, 25% and 73%), which featured middle range incidence inappropriately. Conclusion: Although the scores show similar discriminatory capacity for hospital death, the TIMI score had better calibration than GRACE. These findings need to be validated populations of different risk profiles.
Resumo:
Background:The QRS-T angle correlates with prognosis in patients with heart failure and coronary artery disease, reflected by an increase in mortality proportional to an increase in the difference between the axes of the QRS complex and T wave in the frontal plane. The value of this correlation in patients with Chagas heart disease is currently unknown.Objective:Determine the correlation of the QRS-T angle and the risk of induction of ventricular tachycardia / ventricular fibrillation (VT / VF) during electrophysiological study (EPS) in patients with Chagas disease.Methods:Case-control study at a tertiary center. Patients without induction of VT / VF on EPS were used as controls. The QRS-T angle was categorized as normal (0-105º), borderline (105-135º) or abnormal (135-180º). Differences between groups for continuous variables were analyzed with the t test or Mann-Whitney test, and for categorical variables with Fisher's exact test. P values < 0.05 were considered significant.Results:Of 116 patients undergoing EPS, 37.9% were excluded due to incomplete information / inactive records or due to the impossibility to correctly calculate the QRS-T angle (presence of left bundle branch block and atrial fibrillation). Of 72 patients included in the study, 31 induced VT / VF on EPS. Of these, the QRS-T angle was normal in 41.9%, borderline in 12.9% and abnormal in 45.2%. Among patients without induction of VT / VF on EPS, the QRS-T angle was normal in 63.4%, borderline in 14.6% and abnormal in 17.1% (p = 0.04). When compared with patients with normal QRS-T angle, those with abnormal angle had a fourfold higher risk of inducing ventricular tachycardia / ventricular fibrillation on EPS [odds ratio (OR) 4; confidence interval (CI) 1.298-12.325; p = 0.028]. After adjustment for other variables such as age, ejection fraction (EF) and QRS size, there was a trend for the abnormal QRS-T angle to identify patients with increased risk of inducing VT / VF during EPS (OR 3.95; CI 0.99-15.82; p = 0.052). The EF also emerged as a predictor of induction of VT / VF: for each point increase in EF, there was a 4% reduction in the rate of sustained ventricular arrhythmia on EPS.Conclusions:Changes in the QRS-T angle and decreases in EF were associated with an increased risk of induction of VT / VF on EPS.
Resumo:
Background:Cardiovascular diseases (CVDs) are the leading cause of death worldwide. Knowledge about cardiovascular risk factors (CVRFs) in young adults and their modification over time are measures that change the risks and prevent CVDs.Objectives:To determine the presence of CVRFs and their changes in different health care professionals over a period of 20 years.Methods:All students of medicine, nursing, nutrition, odontology, and pharmacy departments of Federal University of Goiás who agreed to participate in this study were evaluated when they started their degree courses and 20 years afterward. Questionnaires on CVRFs [systemic arterial hypertension (SAH), diabetes mellitus, dyslipidemia, and family history of early CVD, smoking, alcohol consumption, and sedentarism] were administered. Cholesterol levels, blood sugar levels, blood pressure, weight, height, and body mass index were determined. The Kolmogorov-Smirnov test was used to evaluate distribution, the chi-square test was used to compare different courses and sexes, and the McNemar test was used for comparing CVRFs. The significance level was set at a p value of < 0.05.Results:The first stage of the study included 281 individuals (91% of all the students), of which 62.9% were women; the mean age was 19.7 years. In the second stage, 215 subjects were reassessed (76% of the initial sample), of which 59.07% were women; the mean age was 39.8 years. The sample mostly consisted of medical students (with a predominance of men), followed by nursing, nutrition, and pharmacy students, with a predominance of women (p < 0.05). Excessive weight gain, SAH, and dyslipidemia were observed among physicians and dentists (p < 0.05). Excessive weight gain and SAH and a reduction in sedentarism (p < 0.05) were observed among pharmacists. Among nurses there was an increase in excessive weight and alcohol consumption (p < 0.05). Finally, nutritionists showed an increase in dyslipidemia (p < 0.05).Conclusion:In general, there was an unfavorable progression of CVRFs in the population under study, despite it having adequate specialized knowledge about these risk factors.
Resumo:
AbstractBackground:Heart surgery has developed with increasing patient complexity.Objective:To assess the use of resources and real costs stratified by risk factors of patients submitted to surgical cardiac procedures and to compare them with the values reimbursed by the Brazilian Unified Health System (SUS).Method:All cardiac surgery procedures performed between January and July 2013 in a tertiary referral center were analyzed. Demographic and clinical data allowed the calculation of the value reimbursed by the Brazilian SUS. Patients were stratified as low, intermediate and high-risk categories according to the EuroSCORE. Clinical outcomes, use of resources and costs (real costs versus SUS) were compared between established risk groups.Results:Postoperative mortality rates of low, intermediate and high-risk EuroSCORE risk strata showed a significant linear positive correlation (EuroSCORE: 3.8%, 10%, and 25%; p < 0.0001), as well as occurrence of any postoperative complication EuroSCORE: 13.7%, 20.7%, and 30.8%, respectively; p = 0.006). Accordingly, length-of-stay increased from 20.9 days to 24.8 and 29.2 days (p < 0.001). The real cost was parallel to increased resource use according to EuroSCORE risk strata (R$ 27.116,00 ± R$ 13.928,00 versus R$ 34.854,00 ± R$ 27.814,00 versus R$ 43.234,00 ± R$ 26.009,00, respectively; p < 0.001). SUS reimbursement also increased (R$ 14.306,00 ± R$ 4.571,00 versus R$ 16.217,00 ± R$ 7.298,00 versus R$ 19.548,00 ± R$935,00; p < 0.001). However, as the EuroSCORE increased, there was significant difference (p < 0.0001) between the real cost increasing slope and the SUS reimbursement elevation per EuroSCORE risk strata.Conclusion:Higher EuroSCORE was related to higher postoperative mortality, complications, length of stay, and costs. Although SUS reimbursement increased according to risk, it was not proportional to real costs.
Resumo:
Abstract Background: Pulmonary hypertension is associated with poor prognosis in heart failure. However, non-invasive diagnosis is still challenging in clinical practice. Objective: We sought to assess the prognostic utility of non-invasive estimation of pulmonary vascular resistances (PVR) by cardiovascular magnetic resonance to predict adverse cardiovascular outcomes in heart failure with reduced ejection fraction (HFrEF). Methods: Prospective registry of patients with left ventricular ejection fraction (LVEF) < 40% and recently admitted for decompensated heart failure during three years. PVRwere calculated based on right ventricular ejection fraction and average velocity of the pulmonary artery estimated during cardiac magnetic resonance. Readmission for heart failure and all-cause mortality were considered as adverse events at follow-up. Results: 105 patients (average LVEF 26.0 ±7.7%, ischemic etiology 43%) were included. Patients with adverse events at long-term follow-up had higher values of PVR (6.93 ± 1.9 vs. 4.6 ± 1.7estimated Wood Units (eWu), p < 0.001). In multivariate Cox regression analysis, PVR ≥ 5 eWu(cutoff value according to ROC curve) was independently associated with increased risk of adverse events at 9 months follow-up (HR2.98; 95% CI 1.12-7.88; p < 0.03). Conclusions: In patients with HFrEF, the presence of PVR ≥ 5.0 Wu is associated with significantly worse clinical outcome at follow-up. Non-invasive estimation of PVR by cardiac magnetic resonance might be useful for risk stratification in HFrEF, irrespective of etiology, presence of late gadolinium enhancement or LVEF.
Resumo:
Sepsis is a major challenge in medicine. It is a common and frequently fatal infectious condition. The incidence continues to increase, with unacceptably high mortality rates, despite the use of specific antibiotics, aggressive operative intervention, nutritional support, and anti-inflammatory therapies. Typically, septic patients exhibit a high degree of heterogeneity due to variables such as age, weight, gender, the presence of secondary disease, the state of the immune system, and the severity of the infection. We are at urgent need for biomarkers and reliable measurements that can be applied to risk stratification of septic patients and that would easily identify those patients at the highest risk of a poor outcome. Such markers would be of fundamental importance to decision making for early intervention therapy or for the design of septic clinical trials. In the present work, we will review current biomarkers for sepsis severity and especially the use of cytokines as biomarkers with important pathophysiological role.
Resumo:
Chagas heart disease (CHD) results from infection with the protozoan parasite Trypanosoma cruzi and is the leading cause of infectious myocarditis worldwide. It poses a substantial public health burden due to high morbidity and mortality. CHD is also the most serious and frequent manifestation of chronic Chagas disease and appears in 20-40% of infected individuals between 10-30 years after the original acute infection. In recent decades, numerous clinical and experimental investigations have shown that a low-grade but incessant parasitism, along with an accompanying immunological response [either parasite-driven (most likely) or autoimmune-mediated], plays an important role in producing myocardial damage in CHD. At the same time, primary neuronal damage and microvascular dysfunction have been described as ancillary pathogenic mechanisms. Conduction system disturbances, atrial and ventricular arrhythmias, congestive heart failure, systemic and pulmonary thromboembolism and sudden cardiac death are the most common clinical manifestations of chronic Chagas cardiomyopathy. Management of CHD aims to relieve symptoms, identify markers of unfavourable prognosis and treat those individuals at increased risk of disease progression or death. This article reviews the pathophysiology of myocardial damage, discusses the value of current risk stratification models and proposes an algorithm to guide mortality risk assessment and therapeutic decision-making in patients with CHD.
Resumo:
This study investigated the rate of human papillomavirus (HPV) persistence, associated risk factors, and predictors of cytological alteration outcomes in a cohort of human immunodeficiency virus-infected pregnant women over an 18-month period. HPV was typed through L1 gene sequencing in cervical smears collected during gestation and at 12 months after delivery. Outcomes were defined as nonpersistence (clearance of the HPV in the 2nd sample), re-infection (detection of different types of HPV in the 2 samples), and type-specific HPV persistence (the same HPV type found in both samples). An unfavourable cytological outcome was considered when the second exam showed progression to squamous intraepithelial lesion or high squamous intraepithelial lesion. Ninety patients were studied. HPV DNA persistence occurred in 50% of the cases composed of type-specific persistence (30%) or re-infection (20%). A low CD4+T-cell count at entry was a risk factor for type-specific, re-infection, or HPV DNA persistence. The odds ratio (OR) was almost three times higher in the type-specific group when compared with the re-infection group (OR = 2.8; 95% confidence interval: 0.43-22.79). Our findings show that bonafide (type-specific) HPV persistence is a stronger predictor for the development of cytological abnormalities, highlighting the need for HPV typing as opposed to HPV DNA testing in the clinical setting.
Resumo:
The State of Santa Catarina, Brazil, has agricultural and livestock activities, such as pig farming, that are responsible for adding large amounts of phosphorus (P) to soils. However, a method is required to evaluate the environmental risk of these high soil P levels. One possible method for evaluating the environmental risk of P fertilization, whether organic or mineral, is to establish threshold levels of soil available P, measured by Mehlich-1 extractions, below which there is not a high risk of P transfer from the soil to surface waters. However, the Mehlich-1 extractant is sensitive to soil clay content, and that factor should be considered when establishing such P-thresholds. The objective of this study was to determine P-thresholds using the Mehlich-1 extractant for soils with different clay contents in the State of Santa Catarina, Brazil. Soil from the B-horizon of an Oxisol with 800 g kg-1 clay was mixed with different amounts of sand to prepare artificial soils with 200, 400, 600, and 800 g kg-1 clay. The artificial soils were incubated for 30 days with moisture content at 80 % of field capacity to stabilize their physicochemical properties, followed by additional incubation for 30 days after liming to raise the pH(H2O) to 6.0. Soil P sorption curves were produced, and the maximum sorption (Pmax) was determined using the Langmuir model for each soil texture evaluated. Based on the Pmax values, seven rates of P were added to four replicates of each soil, and incubated for 20 days more. Following incubation, available P contents (P-Mehlich-1) and P dissolved in the soil solution (P-water) were determined. A change-point value (the P-Mehlich-1 value above which P-water starts increasing sharply) was calculated through the use of segmented equations. The maximum level of P that a soil might safely adsorb (P-threshold) was defined as 80 % of the change-point value to maintain a margin for environmental safety. The P-threshold value, in mg dm-3, was dependent on the soil clay content according to the model P-threshold = 40 + Clay, where the soil clay content is expressed as a percentage. The model was tested in 82 diverse soil samples from the State of Santa Catarina and was able to distinguish samples with high and low environmental risk.
Resumo:
AbstractObjective:Longitudinal study with B-mode ultrasonography and Doppler ultrasonography of maternal kidneys and liver in low-risk pregnancy, to establish and quantify normality parameters, correlating them with physiological changes.Materials and Methods:Twenty-five pregnant women were assessed and selected to participate in the study, each of them undergoing four examinations at the first, second, third trimesters and postpartum.Results:Findings during pregnancy were the following: increased renal volume, pyelocaliceal dilatation with incidence of 45.4% in the right kidney, and 9% in the left kidney; nephrolithiasis, 18.1% in the right kidney, 13.6% in the left kidney. With pyelocaliceal dilatation, mean values for resistivity index were: 0.68 for renal arteries; 0.66 for segmental arteries; 0.64 for interlobar arteries; 0.64 for arcuate arteries. Without pyelocaliceal dilatation, 0.67 for renal arteries; 0.64 for segmental arteries; 0.63 for interlobar arteries; and 0.61 for arcuate arteries. Portal vein flow velocities presented higher values in pregnancy, with mean value for maximum velocity of 28.9 cm/s, and 22.6 cm/s postpartum. The waveform pattern of the right hepatic vein presented changes persisting in the postpartum period in 31.8% of the patients. Cholelithiasis was observed in 18.1% of the patients.Conclusion:Alterations in renal volume, pyelocaliceal dilatation, nephrolithiasis, cholelithiasis, changes in portal vein flow velocity, alterations in waveform pattern of the right hepatic vein, proved to be significant.
Resumo:
Techniques of evaluation of risks coming from inherent uncertainties to the agricultural activity should accompany planning studies. The risk analysis should be carried out by risk simulation using techniques as the Monte Carlo method. This study was carried out to develop a computer program so-called P-RISCO for the application of risky simulations on linear programming models, to apply to a case study, as well to test the results comparatively to the @RISK program. In the risk analysis it was observed that the average of the output variable total net present value, U, was considerably lower than the maximum U value obtained from the linear programming model. It was also verified that the enterprise will be front to expressive risk of shortage of water in the month of April, what doesn't happen for the cropping pattern obtained by the minimization of the irrigation requirement in the months of April in the four years. The scenario analysis indicated that the sale price of the passion fruit crop exercises expressive influence on the financial performance of the enterprise. In the comparative analysis it was verified the equivalence of P-RISCO and @RISK programs in the execution of the risk simulation for the considered scenario.
Resumo:
A study was conducted to evaluate the predictive diagnostic value of different copper (Cu) parameters as indicators of average daily gain (ADG) in growing calves. The effects in calves of cow Cu supplementation in the last one-third gestation period were also evaluated. Five supplementation trials, with a total of 300 calves, were carried out. Two groups of 30 calves were randomly assigned to each trial, one group was parenterally supplemented (SG) and the other was not supplemented (NSG). Trials began when calves were three-month-old and ended at weaning time. At each sampling calves were weighed and blood was taken to determine Cu concentrations in plasma, Whole Blood (WB), Red Cells (RC) and Packed Cell Volume (PCV). Liver samples from six animals of each group were taken both at the beginning and at the end of the trial. In two trials the mothers of the SG received Cu supplementation at the last one- third gestation period. Four of the five trials exhibited low ADG in the NSGs. In these groups, plasma Cu concentration decreased rapidly before low ADG was detected, which occurred with values remaining below 25µg/dl. The decrease of RC Cu concentration was considerably slow. WB showed an intermediate position. PCV in the SGs was higher than in the NSGs in all trials. Cow supplementation was insufficient to generate a liver storage able to last after calves reached the 3 months of age. These data could be useful to predict the risk of low ADG in grazing calves.
Resumo:
Increased proteinuria is recognized as a risk predictor for all-cause and cardiovascular mortality in diabetic patients; however, no study has evaluated these relationships in Brazilian patients. The aim of this study was to investigate the prognostic value of gross proteinuria for all-cause and cardiovascular mortalities and for cardiovascular morbidity in a cohort study of 471 type 2 diabetic individuals followed for up to 7 years. Several clinical, laboratory and electrocardiographic variables were obtained at baseline. The relative risks for all-cause, cardiovascular and cardiac mortalities and for cardiovascular and cardiac events associated with the presence of overt proteinuria (>0.5 g/24 h) were assessed by Kaplan-Meier survival curves and by multivariate Cox regression model. During a median follow-up of 57 months (range 2-84 months), 121 patients (25.7%) died, 44 from cardiovascular and 30 from cardiac causes, and 106 fatal or non-fatal cardiovascular events occurred. Gross proteinuria was an independent risk predictor of all-cause, cardiovascular and cardiac mortalities and of cardiovascular morbidity with adjusted relative risks ranging from 1.96 to 4.38 for the different endpoints. This increased risk remained significant after exclusion of patients with prior cardiovascular disease at baseline from the multivariate analysis. In conclusion, gross proteinuria was a strong predictor of all-cause, cardiovascular and cardiac mortalities and also of cardiovascular morbidity in a Brazilian cohort of type 2 diabetic patients. Intervention studies are necessary to determine whether the reduction of proteinuria can decrease morbidity and mortality of type 2 diabetes in Brazil.