932 resultados para Predictive factors of hospitalization
Resumo:
This study estimates the factors of artificial environments (houses and peridomestic areas) associated with Triatoma sordida occurrence. Manual searches for triatomines were performed in 136 domiciliary units (DUs) in two rural localities of Central-West Brazil. For each DU, 32 structural, 23 biotic and 28 management variables were obtained. Multiple logistic regression analysis was performed in order to identify statistically significant variables associated with occurrence of T. sordida in the study areas. A total of 1,057 specimens (99% in peridomiciles, mainly chicken coops) of T. sordida were collected from 63 DUs (infestation: 47%; density: ~8 specimens/DU; crowding: ~17 specimens/infested DU; colonisation: 81%). Only six (0.6%) out of 945 specimens examined were infected with Trypanosoma cruzi. The final adjusted logistic regression model indicated that the probability of T. sordida occurrence was higher in DU with wooden chicken coops, presence of > 30 animals in wooden corrals, presence of wood piles and presence of food storeroom. The results show the persistence of T. sordida in peridomestic habitats in rural localities of Central-West Brazil. However, the observed low intradomestic colonisation and minimal triatomine infection rates indicate that T. sordida has low potential to sustain high rates of T. cruzi transmission to residents of these localities.
Resumo:
BACKGROUND: Prognosis of status epilepticus (SE) depends on its cause, but there is uncertainty as to whether SE represents an independent outcome predictor for a given etiology. Cerebral anoxia is a relatively homogenous severe encephalopathy. Postanoxic SE is associated to a nearly 100% mortality in this setting; however, it is still unclear whether this is a severity marker of the underlying encephalopathy, or an independent factor influencing outcome. The goal of this study was to assess if postanoxic SE is independently associated with mortality after cerebral anoxia. METHODS: This was a retrospective observation of consecutive comatose survivors of cardiac arrest, including subjects treated with hypothermia. On the subgroup with EEG recordings in the first hospitalization days, univariate and multivariate analyses were applied to potential determinants of in-hospital mortality, and included the following variables: age, gender, type and length of cardiac arrest, occurrence of circulatory shock, presence of therapeutic hypothermia, and electrographic SE. RESULTS: Out of 166 postanoxic patients, 107 (64%) had an EEG (median latency from admission, 2 days); in this group, therapeutic hypothermia was administered in 59%. Death occurred in 71 (67%) patients. Postanoxic SE was associated with mortality regardless of type of acute cardiac rhythm and administration of hypothermic treatment. CONCLUSION: In this hospital-based cohort, postanoxic status epilepticus (SE) seems to be independently related to death in cardiac arrest survivors, suggesting that SE might determine a bad prognosis for a given etiology. Confirmation of these results in a prospective assessment is needed.
Resumo:
Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.
Resumo:
BACKGROUND & AIMS Hy's Law, which states that hepatocellular drug-induced liver injury (DILI) with jaundice indicates a serious reaction, is used widely to determine risk for acute liver failure (ALF). We aimed to optimize the definition of Hy's Law and to develop a model for predicting ALF in patients with DILI. METHODS We collected data from 771 patients with DILI (805 episodes) from the Spanish DILI registry, from April 1994 through August 2012. We analyzed data collected at DILI recognition and at the time of peak levels of alanine aminotransferase (ALT) and total bilirubin (TBL). RESULTS Of the 771 patients with DILI, 32 developed ALF. Hepatocellular injury, female sex, high levels of TBL, and a high ratio of aspartate aminotransferase (AST):ALT were independent risk factors for ALF. We compared 3 ways to use Hy's Law to predict which patients would develop ALF; all included TBL greater than 2-fold the upper limit of normal (×ULN) and either ALT level greater than 3 × ULN, a ratio (R) value (ALT × ULN/alkaline phosphatase × ULN) of 5 or greater, or a new ratio (nR) value (ALT or AST, whichever produced the highest ×ULN/ alkaline phosphatase × ULN value) of 5 or greater. At recognition of DILI, the R- and nR-based models identified patients who developed ALF with 67% and 63% specificity, respectively, whereas use of only ALT level identified them with 44% specificity. However, the level of ALT and the nR model each identified patients who developed ALF with 90% sensitivity, whereas the R criteria identified them with 83% sensitivity. An equal number of patients who did and did not develop ALF had alkaline phosphatase levels greater than 2 × ULN. An algorithm based on AST level greater than 17.3 × ULN, TBL greater than 6.6 × ULN, and AST:ALT greater than 1.5 identified patients who developed ALF with 82% specificity and 80% sensitivity. CONCLUSIONS When applied at DILI recognition, the nR criteria for Hy's Law provides the best balance of sensitivity and specificity whereas our new composite algorithm provides additional specificity in predicting the ultimate development of ALF.
Resumo:
Background Coronary microvascular dysfunction (CMD) is associated with cardiovascular events in type 2 diabetes mellitus (T2DM). Optimal glycaemic control does not always preclude future events. We sought to assess the effect of the current target of HBA1c level on the coronary microcirculatory function and identify predictive factors for CMD in T2DM patients. Methods We studied 100 patients with T2DM and 214 patients without T2DM. All of them with a history of chest pain, non-obstructive angiograms and a direct assessment of coronary blood flow increase in response to adenosine and acetylcholine coronary infusion, for evaluation of endothelial independent and dependent CMD. Patients with T2DM were categorized as having optimal (HbA1c < 7 %) vs. suboptimal (HbA1c ≥ 7 %) glycaemic control at the time of catheterization. Results Baseline characteristics and coronary endothelial function parameters differed significantly between T2DM patients and control group. The prevalence of endothelial independent CMD (29.8 vs. 39.6 %, p = 0.40) and dependent CMD (61.7 vs. 62.2 %, p = 1.00) were similar in patients with optimal vs. suboptimal glycaemic control. Age (OR 1.10; CI 95 % 1.04–1.18; p < 0.001) and female gender (OR 3.87; CI 95 % 1.45–11.4; p < 0.01) were significantly associated with endothelial independent CMD whereas glomerular filtrate (OR 0.97; CI 95 % 0.95–0.99; p < 0.05) was significantly associated with endothelial dependent CMD. The optimal glycaemic control was not associated with endothelial independent (OR 0.60, CI 95 % 0.23–1.46; p 0.26) or dependent CMD (OR 0.99, CI 95 % 0.43–2.24; p = 0.98). Conclusions The current target of HBA1c level does not predict a better coronary microcirculatory function in T2DM patients. The appropriate strategy for prevention of CMD in T2DM patients remains to be addressed. Keywords: Endothelial dysfunction; Diabetes mellitus; Coronary microcirculation
Resumo:
BACKGROUND & AIMS: A fast-track program is a multimodal approach for patients undergoing colonic surgery that combines stringent regimens of perioperative care (fluid restriction, optimized analgesia, forced mobilization, and early oral feeding) to reduce perioperative morbidity, hospital stay, and cost. We investigated the impact of a fast-track protocol on postoperative morbidity in patients after open colonic surgery. METHODS: A randomized trial of patients in 4 teaching hospitals in Switzerland included 156 patients undergoing elective open colonic surgery who were assigned to either a fast-track program or standard care. The primary end point was the 30-day complication rate. Secondary end points were severity of complications, hospital stay, and compliance with the fast-track protocol. RESULTS: The fast-track protocol significantly decreased the number of complications (16 of 76 in the fast-track group vs 37 of 75 in the standard care group; P = .0014), resulting in shorter hospital stays (median, 5 days; range, 2-30 vs 9 days, respectively; range, 6-30; P < .0001). There was a trend toward less severe complications in the fast-track group. A multiple logistic regression analysis revealed fluid administration greater than the restriction limits (odds ratio, 4.198; 95% confidence interval, 1.7-10.366; P = .002) and a nonfunctioning epidural analgesia (odds ratio, 3.365; 95% confidence interval, 1.367-8.283; P = .008) as independent predictors of postoperative complications. CONCLUSIONS: The fast-track program reduces the rate of postoperative complications and length of hospital stay and should be considered as standard care. Fluid restriction and an effective epidural analgesia are the key factors that determine outcome of the fast-track program.
Resumo:
Percutaneous transluminal renal angioplasty (PTRA) is an invasive technique that is costly and involves the risk of complications and renal failure. The ability of PTRA to reduce the administration of antihypertensive drugs has been demonstrated. A potentially greater benefit, which nevertheless remains to be proven, is the deferral of the need for chronic dialysis. The aim of the study (ANPARIA) was to assess the appropriateness of PTRA to impact on the evolution of renal function. A standardized expert panel method was used to assess the appropriateness of medical treatment alone or medical treatment with revascularization in various clinical situations. The choice of revascularization by either PTRA or surgery was examined for each clinical situation. Analysis was based on a detailed literature review and on systematically elicited expert opinion, which were obtained during a two-round modified Delphi process. The study provides detailed responses on the appropriateness of PTRA for 1848 distinct clinical scenarios. Depending on the major clinical presentation, appropriateness of revascularization varied from 32% to 75% for individual scenarios (overal 48%). Uncertainty as to revascularization was 41% overall. When revascularization was appropriate, PTRA was favored over surgery in 94% of the scenarios, except in certain cases of aortic atheroma where sugery was the preferred choice. Kidney size [7 cm, absence of coexisting disease, acute renal failure, a high degree of stenosis (C70%), and absence of multiple arteries were identified as predictive variables of favorable appropriateness ratings. Situations such as cardiac failure with pulmonary edema or acute thrombosis of the renal artery were defined as indications for PTRA. This study identified clinical situations in which PTRA or surgery are appropriate for renal artery disease. We built a decision tree which can be used via Internet: the ANPARIA software (http://www.chu-clermontferrand.fr/anparia/). In numerous clinical situations uncertainty remains as to whether PTRA prevents deterioration of renal function.
Resumo:
BACKGROUND: The Pulmonary Embolism Severity Index (PESI) estimates the risk of 30-day mortality in patients with acute pulmonary embolism (PE). We constructed a simplified version of the PESI. METHODS: The study retrospectively developed a simplified PESI clinical prediction rule for estimating the risk of 30-day mortality in a derivation cohort of Spanish outpatients. Simplified and original PESI performances were compared in the derivation cohort. The simplified PESI underwent retrospective external validation in an independent multinational cohort (Registro Informatizado de la Enfermedad Tromboembólica [RIETE] cohort) of outpatients. RESULTS: In the derivation data set, univariate logistic regression of the original 11 PESI variables led to the removal of variables that did not reach statistical significance and subsequently produced the simplified PESI that contained the variables of age, cancer, chronic cardiopulmonary disease, heart rate, systolic blood pressure, and oxyhemoglobin saturation levels. The prognostic accuracy of the original and simplified PESI scores did not differ (area under the curve, 0.75 [95% confidence interval (CI), 0.69-0.80]). The 305 of 995 patients (30.7%) who were classified as low risk by the simplified PESI had a 30-day mortality of 1.0% (95% CI, 0.0%-2.1%) compared with 10.9% (8.5%-13.2%) in the high-risk group. In the RIETE validation cohort, 2569 of 7106 patients (36.2%) who were classified as low risk by the simplified PESI had a 30-day mortality of 1.1% (95% CI, 0.7%-1.5%) compared with 8.9% (8.1%-9.8%) in the high-risk group. CONCLUSION: The simplified PESI has similar prognostic accuracy and clinical utility and greater ease of use compared with the original PESI.
Resumo:
Osteoporotic hip fractures increase dramatically with age and are responsible for considerable morbidity and mortality. Several treatments to prevent the occurrence of hip fracture have been validated in large randomized trials and the current challenge is to improve the identification of individuals at high risk of fracture who would benefit from therapeutic or preventive intervention. We have performed an exhaustive literature review on hip fracture predictors, focusing primarily on clinical risk factors, dual X-ray absorptiometry (DXA), quantitative ultrasound, and bone markers. This review is based on original articles and meta-analyses. We have selected studies that aim both to predict the risk of hip fracture and to discriminate individuals with or without fracture. We have included only postmenopausal women in our review. For studies involving both men and women, only results concerning women have been considered. Regarding clinical factors, only prospective studies have been taken into account. Predictive factors have been used as stand-alone tools to predict hip fracture or sequentially through successive selection processes or by combination into risk scores. There is still much debate as to whether or not the combination of these various parameters, as risk scores or as sequential or concurrent combinations, could help to better predict hip fracture. There are conflicting results on whether or not such combinations provide improvement over each method alone. Sequential combination of bone mineral density and ultrasound parameters might be cost-effective compared with DXA alone, because of fewer bone mineral density measurements. However, use of multiple techniques may increase costs. One problem that precludes comparison of most published studies is that they use either relative risk, or absolute risk, or sensitivity and specificity. The absolute risk of individuals given their risk factors and bone assessment results would be a more appropriate model for decision-making than relative risk. Currently, a group appointed by the World Health Organization and lead by Professor John Kanis is working on such a model. It will therefore be possible to further assess the best choice of threshold to optimize the number of women needed to screen for each country and each treatment.
Resumo:
BACKGROUND: Guidelines for the prevention of coronary heart disease (CHD) recommend use of Framingham-based risk scores that were developed in white middle-aged populations. It remains unclear whether and how CHD risk prediction might be improved among older adults. We aimed to compare the prognostic performance of the Framingham risk score (FRS), directly and after recalibration, with refit functions derived from the present cohort, as well as to assess the utility of adding other routinely available risk parameters to FRS.¦METHODS: Among 2193 black and white older adults (mean age, 73.5 years) without pre-existing cardiovascular disease from the Health ABC cohort, we examined adjudicated CHD events, defined as incident myocardial infarction, CHD death, and hospitalization for angina or coronary revascularization.¦RESULTS: During 8-year follow-up, 351 participants experienced CHD events. The FRS poorly discriminated between persons who experienced CHD events vs. not (C-index: 0.577 in women; 0.583 in men) and underestimated absolute risk prediction by 51% in women and 8% in men. Recalibration of the FRS improved absolute risk prediction, particulary for women. For both genders, refitting these functions substantially improved absolute risk prediction, with similar discrimination to the FRS. Results did not differ between whites and blacks. The addition of lifestyle variables, waist circumference and creatinine did not improve risk prediction beyond risk factors of the FRS.¦CONCLUSIONS: The FRS underestimates CHD risk in older adults, particularly in women, although traditional risk factors remain the best predictors of CHD. Re-estimated risk functions using these factors improve accurate estimation of absolute risk.
Resumo:
ECG criteria for left ventricular hypertrophy (LVH) have been almost exclusively elaborated and calibrated in white populations. Because several interethnic differences in ECG characteristics have been found, the applicability of these criteria to African individuals remains to be demonstrated. We therefore investigated the performance of classic ECG criteria for LVH detection in an African population. Digitized 12-lead ECG tracings were obtained from 334 African individuals randomly selected from the general population of the Republic of Seychelles (Indian Ocean). Left ventricular mass was calculated with M-mode echocardiography and indexed to body height. LVH was defined by taking the 95th percentile of body height-indexed LVM values in a reference subgroup. In the entire study sample, 16 men and 15 women (prevalence 9.3%) were finally declared to have LVH, of whom 9 were of the reference subgroup. Sensitivity, specificity, accuracy, and positive and negative predictive values for LVH were calculated for 9 classic ECG criteria, and receiver operating characteristic curves were computed. We also generated a new composite time-voltage criterion with stepwise multiple linear regression: weighted time-voltage criterion=(0.2366R(aVL)+0.0551R(V5)+0.0785S(V3)+ 0.2993T(V1))xQRS duration. The Sokolow-Lyon criterion reached the highest sensitivity (61%) and the R(aVL) voltage criterion reached the highest specificity (97%) when evaluated at their traditional partition value. However, at a fixed specificity of 95%, the sensitivity of these 10 criteria ranged from 16% to 32%. Best accuracy was obtained with the R(aVL) voltage criterion and the new composite time-voltage criterion (89% for both). Positive and negative predictive values varied considerably depending on the concomitant presence of 3 clinical risk factors for LVH (hypertension, age >/=50 years, overweight). Median positive and negative predictive values of the 10 ECG criteria were 15% and 95%, respectively, for subjects with none or 1 of these risk factors compared with 63% and 76% for subjects with all of them. In conclusion, the performance of classic ECG criteria for LVH detection was largely disparate and appeared to be lower in this population of East African origin than in white subjects. A newly generated composite time-voltage criterion might provide improved performance. The predictive value of ECG criteria for LVH was considerably enhanced with the integration of information on concomitant clinical risk factors for LVH.
Resumo:
PURPOSE: The intraoperative quality assessment of the arteriovenous fistula for hemodialysis is an essential process to limit early failure due to technical problems or inadequate vascular quality. This step is not clearly defined in the literature with no recommendations. METHODS: We selected published articles related to the topic of intraoperative quality control of the vascular access for hemodialysis. RESULTS: The intraoperative blood flow measurement greater than 120 ml/min in autologous fistula and less than 320 ml/min in arteriovenous graft was described as predictive factors for early failure. CONCLUSIONS: The blood flow measurement should be performed after the confection of the anastomosis. When blood flow is limited, fistulography is an essential step to assess patency.
Resumo:
BACKGROUND & AIMS: The host immune response during the chronic phase of hepatitis C virus infection varies among individuals; some patients have a no interferon (IFN) response in the liver, whereas others have full activation IFN-stimulated genes (ISGs). Preactivation of this endogenous IFN system is associated with nonresponse to pegylated IFN-α (pegIFN-α) and ribavirin. Genome-wide association studies have associated allelic variants near the IL28B (IFNλ3) gene with treatment response. We investigated whether IL28B genotype determines the constitutive expression of ISGs in the liver and compared the abilities of ISG levels and IL28B genotype to predict treatment outcome. METHODS: We genotyped 109 patients with chronic hepatitis C for IL28B allelic variants and quantified the hepatic expression of ISGs and of IL28B. Decision tree ensembles, in the form of a random forest classifier, were used to calculate the relative predictive power of these different variables in a multivariate analysis. RESULTS: The minor IL28B allele was significantly associated with increased expression of ISG. However, stratification of the patients according to treatment response revealed increased ISG expression in nonresponders, irrespective of IL28B genotype. Multivariate analysis of ISG expression, IL28B genotype, and several other factors associated with response to therapy identified ISG expression as the best predictor of treatment response. CONCLUSIONS: IL28B genotype and hepatic expression of ISGs are independent predictors of response to treatment with pegIFN-α and ribavirin in patients with chronic hepatitis C. The most accurate prediction of response was obtained with a 4-gene classifier comprising IFI27, ISG15, RSAD2, and HTATIP2.
Resumo:
Background: Age is frequently discussed as negative host factor to achieve a sustained virological response (SVR) to antiviral hepatitis C therapy. However, elderly patients often show relevant fibrosis or cirrhosis which is a known negative predictive factor, making it difficult to interpret age as an independent predictive factor. Methods: From the framework of the Swiss hepatitis C cohort (SCCS), we collected data from 545 antiviral hepatitis C therapies, including data from 67 hepatitis C patients ≥ 60 y who had been treated with PEG-interferon and ribavirin. We analyzed host factors (age, gender, fibrosis, haemoglobin, depression, earlier hepatitis C treatment), viral factors (genotype, viral load) and treatment course (early virological response, end of treatment response, SVR). Generalised estimating equations (GEE) regression modelling was used for the primary end point (SVR), with age ≥ 60 y and < 60 y as independent variable and gender, presence of cirrhosis, genotype, earlier treatment and viral load as confounders. SVR was analysed in young and elderly patients after matching for these confounders. Additionally, classification tree analysis was done in elderly patients using these confounders. Results: SVR analyzed in 545 patients was 55%. In genotype 1/4, SVR was 42.9% in 259 patients < 60 y and 26.1% in 46 patients ≥ 60 y. In genotype 2/3, SVR was 74.4% in 215 patients < 60 y and 84% in 25 patients ≥ 60 y. However, GEE model showed that age had no influence on achieving SVR (Odds ratio 0.91). Confounders influenced SVR as known from previous studies (cirrhosis, genotype 1/4, previous treatment and viral load >600'000 IE/ml as negative predictive factors). When young and elderly patients were matched (analysis in 59 elderly patients), SVR was not different in these patient groups (54.2% and 55.9%, resp.; p=0.795 in binomial test). The classification tree-derived best criterion for SVR in elderly patients was genotype, with no further criteria relevant for predicting SVR in genotype 2/3. In patients with genotype 1/4, further criteria were presence of cirrhosis and low viral load <600'000 IE/ml in non-cirrhotic patients. Conclusions: Age is not a relevant predictive factor for achieving SVR, when confounders were taken into account. In terms of effectiveness of antiviral therapy, age does not play a major role and should not be regarded as relevant negative predictive factor. Since life expectancy in Switzerland at age 60 is more than 22 y, hepatitis C therapy is reasonable in elderly patients with known relevant fibrosis or cirrhosis, because interferon-based hepatitis C therapy improves survival and reduces carcinogenesis.
Resumo:
Aging adults represent the fastest growing population segment in many countries. Physiological and metabolic changes in the aging process may alter how aging adults biologically respond to pollutants. In a controlled human toxicokinetic study (exposure chamber; 12 m³), aging volunteers (n=10; >58 years) were exposed to propylene glycol monomethyl ether (PGME, CAS no. 107-98-2) at 50 ppm for 6 h. The dose-dependent renal excretion of oxidative metabolites, conjugated and free PGME could potentially be altered by age. AIMS: (1) Compare PGME toxicokinetic profiles between aging and young volunteers (20-25 years) and gender; (2) test the predictive power of a compartmental toxicokinetic (TK) model developed for aging persons against urinary PGME concentrations found in this study. METHODS: Urine samples were collected before, during, and after the exposure. Urinary PGME was quantified by capillary GC/FID. RESULTS: Differences in urinary PGME profiles were not noted between genders but between age groups. Metabolic parameters had to be changed to fit the age adjusted TK model to the experimental results, implying a slower enzymatic pathway in the aging volunteers. For an appropriate exposure assessment, urinary total PGME should be quantified. CONCLUSION: Age is a factor that should be considered when biological limit values are developed.