84 resultados para REGRESSION-MODEL
Resumo:
Abstract : The human body is composed of a huge number of cells acting together in a concerted manner. The current understanding is that proteins perform most of the necessary activities in keeping a cell alive. The DNA, on the other hand, stores the information on how to produce the different proteins in the genome. Regulating gene transcription is the first important step that can thus affect the life of a cell, modify its functions and its responses to the environment. Regulation is a complex operation that involves specialized proteins, the transcription factors. Transcription factors (TFs) can bind to DNA and activate the processes leading to the expression of genes into new proteins. Errors in this process may lead to diseases. In particular, some transcription factors have been associated with a lethal pathological state, commonly known as cancer, associated with uncontrolled cellular proliferation, invasiveness of healthy tissues and abnormal responses to stimuli. Understanding cancer-related regulatory programs is a difficult task, often involving several TFs interacting together and influencing each other's activity. This Thesis presents new computational methodologies to study gene regulation. In addition we present applications of our methods to the understanding of cancer-related regulatory programs. The understanding of transcriptional regulation is a major challenge. We address this difficult question combining computational approaches with large collections of heterogeneous experimental data. In detail, we design signal processing tools to recover transcription factors binding sites on the DNA from genome-wide surveys like chromatin immunoprecipitation assays on tiling arrays (ChIP-chip). We then use the localization about the binding of TFs to explain expression levels of regulated genes. In this way we identify a regulatory synergy between two TFs, the oncogene C-MYC and SP1. C-MYC and SP1 bind preferentially at promoters and when SP1 binds next to C-NIYC on the DNA, the nearby gene is strongly expressed. The association between the two TFs at promoters is reflected by the binding sites conservation across mammals, by the permissive underlying chromatin states 'it represents an important control mechanism involved in cellular proliferation, thereby involved in cancer. Secondly, we identify the characteristics of TF estrogen receptor alpha (hERa) target genes and we study the influence of hERa in regulating transcription. hERa, upon hormone estrogen signaling, binds to DNA to regulate transcription of its targets in concert with its co-factors. To overcome the scarce experimental data about the binding sites of other TFs that may interact with hERa, we conduct in silico analysis of the sequences underlying the ChIP sites using the collection of position weight matrices (PWMs) of hERa partners, TFs FOXA1 and SP1. We combine ChIP-chip and ChIP-paired-end-diTags (ChIP-pet) data about hERa binding on DNA with the sequence information to explain gene expression levels in a large collection of cancer tissue samples and also on studies about the response of cells to estrogen. We confirm that hERa binding sites are distributed anywhere on the genome. However, we distinguish between binding sites near promoters and binding sites along the transcripts. The first group shows weak binding of hERa and high occurrence of SP1 motifs, in particular near estrogen responsive genes. The second group shows strong binding of hERa and significant correlation between the number of binding sites along a gene and the strength of gene induction in presence of estrogen. Some binding sites of the second group also show presence of FOXA1, but the role of this TF still needs to be investigated. Different mechanisms have been proposed to explain hERa-mediated induction of gene expression. Our work supports the model of hERa activating gene expression from distal binding sites by interacting with promoter bound TFs, like SP1. hERa has been associated with survival rates of breast cancer patients, though explanatory models are still incomplete: this result is important to better understand how hERa can control gene expression. Thirdly, we address the difficult question of regulatory network inference. We tackle this problem analyzing time-series of biological measurements such as quantification of mRNA levels or protein concentrations. Our approach uses the well-established penalized linear regression models where we impose sparseness on the connectivity of the regulatory network. We extend this method enforcing the coherence of the regulatory dependencies: a TF must coherently behave as an activator, or a repressor on all its targets. This requirement is implemented as constraints on the signs of the regressed coefficients in the penalized linear regression model. Our approach is better at reconstructing meaningful biological networks than previous methods based on penalized regression. The method is tested on the DREAM2 challenge of reconstructing a five-genes/TFs regulatory network obtaining the best performance in the "undirected signed excitatory" category. Thus, these bioinformatics methods, which are reliable, interpretable and fast enough to cover large biological dataset, have enabled us to better understand gene regulation in humans.
Resumo:
Ventilator-associated pneumonia (VAP) affects mortality, morbidity and cost of critical care. Reliable risk estimation might improve end-of-life decisions, resource allocation and outcome. Several scoring systems for survival prediction have been established and optimised over the last decades. Recently, new biomarkers have gained interest in the prognostic field. We assessed whether midregional pro-atrial natriuretic peptide (MR-proANP) and procalcitonin (PCT) improve the predictive value of the Simplified Acute Physiologic Score (SAPS) II and Sequential Related Organ Failure Assessment (SOFA) in VAP. Specified end-points of a prospective multinational trial including 101 patients with VAP were analysed. Death <28 days after VAP onset was the primary end-point. MR-proANP and PCT were elevated at the onset of VAP in nonsurvivors compared with survivors (p = 0.003 and p = 0.017, respectively) and their slope of decline differed significantly (p = 0.018 and p = 0.039, respectively). Patients with the highest MR-proANP quartile at VAP onset were at increased risk for death (log rank p = 0.013). In a logistic regression model, MR-proANP was identified as the best predictor of survival. Adding MR-proANP and PCT to SAPS II and SOFA improved their predictive properties (area under the curve 0.895 and 0.880). We conclude that the combination of two biomarkers, MR-proANP and PCT, improve survival prediction of clinical severity scores in VAP.
Resumo:
BACKGROUND: Artemisinin-based combination therapy (ACT) has been promoted as a means to reduce malaria transmission due to their ability to kill both asexual blood stages of malaria parasites, which sustain infections over long periods and the immature derived sexual stages responsible for infecting mosquitoes and onward transmission. Early studies reported a temporal association between ACT introduction and reduced malaria transmission in a number of ecological settings. However, these reports have come from areas with low to moderate malaria transmission, been confounded by the presence of other interventions or environmental changes that may have reduced malaria transmission, and have not included a comparison group without ACT. This report presents results from the first large-scale observational study to assess the impact of case management with ACT on population-level measures of malaria endemicity in an area with intense transmission where the benefits of effective infection clearance might be compromised by frequent and repeated re-infection. METHODS: A pre-post observational study with a non-randomized comparison group was conducted at two sites in Tanzania. Both sites used sulphadoxine-pyrimethamine (SP) monotherapy as a first-line anti-malarial from mid-2001 through 2002. In 2003, the ACT, artesunate (AS) co-administered with SP (AS + SP), was introduced in all fixed health facilities in the intervention site, including both public and registered non-governmental facilities. Population-level prevalence of Plasmodium falciparum asexual parasitaemia and gametocytaemia were assessed using light microscopy from samples collected during representative household surveys in 2001, 2002, 2004, 2005 and 2006. FINDINGS: Among 37,309 observations included in the analysis, annual asexual parasitaemia prevalence in persons of all ages ranged from 11% to 28% and gametocytaemia prevalence ranged from <1% to 2% between the two sites and across the five survey years. A multivariable logistic regression model was fitted to adjust for age, socioeconomic status, bed net use and rainfall. In the presence of consistently high coverage and efficacy of SP monotherapy and AS + SP in the comparison and intervention areas, the introduction of ACT in the intervention site was associated with a modest reduction in the adjusted asexual parasitaemia prevalence of 5 percentage-points or 23% (p < 0.0001) relative to the comparison site. Gametocytaemia prevalence did not differ significantly (p = 0.30). INTERPRETATION: The introduction of ACT at fixed health facilities only modestly reduced asexual parasitaemia prevalence. ACT is effective for treatment of uncomplicated malaria and should have substantial public health impact on morbidity and mortality, but is unlikely to reduce malaria transmission substantially in much of sub-Saharan Africa where individuals are rapidly re-infected.
Resumo:
PURPOSE: This longitudinal study aimed at comparing heart rate variability (HRV) in elite athletes identified either in 'fatigue' or in 'no-fatigue' state in 'real life' conditions. METHODS: 57 elite Nordic-skiers were surveyed over 4 years. R-R intervals were recorded supine (SU) and standing (ST). A fatigue state was quoted with a validated questionnaire. A multilevel linear regression model was used to analyze relationships between heart rate (HR) and HRV descriptors [total spectral power (TP), power in low (LF) and high frequency (HF) ranges expressed in ms(2) and normalized units (nu)] and the status without and with fatigue. The variables not distributed normally were transformed by taking their common logarithm (log10). RESULTS: 172 trials were identified as in a 'fatigue' and 891 as in 'no-fatigue' state. All supine HR and HRV parameters (Beta+/-SE) were significantly different (P<0.0001) between 'fatigue' and 'no-fatigue': HRSU (+6.27+/-0.61 bpm), logTPSU (-0.36+/-0.04), logLFSU (-0.27+/-0.04), logHFSU (-0.46+/-0.05), logLF/HFSU (+0.19+/-0.03), HFSU(nu) (-9.55+/-1.33). Differences were also significant (P<0.0001) in standing: HRST (+8.83+/-0.89), logTPST (-0.28+/-0.03), logLFST (-0.29+/-0.03), logHFST (-0.32+/-0.04). Also, intra-individual variance of HRV parameters was larger (P<0.05) in the 'fatigue' state (logTPSU: 0.26 vs. 0.07, logLFSU: 0.28 vs. 0.11, logHFSU: 0.32 vs. 0.08, logTPST: 0.13 vs. 0.07, logLFST: 0.16 vs. 0.07, logHFST: 0.25 vs. 0.14). CONCLUSION: HRV was significantly lower in 'fatigue' vs. 'no-fatigue' but accompanied with larger intra-individual variance of HRV parameters in 'fatigue'. The broader intra-individual variance of HRV parameters might encompass different changes from no-fatigue state, possibly reflecting different fatigue-induced alterations of HRV pattern.
Resumo:
BACKGROUND: Present combination antiretroviral therapy (cART) alone does not cure HIV infection and requires lifelong drug treatment. The potential role of HIV therapeutic vaccines as part of an HIV cure is under consideration. Our aim was to assess the efficacy, safety, and immunogenicity of Vacc-4x, a peptide-based HIV-1 therapeutic vaccine targeting conserved domains on p24(Gag), in adults infected with HIV-1. METHODS: Between July, 2008, and June, 2010, we did a multinational double-blind, randomised, phase 2 study comparing Vacc-4x with placebo. Participants were adults infected with HIV-1 who were aged 18-55 years and virologically suppressed on cART (viral load <50 copies per mL) with CD4 cell counts of 400 × 10(6) cells per L or greater. The trial was done at 18 sites in Germany, Italy, Spain, the UK, and the USA. Participants were randomly assigned (2:1) to Vacc-4x or placebo. Group allocation was masked from participants and investigators. Four primary immunisations, weekly for 4 weeks, containing Vacc-4x (or placebo) were given intradermally after administration of adjuvant. Booster immunisations were given at weeks 16 and 18. At week 28, cART was interrupted for up to 24 weeks. The coprimary endpoints were cART resumption and changes in CD4 counts during treatment interruption. Analyses were by modified intention to treat: all participants who received one intervention. Furthermore, safety, viral load, and immunogenicity (as measured by ELISPOT and proliferation assays) were assessed. The 52 week follow-up period was completed in June, 2011. For the coprimary endpoints the proportion of participants who met the criteria for cART resumption was analysed with a logistic regression model with the treatment effect being assessed in a model including country as a covariate. This study is registered with ClinicalTrials.gov, number NCT00659789. FINDINGS: 174 individuals were screened; because of slow recruitment, enrolment stopped with 136 of a planned 345 participants and 93 were randomly assigned to receive Vacc-4x and 43 to receive placebo. There were no differences between the two groups for the primary efficacy endpoints in those participants who stopped cART at week 28. Of the participants who resumed cART, 30 (34%) were in the Vacc-4x group and 11 (29%) in the placebo group, and percentage changes in CD4 counts were not significant (mean treatment difference -5·71, 95% CI -13·01 to 1·59). However, a significant difference in viral load was noted for the Vacc-4x group both at week 48 (median 23 100 copies per mL Vacc-4x vs 71 800 copies per mL placebo; p=0·025) and week 52 (median 19 550 copies per mL vs 51 000 copies per mL; p=0·041). One serious adverse event, exacerbation of multiple sclerosis, was reported as possibly related to study treatment. Vacc-4x was immunogenic, inducing proliferative responses in both CD4 and CD8 T-cell populations. INTERPRETATION: The proportion of participants resuming cART before end of study and change in CD4 counts during the treatment interruption showed no benefit of vaccination. Vacc-4x was safe, well tolerated, immunogenic, seemed to contribute to a viral-load setpoint reduction after cART interruption, and might be worth consideration in future HIV-cure investigative strategies. FUNDING: Norwegian Research Council GLOBVAC Program and Bionor Pharma ASA.
Resumo:
OBJECTIVE(S): To investigate the relationship between detection of HIV drug resistance by 2 years from starting antiretroviral therapy and the subsequent risk of progression to AIDS and death. DESIGN: Virological failure was defined as experiencing two consecutive viral loads of more than 400 copies/ml in the time window between 0.5 and 2 years from starting antiretroviral therapy (baseline). Patients were grouped according to evidence of virological failure and whether there was detection of the International AIDS Society resistance mutations to one, two or three drug classes in the time window. METHODS: Standard survival analysis using Kaplan-Meier curves and Cox proportional hazards regression model with time-fixed covariates defined at baseline was employed. RESULTS: We studied 8229 patients in EuroSIDA who started antiretroviral therapy and who had at least 2 years of clinical follow-up. We observed 829 AIDS events and 571 deaths during 38,814 person-years of follow-up resulting in an overall incidence of new AIDS and death of 3.6 per 100 person-years of follow-up [95% confidence interval (CI):3.4-3.8]. By 96 months from baseline, the proportion of patients with a new AIDS diagnosis or death was 20.3% (95% CI:17.7-22.9) in patients with no evidence of virological failure and 53% (39.3-66.7) in those with virological failure and mutations to three drug classes (P = 0.0001). An almost two-fold difference in risk was confirmed in the multivariable analysis (adjusted relative hazard = 1.8, 95% CI:1.2-2.7, P = 0.005). CONCLUSION: Although this study shows an association between the detection of resistance at failure and risk of clinical progression, further research is needed to clarify whether resistance reflects poor adherence or directly increases the risk of clinical events via exhaustion of drug options.
Resumo:
BACKGROUND: Cytomegalovirus (CMV), human herpesvirus-6 and -7 (HHV-6 and -7) are beta-herpesviruses that commonly reactivate and have been proposed to trigger acute rejection and chronic allograft injury. We assessed the contribution of these viruses in the development of bronchiolitis obliterans syndrome (BOS) after lung transplantation. METHODS: Quantitative real-time polymerase chain reaction of bronchoalveolar lavage samples were performed for CMV, HHV-6 and -7 in a prospective cohort of lung transplant recipients. A time-dependent Cox regression analysis was used to correlate the risk of BOS and acute rejection in patients with and without beta-herpesviruses infection. RESULTS: Ninety-three patients were included in the study over a period of 3 years. A total of 581 samples from bronchoalveolar lavage were obtained. Sixty-one patients (65.6%) had at least one positive result for one of the beta-herpesviruses: 48 patients (51.6%) for CMV and 19 patients (20.4%) for both HHV-6 and -7. Median peak viral load was 3419 copies/mL for CMV, 258 copies/mL for HHV-6, and 665 copies/mL for HHV-7. Acute rejection (>or=grade 2) occurred in 46.2% and BOS (>or=stage 1) in 19.4% of the patients. In the Cox regression model the relative risk of acute rejection or BOS was not increased in patients with any beta-herpesviruses reactivation. Acute rejection was the only independently associated risk factor for BOS. CONCLUSIONS: In lung transplant recipients receiving prolonged antiviral prophylaxis, reactivation of beta-herpesviruses within the allograft was common. However, despite high viral loads in many patients, virus replication was not associated with the development of rejection or BOS.
Resumo:
The objectives of this study were to develop a computerized method to screen for potentially avoidable hospital readmissions using routinely collected data and a prediction model to adjust rates for case mix. We studied hospital information system data of a random sample of 3,474 inpatients discharged alive in 1997 from a university hospital and medical records of those (1,115) readmitted within 1 year. The gold standard was set on the basis of the hospital data and medical records: all readmissions were classified as foreseen readmissions, unforeseen readmissions for a new affection, or unforeseen readmissions for a previously known affection. The latter category was submitted to a systematic medical record review to identify the main cause of readmission. Potentially avoidable readmissions were defined as a subgroup of unforeseen readmissions for a previously known affection occurring within an appropriate interval, set to maximize the chance of detecting avoidable readmissions. The computerized screening algorithm was strictly based on routine statistics: diagnosis and procedures coding and admission mode. The prediction was based on a Poisson regression model. There were 454 (13.1%) unforeseen readmissions for a previously known affection within 1 year. Fifty-nine readmissions (1.7%) were judged avoidable, most of them occurring within 1 month, which was the interval used to define potentially avoidable readmissions (n = 174, 5.0%). The intra-sample sensitivity and specificity of the screening algorithm both reached approximately 96%. Higher risk for potentially avoidable readmission was associated with previous hospitalizations, high comorbidity index, and long length of stay; lower risk was associated with surgery and delivery. The model offers satisfactory predictive performance and a good medical plausibility. The proposed measure could be used as an indicator of inpatient care outcome. However, the instrument should be validated using other sets of data from various hospitals.
Resumo:
Background: Modelling epidemiological knowledge in validated clinical scores is a practical mean of integrating EBM to usual care. Existing scores about cardiovascular disease have been largely developed in emergency settings, but few in primary care. Such a toll is needed for general practitioners (GP) to evaluate the probability of ischemic heart disease (IHD) in patients with non-traumatic chest pain. Objective: To develop a predictive model to use as a clinical score for detecting IHD in patients with non-traumatic chest-pain in primary care. Methods: A post-hoc secondary analysis on data from an observational study including 672 patients with chest pain of which 85 had IHD diagnosed by their GP during the year following their inclusion. Best subset method was used to select 8 predictive variables from univariate analysis and fitted in a multivariate logistic regression model to define the score. Reliability of the model was assessed using split-group method. Results: Significant predictors were: age (0-3 points), gender (1 point), having at least one cardiovascular risks factor (hypertension, dyslipidemia, diabetes, smoking, family history of CVD; 3 points), personal history of cardiovascular disease (1 point), duration of chest pain from 1 to 60 minutes (2 points), substernal chest pain (1 point), pain increasing with exertion (1 point) and absence of tenderness at palpation (1 point). Area under the ROC curve for the score was of 0.95 (IC95% 0.93; 0.97). Patients were categorised in three groups, low risk of IHD (score under 6; n = 360), moderate risk of IHD (score from 6 to 8; n = 187) and high risk of IHD (score from 9-13; n = 125). Prevalence of IHD in each group was respectively of 0%, 6.7%, 58.5%. Reliability of the model seems satisfactory as the model developed from the derivation set predicted perfectly (p = 0.948) the number of patients in each group in the validation set. Conclusion: This clinical score based only on history and physical exams can be an important tool in the practice of the general physician for the prediction of ischemic heart disease in patients complaining of chest pain. The score below 6 points (in more than half of our population) can avoid demanding complementary exams for selected patients (ECG, laboratory tests) because of the very low risk of IHD. Score above 6 points needs investigation to detect or rule out IHD. Further external validation is required in ambulatory settings.
Resumo:
BACKGROUND: The clinical course of HIV-1 infection is highly variable among individuals, at least in part as a result of genetic polymorphisms in the host. Toll-like receptors (TLRs) have a key role in innate immunity and mutations in the genes encoding these receptors have been associated with increased or decreased susceptibility to infections. OBJECTIVES: To determine whether single-nucleotide polymorphisms (SNPs) in TLR2-4 and TLR7-9 influenced the natural course of HIV-1 infection. METHODS: Twenty-eight SNPs in TLRs were analysed in HAART-naive HIV-positive patients from the Swiss HIV Cohort Study. The SNPs were detected using Sequenom technology. Haplotypes were inferred using an expectation-maximization algorithm. The CD4 T cell decline was calculated using a least-squares regression. Patients with a rapid CD4 cell decline, less than the 15th percentile, were defined as rapid progressors. The risk of rapid progression associated with SNPs was estimated using a logistic regression model. Other candidate risk factors included age, sex and risk groups (heterosexual, homosexual and intravenous drug use). RESULTS: Two SNPs in TLR9 (1635A/G and +1174G/A) in linkage disequilibrium were associated with the rapid progressor phenotype: for 1635A/G, odds ratio (OR), 3.9 [95% confidence interval (CI),1.7-9.2] for GA versus AA and OR, 4.7 (95% CI,1.9-12.0) for GG versus AA (P = 0.0008). CONCLUSION: Rapid progression of HIV-1 infection was associated with TLR9 polymorphisms. Because of its potential implications for intervention strategies and vaccine developments, additional epidemiological and experimental studies are needed to confirm this association.
Resumo:
Background: Visual analog scales (VAS) are used to assess readiness to changeconstructs, which are often considered critical for change.Objective: We studied whether 3 constructs -readiness to change, importance of changing and confidence inability to change- predict risk status 6 months later in 20 year-old men with either orboth of two behaviors: risky drinking and smoking. Methods: 577 participants in abrief intervention randomized trial were assessed at baseline and 6 months later onalcohol and tobacco consumption and with three 1-10 VAS (readiness, importance,confidence) for each behavior. For each behavior, we used one regression model foreach constructs. Models controlled for receipt of a brief intervention and used thelowest level (1-4) in each construct as the reference group (vs medium (5-7) and high(8-10) levels).Results: Among the 475 risky drinkers, mean (SD) readiness, importance and confidence to change drinking were 4.0 (3.1), 2.8 (2.2) and 7.2 (3.0).Readiness was not associated with being alcohol-risk free 6 months later (OR 1.3[0.7; 2.2] and 1.4 [0.8; 2.6] for medium and high readiness). High importance andhigh confidence were associated with being risk free (OR 0.9 [0.5; 1.8] and 2.9 [1.2;7.5] for medium and high importance; 2.1 [1.0;4.8] and 2.8 [1.5;5.6] for medium andhigh confidence). Among the 320 smokers, mean readiness, importance andconfidence to change smoking were 4.6 (2.6), 5.3 (2.6) and 5.9 (2.6). Neitherreadiness nor importance were associated with being smoking free (OR 2.1 [0.9; 4.7]and 2.1 [0.8; 5.8] for medium and high readiness; 1.4 [0.6; 3.4] and 2.1 [0.8; 5.4] formedium and high importance). High confidence was associated with being smokingfree (OR 2.2 [0.8;6.6] and 3.4 [1.2;9.8] for medium and high confidence).Conclusions: For drinking and smoking, high confidence in ability to change wasassociated -with similar magnitude- with a favorable outcome. This points to thevalue of confidence as an important predictor of successful change.
Resumo:
BACKGROUND: Lower ambulatory performance with aging may be related to a reduced oxidative capacity within skeletal muscle. This study examined the associations between skeletal muscle mitochondrial capacity and efficiency with walking performance in a group of older adults. METHODS: Thirty-seven older adults (mean age 78 years; 21 men and 16 women) completed an aerobic capacity (VO peak) test and measurement of preferred walking speed over 400 m. Maximal coupled (State 3; St3) mitochondrial respiration was determined by high-resolution respirometry in saponin-permeabilized myofibers obtained from percutanous biopsies of vastus lateralis (n = 22). Maximal phosphorylation capacity (ATP) of vastus lateralis was determined in vivo by P magnetic resonance spectroscopy (n = 30). Quadriceps contractile volume was determined by magnetic resonance imaging. Mitochondrial efficiency (max ATP production/max O consumption) was characterized using ATP per St3 respiration (ATP/St3). RESULTS: In vitro St3 respiration was significantly correlated with in vivo ATP (r = .47, p = .004). Total oxidative capacity of the quadriceps (St3*quadriceps contractile volume) was a determinant of VO peak (r = .33, p = .006). ATP (r = .158, p = .03) and VO peak (r = .475, p < .0001) were correlated with preferred walking speed. Inclusion of both ATP/St3 and VO peak in a multiple linear regression model improved the prediction of preferred walking speed (r = .647, p < .0001), suggesting that mitochondrial efficiency is an important determinant for preferred walking speed. CONCLUSIONS: Lower mitochondrial capacity and efficiency were both associated with slower walking speed within a group of older participants with a wide range of function. In addition to aerobic capacity, lower mitochondrial capacity and efficiency likely play roles in slowing gait speed with age.
Resumo:
Long-term outcome of idiopathic steroid-resistant nephrotic syndrome was retrospectively studied in 78 children in eight centers for the past 20 years. Median age at onset was 4.4 years (1.1-15.0 years) and the gender ratio was 1.4. Median follow-up period was 7.7 years (1.0-19.7 years). The disease in 45 patients (58%) was initially not steroid-responsive and in 33 (42%) it was later non-responsive. The main therapeutic strategies included administration of ciclosporine (CsA) alone (n = 29; 37%) and CsA + mycophenolate mofetil (n = 18; 23%). Actuarial patient survival rate after 15 years was 97%. Renal survival rate after 5 years, 10 years and 15 years was 75%, 58% and 53%, respectively. An age at onset of nephrotic syndrome (NS) > 10 years was the only independent predictor of end-stage renal disease (ESRD) in a multivariate analysis using a Cox regression model (P < 0.001). Twenty patients (26%) received transplants; ten showed recurrence of the NS: seven within 2 days, one within 2 weeks, and two within 3-5 months. Seven patients lost their grafts, four from recurrence. Owing to better management, kidney survival in idiopathic steroid-resistant nephrotic syndrome (SRNS) has improved during the past 20 years. Further prospective controlled trials will delineate the potential benefit of new immunosuppressive treatment.
Resumo:
BACKGROUND AND PURPOSE: Hyperglycemia after stroke is associated with larger infarct volume and poorer functional outcome. In an animal stroke model, the association between serum glucose and infarct volume is described by a U-shaped curve with a nadir ≈7 mmol/L. However, a similar curve in human studies was never reported. The objective of the present study is to investigate the association between serum glucose levels and functional outcome in patients with acute ischemic stroke. METHODS: We analyzed 1446 consecutive patients with acute ischemic stroke. Serum glucose was measured on admission at the emergency department together with multiple other metabolic, clinical, and radiological parameters. National Institutes of Health Stroke Scale (NIHSS) score was recorded at 24 hours, and Rankin score was recorded at 3 and 12 months. The association between serum glucose and favorable outcome (Rankin score ≤2) was explored in univariate and multivariate analysis. The model was further analyzed in a robust regression model based on fractional polynomial (-2-2) functions. RESULTS: Serum glucose is independently correlated with functional outcome at 12 months (OR, 1.15; P=0.01). Other predictors of outcome include admission NIHSS score (OR, 1.18; P<0001), age (OR, 1.06; P<0.001), prestroke Rankin score (OR, 20.8; P=0.004), and leukoaraiosis (OR, 2.21; P=0.016). Using these factors in multiple logistic regression analysis, the area under the receiver-operator characteristic curve is 0.869. The association between serum glucose and Rankin score at 12 months is described by a J-shaped curve with a nadir of 5 mmol/L. Glucose values between 3.7 and 7.3 mmol/L are associated with favorable outcome. A similar curve was generated for the association of glucose and 24-hour NIHSS score, for which glucose values between 4.0 and 7.2 mmol/L are associated with a NIHSS score <7. Discussion-Both hypoglycemia and hyperglycemia are dangerous in acute ischemic stroke as shown by a J-shaped association between serum glucose and 24-hour and 12-month outcome. Initial serum glucose values between 3.7 and 7.3 mmol/L are associated with favorable outcome.
Resumo:
OBJECTIVES:: For certain major operations, inpatient mortality risk is lower in high-volume hospitals than those in low-volume hospitals. Extending the analysis to a broader range of interventions and outcomes is necessary before adopting policies based on minimum volume thresholds. METHODS:: Using the United States 2004 Nationwide Inpatient Sample, we assessed the effect of intervention-specific and overall hospital volume on surgical complications, potentially avoidable reoperations, and deaths across 1.4 million interventions in 353 hospitals. Outcome variations across hospitals were analyzed through a 3-level hierarchical logistic regression model (patients, surgical interventions, and hospitals), which took into account interventions on multiple organs, 144 intervention categories, and structural hospital characteristics. Discriminative performance and calibration were good. RESULTS:: Hospitals with more experience in a given intervention had similar reoperation rates but lower mortality and complication rates: odds ratio per volume deciles 0.93 and 0.97. However, the benefit was limited to heart surgery and a small number of other operations. Risks were higher for hospitals that performed more interventions overall: odds ratio per 1000 for each event was approximately 1.02. Even after adjustment for specific volume, mortality varied substantially across both high- and low-volume hospitals. CONCLUSION:: Although the link between specific volume and certain inpatient outcomes suggests that specialization might help improve surgical safety, the variable magnitude of this link and the heterogeneity of hospital effect do not support the systematic use of volume-based referrals. It may be more efficient to monitor risk-adjusted postoperative outcomes and to investigate facilities with worse than expected outcomes.