84 resultados para Harry Truman
Resumo:
Fibromuscular dysplasia (FMD) is a rare, nonatherosclerotic arterial disease for which the molecular basis is unknown. We comprehensively studied 47 subjects with FMD, including physical examination, spine magnetic resonance imaging, bone densitometry, and brain magnetic resonance angiography. Inflammatory biomarkers in plasma and transforming growth factor β (TGF-β) cytokines in patient-derived dermal fibroblasts were measured by ELISA. Arterial pathology other than medial fibrodysplasia with multifocal stenosis included cerebral aneurysm, found in 12.8% of subjects. Extra-arterial pathology included low bone density (P<0.001); early onset degenerative spine disease (95.7%); increased incidence of Chiari I malformation (6.4%) and dural ectasia (42.6%); and physical examination findings of a mild connective tissue dysplasia (95.7%). Screening for mutations causing known genetically mediated arteriopathies was unrevealing. We found elevated plasma TGF-β1 (P=0.009), TGF-β2 (P=0.004) and additional inflammatory markers, and increased TGF-β1 (P=0.0009) and TGF-β2 (P=0.0001) secretion in dermal fibroblast cell lines from subjects with FMD compared to age- and gender-matched controls. Detailed phenotyping of patients with FMD allowed us to demonstrate that FMD is a systemic disease with alterations in common with the spectrum of genetic syndromes that involve altered TGF-β signaling and offers TGF-β as a marker of FMD.
Resumo:
The QT interval, an electrocardiographic measure reflecting myocardial repolarization, is a heritable trait. QT prolongation is a risk factor for ventricular arrhythmias and sudden cardiac death (SCD) and could indicate the presence of the potentially lethal mendelian long-QT syndrome (LQTS). Using a genome-wide association and replication study in up to 100,000 individuals, we identified 35 common variant loci associated with QT interval that collectively explain ∼8-10% of QT-interval variation and highlight the importance of calcium regulation in myocardial repolarization. Rare variant analysis of 6 new QT interval-associated loci in 298 unrelated probands with LQTS identified coding variants not found in controls but of uncertain causality and therefore requiring validation. Several newly identified loci encode proteins that physically interact with other recognized repolarization proteins. Our integration of common variant association, expression and orthogonal protein-protein interaction screens provides new insights into cardiac electrophysiology and identifies new candidate genes for ventricular arrhythmias, LQTS and SCD.
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.
Resumo:
BACKGROUND CONTEXT The nerve root sedimentation sign in transverse magnetic resonance imaging has been shown to discriminate well between selected patients with and without lumbar spinal stenosis (LSS), but the performance of this new test, when used in a broader patient population, is not yet known. PURPOSE To evaluate the clinical performance of the nerve root sedimentation sign in detecting central LSS above L5 and to determine its potential significance for treatment decisions. STUDY DESIGN Retrospective cohort study. PATIENT SAMPLE One hundred eighteen consecutive patients with suspected LSS (52% women, median age 62 years) with a median follow-up of 24 months. OUTCOME MEASURES Oswestry disability index (ODI) and back and leg pain relief. METHODS We performed a clinical test validation study to assess the clinical performance of the sign by measuring its association with health outcomes. Subjects were patients referred to our orthopedic spine unit from 2004 to 2007 before the sign had been described. Based on clinical and radiological diagnostics, patients had been treated with decompression surgery or nonsurgical treatment. Changes in the ODI and pain from baseline to 24-month follow-up were compared between sedimentation sign positives and negatives in both treatment groups. RESULTS Sixty-nine patients underwent surgery. Average baseline ODI in the surgical group was 54.7%, and the sign was positive in 39 patients (mean ODI improvement 29.0 points) and negative in 30 (ODI improvement 28.4), with no statistically significant difference in ODI and pain improvement between groups. In the 49 patients of the nonsurgical group, mean baseline ODI was 42.4%; the sign was positive in 18 (ODI improvement 0.6) and negative in 31 (ODI improvement 17.7). A positive sign was associated with a smaller ODI and back pain improvement than negative signs (both p<.01 on t test). CONCLUSIONS In patients commonly treated with decompression surgery, the sedimentation sign does not appear to predict surgical outcome. In nonsurgically treated patients, a positive sign is associated with more limited improvement. In these cases, surgery might be effective, but this needs investigation in prospective randomized trials (Australian New Zealand Clinical Trial Registry, number ACTRN12610000567022).
Resumo:
BACKGROUND: Infliximab (IFX) has been used for over a decade worldwide. Less is known about the natural history of IFX use beyond a few years and which patients are more likely to sustain benefits. METHODS: Patients with Crohn's disease (CD) exposed to IFX from Massachusetts General Hospital, Boston, Saint-Antoine Hospital, Paris, and the Swiss IBD Cohort Study were identified through retrospective and prospective data collection, complemented by chart abstraction of electronic medical records. We compared long-term users of IFX (>5 yr of treatment, long-term users of infliximab [LTUI]), with non-LTUI patients to identify prognostic factors. RESULTS: We pooled data on 1014 patients with CD from 3 different databases, of whom 250 were defined as LTUI. The comparison group comprised 290 patients with CD who discontinued IFX: 48 primary nonresponses, 95 loss of responses, and 147 adverse events. Factors associated with LTUI were colonic involvements and an earlier age at the start of IFX. The prevalence of active smokers and obese patients differed markedly, but inversely, between American and European centers but did not impact outcome. The discontinuation rate was stable around 3% to 6%, each year from years 3 to 10. CONCLUSIONS: Young age at start of IFX and colonic CD are factors associated with a beneficial long-term use of IFX. After 5 years of IFX, there is still a 3% to 5% discontinuation rate annually. Several factors associated with a good initial response such as nonsmoker and shorter disease duration at IFX initiation do not seem associated with a longer term response.
Resumo:
BACKGROUND & AIMS Pegylated interferon is still the backbone of hepatitis C treatment and may cause thrombocytopenia, leading to dose reductions, early discontinuation, and eventually worse clinical outcome. We assessed associations between interferon-induced thrombocytopenia and bleeding complications, interferon dose reductions, early treatment discontinuation, as well as SVR and long-term clinical outcome. METHODS All consecutive patients with chronic HCV infection and biopsy-proven advanced hepatic fibrosis (Ishak 4-6) who initiated interferon-based therapy between 1990 and 2003 in 5 large hepatology units in Europe and Canada were included. RESULTS Overall, 859 treatments were administered to 546 patients. Baseline platelets (in 10(9)/L) were normal (⩾150) in 394 (46%) treatments; thrombocytopenia was moderate (75-149) in 324 (38%) and severe (<75) in 53 (6%) treatments. Thrombocytopenia-induced interferon dose reductions occurred in 3 (1%); 46 (16%), and 15 (30%) treatments respectively (p<0.001); interferon was discontinued due to thrombocytopenia in 1 (<1%), 8 (3%), and in 8 (16%) treatments respectively (p<0.001). In total, 104 bleeding events were reported during 53 treatments. Only two severe bleeding complications occurred. Multivariate analysis showed that cirrhosis and a platelet count below 50 were associated with on-treatment bleeding. Within thrombocytopenic patients, patients attaining SVR had a lower occurrence of liver failure (p<0.001), hepatocellular carcinoma (p<0.001), liver related death or liver transplantation (p<0.001), and all-cause mortality (p=0.001) compared to patients without SVR. CONCLUSIONS Even in thrombocytopenic patients with chronic HCV infection and advanced hepatic fibrosis, on-treatment bleedings are generally mild. SVR was associated with a marked reduction in cirrhosis-related morbidity and mortality, especially in patients with baseline thrombocytopenia.
Resumo:
OBJECTIVE Reliable tools to predict long-term outcome among patients with well compensated advanced liver disease due to chronic HCV infection are lacking. DESIGN Risk scores for mortality and for cirrhosis-related complications were constructed with Cox regression analysis in a derivation cohort and evaluated in a validation cohort, both including patients with chronic HCV infection and advanced fibrosis. RESULTS In the derivation cohort, 100/405 patients died during a median 8.1 (IQR 5.7-11.1) years of follow-up. Multivariate Cox analyses showed age (HR=1.06, 95% CI 1.04 to 1.09, p<0.001), male sex (HR=1.91, 95% CI 1.10 to 3.29, p=0.021), platelet count (HR=0.91, 95% CI 0.87 to 0.95, p<0.001) and log10 aspartate aminotransferase/alanine aminotransferase ratio (HR=1.30, 95% CI 1.12 to 1.51, p=0.001) were independently associated with mortality (C statistic=0.78, 95% CI 0.72 to 0.83). In the validation cohort, 58/296 patients with cirrhosis died during a median of 6.6 (IQR 4.4-9.0) years. Among patients with estimated 5-year mortality risks <5%, 5-10% and >10%, the observed 5-year mortality rates in the derivation cohort and validation cohort were 0.9% (95% CI 0.0 to 2.7) and 2.6% (95% CI 0.0 to 6.1), 8.1% (95% CI 1.8 to 14.4) and 8.0% (95% CI 1.3 to 14.7), 21.8% (95% CI 13.2 to 30.4) and 20.9% (95% CI 13.6 to 28.1), respectively (C statistic in validation cohort = 0.76, 95% CI 0.69 to 0.83). The risk score for cirrhosis-related complications also incorporated HCV genotype (C statistic = 0.80, 95% CI 0.76 to 0.83 in the derivation cohort; and 0.74, 95% CI 0.68 to 0.79 in the validation cohort). CONCLUSIONS Prognosis of patients with chronic HCV infection and compensated advanced liver disease can be accurately assessed with risk scores including readily available objective clinical parameters.
Resumo:
BACKGROUND & AIMS Pegylated interferon-based treatment is still the backbone of current hepatitis C therapy and is associated with bone marrow suppression and an increased risk of infections. The aim of this retrospective cohort study was to assess the risk of infections during interferon-based treatment among patients with chronic HCV infection and advanced hepatic fibrosis and its relation to treatment-induced neutropenia. METHODS This cohort study included all consecutive patients with chronic HCV infection and biopsy-proven bridging fibrosis or cirrhosis (Ishak 4-6) who started treatment between 1990 and 2003 in five large hepatology units in Europe and Canada. Neutrophil counts between 500/μL-749/μL and below 500/μL were considered as moderate and severe neutropenia, respectively. RESULTS This study included 723 interferon-based treatments, administered to 490 patients. In total, 113 infections were reported during 88 (12%) treatments, of which 24 (21%) were considered severe. Only one patient was found to have moderate neutropenia and three patients were found to have severe neutropenia at the visit before the infection. Three hundred and twelve (99.7%) visits with moderate neutropenia and 44 (93.6%) visits with severe neutropenia were not followed by an infection. Multivariable analysis showed that cirrhosis (OR 2.85, 95%CI 1.38-5.90, p=0.005) and severe neutropenia at the previous visit (OR 5.42, 95%CI 1.34-22.0, p=0.018) were associated with the occurrence of infection, while moderate neutropenia was not. Among a subgroup of patients treated with PegIFN, severe neutropenia was not significantly associated (OR 1.63, 95%CI 0.19-14.2, p=0.660). CONCLUSIONS In this large cohort of patients with bridging fibrosis and cirrhosis, infections during interferon-based therapy were generally mild. Severe interferon-induced neutropenia rarely occurred, but was associated with on-treatment infection. Moderate neutropenia was not associated with infection, suggesting that current dose reduction guidelines might be too strict.
Resumo:
BACKGROUND Combination antiretroviral therapy (ART) suppresses viral replication in HIV-infected children. The growth of virologically suppressed children on ART has not been well documented. We aimed to develop dynamic reference curves for weight-for-age z scores (WAZ) and height-for-age z scores (HAZ). RESULTS A total of 4,876 children were followed for 7,407 person-years. Analyses were stratified by baseline z-scores and age, which were the most important predictors of growth response. The youngest children showed the most pronounced increase in weight and height initially but catch-up growth stagnated after 1-2 years. Three years after starting ART, WAZ ranged from -2.2 (95% Prediction interval -5.6 to 0.8) in children with baseline age "5 years and z-score "-3 to 0.0 (-2.7 to 2.4) in children with baseline age "2 years and WAZ "-1. For HAZ the corresponding range was -2.3 (-4.9 to 0.3) in children with baseline age"5 years and z-score "-3 to 0.3 (-3.1 to 3.4) in children with baseline age 2-5 years and HAZ "-1. CONCLUSIONS We have developed an online tool to calculate reference trajectories in fully suppressed children. The web application could help to define 'optimal' growth response and identify children with treatment failure.
Resumo:
Denosumab reduced the incidence of new fractures in postmenopausal women with osteoporosis by 68% at the spine and 40% at the hip over 36 months compared with placebo in the FREEDOM study. This efficacy was supported by improvements from baseline in vertebral (18.2%) strength in axial compression and femoral (8.6%) strength in sideways fall configuration at 36 months, estimated in Newtons by an established voxel-based finite element (FE) methodology. Since FE analyses rely on the choice of meshes, material properties, and boundary conditions, the aim of this study was to independently confirm and compare the effects of denosumab on vertebral and femoral strength during the FREEDOM trial using an alternative smooth FE methodology. Unlike the previous FE study, effects on femoral strength in physiological stance configuration were also examined. QCT data for the proximal femur and two lumbar vertebrae were analyzed by smooth FE methodology at baseline, 12, 24, and 36 months for 51 treated (denosumab) and 47 control (placebo) subjects. QCT images were segmented and converted into smooth FE models to compute bone strength. L1 and L2 vertebral bodies were virtually loaded in axial compression and the proximal femora in both fall and stance configurations. Denosumab increased vertebral body strength by 10.8%, 14.0%, and 17.4% from baseline at 12, 24, and 36 months, respectively (p < 0.0001). Denosumab also increased femoral strength in the fall configuration by 4.3%, 5.1%, and 7.2% from baseline at 12, 24, and 36 months, respectively (p < 0.0001). Similar improvements were observed in the stance configuration with increases of 4.2%, 5.2%, and 5.2% from baseline (p ≤ 0.0007). Differences between the increasing strengths with denosumab and the decreasing strengths with placebo were significant starting at 12 months (vertebral and femoral fall) or 24 months (femoral stance). Using an alternative smooth FE methodology, we confirmed the significant improvements in vertebral body and proximal femur strength previously observed with denosumab. Estimated increases in strength with denosumab and decreases with placebo were highly consistent between both FE techniques.
Resumo:
Calcium channel blockers (CCBs) are prescribed to patients with Marfan syndrome for prophylaxis against aortic aneurysm progression, despite limited evidence for their efficacy and safety in the disorder. Unexpectedly, Marfan mice treated with CCBs show accelerated aneurysm expansion, rupture, and premature lethality. This effect is both extracellular signal-regulated kinase (ERK1/2) dependent and angiotensin-II type 1 receptor (AT1R) dependent. We have identified protein kinase C beta (PKCβ) as a critical mediator of this pathway and demonstrate that the PKCβ inhibitor enzastaurin, and the clinically available anti-hypertensive agent hydralazine, both normalize aortic growth in Marfan mice, in association with reduced PKCβ and ERK1/2 activation. Furthermore, patients with Marfan syndrome and other forms of inherited thoracic aortic aneurysm taking CCBs display increased risk of aortic dissection and need for aortic surgery, compared to patients on other antihypertensive agents.
Resumo:
The metabolic network of a cell represents the catabolic and anabolic reactions that interconvert small molecules (metabolites) through the activity of enzymes, transporters and non-catalyzed chemical reactions. Our understanding of individual metabolic networks is increasing as we learn more about the enzymes that are active in particular cells under particular conditions and as technologies advance to allow detailed measurements of the cellular metabolome. Metabolic network databases are of increasing importance in allowing us to contextualise data sets emerging from transcriptomic, proteomic and metabolomic experiments. Here we present a dynamic database, TrypanoCyc (http://www.metexplore.fr/trypanocyc/), which describes the generic and condition-specific metabolic network of Trypanosoma brucei, a parasitic protozoan responsible for human and animal African trypanosomiasis. In addition to enabling navigation through the BioCyc-based TrypanoCyc interface, we have also implemented a network-based representation of the information through MetExplore, yielding a novel environment in which to visualise the metabolism of this important parasite.