273 resultados para Generalized failure rate
Resumo:
Purpose The third-generation nonsteroidal aromatase inhibitors (AIs) are increasingly used as adjuvant and first-line advanced therapy for postmenopausal, hormone receptor-positive (HR +) breast cancer. Because many patients subsequently experience progression or relapse, it is important to identify agents with efficacy after AI failure. Materials and Methods Evaluation of Faslodex versus Exemestane Clinical Trial (EFECT) is a randomized, double-blind, placebo controlled, multicenter phase III trial of fulvestrant versus exemestane in postmenopausal women with HR + advanced breast cancer (ABC) progressing or recurring after nonsteroidal AI. The primary end point was time to progression (TTP). A fulvestrant loading-dose (LD) regimen was used: 500 mg intramuscularly on day 0, 250 mg on days 14, 28, and 250 mg every 28 days thereafter. Exemestane 25 mg orally was administered once daily. Results A total of 693 women were randomly assigned to fulvestrant (n = 351) or exemestane ( n = 342). Approximately 60% of patients had received at least two prior endocrine therapies. Median TTP was 3.7 months in both groups ( hazard ratio = 0.963; 95% CI, 0.819 to 1.133; P = .6531). The overall response rate ( 7.4% v 6.7%; P = .736) and clinical benefit rate ( 32.2% v 31.5%; P = .853) were similar between fulvestrant and exemestane respectively. Median duration of clinical benefit was 9.3 and 8.3 months, respectively. Both treatments were well tolerated, with no significant differences in the incidence of adverse events or quality of life. Pharmacokinetic data confirm that steady-state was reached within 1 month with the LD schedule of fulvestrant. Conclusion Fulvestrant LD and exemestane are equally active and well-tolerated in a meaningful proportion of postmenopausal women with ABC who have experienced progression or recurrence during treatment with a nonsteroidal AI.
Resumo:
PURPOSE: To evaluate retrospectively the midterm and long-term results of percutaneous endovascular treatment of venous outflow obstruction after pediatric liver transplantation. MATERIALS AND METHODS: During a 9-year period, 18 children with obstruction of a hepatic vein (HV) or inferior vena cava (IVC) anastomosis underwent percutaneous transluminal angioplasty (PTA) with balloon dilation or stent placement in case of PTA failure after liver transplantation. Patients` body weights ranged from 7.7 kg to 42.6 kg (mean, 18.8 kg +/- 9). Potential predictors of patency were compared between balloon dilation and stent placement groups. RESULTS: Forty-two procedures were performed (range, 1-11 per patient; mean, 2). Technical and initial clinical success were achieved in all cases. Major complications included one case of pulmonary artery stent embolization and one case of hemothorax. Three children (25%) with HV obstruction were treated with PTA and nine (75%) were treated with stent placement. Three children with IVC obstruction (75%) were treated with PTA and one (25%) was treated with a stent. There were two children with simultaneous obstruction at the HV and IVC; one was treated with PTA and the other with a stent. Cases of isolated HV stenosis have a higher probability of patency with balloon-expandable stent treatment compared with balloon dilation (P < .05). Follow-up time ranged from 7 days to 9 years (mean, 42 months +/- 31), and the primary assisted patency rate was 100% when stent placement was performed among the first three procedures. CONCLUSIONS: In cases of venous outflow obstruction resulting from HV and/or IVC lesions after pediatric liver transplantation, percutaneous endovascular treatment with balloon dilation or stent placement is a safe and effective alternative treatment that results in long-term patency.
Resumo:
Many lines of evidence indicate that theta rhythm, a prominent neural oscillatory mode found in the mammalian hippocampus, plays a key role in the acquisition, processing, and retrieval of memories. However, a predictive neurophysiological feature of the baseline theta rhythm that correlates with the learning rate across different animals has yet to be identified. Here we show that the mean theta rhythm speed observed during baseline periods of immobility has a strong positive correlation with the rate at which rats learn an operant task. This relationship is observed across rats, during both quiet waking (r=0.82; p<0.01) and paradoxical sleep (r=0.83; p<0.01), suggesting that the basal theta frequency relates to basic neurological processes that are important in the acquisition of operant behavior. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Polymorphisms of chemokines and chemokine-receptors genes have been shown to influence the rate of progression to AIDS; however, their influence on response to HAART remains unclear. We investigated the frequency of the SDF-1-3`A, CCR2-64I, CCR5-D32 and CCR5-Promoter-59029-A/G polymorphisms in Brazilian HIV-1-infected and uninfected individuals and their influence on CD4+ T-cell evolution HIV-1 infected individuals before and during HAART. Polymorphism detection was done in a transversal study of 200 HIV-1-infected and 82 uninfected individuals. The rate of CD4+ T cell increase or decrease was studied in a cohort of 155 HIV-1 infected individuals on pre and post-HAART. Polymorphisms were determined by PCR associated with RFLP. The rate of CD4+ T-cell decline or increase was also determined. HIV-1 infected and uninfected subjects showed, respectively, frequencies of 0.193 and 0.220 for SDF-1-3`A, of 0.140 and 0.110 for CCR2-V64I, of 0.038 and 0.055 for CCR5-D32, and of 0.442 and 0.390 for CCR5-P-59029-A/G. HIV-1-infected subjects carrying one, two or three of these four polymorphisms showed better CD4+ T-cell recovery than HIV-1-infected subjects carrying the four wild-type alleles (+2.7, +1.6, +3.5, and -0.9 lymphocytes/mu l/month, respectively). Regression logistic analysis showed that the CCR5-D32/CCR2-V64I association was predictor of positive CD4+ T cell slope after HAART. The distribution of polymorphisms did not differ between HIV-1-infected and uninfected individuals, but differed from more homogenous ethnic groups probably reflecting the miscegenation of the Brazilian population. We add further evidence of the role of these polymorphisms by showing that the CD4 gain was influenced by carriage of one or more of the polymorphisms studied here. These results highlight the possibility that these genetic traits can be useful to identify patients at risk for faster progression to AIDS or therapeutic failure.
Resumo:
Background: The 6-minute walk test (6MWT) is a well-known instrument for assessing the functional capacity of a variety of groups, including the obese. It is a simple, low-cost and easily applied method to objectively assess the level of exercise capacity. The aim of the present study was to study the functional capacity of a severely obese population before and after bariatric surgery. Methods: A total of 51 patients were studied. Of the 51 patients, 86.2% were women, and the mean age was 40.9 +/- 9.2 years. All 51 patients were evaluated preoperatively and 49 were evaluated 7-12 months postoperatively. The initial body mass index was 51.1 +/- 9.2 kg/m(2), and the final body mass index was 28.2 +/- 8.1 kg/m(2). All patients underwent Roux-en-Y gastric bypass. The 6MWT was performed in a hospital corridor, with patients attempting to cover as much distance as they could, walking back and forth for as long as possible within 6 minutes at their regular pace. The total distance, Borg Scale of perceived exhaustion, modified Borg dyspnea scale for shortness of breath, and physical complaints at the end of the test were recorded. In addition, the heart rate and respiratory frequency were assessed before and after the test. Results: The tolerance was good, and no injuries occurred at either evaluation. The patients` mean distance for the 6MWT was 381.9 +/- 49.3 m before surgery and 467.8 +/- 40.3 m after surgery (p < .0001). Similar results were observed for the other parameters assessed. Conclusion: The 6MWT provided useful information about the functional status of the obese patients undergoing bariatric surgery. A simple, safe, and powerful method to assess functional capacity of severely obese patients, the 6MWT is an objective test that might replace the conventional treadmill test for these types of patients. (Surg Obes Relat Dis 2009;5:540-543.) (C) 2009 American Society for Metabolic and Bariatric Surgery. All rights reserved.
Resumo:
Background: Parenteral nutrition (PN) is used to control the nutritional state after severe intestinal resections. Whenever possible, enteral nutrition (EN) is used to promote intestinal rehabilitation and reduce PN dependency. Our aim is to verify whether EN + oral intake (01) in severe short bowel syndrome (SBS) surgical adult patients can maintain adequate nutritional status in the long term. Methods: This longitudinal retrospective study included 10 patients followed for 7 post-operative years. Body mass index (BMI), percentage of involuntary loss of usual body weight (UWL), free fat mass (FFM), and fat mass (FM) composition assessed by bioelectric impedance, and laboratory tests were evaluated at 6, 12, 24, 36, 48, 60, 72, and 84 months after surgery. Energy and protein offered in HPN and at long term by HEN+ oral intake (01), was evaluated at the same periods. The statistical model of generalized estimating equations with p <0,05 was used. Results: With long term EN + 01 there was a progressive increase in the UWL, a decrease in BMI, FFM, and FM (p < 0,05). PN weaning was possible in eight patients. Infection due to central venous catheter (CVC) contamination was the most common complication (1.2 episodes CVC/patient/year). There was an increase in energy and protein intake supply provided by HEN+OI (p <0.05). All patients survived for at least 2 years, seven for 5 years and six for 7 years of follow-up. Conclusions: In the long term SBS surgical adult patients fed with HEN+OI couldn`t maintain adequate nutritional status with loss of FM and FFM. (Nutr Hosp. 2011;26:834-842) DOI:10.3305/nh.2011.26.4.5153
Resumo:
Hypercalcaemia in patients with HIV infection is usually associated with specific conditions such as lymphoma and granulomatous diseases. We described a case of severe hypercalcaemia consequent to vitamin D intoxication and secondary renal failure in a HIV patient under tenofovir using. Serum creatinine and calcium returned to near normal levels after vitamin D discontinuation, saline and furosemide administration. Some aspects of the drug-induced nephropathy are discussed.
Resumo:
Background-Endocardial fibrous tissue (FT) deposition is a hallmark of endomyocardial fibrosis (EMF). Echocardiography is a first-line and the standard technique for the diagnosis of this disease. Although late gadolinium enhancement (LGE) cardiovascular magnetic resonance (CMR) allows FT characterization, its role in the diagnosis and prognosis of EMF has not been investigated. Methods and Results-Thirty-six patients (29 women; age, 54 +/- 12 years) with EMF diagnosis after clinical evaluation and comprehensive 2-dimensional Doppler echocardiography underwent cine-CMR for assessing ventricular volumes, ejection fraction and mass, and LGE-CMR for FT characterization and quantification. Indexed FT volume (FT/body surface area) was calculated after planimetry of the 8 to 12 slices obtained in the short-axis view at end-diastole (mL/m(2)). Surgical resection of FT was performed in 16 patients. In all patients, areas of LGE were confined to the endocardium, frequently as a continuous streak from the inflow tract extending to the apex, where it was usually most prominent. There was a relation between increased FT/body surface area and worse New York Heart Association functional class and with increased probability of surgery (P<0.05). The histopathologic examination of resected FT showed typical features of EMF with extensive endocardial fibrous thickening, proliferation of small vessels, and scarce inflammatory infiltrate. In multivariate analysis, the patients with FT/body surface area >19 mL/m(2) had an increased mortality rate, with a relative risk of 10.8. Conclusions-Our study provides evidence that LGE-CMR is useful in the diagnosis and prognosis of EMF through quantification of the typical pattern of FT deposition. (Circ Cardiovasc Imaging. 2011;4:304-311.)
Resumo:
Objective: To describe a new FOXL2 gene mutation in a woman with sporadic blepharophimosis-ptosis-epicanthus inversus syndrome (BPES) and hypergonadotropic hypogonadism. Design: Case report. Setting: University medical center. Patient(s): A 28-year-old woman. Intervention(s): Clinical evaluation, hormone assays, gene mutation research. Main Outcome Measure(s): FOXL2 gene mutation. Result(s): The patient with hypergonadotropic hypogonadism was diagnosed with BPES due to a new FOXL2 gene mutation. Conclusion(s): Blepharophimosis-ptosis-epicanthus inversus syndrome is a rare disorder associated with premature ovarian failure (POF). The syndrome is an autosomal dominant trait that causes eyelid malformations and POF in affected women. Mutations in FOXL2 gene, located in chromosome 3, are related to the development of BPES with POF (BPES type I) or without POF (BPES type II). This report demonstrates a previously undescribed de novo mutation in the FOXL2 gene-a thymidine deletion, c. 627delT (g. 864delT)-in a woman with a sporadic case of BPES and POF. This mutation leads to truncated protein production that is related to a BPES type I phenotype. This report shows the importance of family history and genetic analysis in the evaluation of patients with POF and corroborates the relationship between mutations on the FOXL2 gene and ovarian insufficiency. (Fertil Steril (R) 2010; 93: 1006.e3-e6. (C) 2010 by American Society for Reproductive Medicine.)
Resumo:
Objective To determine accuracy of first trimester detection of single umbilical artery (SUA). Methods The number of vessels in the umbilical cord was examined in a prospective cohort of 779 singleton, low-risk, unselected pregnancies, in the first (11-13 weeks) and second (17-24 weeks) trimesters, using both power and color Doppler and after delivery, by placental histopathologic exam. Concordance between first and second trimester findings to postnatal diagnoses was compared by calculating kappa coefficients. Results There was medium concordance between the findings in the first trimester and the postnatal diagnoses (kappa = 0.52) and high concordance (kappa = 0.89) for the second trimester scan. Sensitivity, specificity, positive and negative predictive values for the findings in the first trimester were 57.1, 98.9, 50.0 and 99.2% and for the second trimester were 86.6, 99.9, 92.9 and 99.7%. Conclusion Sensitivity and positive predictive value of first trimester scan to identify an isolated SUA in a prospective unselected population was poor. Diagnosis of isolated SUA as well as a definitive judgment about the presence of associated anomalies would still require a scan in the second trimester. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
Although cigarette smoking and alcohol consumption increase risk for head and neck cancers, there have been few attempts to model risks quantitatively and to formally evaluate cancer site-specific risks. The authors pooled data from 15 case-control studies and modeled the excess odds ratio (EOR) to assess risk by total exposure (pack-years and drink-years) and its modification by exposure rate (cigarettes/day and drinks/day). The smoking analysis included 1,761 laryngeal, 2,453 pharyngeal, and 1,990 oral cavity cancers, and the alcohol analysis included 2,551 laryngeal, 3,693 pharyngeal, and 3,116 oval cavity cancers, with over 8,000 controls. Above 15 cigarettes/day, the EOR/pack-year decreased with increasing cigarettes/day, suggesting that greater cigarettes/day for a shorter duration was less deleterious than fewer cigarettes/day for a longer duration. Estimates of EOR/pack-year were homogeneous across sites, while the effects of cigarettes/day varied, indicating that the greater laryngeal cancer risk derived from differential cigarettes/day effects and not pack-years. EOR/drink-year estimates increased through 10 drinks/day, suggesting that greater drinks/day for a shorter duration was more deleterious than fewer drinks/day for a longer duration. Above 10 drinks/day, data were limited. EOR/drink-year estimates varied by site, while drinks/day effects were homogeneous, indicating that the greater pharyngeal/oral cavity cancer risk with alcohol consumption derived from the differential effects of drink-years and not drinks/day.
Resumo:
Objectives: To analyze mortality rates of children with severe sepsis and septic shock in relation to time-sensitive fluid resuscitation and treatments received and to define barriers to the implementation of the American College of Critical Care Medicine/Pediatric Advanced Life Support guidelines in a pediatric intensive care unit in a developing country. Methods: Retrospective chart review and prospective analysis of septic shock treatment in a pediatric intensive care unit of a tertiary care teaching hospital. Ninety patients with severe sepsis or septic shock admitted between July 2002 and June 2003 were included in this study. Results: Of the 90 patients, 83% had septic shock and 17% had severe sepsis; 80 patients had preexisting severe chronic diseases. Patients with septic shock who received less than a 20-mL/kg dose of resuscitation fluid in the first hour of treatment had a mortality rate of 73%, whereas patients who received more than a 40-mL/kg dose in the first hour of treatment had a mortality rate of 33% (P < 0.05.) Patients treated less than 30 minutes after diagnosis of severe sepsis and septic shock had a significantly lower mortality rate (40%) than patients treated more than 60 Minutes after diagnosis (P < 0.05). Controlling for the risk of mortality, early fluid resuscitation was associated with a 3-fold reduction in the odds of death (odds ratio, 0.33; 95% confidence interval, 0.13-0.85). The most important barriers to achieve adequate severe sepsis and septic shock treatment were lack of adequate vascular access, lack of recognition of early shock, shortage of health care providers, and nonuse of goals and treatment protocols. Conclusions: The mortality rate was higher for children older than years, for those who received less than 40 mL/kg in the first hour, and for those whose treatment was not initiated in the first 30 Minutes after the diagnosis of septic shock. The acknowledgment of existing barriers to a timely fluid administration and the establishment of objectives to overcome these barriers may lead to a more successful implementation of the American College of Critical Care Medicine guidelines and reduced mortality rates for children with septic shock in the developing world.
Resumo:
TRAPS is the most common of the autosomal dominant periodic fever syndromes. It is caused by mutations in the TNFRSF1A gene, which encodes for the type 1 TNF-receptor (TNFR1). We describe here a Brazilian patient with TRAPS associated to a novel TNFRSF1A de novo mutation and the response to anti-TNF therapy. The patient is a 9-year-old girl with recurrent fevers since the age of 3 years, usually lasting 3 to 7 days, and recurring every other week. These episodes are associated with mild abdominal pain, nausea, vomiting and generalized myalgia. Recurrent conjunctivitis and erysipela-like skin lesions in the lower limbs also occur. Laboratory studies show persistent normocytic normochromic anemia, thrombocytosis, elevated erythrocyte sedimentation rate and C-reactive protein. IgD levels are normal. Mutational screening of TNFRSF1A revealed the association of a novel C30F mutation with the common R92Q low-penetrance mutation. The R92Q mutation is seen in 5% of the general population and is associated with an atypical inflammatory phenotype. The patient had a very good response to etanercept, with cessation of fever and normalization of inflammatory markers. Our report expands the spectrum of TNFRSF1A mutations associated with TRAPS, adding further evidence for possible additive effects of a low-penetration R92Q and cysteine residue mutations, and confirms etanercept as an efficacious treatment alternative.
Resumo:
Animal and human studies indicate that cannabidiol (CBD), a major constituent of cannabis, has anxiolytic properties. However, no study to date has investigated the effects of this compound on human pathological anxiety and its underlying brain mechanisms. The aim of the present study was to investigate this in patients with generalized social anxiety disorder (SAD) using functional neuroimaging. Regional cerebral blood flow (rCBF) at rest was measured twice using (99m)Tc-ECD SPECT in 10 treatment-naive patients with SAD. In the first session, subjects were given an oral dose of CBD (400 mg) or placebo, in a double-blind procedure. In the second session, the same procedure was performed using the drug that had not been administered in the previous session. Within-subject between-condition rCBF comparisons were performed using statistical parametric mapping. Relative to placebo, CBD was associated with significantly decreased subjective anxiety (p < 0.001), reduced ECD uptake in the left parahippocampal gyrus, hippocampus, and inferior temporal gyrus (p < 0.001, uncorrected), and increased ECD uptake in the right posterior cingulate gyrus (p < 0.001, uncorrected). These results suggest that CBD reduces anxiety in SAD and that this is related to its effects on activity in limbic and paralimbic brain areas.
Muscle sympathetic nervous activity in depressed patients before and after treatment with sertraline
Resumo:
Background Sympathetic hyperactivity is one of the mechanisms involved in the increased cardiovascular risk associated with depression, and there is evidence that antidepressants decrease sympathetic activity. Objectives We tested the following two hypotheses: patients with major depressive disorder with high scores of depressive symptoms (HMDD) have augmented muscle sympathetic nervous system activity (MSNA) at rest and during mental stress compared with patients with major depressive disorder with low scores of depressive symptoms (LMDD) and controls; sertraline decreases MSNA in depressed patients. Methods Ten HMDD, nine LMDD and 11 body weight-matched controls were studied. MSNA was directly measured from the peroneal nerve using microneurography for 3 min at rest and 4 min during the Stroop color word test. For the LMDD and HMDD groups, the tests were repeated after treatment with sertraline (103.3 +/- 40 mg). Results Resting MSNA was significantly higher in the HMDD [29.1 bursts/min (SE 2.9)] compared with LMDD [19.9 (1.6)] and controls [22.2 (2.0)] groups (P=0.026 and 0.046, respectively). There was a significant positive correlation between resting MSNA and severity of depression. MSNA increased significantly and similarly during stress in all the studied groups. Sertraline significantly decreased resting MSNA in the LMDD group and MSNA during mental stress in LMDD and HMDD groups. Sertraline significantly decreased resting heart rate and heart rate response to mental stress in the HMDD group. Conclusion Moderate-to-severe depression is associated with increased MSNA. Sertraline treatment reduces MSNA at rest and during mental challenge in depressed patients, which may have prognostic implications in this group. J Hypertens 27:2429-2436 (c) 2009 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.