965 resultados para interval-censored data
Resumo:
The purpose of this study was to evaluate outcomes such as success of the initial therapy, failure of outpatient treatment, and death in outpatient treatment during intravenous antimicrobial therapy in patients with febrile neutropenia (FN) and hematological malignancies. In addition, clinical and laboratory data and the Multinational Association for Supportive Care of Cancer index (MASCC) were compared with failure of outpatient treatment and death. In a retrospective study, we evaluated FN following chemotherapy events that were treated initially with cefepime, with or without teicoplanin and replaced by levofloxacin after 48 h of defervescence in patients with good general conditions and ANC > 500/mm(3). Of the 178 FN episodes occurred in 126 patients, we observed success of the initial therapy in 63.5% of the events, failure of outpatient treatment in 20.8%, and death in 6.2%. The success rate of oral levofloxacin after defervescence was 99% (95 out of 96). Using multivariate analysis, significant risks of failure of outpatient treatment were found to be smoking (odds ratio (OR) 3.14, confidence interval (CI) 1.14-8.66; p = 0.027) and serum creatinine levels > 1.2 mg/dL (OR 7.97, CI 2.19-28.95; p = 0.002). With regard to death, the risk found was oxygen saturation by pulse oximetry < 95% (OR 5.8, IC 1.50-22.56; p = 0.011). Using the MASCC index, 165 events were classified as low risk and 13 as high risk. Failure of outpatient treatment was reported in seven (53.8%) high-risk and 30 (18.2%) low-risk episodes (p = 0.006). In addition, death occurred in seven (4.2%) low-risk and four (30.8%) high-risk events (p = 0.004). Ours results show that MASCC index was able to identify patients with high risk. In addition, non-smoking, serum creatinine levels a parts per thousand currency sign1.2 mg/dL, and oxygen saturation by pulse oximetry a parts per thousand yen95% were protection factors.
Resumo:
Ninety-one consecutive systemic lupus erythematosus (SLE) patients (American College of Rheumatology criteria) with a history of cutaneous vasculitis were compared to 163 SLE controls without this clinical manifestation from July to December 2007 in order to determine the possible clinical and serological association of this manifestation. Data were obtained in an ongoing electronic database protocol and autoantibodies to anti-double-stranded DNA, anti-Sm, anti-RNP, anti-Ro/SS-A, anti-La/SS-B, and anticardiolipin and ribosomal P protein antibody (anti-P) were detected by standard techniques. Exclusion criteria were the presence of anti-phospholipid syndrome or antibodies, Sjogren syndrome, and a history of thrombosis. The mean age (38.5 +/- 11.5 vs. 37.8 +/- 11.6 years, p = 0.635), disease duration (12.5 +/- 7.8 vs. 11.8 +/- 7.9 years, p = 0.501), and frequency of white race (71.4% vs. 70.5%, p = 0.872) and female sex (96.8% vs. 93.7%, p = 0.272) were comparable in both groups. The vasculitis group had a higher frequency of malar rash (97.9% vs. 87.4%, p = 0.004), photosensitivity (91.4% vs. 81.6%, p = 0.030), and Raynaud phenomenon (RP; 27.7% vs. 7.5%, p < 0.001), whereas all other clinical manifestation including renal and central nervous system involvements were similar to the control group. Laboratorial data revealed that only anti-P (35.1% vs. 12.1%, p < 0.001) was more frequent in patients with vasculitis. In a multivariate logistic regression model, cutaneous vasculitis was associated to the presence of RP (OR = 3.70; 95% confidence interval [CI] = 1.73-8.00) and anti-P (OR = 3.42; 95% CI = 1.76-6.66). In summary, SLE cutaneous vasculitis characterizes a subgroup of patients with more RP and anti-P antibodies but not accompanied by a higher frequency of renal and central nervous system involvements.
Resumo:
Background-This study compared the 10-year follow-up of percutaneous coronary intervention (PCI), coronary artery surgery (CABG), and medical treatment (MT) in patients with multivessel coronary artery disease, stable angina, and preserved ventricular function. Methods and Results-The primary end points were overall mortality, Q-wave myocardial infarction, or refractory angina that required revascularization. All data were analyzed according to the intention-to-treat principle. At a single institution, 611 patients were randomly assigned to CABG (n = 203), PCI (n = 205), or MT (n = 203). The 10-year survival rates were 74.9% with CABG, 75.1% with PCI, and 69% with MT (P = 0.089). The 10-year rates of myocardial infarction were 10.3% with CABG, 13.3% with PCI, and 20.7% with MT (P < 0.010). The 10-year rates of additional revascularizations were 7.4% with CABG, 41.9% with PCI, and 39.4% with MT (P < 0.001). Relative to the composite end point, Cox regression analysis showed a higher incidence of primary events in MT than in CABG (hazard ratio 2.35, 95% confidence interval 1.78 to 3.11) and in PCI than in CABG (hazard ratio 1.85, 95% confidence interval 1.39 to 2.47). Furthermore, 10-year rates of freedom from angina were 64% with CABG, 59% with PCI, and 43% with MT (P < 0.001). Conclusions-Compared with CABG, MT was associated with a significantly higher incidence of subsequent myocardial infarction, a higher rate of additional revascularization, a higher incidence of cardiac death, and consequently a 2.29-fold increased risk of combined events. PCI was associated with an increased need for further revascularization, a higher incidence of myocardial infarction, and a 1.46-fold increased risk of combined events compared with CABG. Additionally, CABG was better than MT at eliminating anginal symptoms.
Resumo:
Exercise training has an important role in the prevention and treatment of hypertension, but its effects on the early metabolic and hemodynamic abnormalities observed in normotensive offspring of hypertensive parents (FH+) have not been studied. We compared high-intensity interval (aerobic interval training, AIT) and moderate-intensity continuous exercise training (CMT) with regard to hemodynamic, metabolic and hormonal variables in FH+ subjects. Forty-four healthy FH+ women (25.0+/-4.4 years) randomized to control (ConFH+) or to a three times per week equal-volume AIT (80-90% of VO(2MAX)) or CMT (50-60% of VO(2MAX)) regimen, and 15 healthy women with normotensive parents (ConFH-; 25.3+/-3.1 years) had their hemodynamic, metabolic and hormonal variables analyzed at baseline and after 16 weeks of follow-up. Ambulatorial blood pressure (ABP), glucose and cholesterol levels were similar among all groups, but the FH+ groups showed higher insulin, insulin sensitivity, carotid-femoral pulse wave velocity (PWV), norepinephrine and endothelin-1 (ET-1) levels and lower nitrite/ nitrate (NOx) levels than ConFH- subjects. AIT and CMT were equally effective in improving ABP (P<0.05), insulin and insulin sensitivity (P<0.001); however, AIT was superior in improving cardiorespiratory fitness (15 vs. 8%; P<0.05), PWV (P<0.01), and BP, norepinephrine, ET-1 and NOx response to exercise (P<0.05). Exercise intensity was an important factor in improving cardiorespiratory fitness and reversing hemodynamic, metabolic and hormonal alterations involved in the pathophysiology of hypertension. These findings may have important implications for the exercise training programs used for the prevention of inherited hypertensive disorder. Hypertension Research (2010) 33, 836-843; doi:10.1038/hr.2010.72; published online 7 May 2010
Resumo:
Exercise is an effective intervention for treating hypertension and arterial stiffness, but little is known about which exercise modality is the most effective in reducing arterial stiffness and blood pressure in hypertensive subjects. Our purpose was to evaluate the effect of continuous vs. interval exercise training on arterial stiffness and blood pressure in hypertensive patients. Sixty-five patients with hypertension were randomized to 16 weeks of continuous exercise training (n=26), interval training (n=26) or a sedentary routine (n=13). The training was conducted in two 40-min sessions a week. Assessment of arterial stiffness by carotid-femoral pulse wave velocity (PWV) measurement and 24-h ambulatory blood pressure monitoring (ABPM) were performed before and after the 16 weeks of training. At the end of the study, ABPM blood pressure had declined significantly only in the subjects with higher basal values and was independent of training modality. PWV had declined significantly only after interval training from 9.44 +/- 0.91 to 8.90 +/- 0.96 m s(-1), P=0.009 (continuous from 10.15 +/- 1.66 to 9.98 +/- 1.81 m s(-1), P-ns; control from 10.23 +/- 1.82 to 10.53 +/- 1.97 m s(-1), P-ns). Continuous and interval exercise training were beneficial for blood pressure control, but only interval training reduced arterial stiffness in treated hypertensive subjects. Hypertension Research (2010) 33, 627-632; doi:10.1038/hr.2010.42; published online 9 April 2010
Resumo:
Background: Despite antihypertensive therapy, it is difficult to maintain optimal systemic blood pressure (BP) values in hypertensive patients (HPT). Exercise may reduce BP in untreated HPT. However, evidence regarding its effect in long-term antihypertensive therapy is lacking. Our purpose was to evaluate the acute effects of 40-minute continuous (CE) or interval exercise (IE) using cycle ergometers on BP in long-term treated HPT. Methods: Fifty-two treated HPT were randomized to CE (n=26) or IE (n=26) protocols. CE was performed at 60% of reserve heart rate (HR). IE alternated consecutively 2 min at 50% reserve HR with 1 min at 80%. Two 24-h ambulatory BP monitoring were made after exercise (postexercise) or a nonexercise control period (control) in random order. Results: CE reduced mean 24-h systolic (S) BP (2.6 +/- 6.6 mm Hg, p-0.05) and diastolic (D) BP (2.3 +/- 4.6, p-0.01), and nighttime SBP (4.8 +/- 6.4, p < 0.001) and DBP (4.6 +/- 5.2 mm Hg, p-0.001). IE reduced 24-h SBP (2.8 +/- 6.5, p-0.03) and nighttime SBP (3.4 +/- 7.2, p-0.02), and tended to reduce nighttime DBP (p=0.06). Greater reductions occurred in higher BP levels. Percentage of normal ambulatory BP values increased after CE (24-h: 42% to 54%; daytime: 42% to 61%; nighttime: 61% to 69%) and IE (24-h: 31% to 46%; daytime: 54% to 61%; nighttime: 46% to 69%). Conclusion: CE and IE reduced ambulatory BP in treated HPT, increasing the number of patients reaching normal ambulatory BP values. These effects suggest that continuous and interval aerobic exercise may have a role in BP management in treated HPT. (c) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Background: Chagas` disease is the illness caused by the protozoan Trypanosoma cruzi and it is still endemic in Latin America. Heart transplantation is a therapeutic option for patients with end-stage Chagas` cardiomyopathy. Nevertheless, reactivation may occur after transplantation, leading to higher morbidity and graft dysfunction. This study aimed to identify risk factors for Chagas` disease reactivation episodes. Methods: This investigation is a retrospective cohort study of all Chagas` disease heart transplant recipients from September 1985 through September 2004. Clinical, microbiologic and histopathologic data were reviewed. Statistical analysis was performed with SPSS (version 13) software. Results: Sixty-four (21.9%) patients with chronic Chagas` disease underwent heart transplantation during the study period. Seventeen patients (26.5%) had at least one episode of Chagas` disease reactivation, and univariate analysis identified number of rejection episodes (p = 0.013) and development of neoplasms (p = 0.040) as factors associated with Chagas` disease reactivation episodes. Multivariate analysis showed that number of rejection episodes (hazard ratio = 1.31; 95% confidence interval [CI]: 1.06 to 1.62; p = 0.011), neoplasms (hazard ratio = 5.07; 95% CI: 1.49 to 17.20; p = 0.009) and use of mycophenolate mofetil (hazard ratio = 3.14; 95% CI: 1.00 to 9.84; p = 0.049) are independent determinants for reactivation after transplantation. Age (p = 0.88), male gender (p = 0.15), presence of rejection (p = 0.17), cytomegalovirus infection (p = 0.79) and mortality after hospital discharge (p = 0.15) showed no statistically significant difference. Conclusions: Our data suggest that events resulting in greater immunosuppression status contribute to Chagas` disease reactivation episodes after heart transplantation and should alert physicians to make an early diagnosis and perform pre-emptive therapy. Although reactivation led to a high rate of morbidity, a low mortality risk was observed.
Resumo:
Background - The effect of prearrest left ventricular ejection fraction ( LVEF) on outcome after cardiac arrest is unknown. Methods and Results - During a 26-month period, Utstein-style data were prospectively collected on 800 consecutive inpatient adult index cardiac arrests in an observational, single-center study at a tertiary cardiac care hospital. Prearrest echocardiograms were performed on 613 patients ( 77%) at 11 +/- 14 days before the cardiac arrest. Outcomes among patients with normal or nearly normal prearrest LVEF ( >= 45%) were compared with those of patients with moderate or severe dysfunction ( LVEF < 45%) by chi(2) and logistic regression analyses. Survival to discharge was 19% in patients with normal or nearly normal LVEF compared with 8% in those with moderate or severe dysfunction ( adjusted odds ratio, 4.8; 95% confidence interval, 2.3 to 9.9; P < 0.001) but did not differ with regard to sustained return of spontaneous circulation ( 59% versus 56%; P = 0.468) or 24-hour survival ( 39% versus 36%; P = 0.550). Postarrest echocardiograms were performed on 84 patients within 72 hours after the index cardiac arrest; the LVEF decreased 25% in those with normal or nearly normal prearrest LVEF ( 60 +/- 9% to 45 +/- 14%; P < 0.001) and decreased 26% in those with moderate or severe dysfunction ( 31 +/- 7% to 23 +/- 6%, P < 0.001). For all patients, prearrest beta-blocker treatment was associated with higher survival to discharge ( 33% versus 8%; adjusted odds ratio, 3.9; 95% confidence interval, 1.8 to 8.2; P < 0.001). Conclusions - Moderate and severe prearrest left ventricular systolic dysfunction was associated with substantially lower rates of survival to hospital discharge compared with normal or nearly normal function.
Wavelet correlation between subjects: A time-scale data driven analysis for brain mapping using fMRI
Resumo:
Functional magnetic resonance imaging (fMRI) based on BOLD signal has been used to indirectly measure the local neural activity induced by cognitive tasks or stimulation. Most fMRI data analysis is carried out using the general linear model (GLM), a statistical approach which predicts the changes in the observed BOLD response based on an expected hemodynamic response function (HRF). In cases when the task is cognitively complex or in cases of diseases, variations in shape and/or delay may reduce the reliability of results. A novel exploratory method using fMRI data, which attempts to discriminate between neurophysiological signals induced by the stimulation protocol from artifacts or other confounding factors, is introduced in this paper. This new method is based on the fusion between correlation analysis and the discrete wavelet transform, to identify similarities in the time course of the BOLD signal in a group of volunteers. We illustrate the usefulness of this approach by analyzing fMRI data from normal subjects presented with standardized human face pictures expressing different degrees of sadness. The results show that the proposed wavelet correlation analysis has greater statistical power than conventional GLM or time domain intersubject correlation analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.
Resumo:
The objective of this study was to evaluate the long-term outcomes of a single institution, Hospital Sirio-Libanes in SA o pound Paulo, Brazil, regarding the treatment of peritoneal carcinomatosis. Between October 2002 and October 2006, 46 consecutive patients were treated with radical cytoreduction and hyperthermic peritoneal chemotherapy. There were 21 patients with peritoneal surface malignancy (PSM) from colorectal origin (among whom 8 had an appendiceal primary), 15 with ovarian carcinomas, 2 with primary peritoneal mesotheliomas, and 8 with other cancers. The median age was 49 years (range 18-77 years). All patients were followed for a median of 20 months. Demographic data, tumor histology, the peritoneal carcinomatosis index (PCI), operative procedures (extension of resection, lymphadenectomy), and hyperthermic intraperitoneal chemotherapy (HIPEC) characteristics (drugs, temperature, duration) were prospectively recorded. Perioperative mortality and morbidity and the long-term outcome were assessed. Complete cytoreduction was achieved in 45 patients. The median PCI was 11, and the mean operating time was 17 h. There were no procedure-related deaths, but major morbidity was observed in 52% and included fistulas, abscesses, and hematologic complications. The overall Kaplan-Meier 4-year estimated survival was 56%. Among patients with PSM from colorectal carcinoma, the estimated 3-year survival was 70%. Nine (42%) patients had a recurrence, three with peritoneal disease. The median disease-free-interval was 16 months. The ovarian cancer patients had an estimated 4-year survival rate of 75% and median disease-free survival duration of 21 months. Cytoreductive surgery with HIPEC may improve survival of selected patients with peritoneal carcinomatosis, with acceptable morbidity.
Resumo:
Functional magnetic resonance imaging (fMRI) is currently one of the most widely used methods for studying human brain function in vivo. Although many different approaches to fMRI analysis are available, the most widely used methods employ so called ""mass-univariate"" modeling of responses in a voxel-by-voxel fashion to construct activation maps. However, it is well known that many brain processes involve networks of interacting regions and for this reason multivariate analyses might seem to be attractive alternatives to univariate approaches. The current paper focuses on one multivariate application of statistical learning theory: the statistical discrimination maps (SDM) based on support vector machine, and seeks to establish some possible interpretations when the results differ from univariate `approaches. In fact, when there are changes not only on the activation level of two conditions but also on functional connectivity, SDM seems more informative. We addressed this question using both simulations and applications to real data. We have shown that the combined use of univariate approaches and SDM yields significant new insights into brain activations not available using univariate methods alone. In the application to a visual working memory fMRI data, we demonstrated that the interaction among brain regions play a role in SDM`s power to detect discriminative voxels. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Background: The optimal interval between neoadjuvant chemoradiation therapy (CRT) and surgery in the treatment of patients with distal rectal cancer is controversial. The purpose of this study is to evaluate whether this interval has an impact on survival. Methods and Materials: Patients who underwent surgery after CRT were retrospectively reviewed. Patients with a sustained complete clinical response (cCR) 1 year after CRT were excluded from this study. Clinical and pathologic characteristics and overall and disease-free survival were compared between patients undergoing surgery 12 weeks or less from CRT and patients undergoing surgery longer than 12 weeks from CRT completion and between patients with a surgery delay caused by a suspected cCR and those with a delay for other reasons. Results: Two hundred fifty patients underwent surgery, and 48.4% had CRT-to-surgery intervals of 12 weeks or less. There were no statistical differences in overall survival (86% vs. 81.6%) or disease-free survival rates (56.5% and 58.9%) between patients according to interval (<= 12 vs. >1 2 weeks). Patients with intervals of 12 weeks or less had significantly higher rates of Stage III disease (34% vs. 20%; p = 0.009). The delay in surgery was caused by a suspected cCR in 23 patients (interval, 48 +/- 10.3 weeks). Five-year overall and disease-free survival rates for this subset were 84.9% and 51.6%, not significantly different compared with the remaining group (84%; p = 0.96 and 57.8 %; p = 0.76, respectively). Conclusions: Delay in surgery for the evaluation of tumor response after neoadjuvant CRT is safe and does not negatively affect survival. These results support the hypothesis that shorter intervals may interrupt ongoing tumor necrosis. (C) 2008 Elsevier Inc.
Resumo:
Both hysterectomy and tubal sterilisation offer significant protection from ovarian cancer, and the risk of cardiovascular disease in women is lowered after hysterectomy. Since little is known about the accuracy of women's self-reports of these procedures, we assessed their reliability and validity using data obtained in a case-control study of ovarian cancer. There was 100 per cent repeatability for both positive and negative histories of hysterectomy and tubal sterilisation among a small sample of women on reinterview. Verification of surgery was sought against surgeons' or medical records, or if these were unavailable, from randomly selected current general practitioners for 51 cases and 155 controls reporting a hysterectomy and 73 cases and 137 controls reporting a tubal sterilisation. Validation rate for self-reported hysterectomy against medical reports (32 cases, 96 controls) was 96 per cent (95 per cent confidence interval (CI) 91 to 99) and for tubal sterilisation (32 cases, 77 controls) it was 88 per cent (CI 81 to 93), which is likely to be an underestimate. Although findings are based on small numbers of women for whom medical reports could be ascertained, they are consistent with other findings that suggest women have good recall of past histories of hysterectomy and tubal sterilisation; this allows long-term effects of these procedures to be studied with reasonable accuracy from self-reports.
Resumo:
Mitochondrial DNA (mtDNA) population data for forensic purposes are still scarce for some populations, which may limit the evaluation of forensic evidence especially when the rarity of a haplotype needs to be determined in a database search. In order to improve the collection of mtDNA lineages from the Iberian and South American subcontinents, we here report the results of a collaborative study involving nine laboratories from the Spanish and Portuguese Speaking Working Group of the International Society for Forensic Genetics (GHEP-ISFG) and EMPOP. The individual laboratories contributed population data that were generated throughout the past 10 years, but in the majority of cases have not been made available to the scientific community. A total of 1019 haplotypes from Iberia (Basque Country, 2 general Spanish populations, 2 North and 1 Central Portugal populations), and Latin America (3 populations from Sao Paulo) were collected, reviewed and harmonized according to defined EMPOP criteria. The majority of data ambiguities that were found during the reviewing process (41 in total) were transcription errors confirming that the documentation process is still the most error-prone stage in reporting mtDNA population data, especially when performed manually. This GHEP-EMPOP collaboration has significantly improved the quality of the individual mtDNA datasets and adds mtDNA population data as valuable resource to the EMPOP database (www.empop.org). (C) 2010 Elsevier Ireland Ltd. All rights reserved.