969 resultados para Interval Data
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
Pulmonary hypertension represents an important cause of morbidity and mortality in patients with mitral stenosis who undergo cardiac surgery, especially in the postoperative period. The aim of this study was to test the hypothesis that inhaled nitric oxide (iNO) would improve the hemodynamic effects and short-term clinical outcomes of patients with mitral stenosis and severe pulmonary hypertension who undergo cardiac surgery in a randomized, controlled study. Twenty-nine patients (4 men, 25 women; mean age 46 2 years) were randomly allocated to receive iNO (n = 14) or oxygen (n = 15) for 48 hours immediately after surgery. Hemodynamic data, the use of vasoactive drugs, duration of stay, and short-term complications were assessed. No differences in baseline characteristics were observed between the groups. After 24 and 48 hours, patients receiving iNO had a significantly greater increase in cardiac index compared to patients receiving oxygen (p < 0.0001). Pulmonary vascular resistance was also more significantly reduced in patients receiving iNO versus oxygen (-117 dyne/s/cm(5), 95% confidence interval 34 to 200, vs 40 dyne/s/cm5, 95% confidence interval 34 to 100, p = 0.005) at 48 hours. Patients in the iNO group used fewer systemic vasoactive drugs.(mean 2.1 +/- 0.14 vs 2.6 +/- 0.16, p = 0.046) and had a shorter intensive care unit stay (median 2 days, interquartile range 0.25, vs median 3 days, interquartile range 7, p = 0.02). In conclusion, iNO immediately after surgery in patients with mitral stenosis and severe pulmonary hypertension improves hemodynamics and may have short-term clinical benefits. (C) 2011 Elsevier Inc. All rights reserved. (Am J Cardiol 2011;107:1040-1045)
Resumo:
Dherte PM, Negrao MPG, Mori Neto S, Holzhacker R, Shimada V, Taberner P, Carmona MJC - Smart Alerts: Development of a Software to Optimize Data Monitoring. Background and objectives: Monitoring is useful for vital follow-ups and prevention, diagnosis, and treatment of several events in anesthesia. Although alarms can be useful in monitoring they can cause dangerous user`s desensitization. The objective of this study was to describe the development of specific software to integrate intraoperative monitoring parameters generating ""smart alerts"" that can help decision making, besides indicating possible diagnosis and treatment. Methods: A system that allowed flexibility in the definition of alerts, combining individual alarms of the parameters monitored to generate a more elaborated alert system was designed. After investigating a set of smart alerts, considered relevant in the surgical environment, a prototype was designed and evaluated, and additional suggestions were implemented in the final product. To verify the occurrence of smart alerts, the system underwent testing with data previously obtained during intraoperative monitoring of 64 patients. The system allows continuous analysis of monitored parameters, verifying the occurrence of smart alerts defined in the user interface. Results: With this system a potential 92% reduction in alarms was observed. We observed that in most situations that did not generate alerts individual alarms did not represent risk to the patient. Conclusions: Implementation of software can allow integration of the data monitored and generate information, such as possible diagnosis or interventions. An expressive potential reduction in the amount of alarms during surgery was observed. Information displayed by the system can be oftentimes more useful than analysis of isolated parameters.
Resumo:
Objective: To illustrate methodological issues involved in estimating dietary trends in populations using data obtained from various sources in Australia in the 1980s and 1990s. Methods: Estimates of absolute and relative change in consumption of selected food items were calculated using national data published annually on the national food supply for 1982-83 to 1992-93 and responses to food frequency questions in two population based risk factor surveys in 1983 and 1994 in the Hunter Region of New South Wales, Australia. The validity of estimated food quantities obtained from these inexpensive sources at the beginning of the period was assessed by comparison with data from a national dietary survey conducted in 1983 using 24 h recall. Results: Trend estimates from the food supply data and risk factor survey data were in good agreement for increases in consumption of fresh fruit, vegetables and breakfast food and decreases in butter, margarine, sugar and alcohol. Estimates for trends in milk, eggs and bread consumption, however, were inconsistent. Conclusions: Both data sources can be used for monitoring progress towards national nutrition goals based on selected food items provided that some limitations are recognized. While data collection methods should be consistent over time they also need to allow for changes in the food supply (for example the introduction of new varieties such as low-fat dairy products). From time to time the trends derived from these inexpensive data sources should be compared with data derived from more detailed and quantitative estimates of dietary intake.
Resumo:
In this study, blood serum trace elements, biochemical and hematological parameters were obtained to assess the health status of an elderly population residing in So Paulo city, SP, Brazil. Results obtained showed that more than 93% of the studied individuals presented most of the serum trace element concentrations and of the hematological and biochemical data within the reference values used in clinical laboratories. However, the percentage of elderly presenting recommended low density lipoprotein (LDL) cholesterol concentrations was low (70%). The study indicated positive correlation between the concentrations of Zn and LDL-cholesterol (p < 0.06).
Resumo:
There are scarce data about headache prevalence and its characteristics among elderly people. The aim was to carry out a cross-sectional study to determine the 1-year prevalence of tension-type and migraine headaches in people > 65 years old in the city of Sao Paulo, Brazil. All 1615 people living in the study catchment area who agreed to participate in the study answered a questionnaire based in the International Headache Society criteria. Prevalence (mean and 95% confidence interval) of any type of headache in the last year was 45.6% (43.2, 48.0). Prevalence of tension-type headache in the last year was 33.1% (30.8, 35.4): 28.1% (24.6, 31.6) for men and 36.4% (33.4, 39.4) for women; for migraine headaches, prevalence in the last year was 10.6% (9.1, 12.1): 5.1% (3.4, 6.8) for men and 14.1% (11.9, 16.3) for women. One-year prevalence rates of headaches, and especially of migraine headaches, are very high among the elderly in Brazil.
Resumo:
The purpose of this study was to evaluate outcomes such as success of the initial therapy, failure of outpatient treatment, and death in outpatient treatment during intravenous antimicrobial therapy in patients with febrile neutropenia (FN) and hematological malignancies. In addition, clinical and laboratory data and the Multinational Association for Supportive Care of Cancer index (MASCC) were compared with failure of outpatient treatment and death. In a retrospective study, we evaluated FN following chemotherapy events that were treated initially with cefepime, with or without teicoplanin and replaced by levofloxacin after 48 h of defervescence in patients with good general conditions and ANC > 500/mm(3). Of the 178 FN episodes occurred in 126 patients, we observed success of the initial therapy in 63.5% of the events, failure of outpatient treatment in 20.8%, and death in 6.2%. The success rate of oral levofloxacin after defervescence was 99% (95 out of 96). Using multivariate analysis, significant risks of failure of outpatient treatment were found to be smoking (odds ratio (OR) 3.14, confidence interval (CI) 1.14-8.66; p = 0.027) and serum creatinine levels > 1.2 mg/dL (OR 7.97, CI 2.19-28.95; p = 0.002). With regard to death, the risk found was oxygen saturation by pulse oximetry < 95% (OR 5.8, IC 1.50-22.56; p = 0.011). Using the MASCC index, 165 events were classified as low risk and 13 as high risk. Failure of outpatient treatment was reported in seven (53.8%) high-risk and 30 (18.2%) low-risk episodes (p = 0.006). In addition, death occurred in seven (4.2%) low-risk and four (30.8%) high-risk events (p = 0.004). Ours results show that MASCC index was able to identify patients with high risk. In addition, non-smoking, serum creatinine levels a parts per thousand currency sign1.2 mg/dL, and oxygen saturation by pulse oximetry a parts per thousand yen95% were protection factors.
Resumo:
Ninety-one consecutive systemic lupus erythematosus (SLE) patients (American College of Rheumatology criteria) with a history of cutaneous vasculitis were compared to 163 SLE controls without this clinical manifestation from July to December 2007 in order to determine the possible clinical and serological association of this manifestation. Data were obtained in an ongoing electronic database protocol and autoantibodies to anti-double-stranded DNA, anti-Sm, anti-RNP, anti-Ro/SS-A, anti-La/SS-B, and anticardiolipin and ribosomal P protein antibody (anti-P) were detected by standard techniques. Exclusion criteria were the presence of anti-phospholipid syndrome or antibodies, Sjogren syndrome, and a history of thrombosis. The mean age (38.5 +/- 11.5 vs. 37.8 +/- 11.6 years, p = 0.635), disease duration (12.5 +/- 7.8 vs. 11.8 +/- 7.9 years, p = 0.501), and frequency of white race (71.4% vs. 70.5%, p = 0.872) and female sex (96.8% vs. 93.7%, p = 0.272) were comparable in both groups. The vasculitis group had a higher frequency of malar rash (97.9% vs. 87.4%, p = 0.004), photosensitivity (91.4% vs. 81.6%, p = 0.030), and Raynaud phenomenon (RP; 27.7% vs. 7.5%, p < 0.001), whereas all other clinical manifestation including renal and central nervous system involvements were similar to the control group. Laboratorial data revealed that only anti-P (35.1% vs. 12.1%, p < 0.001) was more frequent in patients with vasculitis. In a multivariate logistic regression model, cutaneous vasculitis was associated to the presence of RP (OR = 3.70; 95% confidence interval [CI] = 1.73-8.00) and anti-P (OR = 3.42; 95% CI = 1.76-6.66). In summary, SLE cutaneous vasculitis characterizes a subgroup of patients with more RP and anti-P antibodies but not accompanied by a higher frequency of renal and central nervous system involvements.
Resumo:
Background-This study compared the 10-year follow-up of percutaneous coronary intervention (PCI), coronary artery surgery (CABG), and medical treatment (MT) in patients with multivessel coronary artery disease, stable angina, and preserved ventricular function. Methods and Results-The primary end points were overall mortality, Q-wave myocardial infarction, or refractory angina that required revascularization. All data were analyzed according to the intention-to-treat principle. At a single institution, 611 patients were randomly assigned to CABG (n = 203), PCI (n = 205), or MT (n = 203). The 10-year survival rates were 74.9% with CABG, 75.1% with PCI, and 69% with MT (P = 0.089). The 10-year rates of myocardial infarction were 10.3% with CABG, 13.3% with PCI, and 20.7% with MT (P < 0.010). The 10-year rates of additional revascularizations were 7.4% with CABG, 41.9% with PCI, and 39.4% with MT (P < 0.001). Relative to the composite end point, Cox regression analysis showed a higher incidence of primary events in MT than in CABG (hazard ratio 2.35, 95% confidence interval 1.78 to 3.11) and in PCI than in CABG (hazard ratio 1.85, 95% confidence interval 1.39 to 2.47). Furthermore, 10-year rates of freedom from angina were 64% with CABG, 59% with PCI, and 43% with MT (P < 0.001). Conclusions-Compared with CABG, MT was associated with a significantly higher incidence of subsequent myocardial infarction, a higher rate of additional revascularization, a higher incidence of cardiac death, and consequently a 2.29-fold increased risk of combined events. PCI was associated with an increased need for further revascularization, a higher incidence of myocardial infarction, and a 1.46-fold increased risk of combined events compared with CABG. Additionally, CABG was better than MT at eliminating anginal symptoms.
Resumo:
Exercise training has an important role in the prevention and treatment of hypertension, but its effects on the early metabolic and hemodynamic abnormalities observed in normotensive offspring of hypertensive parents (FH+) have not been studied. We compared high-intensity interval (aerobic interval training, AIT) and moderate-intensity continuous exercise training (CMT) with regard to hemodynamic, metabolic and hormonal variables in FH+ subjects. Forty-four healthy FH+ women (25.0+/-4.4 years) randomized to control (ConFH+) or to a three times per week equal-volume AIT (80-90% of VO(2MAX)) or CMT (50-60% of VO(2MAX)) regimen, and 15 healthy women with normotensive parents (ConFH-; 25.3+/-3.1 years) had their hemodynamic, metabolic and hormonal variables analyzed at baseline and after 16 weeks of follow-up. Ambulatorial blood pressure (ABP), glucose and cholesterol levels were similar among all groups, but the FH+ groups showed higher insulin, insulin sensitivity, carotid-femoral pulse wave velocity (PWV), norepinephrine and endothelin-1 (ET-1) levels and lower nitrite/ nitrate (NOx) levels than ConFH- subjects. AIT and CMT were equally effective in improving ABP (P<0.05), insulin and insulin sensitivity (P<0.001); however, AIT was superior in improving cardiorespiratory fitness (15 vs. 8%; P<0.05), PWV (P<0.01), and BP, norepinephrine, ET-1 and NOx response to exercise (P<0.05). Exercise intensity was an important factor in improving cardiorespiratory fitness and reversing hemodynamic, metabolic and hormonal alterations involved in the pathophysiology of hypertension. These findings may have important implications for the exercise training programs used for the prevention of inherited hypertensive disorder. Hypertension Research (2010) 33, 836-843; doi:10.1038/hr.2010.72; published online 7 May 2010
Resumo:
Exercise is an effective intervention for treating hypertension and arterial stiffness, but little is known about which exercise modality is the most effective in reducing arterial stiffness and blood pressure in hypertensive subjects. Our purpose was to evaluate the effect of continuous vs. interval exercise training on arterial stiffness and blood pressure in hypertensive patients. Sixty-five patients with hypertension were randomized to 16 weeks of continuous exercise training (n=26), interval training (n=26) or a sedentary routine (n=13). The training was conducted in two 40-min sessions a week. Assessment of arterial stiffness by carotid-femoral pulse wave velocity (PWV) measurement and 24-h ambulatory blood pressure monitoring (ABPM) were performed before and after the 16 weeks of training. At the end of the study, ABPM blood pressure had declined significantly only in the subjects with higher basal values and was independent of training modality. PWV had declined significantly only after interval training from 9.44 +/- 0.91 to 8.90 +/- 0.96 m s(-1), P=0.009 (continuous from 10.15 +/- 1.66 to 9.98 +/- 1.81 m s(-1), P-ns; control from 10.23 +/- 1.82 to 10.53 +/- 1.97 m s(-1), P-ns). Continuous and interval exercise training were beneficial for blood pressure control, but only interval training reduced arterial stiffness in treated hypertensive subjects. Hypertension Research (2010) 33, 627-632; doi:10.1038/hr.2010.42; published online 9 April 2010
Resumo:
Background: Despite antihypertensive therapy, it is difficult to maintain optimal systemic blood pressure (BP) values in hypertensive patients (HPT). Exercise may reduce BP in untreated HPT. However, evidence regarding its effect in long-term antihypertensive therapy is lacking. Our purpose was to evaluate the acute effects of 40-minute continuous (CE) or interval exercise (IE) using cycle ergometers on BP in long-term treated HPT. Methods: Fifty-two treated HPT were randomized to CE (n=26) or IE (n=26) protocols. CE was performed at 60% of reserve heart rate (HR). IE alternated consecutively 2 min at 50% reserve HR with 1 min at 80%. Two 24-h ambulatory BP monitoring were made after exercise (postexercise) or a nonexercise control period (control) in random order. Results: CE reduced mean 24-h systolic (S) BP (2.6 +/- 6.6 mm Hg, p-0.05) and diastolic (D) BP (2.3 +/- 4.6, p-0.01), and nighttime SBP (4.8 +/- 6.4, p < 0.001) and DBP (4.6 +/- 5.2 mm Hg, p-0.001). IE reduced 24-h SBP (2.8 +/- 6.5, p-0.03) and nighttime SBP (3.4 +/- 7.2, p-0.02), and tended to reduce nighttime DBP (p=0.06). Greater reductions occurred in higher BP levels. Percentage of normal ambulatory BP values increased after CE (24-h: 42% to 54%; daytime: 42% to 61%; nighttime: 61% to 69%) and IE (24-h: 31% to 46%; daytime: 54% to 61%; nighttime: 46% to 69%). Conclusion: CE and IE reduced ambulatory BP in treated HPT, increasing the number of patients reaching normal ambulatory BP values. These effects suggest that continuous and interval aerobic exercise may have a role in BP management in treated HPT. (c) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Background: Chagas` disease is the illness caused by the protozoan Trypanosoma cruzi and it is still endemic in Latin America. Heart transplantation is a therapeutic option for patients with end-stage Chagas` cardiomyopathy. Nevertheless, reactivation may occur after transplantation, leading to higher morbidity and graft dysfunction. This study aimed to identify risk factors for Chagas` disease reactivation episodes. Methods: This investigation is a retrospective cohort study of all Chagas` disease heart transplant recipients from September 1985 through September 2004. Clinical, microbiologic and histopathologic data were reviewed. Statistical analysis was performed with SPSS (version 13) software. Results: Sixty-four (21.9%) patients with chronic Chagas` disease underwent heart transplantation during the study period. Seventeen patients (26.5%) had at least one episode of Chagas` disease reactivation, and univariate analysis identified number of rejection episodes (p = 0.013) and development of neoplasms (p = 0.040) as factors associated with Chagas` disease reactivation episodes. Multivariate analysis showed that number of rejection episodes (hazard ratio = 1.31; 95% confidence interval [CI]: 1.06 to 1.62; p = 0.011), neoplasms (hazard ratio = 5.07; 95% CI: 1.49 to 17.20; p = 0.009) and use of mycophenolate mofetil (hazard ratio = 3.14; 95% CI: 1.00 to 9.84; p = 0.049) are independent determinants for reactivation after transplantation. Age (p = 0.88), male gender (p = 0.15), presence of rejection (p = 0.17), cytomegalovirus infection (p = 0.79) and mortality after hospital discharge (p = 0.15) showed no statistically significant difference. Conclusions: Our data suggest that events resulting in greater immunosuppression status contribute to Chagas` disease reactivation episodes after heart transplantation and should alert physicians to make an early diagnosis and perform pre-emptive therapy. Although reactivation led to a high rate of morbidity, a low mortality risk was observed.
Resumo:
Background - The effect of prearrest left ventricular ejection fraction ( LVEF) on outcome after cardiac arrest is unknown. Methods and Results - During a 26-month period, Utstein-style data were prospectively collected on 800 consecutive inpatient adult index cardiac arrests in an observational, single-center study at a tertiary cardiac care hospital. Prearrest echocardiograms were performed on 613 patients ( 77%) at 11 +/- 14 days before the cardiac arrest. Outcomes among patients with normal or nearly normal prearrest LVEF ( >= 45%) were compared with those of patients with moderate or severe dysfunction ( LVEF < 45%) by chi(2) and logistic regression analyses. Survival to discharge was 19% in patients with normal or nearly normal LVEF compared with 8% in those with moderate or severe dysfunction ( adjusted odds ratio, 4.8; 95% confidence interval, 2.3 to 9.9; P < 0.001) but did not differ with regard to sustained return of spontaneous circulation ( 59% versus 56%; P = 0.468) or 24-hour survival ( 39% versus 36%; P = 0.550). Postarrest echocardiograms were performed on 84 patients within 72 hours after the index cardiac arrest; the LVEF decreased 25% in those with normal or nearly normal prearrest LVEF ( 60 +/- 9% to 45 +/- 14%; P < 0.001) and decreased 26% in those with moderate or severe dysfunction ( 31 +/- 7% to 23 +/- 6%, P < 0.001). For all patients, prearrest beta-blocker treatment was associated with higher survival to discharge ( 33% versus 8%; adjusted odds ratio, 3.9; 95% confidence interval, 1.8 to 8.2; P < 0.001). Conclusions - Moderate and severe prearrest left ventricular systolic dysfunction was associated with substantially lower rates of survival to hospital discharge compared with normal or nearly normal function.
Wavelet correlation between subjects: A time-scale data driven analysis for brain mapping using fMRI
Resumo:
Functional magnetic resonance imaging (fMRI) based on BOLD signal has been used to indirectly measure the local neural activity induced by cognitive tasks or stimulation. Most fMRI data analysis is carried out using the general linear model (GLM), a statistical approach which predicts the changes in the observed BOLD response based on an expected hemodynamic response function (HRF). In cases when the task is cognitively complex or in cases of diseases, variations in shape and/or delay may reduce the reliability of results. A novel exploratory method using fMRI data, which attempts to discriminate between neurophysiological signals induced by the stimulation protocol from artifacts or other confounding factors, is introduced in this paper. This new method is based on the fusion between correlation analysis and the discrete wavelet transform, to identify similarities in the time course of the BOLD signal in a group of volunteers. We illustrate the usefulness of this approach by analyzing fMRI data from normal subjects presented with standardized human face pictures expressing different degrees of sadness. The results show that the proposed wavelet correlation analysis has greater statistical power than conventional GLM or time domain intersubject correlation analysis. (C) 2010 Elsevier B.V. All rights reserved.