970 resultados para Multivariate data
Resumo:
Unauthorized accesses to digital contents are serious threats to international security and informatics. We propose an offline oblivious data distribution framework that preserves the sender's security and the receiver's privacy using tamper-proof smart cards. This framework provides persistent content protections from digital piracy and promises private content consumption.
Resumo:
Background: Depth of tumor invasion (T-category) and the number of metastatic lymph nodes (N-category) are the most important prognostic factors in patients with gastric cancer. Recently, the ratio between metastatic and dissected lymph nodes (N-ratio) has been established as one. The aim of this study is to evaluate the impact of N-ratio and its interaction with N-category as a prognostic factor in gastric cancer. Methods: This was a retrospective study in which we reviewed clinical and pathological data of 165 patients who had undergone curative surgery at our institution through a 9-year period. The exclusion criteria included metastases, gastric stump tumors and gastrectomy with less than 15 lymph nodes dissected. Results: The median age of the patients was 63 years and most of them were male. Total gastrectomy was the most common procedure and 92.1% of the patients had a D2-lymphadenectomy. Their 5-year overall survival was 57.7%. T-category, N-category, extended gastrectomy, and N-ratio were prognostic factors in overall and disease-free survival in accordance with univariate analysis. In accordance with TNM staging, N1 patients who have had NR1 had 5-year survival in 75.5% whereas in the NR2 group only 33% of the cases had 5-year survival. In the multivariate analysis, the interaction between N-category and N-ratio was an independent prognostic factor. Conclusion: Our findings confirmed the role of N-ratio as prognostic factor of survival in patients with gastric cancer surgically treated with at least 15 lymph nodes dissected. The relationship between N-category and N-ratio is a better predictor than lymph node metastasis staging. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Dherte PM, Negrao MPG, Mori Neto S, Holzhacker R, Shimada V, Taberner P, Carmona MJC - Smart Alerts: Development of a Software to Optimize Data Monitoring. Background and objectives: Monitoring is useful for vital follow-ups and prevention, diagnosis, and treatment of several events in anesthesia. Although alarms can be useful in monitoring they can cause dangerous user`s desensitization. The objective of this study was to describe the development of specific software to integrate intraoperative monitoring parameters generating ""smart alerts"" that can help decision making, besides indicating possible diagnosis and treatment. Methods: A system that allowed flexibility in the definition of alerts, combining individual alarms of the parameters monitored to generate a more elaborated alert system was designed. After investigating a set of smart alerts, considered relevant in the surgical environment, a prototype was designed and evaluated, and additional suggestions were implemented in the final product. To verify the occurrence of smart alerts, the system underwent testing with data previously obtained during intraoperative monitoring of 64 patients. The system allows continuous analysis of monitored parameters, verifying the occurrence of smart alerts defined in the user interface. Results: With this system a potential 92% reduction in alarms was observed. We observed that in most situations that did not generate alerts individual alarms did not represent risk to the patient. Conclusions: Implementation of software can allow integration of the data monitored and generate information, such as possible diagnosis or interventions. An expressive potential reduction in the amount of alarms during surgery was observed. Information displayed by the system can be oftentimes more useful than analysis of isolated parameters.
Resumo:
The classification rules of linear discriminant analysis are defined by the true mean vectors and the common covariance matrix of the populations from which the data come. Because these true parameters are generally unknown, they are commonly estimated by the sample mean vector and covariance matrix of the data in a training sample randomly drawn from each population. However, these sample statistics are notoriously susceptible to contamination by outliers, a problem compounded by the fact that the outliers may be invisible to conventional diagnostics. High-breakdown estimation is a procedure designed to remove this cause for concern by producing estimates that are immune to serious distortion by a minority of outliers, regardless of their severity. In this article we motivate and develop a high-breakdown criterion for linear discriminant analysis and give an algorithm for its implementation. The procedure is intended to supplement rather than replace the usual sample-moment methodology of discriminant analysis either by providing indications that the dataset is not seriously affected by outliers (supporting the usual analysis) or by identifying apparently aberrant points and giving resistant estimators that are not affected by them.
Resumo:
Objective: To illustrate methodological issues involved in estimating dietary trends in populations using data obtained from various sources in Australia in the 1980s and 1990s. Methods: Estimates of absolute and relative change in consumption of selected food items were calculated using national data published annually on the national food supply for 1982-83 to 1992-93 and responses to food frequency questions in two population based risk factor surveys in 1983 and 1994 in the Hunter Region of New South Wales, Australia. The validity of estimated food quantities obtained from these inexpensive sources at the beginning of the period was assessed by comparison with data from a national dietary survey conducted in 1983 using 24 h recall. Results: Trend estimates from the food supply data and risk factor survey data were in good agreement for increases in consumption of fresh fruit, vegetables and breakfast food and decreases in butter, margarine, sugar and alcohol. Estimates for trends in milk, eggs and bread consumption, however, were inconsistent. Conclusions: Both data sources can be used for monitoring progress towards national nutrition goals based on selected food items provided that some limitations are recognized. While data collection methods should be consistent over time they also need to allow for changes in the food supply (for example the introduction of new varieties such as low-fat dairy products). From time to time the trends derived from these inexpensive data sources should be compared with data derived from more detailed and quantitative estimates of dietary intake.
Resumo:
In this study, blood serum trace elements, biochemical and hematological parameters were obtained to assess the health status of an elderly population residing in So Paulo city, SP, Brazil. Results obtained showed that more than 93% of the studied individuals presented most of the serum trace element concentrations and of the hematological and biochemical data within the reference values used in clinical laboratories. However, the percentage of elderly presenting recommended low density lipoprotein (LDL) cholesterol concentrations was low (70%). The study indicated positive correlation between the concentrations of Zn and LDL-cholesterol (p < 0.06).
Resumo:
Objective. To determine pregnancy outcome and fetal loss risk factors in patients with juvenile systemic lupus erythematosus (JSLE). Methods. A total of 315 female patients with JSLE followed in 12 Brazilian pediatric rheumatology centers were consecutively selected. Menarche was observed in 298 (94.6%) patients. Patients` medical records were reviewed for pregnancy outcomes and demographic, clinical, and therapeutic data. Results. A total of 24 unplanned pregnancies occurred in 298 (8%) patients. The outcomes were 5 (21%) early fetal losses (prior to 16 wks gestation), 18 (75%) live births, and 1 (4%) death due to preeclampsia and premature birth. The frequencies of active diffuse proliferative glomerulonephritis, proteinuria >= 0.5 g/day, and arterial hypertension at the beginning of pregnancy were higher in pregnancies resulting in fetal losses than in live births [60% vs 5% (p = 0.02), 60% vs 5% (p = 0.02), 60% vs 5% (p = 0.02), respectively]. JSLE pregnancies with fetal losses had a significantly higher mean SLE Disease Activity Index 2000 (SLEDAI-2K) at the start of pregnancy compared with those with live births (9.40 +/- 7.47 vs 3.94 +/- 6.00; p = 0.049). Four pregnancies were inadvertently exposed to intravenous cyclophosphamide therapy for renal involvement despite contraceptive prescriptions, resulting in fetal loss in 3 (p = 0.02). In multivariate analysis only intravenous cyclophosphamide use at start of pregnancy (OR 25.50, 95% CI 1.72-377.93, p = 0.019) remained as an independent risk factor for fetal loss. Conclusion. We identified immunosuppressive therapy as the major contributing factor for fetal loss in JSLE, reinforcing the importance of contraception.
Resumo:
The purpose of this study was to evaluate outcomes such as success of the initial therapy, failure of outpatient treatment, and death in outpatient treatment during intravenous antimicrobial therapy in patients with febrile neutropenia (FN) and hematological malignancies. In addition, clinical and laboratory data and the Multinational Association for Supportive Care of Cancer index (MASCC) were compared with failure of outpatient treatment and death. In a retrospective study, we evaluated FN following chemotherapy events that were treated initially with cefepime, with or without teicoplanin and replaced by levofloxacin after 48 h of defervescence in patients with good general conditions and ANC > 500/mm(3). Of the 178 FN episodes occurred in 126 patients, we observed success of the initial therapy in 63.5% of the events, failure of outpatient treatment in 20.8%, and death in 6.2%. The success rate of oral levofloxacin after defervescence was 99% (95 out of 96). Using multivariate analysis, significant risks of failure of outpatient treatment were found to be smoking (odds ratio (OR) 3.14, confidence interval (CI) 1.14-8.66; p = 0.027) and serum creatinine levels > 1.2 mg/dL (OR 7.97, CI 2.19-28.95; p = 0.002). With regard to death, the risk found was oxygen saturation by pulse oximetry < 95% (OR 5.8, IC 1.50-22.56; p = 0.011). Using the MASCC index, 165 events were classified as low risk and 13 as high risk. Failure of outpatient treatment was reported in seven (53.8%) high-risk and 30 (18.2%) low-risk episodes (p = 0.006). In addition, death occurred in seven (4.2%) low-risk and four (30.8%) high-risk events (p = 0.004). Ours results show that MASCC index was able to identify patients with high risk. In addition, non-smoking, serum creatinine levels a parts per thousand currency sign1.2 mg/dL, and oxygen saturation by pulse oximetry a parts per thousand yen95% were protection factors.
Resumo:
Ninety-one consecutive systemic lupus erythematosus (SLE) patients (American College of Rheumatology criteria) with a history of cutaneous vasculitis were compared to 163 SLE controls without this clinical manifestation from July to December 2007 in order to determine the possible clinical and serological association of this manifestation. Data were obtained in an ongoing electronic database protocol and autoantibodies to anti-double-stranded DNA, anti-Sm, anti-RNP, anti-Ro/SS-A, anti-La/SS-B, and anticardiolipin and ribosomal P protein antibody (anti-P) were detected by standard techniques. Exclusion criteria were the presence of anti-phospholipid syndrome or antibodies, Sjogren syndrome, and a history of thrombosis. The mean age (38.5 +/- 11.5 vs. 37.8 +/- 11.6 years, p = 0.635), disease duration (12.5 +/- 7.8 vs. 11.8 +/- 7.9 years, p = 0.501), and frequency of white race (71.4% vs. 70.5%, p = 0.872) and female sex (96.8% vs. 93.7%, p = 0.272) were comparable in both groups. The vasculitis group had a higher frequency of malar rash (97.9% vs. 87.4%, p = 0.004), photosensitivity (91.4% vs. 81.6%, p = 0.030), and Raynaud phenomenon (RP; 27.7% vs. 7.5%, p < 0.001), whereas all other clinical manifestation including renal and central nervous system involvements were similar to the control group. Laboratorial data revealed that only anti-P (35.1% vs. 12.1%, p < 0.001) was more frequent in patients with vasculitis. In a multivariate logistic regression model, cutaneous vasculitis was associated to the presence of RP (OR = 3.70; 95% confidence interval [CI] = 1.73-8.00) and anti-P (OR = 3.42; 95% CI = 1.76-6.66). In summary, SLE cutaneous vasculitis characterizes a subgroup of patients with more RP and anti-P antibodies but not accompanied by a higher frequency of renal and central nervous system involvements.
Resumo:
Objective: To document outcome and to investigate patterns of physical and psychosocial recovery in the first year following severe traumatic brain injury (TBI) in an Australian patient sample. Design: A longitudinal prospective study of a cohort of patients, with data collection at 3, 6, 9, and 12 months post injury. Setting: A head injury rehabilitation unit in a large metropolitan public hospital. Patients: A sample of 55 patients selected from 120 consecutive admissions with severe TBI. Patients who were more than 3 months post injury on admission, who remained confused, or who had severe communication deficits or a previous neurologic disorder were excluded. Interventions: All subjects participated in a multidisciplinary inpatient rehabilitation program, followed by varied participation in outpatient rehabilitation and community-based sen ices. Main Outcome Measures: The Sickness impact Profile (SIP) provided physical, psychosocial, and total dysfunction scores at each follow-up. Outcome at 1 year was measured by the Disability Rating Scale. Results: Multivariate analysis of variance indicated that the linear trend of recovery over time was less for psychosocial dysfunction than for physical dysfunction (F(1,51) = 5.87, P < .02). One rear post injury, 22% of subjects had returned to their previous level of employability, and 42% were able to live independently. Conclusions: Recovery from TBI in this Australian sample followed a pattern similar to that observed in other countries, with psychosocial dysfunction being more persistent. Self-report measures such as the SIP in TBI research are limited by problems of diminished self-awareness.
Resumo:
Background Left atrial volume indexed (LAVI) has been reported as a predictor of cardiovascular events. We sought to determine the prognostic value of LAVI for predicting the outcome of patients who underwent dobutamine stress echocardiography (DSE) for known or suspected coronary artery disease (CAD). Methods From January 2000 to July 2005, we studied 981 patients who underwent DSE and off-line measurements of LAVI. The value of DSE over clinical and LAVI data was examined using a stepwise log-rank test. Results During a median follow-up of 24 months, 56 (6%) events occurred. By univariate analysis, predictors of events were male sex, diabetes mellitus, previous myocardial infarction, left ventricular ejection fraction (LVEF), left atrial diameter indexed, LAVI, and abnormal DSE. By multivariate analysis, independent predictors were LVEF (relative risk [RR] = 0.98, 95% CI 0.95-1.00), LAVI (RR = 1.04, 95% CI 1.02-1.05), and abnormal DSE (RR = 2.70, 95% CI 1.28-5.69). In an incremental multivariate model, LAVI was additional to clinical data for predicting events (chi(2) 36.8, P < .001). The addition of DSE to clinical and LAVI yielded incremental information (chi(2) 55.3, P < .001). The 3-year event-free survival in patients with normal DSE and LAVI <= 33 mL/m(2) was 96%; with abnormal DSE and LAVI <= 33 mL/m(2), 91%; with normal DSE and LAVI >34 mL/m(2), 83%; and with abnormal DSE and LAVI >34 mL/m(2) 51%. Conclusion Left atrial volume indexed provides independent prognostic information in patients who underwent DSE for known or suspected CAD. Among patients with normal DSE, those with larger LAVI had worse outcome, and among patients with abnormal DSE, LAVI was still predictive. (Am Heart J 2008; 156:1110-6.)
Resumo:
Background: Chagas` disease is the illness caused by the protozoan Trypanosoma cruzi and it is still endemic in Latin America. Heart transplantation is a therapeutic option for patients with end-stage Chagas` cardiomyopathy. Nevertheless, reactivation may occur after transplantation, leading to higher morbidity and graft dysfunction. This study aimed to identify risk factors for Chagas` disease reactivation episodes. Methods: This investigation is a retrospective cohort study of all Chagas` disease heart transplant recipients from September 1985 through September 2004. Clinical, microbiologic and histopathologic data were reviewed. Statistical analysis was performed with SPSS (version 13) software. Results: Sixty-four (21.9%) patients with chronic Chagas` disease underwent heart transplantation during the study period. Seventeen patients (26.5%) had at least one episode of Chagas` disease reactivation, and univariate analysis identified number of rejection episodes (p = 0.013) and development of neoplasms (p = 0.040) as factors associated with Chagas` disease reactivation episodes. Multivariate analysis showed that number of rejection episodes (hazard ratio = 1.31; 95% confidence interval [CI]: 1.06 to 1.62; p = 0.011), neoplasms (hazard ratio = 5.07; 95% CI: 1.49 to 17.20; p = 0.009) and use of mycophenolate mofetil (hazard ratio = 3.14; 95% CI: 1.00 to 9.84; p = 0.049) are independent determinants for reactivation after transplantation. Age (p = 0.88), male gender (p = 0.15), presence of rejection (p = 0.17), cytomegalovirus infection (p = 0.79) and mortality after hospital discharge (p = 0.15) showed no statistically significant difference. Conclusions: Our data suggest that events resulting in greater immunosuppression status contribute to Chagas` disease reactivation episodes after heart transplantation and should alert physicians to make an early diagnosis and perform pre-emptive therapy. Although reactivation led to a high rate of morbidity, a low mortality risk was observed.
Wavelet correlation between subjects: A time-scale data driven analysis for brain mapping using fMRI
Resumo:
Functional magnetic resonance imaging (fMRI) based on BOLD signal has been used to indirectly measure the local neural activity induced by cognitive tasks or stimulation. Most fMRI data analysis is carried out using the general linear model (GLM), a statistical approach which predicts the changes in the observed BOLD response based on an expected hemodynamic response function (HRF). In cases when the task is cognitively complex or in cases of diseases, variations in shape and/or delay may reduce the reliability of results. A novel exploratory method using fMRI data, which attempts to discriminate between neurophysiological signals induced by the stimulation protocol from artifacts or other confounding factors, is introduced in this paper. This new method is based on the fusion between correlation analysis and the discrete wavelet transform, to identify similarities in the time course of the BOLD signal in a group of volunteers. We illustrate the usefulness of this approach by analyzing fMRI data from normal subjects presented with standardized human face pictures expressing different degrees of sadness. The results show that the proposed wavelet correlation analysis has greater statistical power than conventional GLM or time domain intersubject correlation analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Resting state functional magnetic resonance imaging (fMRI) reveals a distinct network of correlated brain function representing a default mode state of the human brain The underlying structural basis of this functional connectivity pattern is still widely unexplored We combined fractional anisotropy measures of fiber tract integrity derived from diffusion tensor imaging (DTI) and resting state fMRI data obtained at 3 Tesla from 20 healthy elderly subjects (56 to 83 years of age) to determine white matter microstructure e 7 underlying default mode connectivity We hypothesized that the functional connectivity between the posterior cingulate and hippocampus from resting state fMRI data Would be associated with the white matter microstructure in the cingulate bundle and fiber tracts connecting posterior cingulate gyrus With lateral temporal lobes, medial temporal lobes, and precuneus This was demonstrated at the p<0001 level using a voxel-based multivariate analysis of covariance (MANCOVA) approach In addition, we used a data-driven technique of joint independent component analysis (ICA) that uncovers spatial pattern that are linked across modalities. It revealed a pattern of white matter tracts including cingulate bundle and associated fiber tracts resembling the findings from the hypothesis-driven analysis and was linked to the pattern of default mode network (DMN) connectivity in the resting state fMRI data Out findings support the notion that the functional connectivity between the posterior cingulate and hippocampus and the functional connectivity across the entire DMN is based oil distinct pattern of anatomical connectivity within the cerebral white matter (C) 2009 Elsevier Inc All rights reserved