968 resultados para Multivariate regression
Resumo:
BACKGROUND Cytomegalovirus (CMV) is associated with an increased risk of cardiac allograft vasculopathy (CAV), the major limiting factor for long-term survival after heart transplantation (HTx). The purpose of this study was to evaluate the impact of CMV infection during long-term follow-up after HTx. METHODS A retrospective, single-centre study analyzed 226 HTx recipients (mean age 45 ± 13 years, 78 % men) who underwent transplantation between January 1988 and December 2000. The incidence and risk factors for CMV infection during the first year after transplantation were studied. Risk factors for CAV were included in an analyses of CAV-free survival within 10 years post-transplant. The effect of CMV infection on the grade of CAV was analyzed. RESULTS Survival to 10 years post-transplant was higher in patients with no CMV infection (69 %) compared with patients with CMV disease (55 %; p = 0.018) or asymptomatic CMV infection (54 %; p = 0.053). CAV-free survival time was higher in patients with no CMV infection (6.7 years; 95 % CI, 6.0-7.4) compared with CMV disease (4.2 years; CI, 3.2-5.2; p < 0.001) or asymptomatic CMV infection (5.4 years; CI, 4.3-6.4; p = 0.013). In univariate analysis, recipient age, donor age, coronary artery disease (CAD), asymptomatic CMV infection and CMV disease were significantly associated with CAV-free survival. In multivariate regression analysis, CMV disease, asymptomatic CMV infection, CAD and donor age remained independent predictors of CAV-free survival at 10 years post-transplant. CONCLUSIONS CAV-free survival was significantly reduced in patients with CMV disease and asymptomatic CMV infection compared to patients without CMV infection. These findings highlight the importance of close monitoring of CMV viral load and appropriate therapeutic strategies for preventing asymptomatic CMV infection.
Resumo:
PURPOSE Open surgical management of unstable pelvic ring injuries has been discussed controversially compared to percutaneous techniques in terms of surgical site morbidity especially in older patients. Thus, we assessed the impact of age on the outcome following fixation of unstable pelvic ring injuries through the modified Stoppa approach. METHODS Out of a consecutive series of 92 patients eligible for the study, 63 patients (mean age 50 years, range 19-78) were evaluated [accuracy of reduction, complications, failures, Majeed-Score, Oswestry Disability Questionnaire (ODI), Mainz Pain Staging System (MPSS)] at a mean follow-up of 3.3 years (range 1.0-7.9). Logistic multivariate regression analysis was performed to assess the outcome in relation to increasing patient age and/or Injury Severity Score (ISS). RESULTS Out of 63 patients, in 36 an "anatomic" reduction was achieved. Ten postoperative complications occurred in eight patients. In five patients, failure of fixation was noted at the anterior and/or posterior pelvic ring. In 49 patients, an "excellent" or "good" Majeed-Score was obtained; the mean ODI was 14 % (range 0-76 %); 50 patients reported either no or only minor chronic pelvic pain (MPSS). Only an increasing ISS conferred an increased likelihood of the occurrence of a non-anatomical reduction, a "poor" or "fair" Majeed-Score, or an ODI >20 %. CONCLUSIONS Increasing age did not impact the analysed parameters. Open reduction and internal fixation of the anterior pelvic ring through a modified Stoppa approach in unstable pelvic ring injuries did not result in an unfavourable outcome with increasing age of patients.
Resumo:
The purpose of this study was to determine the prevalence and possible etiological factors of erosive tooth wear and wedge-shaped defects in Swiss Army recruits and compare the findings with those of an analogous study conducted in 1996. In 2006, 621 recruits between 18 and 25 years of age (1996: 417 recruits; ages 19 to 25) were examined for erosive tooth wear and wedge-shaped defects. Additional data was acquired using a questionnaire about personal details, education, dentitions subjective condition, oral hygiene, eating and drinking habits, medications used, and general medical problems. In 2006, 60.1% of those examined exhibited occlusal erosive tooth wear not involving the dentin (1996: 82.0%) and 23.0% involving the dentin (1996: 30.7%). Vestibular erosive tooth wear without dentin involvement was seen in 7.7% in 2006 vs. 14.4% in 1996. Vestibular erosive tooth wear with dentin involvement was rare in both years (0.5%). Oral erosive tooth wear lacking exposed dentin was also rare in those years, although more teeth were affected in 2006 (2.1%) than in 1996 (0.7%). The examinations in 2006 found one or more initial wedge-shaped lesions in 8.5% of the recruits, while 20.4% of the study participants exhibited such in 1996. In 1996, 53% consumed acidic foods and beverages more than 5 times/day; in 2006, 83.9% did so. In neither study did multivariate regression analyses show any significant correlations between occurrence and location of erosive tooth wear and wedge-shaped defects and various other parameters, e.g., eating and hygiene habits, or dentin hyper-sensitivity. Despite a significant increase in consumption of acidic products between 1996 and 2006, the latter study found both fewer erosive tooth wear and fewer wedge-shaped defects (i.e., fewer non-carious lesions.).
Resumo:
OBJECTIVE The aim of this study was to examine the prevalence of nutritional risk and its association with multiple adverse clinical outcomes in a large cohort of acutely ill medical inpatients from a Swiss tertiary care hospital. METHODS We prospectively followed consecutive adult medical inpatients for 30 d. Multivariate regression models were used to investigate the association of the initial Nutritional Risk Score (NRS 2002) with mortality, impairment in activities of daily living (Barthel Index <95 points), hospital length of stay, hospital readmission rates, and quality of life (QoL; adapted from EQ5 D); all parameters were measured at 30 d. RESULTS Of 3186 patients (mean age 71 y, 44.7% women), 887 (27.8%) were at risk for malnutrition with an NRS ≥3 points. We found strong associations (odds ratio/hazard ratio [OR/HR], 95% confidence interval [CI]) between nutritional risk and mortality (OR/HR, 7.82; 95% CI, 6.04-10.12), impaired Barthel Index (OR/HR, 2.56; 95% CI, 2.12-3.09), time to hospital discharge (OR/HR, 0.48; 95% CI, 0.43-0.52), hospital readmission (OR/HR, 1.46; 95% CI, 1.08-1.97), and all five dimensions of QoL measures. Associations remained significant after adjustment for sociodemographic characteristics, comorbidities, and medical diagnoses. Results were robust in subgroup analysis with evidence of effect modification (P for interaction < 0.05) based on age and main diagnosis groups. CONCLUSION Nutritional risk is significant in acutely ill medical inpatients and is associated with increased medical resource use, adverse clinical outcomes, and impairments in functional ability and QoL. Randomized trials are needed to evaluate evidence-based preventive and treatment strategies focusing on nutritional factors to improve outcomes in these high-risk patients.
Resumo:
AIM To associate the dimension of the facial bone wall with clinical, radiological, and patient-centered outcomes at least 10 years after immediate implant placement with simultaneous guided bone regeneration in a retrospective study. MATERIAL AND METHODS Primary endpoint was the distance from the implant shoulder (IS) to the first bone-to-implant contact (IS-BIC10y ). Secondary endpoints included the facial bone thickness (BT10y ) 2, 4, and 6 mm apical to the IS, and the implant position. At baseline, the horizontal defect width (HDWBL ) from the implant surface to the alveolar wall was recorded. At recall, distance from the IS to the mucosal margin (IS-MM10y ), degree of soft tissue coverage of the mesial and distal aspects of the implants (PISm10y , PISd10y ; Papilla Index), pocket probing depth (PPD10y ), and patient-centered outcomes were determined. Width of the keratinized mucosa (KM), Full-Mouth Plaque and Bleeding Score (FMPS, FMBS) were available for both time points. RESULTS Of the 20 patients who underwent immediate implant placement with simultaneous guided bone regeneration and transmucosal healing, nine males and eight females with a median age of 62 years (42 min, 84 max) were followed up for a median period of 10.5 y (min 10.1 max 11.5). The 10-year implant survival rate was 100%. Multivariate regression analysis revealed a correlation of the IS-BIC10y , controlled for age and gender, with four parameters: HDWBL (P = 0.03), KMBL -10 (P = 0.02), BT10 4 mm (P = 0.01), and BT10 6 mm (P = 0.01). CONCLUSION Within the conditions of the present study, the horizontal defect width was the main indicator for the vertical dimension of the facial bone. The facial bone dimension was further associated with a reduction in the width of the keratinized mucosa and the dimension of the buccal bone.
Resumo:
It is estimated that more than half the U.S. adult population is overweight or obese as classified by a body mass index of 25.0–29.9 or ≥30 kg/m 2, respectively. Since the current treatment approaches for long-term maintenance of weight loss are lacking, the National Institutes of Health state that an effective approach may be to focus on weight gain prevention. There is a limited body of literature describing how adults maintain a stable weight as they age. It is hypothesized that weight stability is the result of a balance between energy consumption and energy expenditure as influenced by diet, lifestyle, behavior, genetics and environment. The purpose of this research was to examine the dietary intake and behaviors, lifestyle habits, and risk factors for weight change that predict weight stability in a cohort of 2101 men and 389 women aged 20 to 8 7 years in the Aerobic Center Longitudinal Study regardless of body weight at baseline. At baseline, participants completed a maximal exercise treadmill test to determine cardiorespiratory fitness, a medical history questionnaire, which included self-reported measures of weight, dietary behaviors, lifestyle habits, and risk factors for weight change, a three-day diet record, and a mail-back version of the medical history questionnaire in 1990 or 1995. All analyses were performed separately for men and women. Results from multivariate regression analyses indicated that the strongest predictor of follow-up weight for men and women was previous weight, accounting for 87.0% and 81.9% of the variance, respectively. Age, length of follow-up and eating habits were also significant predictors of follow-up weight in men, though these variables only explained 3% of the variance. For women, length of follow-up and currently being on a diet were significantly associated with follow-up weight but these variables explained only an additional 2% of the variance. Understanding the factors that influence weight change has tremendous public health importance for developing effective methods to prevent weight gain. Since current weight was the strongest predictor of previous weight, preventing initial weight gain by maintaining a stable weight may be the most effective method to combat the increasing prevalence of overweight and obesity. ^
Resumo:
According to the United Nations Program on HIV/AIDS (UNAIDS, 2008), in 2007 about 67 per cent of all HIV-infected patients in the world were in Sub-Saharan Africa, with 35% of new infections and 38% of the AIDS deaths occurring in Southern Africa. Globally, the number of children younger than 15 years of age infected with HIV increased from 1.6 million in 2001 to 2.0 million in 2007 and almost 90% of these were in Sub-Saharan Africa. (UNAIDS, 2008).^ Both clinical and laboratory monitoring of children on Highly Active Anti-Retroviral Therapy (HAART) are important and necessary to optimize outcomes. Laboratory monitoring of HIV viral load and genotype resistance testing, which are important in patient follow-up to optimize treatment success, are both generally expensive and beyond the healthcare budgets of most developing countries. This is especially true for the impoverished Sub-Saharan African nations. It is therefore important to identify those factors that are associated with virologic failure in HIV-infected Sub-Saharan African children. This will inform practitioners in these countries so that they can predict which patients are more likely to develop virologic failure and therefore target the limited laboratory monitoring budgets towards these at-risk patients. The objective of this study was to examine those factors that are associated with virologic failure in HIV-infected children taking Highly Active Anti-retroviral Therapy in Botswana, a developing Sub-Saharan African country. We examined these factors in a Case-Control study using medical records of HIV-infected children and adolescents on HAART at the Botswana-Baylor Children's Clinical Center of Excellence (BBCCCOE) in Gaborone, Botswana. Univariate and Multivariate Regression Analyses were performed to identify predictors of virologic failure in these children.^ The study population comprised of 197 cases (those with virologic failure) and 544 controls (those with virologic success) with ages ranging from 3 months to 16 years at baseline. Poor adherence (pill count <95% on at least 3 consecutive occasions) was the strongest independent predictor of virologic failure (adjusted OR = 269.97, 95% CI = 104.13 to 699.92; P < 0.001). Other independent predictors of virologic failure identified were: First Line NNRTI with Nevirapine (OR = 2.99, 95% CI = 1.19 to7.54; P = 0.020), Baseline HIV-1 Viral Load >750,000/ml (OR = 257, 95% CI = 1.47 to 8.63; P = 0.005), Positive History of PMTCT (OR = 11.65, 95% CI = 3.04-44.57; P < 0.001), Multiple Care-givers (>=3) (OR = 2.56, 95% CI = 1.06 to 6.19; P = 0.036) and Residence in a Village (OR = 2.85, 95% CI = 1.36 to 5.97; P = 0.005).^ The results of this study may help to improve virologic outcomes and reduce the costs of caring for HIV-infected children in resource-limited settings. ^ Keywords: Virologic Failure, Highly Active Anti-Retroviral Therapy, Sub-Saharan Africa, Children, Adherence.^
Resumo:
Purpose. A descriptive analysis of glioma patients by race was carried out in order to better elucidate potential differences between races in demographics, treatment, characteristics, prognosis and survival. ^ Patients and Methods. Among 1,967 patients ≥ 18 years diagnosed with glioma seen between July 2000 and September 2006 at The University of Texas M.D. Anderson Cancer Center (UTMDACC). Data were collated from the UTMDACC Patient History Database (PHDB) and the UTMDACC Tumor Registry Database (TRDB). Chi-square analysis, uni- /multivariate Cox proportional hazards modeling and survival analysis were used to analyze differences by race. ^ Results. Demographic, treatment and histologic differences exist between races. Though risk differences were seen between races, race was not found to be a significant predictor in multivariate regression analysis after accounting for age, surgery, chemotherapy, radiation, tumor type as stratified by WHO tumor grade. Age was the most consistent predictor in risk for death. Overall survival by race was significantly different (p=0.0049) only in low-grade gliomas after adjustment for age although survival differences were very slight. ^ Conclusion. Among this cohort of glioma patients, age was the strongest predictor for survival. It is likely that survival is more influenced by age, time to treatment, tumor grade and surgical expertise rather than racial differences. However, age at diagnosis, gender ratios, histology and history of cancer differed significantly between race and genetic differences to this effect cannot be excluded. ^
Resumo:
Major objectives within Healthy People 2010 include improving hypertension and mental health management of the American population. Both mental health issues and hypertension exist in the military which may decrease the health status of military personnel and diminish the ability to complete assigned missions. Some cases may be incompatible with military service even with optimum treatment. In the interest of maintaining a fit fighting force, the Department of Defense regularly conducts a survey of health related behaviors among active duty military personnel. The 2005 DoD Survey was conducted to obtain information regarding health and behavioral readiness among active duty military personnel to assess progress toward selected Healthy People 2010 objectives. ^ This study is a cross-sectional prevalence design looking at the association of hypertension treatment with mental health issues (either treatment or perceived need for treatment) within the military population sampled in the 2005 DoD Survey. There were 16,946 military personnel in the final cross-sectional sample representing 1.3 million active duty service members. The question is whether there is a significant association between the self-reported occurrence of hypertension and the self-reported occurrence of mental health issues in the 2005 DoD Survey. In addition to these variables, this survey examined the contribution of various sociodemographic, occupational, and behavioral covariates. An analysis of the demographic composition of the study variables was followed by logistic analysis, comparing outcome variables with each of the independent variables. Following univariate regression analysis, multivariate regression was performed with adjustment (for those variables with an unadjusted alpha level less than or equal to 0.25). ^ All the mental health related indicators were associated with hypertension treatment. The same relationship was maintained after multivariate adjustment. The covariates remaining as significant (p < 0.05) in the final model included gender, age, race/ethnicity and obesity. There is a need to recognize and treat co-morbid medical diagnoses among mental health patients and to improve quality of life outcomes, whether in the military population or the general population. Optimum health of the individual can be facilitated through discovery of treatable cases, to minimize disruptions of military missions, and even allow for continued military service. ^
Resumo:
Racial differences in heart failure with preserved ejection fraction (HFpEF) have rarely been studied in an ambulatory, financially "equal access" cohort, although the majority of such patients are treated as outpatients. ^ Retrospective data was collected from 2,526 patients (2,240 Whites, 286 African American) with HFpEF treated at 153 VA clinics, as part of the VA External Peer Review Program (EPRP) between October 2000 and September 2002. Kaplan Meier curves (stratified by race) were created for time to first heart failure (HF) hospitalization, all cause hospitalization and death and Cox proportional multivariate regression models were constructed to evaluate the effect of race on these outcomes. ^ African American patients were younger (67.7 ± 11.3 vs. 71.2 ± 9.8 years; p < 0.001), had lower prevalence of atrial fibrillation (24.5 % vs. 37%; p <0.001), chronic obstructive pulmonary disease (23.4 % vs. 36.9%, p <0.001), but had higher blood pressure (systolic blood pressure > 120 mm Hg 77.6% vs. 67.8%; p < 0.01), glomerular filtration rate (67.9 ± 31.0 vs. 61.6 ± 22.6 mL/min/1.73 m2; p < 0.001), anemia (56.6% vs. 41.7%; p <0.001) as compared to whites. African Americans were found to have higher risk adjusted rate of HF hospitalization (HR 1.52, 95% CI 1.1 - 2.11; p = 0.01), with no difference in risk-adjusted all cause hospitalization (p = 0.80) and death (p= 0.21). ^ In a financially "equal access" setting of the VA, among ambulatory patients with HFpEF, African Americans have similar rates of mortality and all cause hospitalization but have an increased risk of HF hospitalizations compared to whites.^
Resumo:
The ascertainment and analysis of adverse reactions to investigational agents presents a significant challenge because of the infrequency of these events, their subjective nature and the low priority of safety evaluations in many clinical trials. A one year review of antibiotic trials published in medical journals demonstrates the lack of standards in identifying and reporting these potentially fatal conditions. This review also illustrates the low probability of observing and detecting rare events in typical clinical trials which include fewer than 300 subjects. Uniform standards for ascertainment and reporting are suggested which include operational definitions of study subjects. Meta-analysis of selected antibiotic trials using multivariate regression analysis indicates that meaningful conclusions may be drawn from data from multiple studies which are pooled in a scientifically rigorous manner. ^
Resumo:
An investigation was undertaken to determine the chemical characterization of inhalable particulate matter in the Houston area, with special emphasis on source identification and apportionment of outdoor and indoor atmospheric aerosols using multivariate statistical analyses.^ Fine (<2.5 (mu)m) particle aerosol samples were collected by means of dichotomous samplers at two fixed site (Clear Lake and Sunnyside) ambient monitoring stations and one mobile monitoring van in the Houston area during June-October 1981 as part of the Houston Asthma Study. The mobile van allowed particulate sampling to take place both inside and outside of twelve homes.^ The samples collected for 12-h sampling on a 7 AM-7 PM and 7 PM-7 AM (CDT) schedule were analyzed for mass, trace elements, and two anions. Mass was determined gravimetrically. An energy-dispersive X-ray fluorescence (XRF) spectrometer was used for determination of elemental composition. Ion chromatography (IC) was used to determine sulfate and nitrate.^ Average chemical compositions of fine aerosol at each site were presented. Sulfate was found to be the largest single component in the fine fraction mass, comprising approximately 30% of the fine mass outdoors and 12% indoors, respectively.^ Principal components analysis (PCA) was applied to identify sources of aerosols and to assess the role of meteorological factors on the variation in particulate samples. The results suggested that meteorological parameters were not associated with sources of aerosol samples collected at these Houston sites.^ Source factor contributions to fine mass were calculated using a combination of PCA and stepwise multivariate regression analysis. It was found that much of the total fine mass was apparently contributed by sulfate-related aerosols. The average contributions to the fine mass coming from the sulfate-related aerosols were 56% of the Houston outdoor ambient fine particulate matter and 26% of the indoor fine particulate matter.^ Characterization of indoor aerosol in residential environments was compared with the results for outdoor aerosols. It was suggested that much of the indoor aerosol may be due to outdoor sources, but there may be important contributions from common indoor sources in the home environment such as smoking and gas cooking. ^
Resumo:
Trauma and severe head injuries are important issues because they are prevalent, because they occur predominantly in the young, and because variations in clinical management may matter. Trauma is the leading cause of death for those under age 40. The focus of this head injury study is to determine if variations in time from the scene of accident to a trauma center hospital makes a difference in patient outcomes.^ A trauma registry is maintained in the Houston-Galveston area and includes all patients admitted to any one of three trauma center hospitals with mild or severe head injuries. A study cohort, derived from the Registry, includes 254 severe head injury cases, for 1980, with a Glasgow Coma Score of 8 or less.^ Multiple influences relate to patient outcomes from severe head injury. Two primary variables and four confounding variables are identified, including time to emergency room, time to intubation, patient age, severity of injury, type of injury and mode of transport to the emergency room. Regression analysis, analysis of variance, and chi-square analysis were the principal statistical methods utilized.^ Analysis indicates that within an urban setting, with a four-hour time span, variations in time to emergency room do not provide any strong influence or predictive value to patient outcome. However, data are suggestive that at longer time periods there is a negative influence on outcomes. Age is influential only when the older group (55-64) is included. Mode of transport (helicopter or ambulance) did not indicate any significant difference in outcome.^ In a multivariate regression model, outcomes are influenced primarily by severity of injury and age which explain 36% (R('2)) of variance. Inclusion of time to emergency room, time to intubation, transport mode and type injury add only 4% (R('2)) additional contribution to explaining variation in patient outcome.^ The research concludes that since the group most at risk to head trauma is the young adult male involved in automobile/motorcycle accidents, more may be gained by modifying driving habits and other preventive measures. Continuous clinical and evaluative research are required to provide updated clinical wisdom in patient management and trauma treatment protocols. A National Institute of Trauma may be required to develop a national public policy and evaluate the many medical, behavioral and social changes required to cope with the country's number 3 killer and the primary killer of young adults.^
Resumo:
Interannual environmental variability in Peru is dominated by the El Niño Southern Oscillation (ENSO). The most dramatic changes are associated with the warm El Niño (EN) phase (opposite the cold La Niña phase), which disrupts the normal coastal upwelling and affects the dynamics of many coastal marine and terrestrial resources. This study presents a trophic model for Sechura Bay, located at the northern extension of the Peruvian upwelling system, where ENSO-induced environmental variability is most extreme. Using an initial steady-state model for the year 1996, we explore the dynamics of the ecosystem through the year 2003 (including the strong EN of 1997/98 and the weaker EN of 2002/03). Based on support from literature, we force biomass of several non-trophically-mediated 'drivers' (e.g. Scallops, Benthic detritivores, Octopus, and Littoral fish) to observe whether the fit between historical and simulated changes (by the trophic model) is improved. The results indicate that the Sechura Bay Ecosystem is a relatively inefficient system from a community energetics point of view, likely due to the periodic perturbations of ENSO. A combination of high system productivity and low trophic level target species of invertebrates (i.e. scallops) and fish (i.e. anchoveta) results in high catches and an efficient fishery. The importance of environmental drivers is suggested, given the relatively small improvements in the fit of the simulation with the addition of trophic drivers on remaining functional groups' dynamics. An additional multivariate regression model is presented for the scallop Argopecten purpuratus, which demonstrates a significant correlation between both spawning stock size and riverine discharge-mediated mortality on catch levels. These results are discussed in the context of the appropriateness of trophodynamic modeling in relatively open systems, and how management strategies may be focused given the highly environmentally influenced marine resources of the region.
Resumo:
Visual traces of iron reduction and oxidation are linked to the redox status of soils and have been used to characterise the quality of agricultural soils.We tested whether this feature could also be used to explain the spatial pattern of the natural vegetation of tidal habitats. If so, an easy assessment of the effect of rising sea level on tidal ecosystems would be possible. Our study was conducted at the salt marshes of the northern lagoon of Venice, which are strongly threatened by erosion and rising sea level and are part of the world heritage 'Venice and its lagoon'. We analysed the abundance of plant species at 255 sampling points along a land-sea gradient. In addition, we surveyed the redox morphology (presence/absence of red iron oxide mottles in the greyish topsoil horizons) of the soils and the presence of disturbances. We used indicator species analysis, correlation trees and multivariate regression trees to analyse relations between soil properties and plant species distribution. Plant species with known sensitivity to anaerobic conditions (e.g. Halimione portulacoides) were identified as indicators for oxic soils (showing iron oxide mottles within a greyish soil matrix). Plant species that tolerate a low redox potential (e.g. Spartina maritima) were identified as indicators for anoxic soils (greyish matrix without oxide mottles). Correlation trees and multivariate regression trees indicate the dominant role of the redox morphology of the soils in plant species distribution. In addition, the distance from the mainland and the presence of disturbances were identified as tree-splitting variables. The small-scale variation of oxygen availability plays a key role for the biodiversity of salt marsh ecosystems. Our results suggest that the redox morphology of salt marsh soils indicates the plant availability of oxygen. Thus, the consideration of this indicator may enable an understanding of the heterogeneity of biological processes in oxygen-limited systems and may be a sensitive and easy-to-use tool to assess human impacts on salt marsh ecosystems.