60 resultados para Multiple Additive Regression Trees (MART)
Resumo:
The development of coronary vasculopathy is the main determinant of long-term survival in cardiac transplantation. The identification of risk factors, therefore, seems necessary in order to identify possible treatment strategies. Ninety-five out of 397 patients, undergoing orthotopic cardiac transplantation from 10/1985 to 10/1992 were evaluated retrospectively on the basis of perioperative and postoperative variables including age, sex, diagnosis, previous operations, renal function, cholesterol levels, dosage of immunosuppressive drugs (cyclosporin A, azathioprine, steroids), incidence of rejection, treatment with calcium channel blockers at 3, 6, 12, and 18 months postoperatively. Coronary vasculopathy was assessed by annual angiography at 1 and 2 years postoperatively. After univariate analysis, data were evaluated by stepwise multiple logistic regression analysis. Coronary vasculopathy was assessed in 15 patients at 1 (16%), and in 23 patients (24%) at 2, years. On multivariate analysis, previous operations and the incidence of rejections were identified as significant risk factors (P < 0.05), whereas the underlying diagnosis had borderline significance (P = 0.058) for the development of graft coronary vasculopathy. In contrast, all other variables were not significant in our subset of patients investigated. We therefore conclude that the development of coronary vasculopathy in cardiac transplant patients mainly depends on the rejection process itself, aside from patient-dependent factors. Therapeutic measures, such as the administration of calcium channel blockers and regulation of lipid disorders, may therefore only reduce the progress of native atherosclerotic disease in the posttransplant setting.
Resumo:
BACKGROUND: Renal resistance index, a predictor of kidney allograft function and patient survival, seems to depend on renal and peripheral vascular compliance and resistance. Asymmetric dimethylarginine (ADMA) is an endogenous inhibitor of nitric oxide synthase and therefore influences vascular resistance. STUDY DESIGN: We investigated the relationship between renal resistance index, ADMA, and risk factors for cardiovascular diseases and kidney function in a cross-sectional study. SETTING ; PARTICIPANTS: 200 stable renal allograft recipients (133 men and 67 women with a mean age of 52.8 years). PREDICTORS: Serum ADMA concentration, pulse pressure, estimated glomerular filtration rate and recipient age. OUTCOME: Renal resistance index. MEASUREMENTS: Renal resistance index measured by color-coded duplex ultrasound, serum ADMA concentration measured by liquid chromatography-tandem mass spectrometry, estimated glomerular filtration rate (Nankivell equation), arterial stiffness measured by digital volume pulse, Framingham and other cardiovascular risk factors, and evaluation of concomitant antihypertensive and immunosuppressive medication. RESULTS: Mean serum ADMA concentration was 0.72 +/- 0.21 (+/-SD) micromol/L and mean renal resistance index was 0.71 +/- 0.07. Multiple stepwise regression analysis showed that recipient age (P < 0.001), pulse pressure (P < 0.001), diabetes (P < 0.01) and ADMA concentration (P < 0.01) were independently associated with resistance index. ADMA concentrations were correlated with estimated glomerular filtration rate (P < 0.01). LIMITATIONS: The cross-sectional nature of this study precludes cause-effect conclusions. CONCLUSIONS: In addition to established cardiovascular risk factors, ADMA appears to be a relevant determinant of renal resistance index and allograft function and deserves consideration in prospective outcome trials in renal transplantation.
Resumo:
Recent coccoliths from 74 surface sediment samples recovered from the southeastern Pacific off Chile were examined quantitatively to investigate modern regional gradients of sea surface productivity and temperature. All findings are based on coccolith accumulation rates. Therefore an approach was designed to estimate recent sedimentation rates based on 210Pb and bulk chemistry analyses of the same set of surface samples. Highest total coccolith accumulation rates were found off north-central Chile, where seasonal upwelling takes place. Based on a multiple linear regression between calculated coccolith accumulation rates and World Ocean Atlas derived sea surface temperatures, a calibration model to reconstruct annual average temperatures of the uppermost 75 m of the water column is provided. The model was cross-validated and the SST estimates were compared with SST observed and SST estimates based on diatoms and planktonic foraminifera, showing a good correlation.
Resumo:
Due to highly erodible volcanic soils and a harsh climate, livestock grazing in Iceland has led to serious soil erosion on about 40% of the country's surface. Over the last 100 years, various revegetation and restoration measures were taken on large areas distributed all over Iceland in an attempt to counteract this problem. The present research aimed to develop models for estimating percent vegetation cover (VC) and aboveground biomass (AGB) based on satellite data, as this would make it possible to assess and monitor the effectiveness of restoration measures over large areas at a fairly low cost. Models were developed based on 203 vegetation cover samples and 114 aboveground biomass samples distributed over five SPOT satellite datasets. All satellite datasets were atmospherically corrected, and digital numbers were converted into ground reflectance. Then a selection of vegetation indices (VIs) was calculated, followed by simple and multiple linear regression analysis of the relations between the field data and the calculated VIs. Best results were achieved using multiple linear regression models for both %VC and AGB. The model calibration and validation results showed that R2 and RMSE values for most VIs do not vary very much. For percent VC, R2 values range between 0.789 and 0.822, leading to RMSEs ranging between 15.89% and 16.72%. For AGB, R2 values for low-biomass areas (AGB < 800 g/m2) range between 0.607 and 0.650, leading to RMSEs ranging between 126.08 g/m2 and 136.38 g/m2. The AGB model developed for all areas, including those with high biomass coverage (AGB > 800 g/m2), achieved R2 values between 0.487 and 0.510, resulting in RMSEs ranging from 234 g/m2 to 259.20 g/m2. The models predicting percent VC generally overestimate observed low percent VC and slightly underestimate observed high percent VC. The estimation models for AGB behave in a similar way, but over- and underestimation are much more pronounced. These results show that it is possible to estimate percent VC with high accuracy based on various VIs derived from SPOT satellite data. AGB of restoration areas with low-biomass values of up to 800 g/m2 can likewise be estimated with high accuracy based on various VIs derived from SPOT satellite data, whereas in the case of high biomass coverage, estimation accuracy decreases with increasing biomass values. Accordingly, percent VC can be estimated with high accuracy anywhere in Iceland, whereas AGB is much more difficult to estimate, particularly for areas with high-AGB variability.
Resumo:
The aims of this study were to assess and compare the methodological quality of Cochrane and non-Cochrane systematic reviews (SRs) published in leading orthodontic journals and the Cochrane Database of Systematic Reviews (CDSR) using AMSTAR and to compare the prevalence of meta-analysis in both review types. A literature search was undertaken to identify SRs that consisted of hand-searching five major orthodontic journals [American Journal of Orthodontics and Dentofacial Orthopedics, Angle Orthodontist, European Journal of Orthodontics, Journal of Orthodontics and Orthodontics and Craniofacial Research (February 2002 to July 2011)] and the Cochrane Database of Systematic Reviews from January 2000 to July 2011. Methodological quality of the included reviews was gauged using the AMSTAR tool involving 11 key methodological criteria with a score of 0 or 1 given for each criterion. A cumulative grade was given for the paper overall (0-11); an overall score of 4 or less represented poor methodological quality, 5-8 was considered fair and 9 or greater was deemed to be good. In total, 109 SRs were identified in the five major journals and on the CDSR. Of these, 26 (23.9%) were in the CDSR. The mean overall AMSTAR score was 6.2 with 21.1% of reviews satisfying 9 or more of the 11 criteria; a similar prevalence of poor reviews (22%) was also noted. Multiple linear regression indicated that reviews published in the CDSR (P < 0.01); and involving meta-analysis (β = 0.50, 95% confidence interval 0.72, 2.07, P < 0.001) showed greater concordance with AMSTAR.
Resumo:
OBJECTIVE To investigate the evolution of delirium of nursing home (NH) residents and their possible predictors. DESIGN Post-hoc analysis of a prospective cohort assessment. SETTING Ninety NHs in Switzerland. PARTICIPANTS Included 14,771 NH residents. MEASUREMENTS The Resident Assessment Instrument Minimum Data Set and the Nursing Home Confusion Assessment Method were used to determine follow-up of subsyndromal or full delirium in NH residents using discrete Markov chain modeling to describe long-term trajectories and multiple logistic regression analyses to determine predictors of the trajectories. RESULTS We identified four major types of delirium time courses in NH. Increasing severity of cognitive impairment and of depressive symptoms at the initial assessment predicted the different delirium time courses. CONCLUSION More pronounced cognitive impairment and depressive symptoms at the initial assessment are associated with different subsequent evolutions of delirium. The presence and evolution of delirium in the first year after NH admission predicted the subsequent course of delirium until death.
Resumo:
Abstract Purpose: There is evidence that depressed mood and perception of pain are related in patients with chronic illness. However, how individual resources such as self-efficacy and social support play a role in this association remains unclear. The aim of this study was to investigate the influence of both variables as either moderator or mediator. Method: In a longitudinal study, 274 injured workers (M = 43.24 years) were investigated. Data were collected on sociodemographics, depressed mood, pain, social support, and self-efficacy at three months post-injury, and depressed mood one year post-injury. Results: Hierarchical multiple linear regression analyses revealed that pain (β = 0.14; p < 0.01) and social support (β = -0.18; p < 0.001) were significant predictors of depressed mood. Self-efficacy moderated the relationship of pain (β = -0.12; p < 0.05) and depressed mood after one year. Lower self-efficacy in combination with pain had a stronger impact than higher self-efficacy and pain on depressed mood. Social support did not moderate the association. Conclusions: Self-efficacy for managing pain is important in the development of depressed mood. According to the results of this study, we suggest that the detection of low social support and low self-efficacy might be important in long-term rehabilitation process. Implications for Rehabilitation Risk for depressed mood one year after an accident is high: One in five workers report depressed mood. Protective factors for depressed mood in injured workers needs to be considered in the rehabilitation. Focusing on resources like social support and self-efficacy could be protective against depressed mood. The early detection of low social support and low self-efficacy might be important in long-term rehabilitation processes
Resumo:
OBJECTIVES Evidence increases that cognitive failure may be used to screen for drivers at risk. Until now, most studies have relied on driving learners. This exploratory pilot study examines self-report of cognitive failure in driving beginners and error during real driving as observed by driving instructors. METHODS Forty-two driving learners of 14 driving instructors filled out a work-related cognitive failure questionnaire. Driving instructors observed driving errors during the next driving lesson. In multiple linear regression analysis, driving errors were regressed on cognitive failure with the number of driving lessons as an estimator of driving experience controlled. RESULTS Higher cognitive failure predicted more driving errors (p < .01) when age, gender and driving experience were controlled in analysis. CONCLUSIONS Cognitive failure was significantly associated with observed driving errors. Systematic research on cognitive failure in driving beginners is recommended.
Resumo:
OBJECTIVE To determine the frequency of and risk factors for complications associated with casts in horses. DESIGN Multicenter retrospective case series. ANIMALS 398 horses with a half-limb or full-limb cast treated at 1 of 4 hospitals. PROCEDURES Data collected from medical records included age, breed, sex, injury, limb affected, time from injury to hospital admission, surgical procedure performed, type of cast (bandage cast [BC; fiberglass tape applied over a bandage] or traditional cast [TC; fiberglass tape applied over polyurethane resin-impregnated foam]), limb position in cast (flexed, neutral, or extended), and complications. Risk factors for cast complications were identified via multiple logistic regression. RESULTS Cast complications were detected in 197 of 398 (49%) horses (18/53 [34%] horses with a BC and 179/345 [52%] horses with a TC). Of the 197 horses with complications, 152 (77%) had clinical signs of complications prior to cast removal; the most common clinical signs were increased lameness severity and visibly detectable soft tissue damage Cast sores were the most common complication (179/398 [45%] horses). Casts broke for 20 (5%) horses. Three (0.8%) horses developed a bone fracture attributable to casting Median time to detection of complications was 12 days and 8 days for horses with TCs and BCs, respectively. Complications developed in 71%, 48%, and 47% of horses with the casted limb in a flexed, neutral, and extended position, respectively. For horses with TCs, hospital, limb position in the cast, and sex were significant risk factors for development of cast complications. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated that 49% of horses with a cast developed cast complications.
Resumo:
Objective: To assess the relationship among Type D personality, self-efficacy, and medication adherence in patients with coronary heart disease. Methods: The study design was prospective and observational. Type D personality, self-efficacy for illness management behaviors, and medication adherence were measured 3 weeks after hospitalization for acute coronary syndrome in 165 patients (mean [standard deviation] age = 61.62 [10.61] years, 16% women). Self-reported medication adherence was measured 6 months later in 118 of these patients. Multiple linear regression and mediation analyses were used to address the study research questions. Results: Using the original categorical classification, 30% of patients with acute coronary syndrome were classified as having Type D personality. Categorically defined patients with Type D personality had significantly poorer medication adherence at 6 months (r = j0.29, p G .01). Negative affectivity (NA; r = j0.25, p = .01) and social inhibition (r = j0.19, p = .04), the components of Type D personality, were associated with medication adherence 6 months after discharge in bivariate analyses. There was no evidence for the interaction of NA and social inhibition, that is, Type D personality, in the prediction of medication adherence 6 months after discharge in multivariate analysis. The observed association between NA and medication adherence 6 months after discharge could be partly explained by indirect effects through self-efficacy in mediation analysis (coefficient = j0.012; 95% bias-corrected and accelerated confidence interval = j0.036 to j0.001). Conclusions: The present data suggest the primacy of NA over the Type D personality construct in predicting medication adherence. Lower levels of self-efficacy may be a mediator between higher levels of NA and poor adherence to medication in patients with coronary heart disease.
Resumo:
Background: Few studies have examined the 20% of individuals who never experience an episode of low back pain (LBP). To date, no investigation has been undertaken that examines a group who claim to have never experienced LBP in their lifetime in comparison to two population-based case–control groups with and without momentary LBP. This study investigates whether LBP-resilient workers between 50 and 65 years had better general health, demonstrated more positive health behaviour and were better able to achieve routine activities compared with both case–control groups. Methods: Forty-two LBP-resilient participants completed the same pain assessment questionnaire as a population-based LBP sample from a nationwide, large-scale cross-sectional survey in Switzerland. The LBP-resilient participants were pairwise compared to the propensity score-matched case controls by exploring differences in demographic and work characteristics, and by calculating odds ratios (ORs) and effect sizes. A discriminant analysis explored group differences, while the multiple logistic regression analysis specified single indicators which accounted for group differences. Results: LBP-resilient participants were healthier than the case controls with momentary LBP and achieved routine activities more easily. Compared to controls without momentary LBP, LBP-resilient participants had a higher vitality, a lower workload, a healthier attitude towards health and behaved more healthily by drinking less alcohol. Conclusions: By demonstrating a difference between LBP-resilient participants and controls without momentary LBP, the question that arises is what additional knowledge can be attained. Three underlying traits seem to be relevant about LBP-resilient participants: personality, favourable work conditions and subjective attitudes/attributions towards health. These rationales have to be considered with respect to LBP prevention.
Resumo:
OBJECTIVE: Occupational low back pain (LBP) is considered to be the most expensive form of work disability, with the socioeconomic costs of persistent LBP exceeding the costs of acute and subacute LBP by far. This makes the early identification of patients at risk of developing persistent LBP essential, especially in working populations. The aim of the study was to evaluate both risk factors (for the development of persistent LBP) and protective factors (preventing the development of persistent LBP) in the same cohort. PARTICIPANTS: An inception cohort of 315 patients with acute to subacute or with recurrent LBP was recruited from 14 health practitioners (twelve general practitioners and two physiotherapists) across New Zealand. METHODS: Patients with persistent LBP at six-month follow-up were compared to patients with non-persistent LBP looking at occupational, psychological, biomedical and demographic/lifestyle predictors at baseline using multiple logistic regression analyses. All significant variables from the different domains were combined into a one predictor model. RESULTS: A final two-predictor model with an overall predictive value of 78% included social support at work (OR 0.67; 95%CI 0.45 to 0.99) and somatization (OR 1.08; 95%CI 1.01 to 1.15). CONCLUSIONS: Social support at work should be considered as a resource preventing the development of persistent LBP whereas somatization should be considered as a risk factor for the development of persistent LBP. Further studies are needed to determine if addressing these factors in workplace interventions for patients suffering from acute, subacute or recurrent LBP prevents subsequent development of persistent LBP.
Resumo:
Although accumulating evidence indicates that local intraspecific density-dependent effects are not as rare in species-rich communities as previously suspected, there are still very few detailed and systematic neighborhood analyses of species-rich communities. Here, we provide such an analysis with the overall goal of quantifying the relative importance of inter- and intraspecific interaction strength in a primary, lowland dipterocarp forest located at Danum, Sabah, Malaysia. Using data on 10 abundant overstory dipterocarp species from two 4-ha permanent plots, we evaluated the effects of neighbors on the absolute growth rate of focal trees (from 1986 to 1996) over increasing neighborhood radii (from 1 to 20 m) with multiple regressions. Only trees 10 cm to < 100 cm girth at breast height in 1986 were considered as focal trees. Among neighborhood models with one neighbor term, models including only conspecific larger trees performed best in five out of 10 species. Negative effects of conspecific larger neighbors were most apparent in large overstory species such as those of the genus Shorea. However, neighborhood models with separate terms and radii for heterospecific and conspecific neighbors accounted for more variability in absolute growth rates than did neighborhood models with one neighbor term. The conspecific term was significant for nine out of 10 species. Moreover, in five out of 10 species, trees without conspecific neighbors had significantly higher absolute growth rates than trees with conspecific neighbors. Averaged over the 10 species, trees without conspecific neighbors grew 32.4 cm(2) in basal area from 1986 to 1996, whereas trees with conspecific neighbors only grew 14.7 cm(2) in basal area, although there was no difference in initial basal area between trees in the two groups. Averaged across the six species of the genus Shorea, negative effects of conspecific larger trees were significantly stronger than for heterospecific larger neighbors. Thus, high local densities within neighborhoods of 20 m may lead to strong intraspecific negative and, hence, density-dependent, effects even in species rich communities with low overall densities at larger spatial scales. We conjecture that the strength of conspecific effects may be correlated with the degree of host specificity of ectomycorrhizae.
Resumo:
AIM Several surveys evaluate different retention approaches among orthodontists, but none exist for general dentists. The primary aim of this survey was to record the preferred fixed retainer designs and retention protocols amongst general dentists and orthodontists in Switzerland. A secondary aim was to investigate whether retention patterns were associated with parameters such as gender, university of graduation, time in practice, and specialist status. METHODS An anonymized questionnaire was distributed to general dentists (n = 401) and orthodontists (n = 398) practicing in the German-speaking part of Switzerland. A total of 768 questionnaires could be delivered, 562 (73.2 %) were returned and evaluated. Descriptive statistics were performed and responses to questions of interest were converted to binary outcomes and analyzed using multiple logistic regression. Any associations between the answers and gender, university of graduation (Swiss or foreign), years in practice, and specialist status (orthodontist/general dentist) were assessed. RESULTS Almost all responding orthodontists (98.0 %) and nearly a third of general dentists (29.6 %) reported bonding fixed retainers regularly. The answers were not associated with the practitioner's gender. The university of graduation and number of years in practice had a moderate impact on the responses. The answers were mostly influenced by specialist status. CONCLUSION Graduation school, years in practice, and specialist status influence retention protocol, and evidence-based guidelines for fixed retention should be issued to minimize these effects. Based on the observation that bonding and maintenance of retainers are also performed by general dentists, these guidelines should be taught in dental school and not during post-graduate training.
Resumo:
BACKGROUND The main goal of this study was to assess frequency, clinical correlates, and independent predictors of fatigue in a homogeneous cohort of well-defined glioblastoma patients at baseline prior to combined radio-chemotherapy. METHODS We prospectively included 65 glioblastoma patients at postsurgical baseline and assessed fatigue, sleepiness, mean bedtimes, mood disturbances, and clinical characteristics such as clinical performance status, presenting symptomatology, details on neurosurgical procedure, and tumor location and diameter as well as pharmacological treatment including antiepileptic drugs, antidepressants, and use of corticosteroids. Data on fatigue and sleepiness were measured with the Fatigue Severity Scale and the Epworth Sleepiness Scale, respectively, and compared with 130 age- and sex-matched healthy controls. RESULTS We observed a significant correlation between fatigue and sleepiness scores in both patients (r = 0.26; P = .04) and controls (r = 0.36; P < .001). Only fatigue appeared to be more common in glioblastoma patients than in healthy controls (48% vs 11%; P < .001) but not the frequency of sleepiness (22% vs 19%; P = .43). Female sex was associated with increased fatigue frequency among glioblastoma patients but not among control participants. Multiple linear regression analyses identified depression, left-sided tumor location, and female sex as strongest associates of baseline fatigue severity. CONCLUSIONS Our findings indicate that glioblastoma patients are frequently affected by fatigue at baseline, suggesting that factors other than those related to radio- or chemotherapy have significant impact, particularly depression and tumor localization.