995 resultados para multivariate analysis
Resumo:
BACKGROUND: Previous cross-sectional studies report that cognitive impairment is associated with poor psychosocial functioning in euthymic bipolar patients. There is a lack of long-term studies to determine the course of cognitive impairment and its impact on functional outcome. Method A total of 54 subjects were assessed at baseline and 6 years later; 28 had DSM-IV TR bipolar I or II disorder (recruited, at baseline, from a Lithium Clinic Program) and 26 were healthy matched controls. They were all assessed with a cognitive battery tapping into the main cognitive domains (executive function, attention, processing speed, verbal memory and visual memory) twice over a 6-year follow-up period. All patients were euthymic (Hamilton Rating Scale for Depression score lower than 8 and Young mania rating scale score lower than 6) for at least 3 months before both evaluations. At the end of follow-up, psychosocial functioning was also evaluated by means of the Functioning Assessment Short Test. RESULTS: Repeated-measures multivariate analysis of covariance showed that there were main effects of group in the executive domain, in the inhibition domain, in the processing speed domain, and in the verbal memory domain (p<0.04). Among the clinical factors, only longer illness duration was significantly related to slow processing (p=0.01), whereas strong relationships were observed between impoverished cognition along time and poorer psychosocial functioning (p<0.05). CONCLUSIONS: Executive functioning, inhibition, processing speed and verbal memory were impaired in euthymic bipolar out-patients. Although cognitive deficits remained stable on average throughout the follow-up, they had enduring negative effects on psychosocial adaptation of patients.
Resumo:
The aim of this study is to evaluate the risk and the results of surgical treatment for perforated peptic ulcer (PPU), to compare them through time, and to determine the current optimal surgical treatment. In a retrospective study, the charts of all the patients admitted for PPU between January 1976 and October 1991 were reviewed. The features believed to be of importance in the outcome were assessed for statistical analysis. A comparison was made between three periods of the study (1976-1980, 1981-1985, 1986-1991). 247 patients were included. Mortality was 11.7% (29/247). Factors associated with an increased mortality were: shock on admission (p = 0.01), age (p < 0.001), severe associated medical illnesses (p < 0.001) and the form of treatment (p < 0.01). After multivariate analysis, only shock on admission and associated disease remained significant. Chronic peptic ulcer disease occurred in 76% of the patients. Comparing the periods showed that age, associated illnesses, percentage of acute or subacute ulcers, mortality, as well as the number of patients, are increasing. The main determinant of surgical treatment for PPU is the patient and his/her general state. Because of the high frequency of chronic peptic ulcer disease, we believe that the gold standard in the treatment for PPU remains definitive surgery. However, in the presence of more than one risk factor, suture and patch are probably safer.
Resumo:
Introduction: The prevalence of multimorbidity (MM) in hospitalized patients is increasing and recognized as an important factor that may modify the strategies of treatment and increase the length of stay. Little is currently known about the prevalence of MM in the general population and if measured or self-reported diseases are different in the outpatient setting compared to hospitalized patients. The objective of the study was, therefore, to assess the prevalence of self-reported and measured MM in representative sample of the general population aged 35-75 years in Switzerland. Method: Data were obtained from the population based CoLaus Study: 3712 participants (1965 women, 50±9 years). MM was defined as presenting >=2 morbidities according to a list of 27 items (either measured or self-reported data, according to Barret et al.) or a Functional Comorbidity Index (FCI) (18 items, measured only). Results: The prevalence of MM according to these three definitions is summarized in the table 1. For all definitions prevalence of MM was higher in women, elderly participants, those with lower education levels, Swiss nationals, former smokers and obese participants. The prevalence of MM when measured data were used was significantly higher than according to self-reported (p<0.001). Multivariate analysis confirmed most of these associations, except that no difference was found for educational level and for overweight participants. Conclusion: The prevalence of MM is high in the general population, ranging from 13.8 and 50.3% even in the younger age group. The prevalence is higher in women, and increases with age and weight. The prevalence varies considerably according to the definition and is lower when using self-reported compared to measured data.
Resumo:
AIM: To assess the predictors of a significant decrease or cessation of substance use (SU) in a treated epidemiological cohort of first-episode psychosis (FEP) patients. METHOD: Participants were FEP patients of the Early Psychosis Prevention and Intervention Centre in Australia. Patients' medical files were reviewed using a standardized file audit. Data on 432 patients with FEP and baseline co-morbid substance use disorder (SUD) were available for analysis. Predictors of reduction/cessation of SU at follow up were examined using logistic regression analyses. RESULTS: In univariate analyses, a reduction/cessation of SU was predicted by baseline measures reflecting higher education, employment, accommodation with others, cannabis use disorder (CUD) only (rather than poly-SUDs), better global functioning and better premorbid social and occupational functioning, later age at onset of psychosis, and a diagnosis of non-affective psychosis. In multivariate analysis, CUD alone and better premorbid social and occupational functioning remained significant predictors. CONCLUSIONS: Addressing SUDs and social and occupational goals in people with FEP may offer opportunities to prevent SUDs becoming more severe or entrenched. Further longitudinal research on recovery from SU and FEP is needed to disentangle directions of influence and identify key targets for intervention.
Resumo:
BACKGROUND & AIMS: The prognostic value of the different causes of renal failure in cirrhosis is not well established. This study investigated the predictive value of the cause of renal failure in cirrhosis. METHODS: Five hundred sixty-two consecutive patients with cirrhosis and renal failure (as defined by serum creatinine 1.5 mg/dL on 2 successive determinations within 48 hours) hospitalized over a 6-year period in a single institution were included in a prospective study. The cause of renal failure was classified into 4 groups: renal failure associated with bacterial infections, renal failure associated with volume depletion, hepatorenal syndrome (HRS), and parenchymal nephropathy. The primary end point was survival at 3 months. RESULTS: Four hundred sixty-three patients (82.4%) had renal failure that could be classified in 1 of 4 groups. The most frequent was renal failure associated with infections (213 cases; 46%), followed by hypovolemia-associated renal failure (149; 32%), HRS (60; 13%), and parenchymal nephropathy (41; 9%). The remaining patients had a combination of causes or miscellaneous conditions. Prognosis was markedly different according to cause of renal failure, 3-month probability of survival being 73% for parenchymal nephropathy, 46% for hypovolemia-associated renal failure, 31% for renal failure associated with infections, and 15% for HRS (P .0005). In a multivariate analysis adjusted for potentially confounding variables, cause of renal failure was independently associated with prognosis, together with MELD score, serum sodium, and hepatic encephalopathy at time of diagnosis of renal failure. CONCLUSIONS: A simple classification of patients with cirrhosis according to cause of renal failure is useful in assessment of prognosis and may help in decision making in liver transplantation.
Resumo:
In soccer, dead-ball moves are those in which the ball is returned to play from a stationary position following an interruption of play. The aim of this study was to analyse the effectiveness of one such dead-ball move, namely corner kicks, and to identify the key variables that determine the success of a shot or header following a corner, thereby enabling a model of successful corner kicks to be proposed. We recorded 554 corner kicks performed during the 2010 World Cup in South Africa and carried out a univariate, bivariate and multivariate analysis of the data. The results indicated that corners were of limited effectiveness in terms of the success of subsequent shots or headers. The analysis also revealed a series of variables that were significantly related to one another, and this enabled us to propose an explanatory model. Although this model had limited explanatory power, it nonetheless helps to understand the execution of corner kicks in practical terms.
Resumo:
Background: The possible additional risk of infection in patients receiving induction with both basiliximab (Ba) and thymoglobulin (Th) is unclear. We assessed the 1-year incidence of infectious complications in 3 groups of kidney transplant recipients according to the type of induction therapy received.Methods: We compared the incidence of infection at 1 year in 3 groups of patients at our institution: fi rst transplant recipients received Ba 20mg at days 0 and 4 (Group Ba); in case of retransplantation or if PRA was >20% patients received Th 1 mg/kg for 3-5 days (Group Th); in case of delayed graft function (DGF), Ba was discontinued and Th was initiated (Group Ba+Th) or prolonged in Group Th. Kaplan-Meier curves were used to calculate the incidence of infection. A Cox analysis was used to identify risk factors for the development of infection.Results: Over 5 years, 170 consecutive kidney transplant recipients were performed:n=113 in Group Ba, n=39 in Group Th and n=18 in Group Ba+Th. As expected, more patients in Group Th received a second transplant (p<0.001). No differences in CMV serostatus were observed between groups (p=0.9). Incidences of CMV infection, CMV disease, BK viremia, BK nephropathy and urinary tract infection (UTI) is shown in Table 1. Table 1 Group Ba (n=113) Group Th (n=38) Group Ba+Th (n=18) CMV infection 31 (27%) 20 (51%) 8 (44%) CMV disease 7 (6%) 4 (10%) 0 BK viremia 11 (8%) 5 (13%) 4 (22%) BK nephropathy 5 (4%) 1 (2%) 2 (11%) UTI 43 (38%) 23 (59%) 6 (33%) Incidences of infection according to type of induction In a multivariate model taking into account CMV serostatus, age, pretransplant dialysis, type of organ transplanted, number of transplants and type of induction, Group Ba carried a lower risk of CMV infection (OR 0.45, p=0.006), and UTI (OR=0.6, p=0.05), but there were no differences in CMV disease (p=0.38). There was a trend towards higher incidence of BK viremia, but not nephropathy in Group Ba+Th (OR 2.2, p=0.23). There were no signifi cant differences in kidney function or graft loss at 1 year between groups.Conclusion: By multivariate analysis, we observed a lower risk of CMV infection andUTI in patients receiving Ba. The group Ba+Th had a similar risk for infection than the group receiving Th alone. Larger studies are needed to clarify whether combining Ba+Th in the setting of DGF may increase the risk of infectious complications, in particular BK infection.
Resumo:
OBJECTIVES: Patients with inflammatory bowel disease (IBD) have a high resource consumption, with considerable costs for the healthcare system. In a system with sparse resources, treatment is influenced not only by clinical judgement but also by resource consumption. We aimed to determine the resource consumption of IBD patients and to identify its significant predictors. MATERIALS AND METHODS: Data from the prospective Swiss Inflammatory Bowel Disease Cohort Study were analysed for the resource consumption endpoints hospitalization and outpatient consultations at enrolment [1187 patients; 41.1% ulcerative colitis (UC), 58.9% Crohn's disease (CD)] and at 1-year follow-up (794 patients). Predictors of interest were chosen through an expert panel and a review of the relevant literature. Logistic regressions were used for binary endpoints, and negative binomial regressions and zero-inflated Poisson regressions were used for count data. RESULTS: For CD, fistula, use of biologics and disease activity were significant predictors for hospitalization days (all P-values <0.001); age, sex, steroid therapy and biologics were significant predictors for the number of outpatient visits (P=0.0368, 0.023, 0.0002, 0.0003, respectively). For UC, biologics, C-reactive protein, smoke quitters, age and sex were significantly predictive for hospitalization days (P=0.0167, 0.0003, 0.0003, 0.0076 and 0.0175 respectively); disease activity and immunosuppressive therapy predicted the number of outpatient visits (P=0.0009 and 0.0017, respectively). The results of multivariate regressions are shown in detail. CONCLUSION: Several highly significant clinical predictors for resource consumption in IBD were identified that might be considered in medical decision-making. In terms of resource consumption and its predictors, CD and UC show a different behaviour.
Resumo:
PURPOSE: Positron emission tomography with (18)F-fluorodeoxyglucose (FDG-PET) was used to evaluate treatment response in patients with gastrointestinal stromal tumors (GIST) after administration of sunitinib, a multitargeted tyrosine kinase inhibitor, after imatinib failure. PATIENTS AND METHODS: Tumor metabolism was assessed with FDG-PET before and after the first 4 weeks of sunitinib therapy in 23 patients who received one to 12 cycles of sunitinib therapy (4 weeks of 50 mg/d, 2 weeks off). Treatment response was expressed as the percent change in maximal standardized uptake values (SUV). The primary end point of time to tumor progression was compared with early PET results on the basis of traditional Response Evaluation Criteria in Solid Tumors (RECIST) criteria. RESULTS: Progression-free survival (PFS) was correlated with early FDG-PET metabolic response (P < .0001). Using -25% and +25% thresholds for SUV variations from baseline, early FDG-PET response was stratified in metabolic partial response, metabolically stable disease, or metabolically progressive disease; median PFS rates were 29, 16, and 4 weeks, respectively. Similarly, when a single FDG-PET positive/negative was considered after 4 weeks of sunitinib, the median PFS was 29 weeks for SUVs less than 8 g/mL versus 4 weeks for SUVs of 8 g/mL or greater (P < .0001). None of the patients with metabolically progressive disease subsequently responded according to RECIST criteria. Multivariate analysis showed shorter PFS in patients who had higher residual SUVs (P < .0001), primary resistance to imatinib (P = .024), or nongastric GIST (P = .002), regardless of the mutational status of the KIT and PDGFRA genes. CONCLUSION: Week 4 FDG-PET is useful for early assessment of treatment response and for the prediction of clinical outcome. Thus, it offers opportunities to individualize and optimize patient therapy.
Resumo:
OBJECTIVE: Critically ill patients are at high risk of malnutrition. Insufficient nutritional support still remains a widespread problem despite guidelines. The aim of this study was to measure the clinical impact of a two-step interdisciplinary quality nutrition program. DESIGN: Prospective interventional study over three periods (A, baseline; B and C, intervention periods). SETTING: Mixed intensive care unit within a university hospital. PATIENTS: Five hundred seventy-two patients (age 59 ± 17 yrs) requiring >72 hrs of intensive care unit treatment. INTERVENTION: Two-step quality program: 1) bottom-up implementation of feeding guideline; and 2) additional presence of an intensive care unit dietitian. The nutrition protocol was based on the European guidelines. MEASUREMENTS AND MAIN RESULTS: Anthropometric data, intensive care unit severity scores, energy delivery, and cumulated energy balance (daily, day 7, and discharge), feeding route (enteral, parenteral, combined, none-oral), length of intensive care unit and hospital stay, and mortality were collected. Altogether 5800 intensive care unit days were analyzed. Patients in period A were healthier with lower Simplified Acute Physiologic Scale and proportion of "rapidly fatal" McCabe scores. Energy delivery and balance increased gradually: impact was particularly marked on cumulated energy deficit on day 7 which improved from -5870 kcal to -3950 kcal (p < .001). Feeding technique changed significantly with progressive increase of days with nutrition therapy (A: 59% days, B: 69%, C: 71%, p < .001), use of enteral nutrition increased from A to B (stable in C), and days on combined and parenteral nutrition increased progressively. Oral energy intakes were low (mean: 385 kcal*day, 6 kcal*kg*day ). Hospital mortality increased with severity of condition in periods B and C. CONCLUSION: A bottom-up protocol improved nutritional support. The presence of the intensive care unit dietitian provided significant additional progression, which were related to early introduction and route of feeding, and which achieved overall better early energy balance.
Resumo:
OBJECTIVE: To determine the incidence and risk factors of electrical seizures and other electrical epileptic activity using continuous EEG (cEEG) in patients with acute stroke. METHODS: One hundred consecutive patients with acute stroke admitted to our stroke unit underwent cEEG using 10 electrodes. In addition to electrical seizures, repetitive focal sharp waves (RSHWs), repetitive focal spikes (RSPs), and periodic lateralized epileptic discharges (PLEDs) were recorded. RESULTS: In the 100 patients, cEEG was recorded for a mean duration of 17 hours 34 minutes (range 1 hour 12 minutes to 37 hours 10 minutes). Epileptic activity occurred in 17 patients and consisted of RSHWs in seven, RSPs in seven, and PLEDs in three. Electrical seizures occurred in two patients. On univariate Cox regression analysis, predictors for electrical epileptic activity were stroke severity (high score on the National Institutes of Health Stroke Scale) (hazard ratio [HR] 1.12; p = 0.002), cortical involvement (HR 5.71; p = 0.021), and thrombolysis (HR 3.27; p = 0.040). Age, sex, stroke type, use of EEG-modifying medication, and cardiovascular risk factors were not predictors of electrical epileptic activity. On multivariate analysis, stroke severity was the only independent predictor (HR 1.09; p = 0.016). CONCLUSION: In patients with acute stroke, electrical epileptic activity occurs more frequently than previously suspected.
Resumo:
Background: Though commercial production of polychlorinated biphenyls was banned in the United States in 1977, exposure continues due to their environmental persistence. Several studies have examined the associationbetween environmental polychlorinated biphenyl exposure and modulations of the secondary sex ratio, with conflicting results.Objective: Our objective was to evaluate the association between maternal preconceptional occupational polychlorinated biphenyl exposure and the secondary sex ratio.Methods: We examined primipara singleton births of 2595 women, who worked in three capacitor plants at least one year during the period polychlorinated biphenyls were used. Cumulative estimated maternal occupationalpolychlorinated biphenyl exposure at the time of the infant's conception was calculated from plant-specific job exposure matrices. A logistic regression analysis was used to evaluate the association between maternalpolychlorinated biphenyl exposure and male sex at birth (yes/no).Results: Maternal body mass index at age 20, smoking status, and race did not vary between those occupationally exposed and those unexposed before the child's conception. Polychlorinated biphenyl-exposed mothers were, however, more likely to have used oral contraceptives and to have been older at the birth of their first child than non-occupationally exposed women. Among 1506 infants liveborn to polychlorinated biphenyl-exposedprimiparous women, 49.8% were male; compared to 49.9% among those not exposed (n = 1089). Multivariate analyses controlling for mother's age and year of birth found no significant association between the odds of amale birth and mother's cumulative estimated polychlorinated biphenyl exposure to time of conception.Conclusions: Based on these data, we find no evidence of altered sex ratio among children born to primiparous polychlorinated biphenyl-exposed female workers.
Resumo:
The objective of this work was to evaluate the biochemical composition of six berry types belonging to Fragaria, Rubus, Vaccinium and Ribes genus. Fruit samples were collected in triplicate (50 fruit each) from 18 different species or cultivars of the mentioned genera, during three years (2008 to 2010). Content of individual sugars, organic acids, flavonols, and phenolic acids were determined by high performance liquid chromatography (HPLC) analysis, while total phenolics (TPC) and total antioxidant capacity (TAC), by using spectrophotometry. Principal component analysis (PCA) and hierarchical cluster analysis (CA) were performed to evaluate the differences in fruit biochemical profile. The highest contents of bioactive components were found in Ribes nigrum and in Fragaria vesca, Rubus plicatus, and Vaccinium myrtillus. PCA and CA were able to partially discriminate between berries on the basis of their biochemical composition. Individual and total sugars, myricetin, ellagic acid, TPC and TAC showed the highest impact on biochemical composition of the berry fruits. CA separated blackberry, raspberry, and blueberry as isolate groups, while classification of strawberry, black and red currant in a specific group has not occurred. There is a large variability both between and within the different types of berries. Metabolite fingerprinting of the evaluated berries showed unique biochemical profiles and specific combination of bioactive compound contents.