78 resultados para Modeling Rapport Using Hidden Markov Models
Resumo:
Some studies of patients with acute myocardial infarction have reported that hyperglycaemia at admission may be associated with a worse outcome. This study sought to evaluate the association of blood glucose at admission with the outcome of unselected patients with acute coronary syndrome (ACS). Using the Acute Myocardial Infarction and unstable angina in Switzerland (AMIS Plus) registry, ACS patients were stratified according to their blood glucose on admission: group 1: 2.80-6.99 mmol/L, group 2: 7.00-11.09 mmol/L and group 3: > 11.10 mmol/L. Odds ratios for in-hospital mortality were calculated using logistic regression models. Of 2,786 patients, 73% were male and 21% were known to have diabetes. In-hospital mortality increased from 3% in group 1 to 7% in group 2 and to 15% in group 3. Higher glucose levels were associated with larger enzymatic infarct sizes (p<0.001) and had a weak negative correlation with angiographic or echographic left ventricular ejection fraction. High admission glycaemia in ACS patients remains a significant independent predictor of in-hospital mortality (adjusted OR 1.08; 95% confidence intervals [CI] 1.05-1.14, p<0.001) per mmol/L. The OR for in-hospital mortality was 1.04 (95% CI 0.99-1.1; p=0.140) per mmol/L for patients with diabetes but 1.21 (95% CI 112-1.30; p<0.001) per mmol/L for non-diabetic patients. In conclusion, elevated glucose level in ACS patients on admission is a significant independent predictor of in-hospital mortality and is even more important for patients who do not have known diabetes.
Resumo:
Abnormal morphology of the hip has been associated with primary osteoarthrosis. We evaluated the morphology of 464 consecutive hips contralateral to hips treated by THA. We excluded all hips with known diagnoses leading to secondary osteoarthritis and all hips with advanced arthrosis to eliminate the effect of arthritic remodeling on the morphologic measurements. Of the remaining 119 hips, 25 were in patients aged 60 years or older who had no or mild arthrosis (Tönnis Grade 0 or 1) and 94 hips had Tönnis Grade 2 osteoarthrosis. We quantified morphologic parameters on plain radiographs and CT images and simulated range of motion using virtual bone models from the CT data. The nonarthritic hips had fewer pathomorphologic findings. High alpha angles and high lateral center edge angles were strongly associated with the presence of arthritis; decreased internal and external rotation in 90 degrees flexion showed lesser correlation. The data confirm previous observations that abnormal hip morphology predates arthrosis and is not secondary to the osteoarthritic process. Hips at risk for developing arthrosis resulting from pathomorphologic changes may potentially be identified at the cessation of growth, long before the development of osteoarthrosis.
Resumo:
OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.
Resumo:
PURPOSE OF REVIEW: Intensive care medicine consumes a high share of healthcare costs, and there is growing pressure to use the scarce resources efficiently. Accordingly, organizational issues and quality management have become an important focus of interest in recent years. Here, we will review current concepts of how outcome data can be used to identify areas requiring action. RECENT FINDINGS: Using recently established models of outcome assessment, wide variability between individual ICUs is found, both with respect to outcome and resource use. Such variability implies that there are large differences in patient care processes not only within the ICU but also in pre-ICU and post-ICU care. Indeed, measures to improve the patient process in the ICU (including care of the critically ill, patient safety, and management of the ICU) have been presented in a number of recently published papers. SUMMARY: Outcome assessment models provide an important framework for benchmarking. They may help the individual ICU to spot appropriate fields of action, plan and initiate quality improvement projects, and monitor the consequences of such activity.
Resumo:
BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.
Resumo:
On Swiss rabbit breeding farms, group-housed does are usually kept singly for 12 days around parturition to avoid pseudograviclity, double litters and deleterious fighting for nests. After this isolation phase there is usually an integration of new group members. Here we studied whether keeping the group composition stable would reduce agonistic interactions, stress levels and injuries when regrouping after the isolation phase. Does were kept in 12 pens containing 8 rabbits each. In two trials, with a total of 24 groups, the group composition before and after the 12 days isolation period remained the same (treatment: stable, S) in 12 groups. In the other 12 groups two or three does were replaced after the isolation phase by unfamiliar does (treatment: mixed, M). Does of S-groups had been housed together for one reproduction cycle. One day before and on days 2, 4 and 6 after regrouping, data on lesions, stress levels (faecal corticosterone metabolites, FCM) and agonistic interactions were collected and statistically analysed using mixed effects models. Lesion scores and the frequency of agonistic interactions were highest on day 2 after regrouping and thereafter decrease in both groups. There was a trend towards more lesions in M-groups compared to S-groups. After regrouping FCM levels were increased in M-groups, but not in S-groups. Furthermore, there was a significant interaction of treatment and experimental day on agonistic interactions. Thus, the frequency of biting and boxing increased more in M-groups than in S-groups. These findings indicate that group stability had an effect on agonistic interactions, stress and lesions. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
INTRODUCTION Our objective was to investigate potential associations between maxillary sinus floor extension and inclination of maxillary second premolars and second molars in patients with Class II Division 1 malocclusion whose orthodontic treatment included maxillary first molar extractions. METHODS The records of 37 patients (18 boys, 19 girls; mean age, 13.2 years; SD, 1.62 years) treated between 1998 and 2004 by 1 orthodontist with full Begg appliances were used in this study. Inclusion criteria were white patients with Class II Division 1 malocclusion, sagittal overjet of ≥4 mm, treatment plan including extraction of the maxillary first permanent molars, no missing teeth, and no agenesis. Maxillary posterior tooth inclination and lower maxillary sinus area in relation to the palatal plane were measured on lateral cephalograms at 3 time points: at the start and end of treatment, and on average 2.5 years posttreatment. Data were analyzed for the second premolar and second molar inclinations by using mixed linear models. RESULTS The analysis showed that the second molar inclination angle decreased by 7° after orthodontic treatment, compared with pretreatment values, and by 11.5° at the latest follow-up, compared with pretreatment. There was evidence that maxillary sinus volume was negatively correlated with second molar inclination angle; the greater the volume, the smaller the inclination angle. For premolars, inclination increased by 15.4° after orthodontic treatment compared with pretreatment, and by 8.1° at the latest follow-up compared with baseline. The volume of the maxillary sinus was not associated with premolar inclination. CONCLUSIONS We found evidence of an association between maxillary second molar inclination and surface area of the lower sinus in patients treated with maxillary first molar extractions. Clinicians who undertake such an extraction scheme in Class II patients should be aware of this potential association and consider appropriate biomechanics to control root uprighting.
Resumo:
BACKGROUND Results of epidemiological studies linking census with mortality records may be affected by unlinked deaths and changes in cause of death classification. We examined these issues in the Swiss National Cohort (SNC). METHODS The SNC is a longitudinal study of the entire Swiss population, based on the 1990 (6.8 million persons) and 2000 (7.3 million persons) censuses. Among 1,053,393 deaths recorded 1991-2007 5.4% could not be linked using stringent probabilistic linkage. We included the unlinked deaths using pragmatic linkages and compared mortality rates for selected causes with official mortality rates. We also examined the impact of the 1995 change in cause of death coding from version 8 (with some additional rules) to version 10 of the International Classification of Diseases (ICD), using Poisson regression models with restricted cubic splines. Finally, we compared results from Cox models including and excluding unlinked deaths of the association of education, marital status, and nationality with selected causes of death. RESULTS SNC mortality rates underestimated all cause mortality by 9.6% (range 2.4%-17.9%) in the 85+ population. Underestimation was less pronounced in years nearer the censuses and in the 75-84 age group. After including 99.7% of unlinked deaths, annual all cause SNC mortality rates were reflecting official rates (relative difference between -1.4% and +1.8%). In the 85+ population the rates for prostate and breast cancer dropped, by 16% and 21% respectively, between 1994 and 1995 coincident with the change in cause of death coding policy. For suicide in males almost no change was observed. Hazard ratios were only negligibly affected by including the unlinked deaths. A sudden decrease in breast (21% less, 95% confidence interval: 12%-28%) and prostate (16% less, 95% confidence interval: 7%-23%) cancer mortality rates in the 85+ population coincided with the 1995 change in cause of death coding policy. CONCLUSIONS Unlinked deaths bias analyses of absolute mortality rates downwards but have little effect on relative mortality. To describe time trends of cause-specific mortality in the SNC, accounting for the unlinked deaths and for the possible effect of change in death certificate coding was necessary.
Resumo:
OBJECTIVES Femoroacetabular impingement is proposed to cause early osteoarthritis (OA) in the non-dysplastic hip. We previously reported on the prevalence of femoral deformities in a young asymptomatic male population. The aim of this study was to determine the prevalence of both femoral and acetabular types of impingement in young females. METHODS We conducted a population-based cross-sectional study of asymptomatic young females. All participants completed a set of questionnaires and underwent clinical examination of the hip. A random sample was subsequently invited to obtain magnetic resonance images (MRI) of the hip. All MRIs were read for cam-type deformities, increased acetabular depths, labral lesions, and impingement pits. Prevalence estimates of cam-type deformities and increased acetabular depths were estimated, and relationships between deformities and signs of joint damage were examined using logistic regression models. RESULTS The study included 283 subjects, and 80 asymptomatic females with a mean age of 19.3 years attended MRI. Fifteen showed some evidence of cam-type deformities, but none were scored to be definite. The overall prevalence was therefore 0% [95% confidence interval (95% CI) 0-5%]. The prevalence of increased acetabular depth was 10% (95% CI 5-19). No association was found between increased acetabular depth and decreased internal rotation of the hip. Increased acetabular depth was not associated with signs of labral damage. CONCLUSIONS Definite cam-type deformities in women are rare compared to men, whereas the prevalence of increased acetabular depth is higher, suggesting that femoroacetabular impingement has different gender-related biomechanical mechanisms.
Resumo:
According to Bandura (1997) efficacy beliefs are a primary determinant of motivation. Still, very little is known about the processes through which people integrate situational factors to form efficacy beliefs (Myers & Feltz, 2007). The aim of this study was to gain insight into the cognitive construction of subjective group-efficacy beliefs. Only with a sound understanding of those processes is there a sufficient base to derive psychological interventions aimed at group-efficacy beliefs. According to cognitive theories (e.g., Miller, Galanter, & Pribram, 1973) individual group-efficacy beliefs can be seen as the result of a comparison between the demands of a group task and the resources of the performing group. At the center of this comparison are internally represented structures of the group task and plans to perform it. The empirical plausibility of this notion was tested using functional measurement theory (Anderson, 1981). Twenty-three students (M = 23.30 years; SD = 3.39; 35 % females) of the University of Bern repeatedly judged the efficacy of groups in different group tasks. The groups consisted of the subjects and another one to two fictive group members. The latter were manipulated by their value (low, medium, high) in task-relevant abilities. Data obtained from multiple full factorial designs were structured with individuals as second level units and analyzed using mixed linear models. The task-relevant abilities of group members, specified as fixed factors, all had highly significant effects on subjects’ group-efficacy judgments. The effect sizes of the ability factors showed to be dependent on the respective abilities’ importance in a given task. In additive tasks (Steiner, 1972) group resources were integrated in a linear fashion whereas significant interaction between factors was obtained in interdependent tasks. The results also showed that people take into account other group members’ efficacy beliefs when forming their own group-efficacy beliefs. The results support the notion that personal group-efficacy beliefs are obtained by comparing the demands of a task with the performing groups’ resources. Psychological factors such as other team members’ efficacy beliefs are thereby being considered task relevant resources and affect subjective group-efficacy beliefs. This latter finding underlines the adequacy of multidimensional measures. While the validity of collective efficacy measures is usually estimated by how well they predict performances, the results of this study allow for a somewhat internal validity criterion. It is concluded that Information Integration Theory holds potential to further help understand people’s cognitive functioning in sport relevant situations.
Resumo:
Urea transporters (UTs) belonging to the solute carrier 14 (SLC14) family comprise two genes with a total of eight isoforms in mammals, UT-A1 to -A6 encoded by SLC14A2 and UT-B1 to -B2 encoded by SLC14A1. Recent efforts have been directed toward understanding the molecular and cellular mechanisms involved in the regulation of UTs using transgenic mouse models and heterologous expression systems, leading to important new insights. Urea uptake by UT-A1 and UT-A3 in the kidney inner medullary collecting duct and by UT-B1 in the descending vasa recta for the countercurrent exchange system are chiefly responsible for medullary urea accumulation in the urinary concentration process. Vasopressin, an antidiuretic hormone, regulates UT-A isoforms via the phosphorylation and trafficking of the glycosylated transporters to the plasma membrane that occurs to maintain equilibrium with the exocytosis and ubiquitin-proteasome degradation pathways. UT-B isoforms are also important in several cellular functions, including urea nitrogen salvaging in the colon, nitric oxide pathway modulation in the hippocampus, and the normal cardiac conduction system. In addition, genomic linkage studies have revealed potential additional roles for SLC14A1 and SLC14A2 in hypertension and bladder carcinogenesis. The precise role of UT-A2 and presence of the urea recycling pathway in normal kidney are issues to be further explored. This review provides an update of these advances and their implications for our current understanding of the SLC14 UTs.
Resumo:
The early phase of psychotherapy has been regarded as a sensitive period in the unfolding of psychotherapy leading to positive outcomes. However, there is disagreement about the degree to which early (especially relationship-related) session experiences predict outcome over and above initial levels of distress and early response to treatment. The goal of the present study was to simultaneously examine outcome at post treatment as a function of (a) intake symptom and interpersonal distress as well as early change in well-being and symptoms, (b) the patient's early session-experiences, (c) the therapist's early session-experiences/interventions, and (d) their interactions. The data of 430 psychotherapy completers treated by 151 therapists were analyzed using hierarchical linear models. Results indicate that early positive intra- and interpersonal session experiences as reported by patients and therapists after the sessions explained 58% of variance of a composite outcome measure, taking intake distress and early response into account. All predictors (other than problem-activating therapists' interventions) contributed to later treatment outcomes if entered as single predictors. However, the multi-predictor analyses indicated that interpersonal distress at intake as well as the early interpersonal session experiences by patients and therapists remained robust predictors of outcome. The findings underscore that early in therapy therapists (and their supervisors) need to understand and monitor multiple interconnected components simultaneously
Resumo:
BACKGROUND Open radical cystectomy (ORC) is associated with substantial blood loss and a high incidence of perioperative blood transfusions. Strategies to reduce blood loss and blood transfusion are warranted. OBJECTIVE To determine whether continuous norepinephrine administration combined with intraoperative restrictive hydration with Ringer's maleate solution can reduce blood loss and the need for blood transfusion. DESIGN, SETTING, AND PARTICIPANTS This was a double-blind, randomised, parallel-group, single-centre trial including 166 consecutive patients undergoing ORC with urinary diversion (UD). Exclusion criteria were severe hepatic or renal dysfunction, congestive heart failure, and contraindications to epidural analgesia. INTERVENTION Patients were randomly allocated to continuous norepinephrine administration starting with 2 μg/kg per hour combined with 1 ml/kg per hour until the bladder was removed, then to 3 ml/kg per hour of Ringer's maleate solution (norepinephrine/low-volume group) or 6 ml/kg per hour of Ringer's maleate solution throughout surgery (control group). OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Intraoperative blood loss and the percentage of patients requiring blood transfusions perioperatively were assessed. Data were analysed using nonparametric statistical models. RESULTS AND LIMITATIONS Total median blood loss was 800 ml (range: 300-1700) in the norepinephrine/low-volume group versus 1200 ml (range: 400-2800) in the control group (p<0.0001). In the norepinephrine/low-volume group, 27 of 83 patients (33%) required an average of 1.8 U (±0.8) of packed red blood cells (PRBCs). In the control group, 50 of 83 patients (60%) required an average of 2.9 U (±2.1) of PRBCs during hospitalisation (relative risk: 0.54; 95% confidence interval [CI], 0.38-0.77; p=0.0006). The absolute reduction in transfusion rate throughout hospitalisation was 28% (95% CI, 12-45). In this study, surgery was performed by three high-volume surgeons using a standardised technique, so whether these significant results are reproducible in other centres needs to be shown. CONCLUSIONS Continuous norepinephrine administration combined with restrictive hydration significantly reduces intraoperative blood loss, the rate of blood transfusions, and the number of PRBC units required per patient undergoing ORC with UD.
Resumo:
Background Non-adherence is one of the strongest predictors of therapeutic failure in HIV-positive patients. Virologic failure with subsequent emergence of resistance reduces future treatment options and long-term clinical success. Methods Prospective observational cohort study including patients starting new class of antiretroviral therapy (ART) between 2003 and 2010. Participants were naïve to ART class and completed ≥1 adherence questionnaire prior to resistance testing. Outcomes were development of any IAS-USA, class-specific, or M184V mutations. Associations between adherence and resistance were estimated using logistic regression models stratified by ART class. Results Of 314 included individuals, 162 started NNRTI and 152 a PI/r regimen. Adherence was similar between groups with 85% reporting adherence ≥95%. Number of new mutations increased with increasing non-adherence. In NNRTI group, multivariable models indicated a significant linear association in odds of developing IAS-USA (odds ratio (OR) 1.66, 95% confidence interval (CI): 1.04-2.67) or class-specific (OR 1.65, 95% CI: 1.00-2.70) mutations. Levels of drug resistance were considerably lower in PI/r group and adherence was only significantly associated with M184V mutations (OR 8.38, 95% CI: 1.26-55.70). Adherence was significantly associated with HIV RNA in PI/r but not NNRTI regimens. Conclusion Therapies containing PI/r appear more forgiving to incomplete adherence compared with NNRTI regimens, which allow higher levels of resistance, even with adherence above 95%. However, in failing PI/r regimens good adherence may prevent accumulation of further resistance mutations and therefore help to preserve future drug options. In contrast, adherence levels have little impact on NNRTI treatments once the first mutations have emerged.
Resumo:
We assessed the impact of antiviral prophylaxis and preemptive therapy on the incidence and outcomes of cytomegalovirus (CMV) disease in a nationwide prospective cohort of solid organ transplant recipients. Risk factors associated with CMV disease and graft failure-free survival were analyzed using Cox regression models. One thousand two hundred thirty-nine patients transplanted from May 2008 until March 2011 were included; 466 (38%) patients received CMV prophylaxis and 522 (42%) patients were managed preemptively. Overall incidence of CMV disease was 6.05% and was linked to CMV serostatus (D+/R− vs. R+, hazard ratio [HR] 5.36 [95% CI 3.14–9.14], p < 0.001). No difference in the incidence of CMV disease was observed in patients receiving antiviral prophylaxis as compared to the preemptive approach (HR 1.16 [95% CI 0.63–2.17], p = 0.63). CMV disease was not associated with a lower graft failure-free survival (HR 1.27 [95% CI 0.64–2.53], p = 0.50). Nevertheless, patients followed by the preemptive approach had an inferior graft failure-free survival after a median of 1.05 years of follow-up (HR 1.63 [95% CI 1.01–2.64], p = 0.044). The incidence of CMV disease in this cohort was low and not influenced by the preventive strategy used. However, patients on CMV prophylaxis were more likely to be free from graft failure.