32 resultados para Post-occupancy management
Resumo:
BACKGROUND Fractures of the mandible (lower jaw) are a common occurrence and usually related to interpersonal violence or road traffic accidents. Mandibular fractures may be treated using open (surgical) and closed (non-surgical) techniques. Fracture sites are immobilized with intermaxillary fixation (IMF) or other external or internal devices (i.e. plates and screws) to allow bone healing. Various techniques have been used, however uncertainty exists with respect to the specific indications for each approach. OBJECTIVES The objective of this review is to provide reliable evidence of the effects of any interventions either open (surgical) or closed (non-surgical) that can be used in the management of mandibular fractures, excluding the condyles, in adult patients. SEARCH METHODS We searched the following electronic databases: the Cochrane Oral Health Group's Trials Register (to 28 February 2013), the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library 2013, Issue 1), MEDLINE via OVID (1950 to 28 February 2013), EMBASE via OVID (1980 to 28 February 2013), metaRegister of Controlled Trials (to 7 April 2013), ClinicalTrials.gov (to 7 April 2013) and the WHO International Clinical Trials Registry Platform (to 7 April 2013). The reference lists of all trials identified were checked for further studies. There were no restrictions regarding language or date of publication. SELECTION CRITERIA Randomised controlled trials evaluating the management of mandibular fractures without condylar involvement. Any studies that compared different treatment approaches were included. DATA COLLECTION AND ANALYSIS At least two review authors independently assessed trial quality and extracted data. Results were to be expressed as random-effects models using mean differences for continuous outcomes and risk ratios for dichotomous outcomes with 95% confidence intervals. Heterogeneity was to be investigated to include both clinical and methodological factors. MAIN RESULTS Twelve studies, assessed as high (six) and unclear (six) risk of bias, comprising 689 participants (830 fractures), were included. Interventions examined different plate materials and morphology; use of one or two lag screws; microplate versus miniplate; early and delayed mobilization; eyelet wires versus Rapid IMF™ and the management of angle fractures with intraoral access alone or combined with a transbuccal approach. Patient-oriented outcomes were largely ignored and post-operative pain scores were inadequately reported. Unfortunately, only one or two trials with small sample sizes were conducted for each comparison and outcome. Our results and conclusions should therefore be interpreted with caution. We were able to pool the results for two comparisons assessing one outcome. Pooled data from two studies comparing two miniplates versus one miniplate revealed no significant difference in the risk of post-operative infection of surgical site (risk ratio (RR) 1.32, 95% CI 0.41 to 4.22, P = 0.64, I(2) = 0%). Similarly, no difference in post-operative infection between the use of two 3-dimensional (3D) and standard (2D) miniplates was determined (RR 1.26, 95% CI 0.19 to 8.13, P = 0.81, I(2) = 27%). The included studies involved a small number of participants with a low number of events. AUTHORS' CONCLUSIONS This review illustrates that there is currently inadequate evidence to support the effectiveness of a single approach in the management of mandibular fractures without condylar involvement. The lack of high quality evidence may be explained by clinical diversity, variability in assessment tools used and difficulty in grading outcomes with existing measurement tools. Until high level evidence is available, treatment decisions should continue to be based on the clinician's prior experience and the individual circumstances.
Resumo:
The benefits companies achieve by implementing an ERP system vary considerably. Many companies need to adapt their ERP integration solution in the post-implementation stage. But after the completion of such a usually very complex integration project, benefits do not emerge by all means. A misfit between the organization and the IS, especially the aspect of cross-functional team collaboration, could explain these divergences. Using an initial theoretical framework, we conducted a single case study to explore the team-oriented perceptions in a post-implementation ERP integration project. To analyze the benefits and the influences in greater depth we disentangled the integration benefits into their particular parts (process, system and information quality). Our findings show that post-implementation ERP integration changes are not always perceived as beneficiary by the involved teams and that cross-functional collaboration has an important influence.
Resumo:
BACKGROUND Enterococci are an important cause of central venous catheter (CVC)-associated bloodstream infections (CA-BSI). It is unclear whether CVC removal is necessary to successfully manage enterococcal CA-BSI. METHODS A 12-month retrospective cohort study of adults with enterococcal CA-BSI was conducted at a tertiary care hospital; clinical, microbiological and outcome data were collected. RESULTS A total of 111 patients had an enterococcal CA-BSI. The median age was 58.2 years (range 21 to 94 years). There were 45 (40.5%) infections caused by Entercoccus faecalis (among which 10 [22%] were vancomycin resistant), 61 (55%) by Enterococcus faecium (57 [93%] vancomycin resistant) and five (4.5%) by other Enterococcus species. Patients were treated with linezolid (n=51 [46%]), vancomycin (n=37 [33%]), daptomycin (n=11 [10%]), ampicillin (n=2 [2%]) or quinupristin/dalfopristin (n=2 [2%]); seven (n=6%) patients did not receive adequate enterococcal treatment. Additionally, 24 (22%) patients received adjunctive gentamicin treatment. The CVC was retained in 29 (26.1%) patients. Patients with removed CVCs showed lower rates of in-hospital mortality (15 [18.3%] versus 11 [37.9]; P=0.03), but similar rates of recurrent bacteremia (nine [11.0%] versus two (7.0%); P=0.7) and a similar post-BSI length of hospital stay (median days [range]) (11.1 [1.7 to 63.1 days] versus 9.3 [1.9 to 31.8 days]; P=0.3). Catheter retention was an independent predictor of mortality (OR 3.34 [95% CI 1.21 to 9.26]). CONCLUSIONS To the authors' knowledge, the present article describes the largest enterococcal CA-BSI series to date. Mortality was increased among patients who had their catheter retained. Additional prospective studies are necessary to determine the optimal management of enterococcal CA-BSI.
Resumo:
OBJECTIVE The aim of the study was to describe the (a) symptom experience of women with vulvar intraepithelial neoplasia and vulvar cancer (vulvar neoplasia) during the first week after hospital discharge, and (b) associations between age, type of disease, stage of disease, the extent of surgical treatment and symptom experience. METHODS This cross-sectional study was conducted in eight hospitals in Germany and Switzerland (Clinical Trial ID: NCT01300663). Symptom experience after surgical treatment in women with vulvar neoplasia was measured with our newly developed WOMAN-PRO instrument. Outpatients (n=65) rated 31 items. We used descriptive statistics and regression analysis. RESULTS The average number of symptoms reported per patient was 20.2 (SD 5.77) with a range of 5 to 31 symptoms. The three most prevalent wound-related symptoms were 'swelling' (n=56), 'drainage' (n=54) and 'pain' (n=52). The three most prevalent difficulties in daily life were 'sitting' (n=63), 'wearing clothes' (n=56) and 'carrying out my daily activities' (n=51). 'Tiredness' (n=62), 'insecurity' (n=54) and 'feeling that my body has changed' (n=50) were the three most prevalent psychosocial symptoms/issues. The most distressing symptoms were 'sitting' (Mean 2.03, SD 0.88), 'open spot (e.g. opening of skin or suture)' (Mean 1.91, SD 0.93), and 'carrying out my daily activities' (Mean 1.86, SD 0.87), which were on average reported as 'quite a bit' distressing. Negative associations were found between psychosocial symptom experience and age. CONCLUSIONS WOMAN-PRO data showed a high symptom prevalence and distress, call for a comprehensive symptom assessment, and may allow identification of relevant areas in symptom management.
Resumo:
Intentional weight loss among overweight and obese adults (body mass index ≥ 25 kg/m2) is associated with numerous health benefits, but weight loss maintenance (WLM) following participation in weight management programming has proven to be elusive. Many individuals attempting to lose weight join formal programs, especially women, but these programs vary widely in focus, as do postprogram weight regain results. We surveyed 2,106 former participants in a community-based, insurance-sponsored weight management program in the United States to identify the pre, during, and post-intervention behavioral and psychosocial factors that lead to successful WLM. Of 835 survey respondents (39.6% response rate), 450 met criteria for inclusion in this study. Logistic regression analyses suggest that interventionists should assess and discuss weight loss and behavior change perceptions early in a program. However, in developing maintenance plans later in a program, attention should shift to behaviors, such as weekly weighing, limiting snacking in the evening, limiting portion sizes, and being physically active every day.
Resumo:
INTRODUCTION Rates of both TB/HIV co-infection and multi-drug-resistant (MDR) TB are increasing in Eastern Europe (EE). Data on the clinical management of TB/HIV co-infected patients are scarce. Our aim was to study the clinical characteristics of TB/HIV patients in Europe and Latin America (LA) at TB diagnosis, identify factors associated with MDR-TB and assess the activity of initial TB treatment regimens given the results of drug-susceptibility tests (DST). MATERIAL AND METHODS We enrolled 1413 TB/HIV patients from 62 clinics in 19 countries in EE, Western Europe (WE), Southern Europe (SE) and LA from January 2011 to December 2013. Among patients who completed DST within the first month of TB therapy, we linked initial TB treatment regimens to the DST results and calculated the distribution of patients receiving 0, 1, 2, 3 and ≥4 active drugs in each region. Risk factors for MDR-TB were identified in logistic regression models. RESULTS Significant differences were observed between EE (n=844), WE (n=152), SE (n=164) and LA (n=253) for use of combination antiretroviral therapy (cART) at TB diagnosis (17%, 40%, 44% and 35%, p<0.0001), a definite TB diagnosis (culture and/or PCR positive for Mycobacterium tuberculosis; 47%, 71%, 72% and 40%, p<0.0001) and MDR-TB prevalence (34%, 3%, 3% and 11%, p <0.0001 among those with DST results). The history of injecting drug use [adjusted OR (aOR) = 2.03, (95% CI 1.00-4.09)], prior TB treatment (aOR = 3.42, 95% CI 1.88-6.22) and living in EE (aOR = 7.19, 95% CI 3.28-15.78) were associated with MDR-TB. For 569 patients with available DST, the initial TB treatment contained ≥3 active drugs in 64% of patients in EE compared with 90-94% of patients in other regions (Figure 1a). Had the patients received initial therapy with standard therapy [Rifampicin, Isoniazid, Pyrazinamide, Ethambutol (RHZE)], the corresponding proportions would have been 64% vs. 86-97%, respectively (Figure 1b). CONCLUSIONS In EE, TB/HIV patients had poorer exposure to cART, less often a definitive TB diagnosis and more often MDR-TB compared to other parts of Europe and LA. Initial TB therapy in EE was sub-optimal, with less than two-thirds of patients receiving at least three active drugs, and improved compliance with standard RHZE treatment does not seem to be the solution. Improved management of TB/HIV patients requires routine use of DST, initial TB therapy according to prevailing resistance patterns and more widespread use of cART.
Territory Occupancy and Parental Quality as Proxies for Spatial Prioritization of Conservation Areas
Resumo:
In order to maximize their fitness, individuals aim at choosing territories offering the most appropriate combination of resources. As population size fluctuates in time, the frequency of breeding territory occupancy reflects territory quality. We investigated the relationships between the frequency of territory occupancy (2002–2009) vs. habitat characteristics, prey abundance, reproductive success and parental traits in hoopoes Upupa epops L., with the objective to define proxies for the delineation of conservation priority areas. We predicted that the distribution of phenotypes is despotic and sought for phenotypic characteristics expressing dominance. Our findings support the hypothesis of a despotic distribution. Territory selection was non-random: frequently occupied territories were settled earlier in the season and yielded higher annual reproductive success, but the frequency of territory occupancy could not be related to any habitat characteristics. Males found in frequently occupied territories showed traits expressing dominance (i.e. larger body size and mass, and older age). In contrast, morphological traits of females were not related to the frequency of territory occupancy, suggesting that territory selection and maintenance were essentially a male's task. Settlement time in spring, reproductive success achieved in a given territory, as well as phenotypic traits and age of male territory holders reflected territory quality, providing good proxies for assessing priority areas for conservation management.
Resumo:
BACKGROUND After cardiac surgery with cardiopulmonary bypass (CPB), acquired coagulopathy often leads to post-CPB bleeding. Though multifactorial in origin, this coagulopathy is often aggravated by deficient fibrinogen levels. OBJECTIVE To assess whether laboratory and thrombelastometric testing on CPB can predict plasma fibrinogen immediately after CPB weaning. PATIENTS / METHODS This prospective study in 110 patients undergoing major cardiovascular surgery at risk of post-CPB bleeding compares fibrinogen level (Clauss method) and function (fibrin-specific thrombelastometry) in order to study the predictability of their course early after termination of CPB. Linear regression analysis and receiver operating characteristics were used to determine correlations and predictive accuracy. RESULTS Quantitative estimation of post-CPB Clauss fibrinogen from on-CPB fibrinogen was feasible with small bias (+0.19 g/l), but with poor precision and a percentage of error >30%. A clinically useful alternative approach was developed by using on-CPB A10 to predict a Clauss fibrinogen range of interest instead of a discrete level. An on-CPB A10 ≤10 mm identified patients with a post-CPB Clauss fibrinogen of ≤1.5 g/l with a sensitivity of 0.99 and a positive predictive value of 0.60; it also identified those without a post-CPB Clauss fibrinogen <2.0 g/l with a specificity of 0.83. CONCLUSIONS When measured on CPB prior to weaning, a FIBTEM A10 ≤10 mm is an early alert for post-CPB fibrinogen levels below or within the substitution range (1.5-2.0 g/l) recommended in case of post-CPB coagulopathic bleeding. This helps to minimize the delay to data-based hemostatic management after weaning from CPB.
Resumo:
The overarching objective of this dissertation is to uncover why and how individually experienced fits and misfits translate into different outcomes of user behavior and satisfaction and whether these individual fit/misfit outcomes are in line with organizational intent. In search of patterns and possible archetype users in the context of ES PIPs, this dissertation is the first study that specifically links the theoretical concepts of the aggregated individual fit experiences with the individual and organizational outcome of these experiences (i.e. behavioral reaction, user satisfaction, and alignment with organizational intent). The case study’s findings provide preliminary support for four archetype users characterized by specific fit/misfit experience-outcome patterns.
Resumo:
OBJECTIVES Rates of TB/HIV coinfection and multi-drug resistant (MDR)-TB are increasing in Eastern Europe (EE). We aimed to study clinical characteristics, factors associated with MDR-TB and predicted activity of empiric anti-TB treatment at time of TB diagnosis among TB/HIV coinfected patients in EE, Western Europe (WE) and Latin America (LA). DESIGN AND METHODS Between January 1, 2011, and December 31, 2013, 1413 TB/HIV patients (62 clinics in 19 countries in EE, WE, Southern Europe (SE), and LA) were enrolled. RESULTS Significant differences were observed between EE (N = 844), WE (N = 152), SE (N = 164), and LA (N = 253) in the proportion of patients with a definite TB diagnosis (47%, 71%, 72% and 40%, p<0.0001), MDR-TB (40%, 5%, 3% and 15%, p<0.0001), and use of combination antiretroviral therapy (cART) (17%, 40%, 44% and 35%, p<0.0001). Injecting drug use (adjusted OR (aOR) = 2.03 (95% CI 1.00-4.09), prior anti-TB treatment (3.42 (1.88-6.22)), and living in EE (7.19 (3.28-15.78)) were associated with MDR-TB. Among 585 patients with drug susceptibility test (DST) results, the empiric (i.e. without knowledge of the DST results) anti-TB treatment included ≥3 active drugs in 66% of participants in EE compared with 90-96% in other regions (p<0.0001). CONCLUSIONS In EE, TB/HIV patients were less likely to receive a definite TB diagnosis, more likely to house MDR-TB and commonly received empiric anti-TB treatment with reduced activity. Improved management of TB/HIV patients in EE requires better access to TB diagnostics including DSTs, empiric anti-TB therapy directed at both susceptible and MDR-TB, and more widespread use of cART.
Resumo:
A one-year-old healthy sheep received an implant stenting the mural ('posterior') leaflet of the mitral valve. The experiment was authorized by the Cantonal Ethical Committee. The surgery was performed on the open, beating heart during cardiopulmonary bypass (CPB). Management of anaesthesia was based on isoflurane with mechanical intermittent positive pressure ventilation (IPPV) of the lungs, combined with intercostal nerve blocks and intravenous fentanyl and lidocaine. Marked cardiovascular depression occurred towards the end of CPB time and required high doses of dopamine, dobutamine, lidocaine and ephedrine to allow for weaning off the CPB pump. Moreover, severe pulmonary dysfunction developed when IPPV was re-initiated after CPB. Hypoxaemia persisted throughout the recovery from general anaesthesia. Multiple organ failure developed gradually during the three postoperative days, leading to euthanasia of the animal. As described in this case, marked lung injury associated with some degree of failure of other vital organs may occur in sheep after CPB. Intraoperative cardiorespiratory complications when weaning-off may indicate the development of 'post-pump syndrome'.
Resumo:
AIMS Our aim was to report on a survey initiated by the European Association of Percutaneous Cardiovascular Interventions (EAPCI) collecting the opinion of the cardiology community on the invasive management of acute coronary syndrome (ACS), before and after the MATRIX trial presentation at the American College of Cardiology (ACC) 2015 Scientific Sessions. METHODS AND RESULTS A web-based survey was distributed to all individuals registered on the EuroIntervention mailing list (n=15,200). A total of 572 and 763 physicians responded to the pre- and post-ACC survey, respectively. The radial approach emerged as the preferable access site for ACS patients undergoing invasive management with roughly every other responder interpreting the evidence for mortality benefit as definitive and calling for a guidelines upgrade to class I. The most frequently preferred anticoagulant in ACS patients remains unfractionated heparin (UFH), due to higher costs and greater perceived thrombotic risks associated with bivalirudin. However, more than a quarter of participants declared the use of bivalirudin would increase after MATRIX. CONCLUSIONS The MATRIX trial reinforced the evidence for a causal association between bleeding and mortality and triggered consensus on the superiority of the radial versus femoral approach. The belief that bivalirudin mitigates bleeding risk is common, but UFH still remains the preferred anticoagulant based on lower costs and thrombotic risks.
Resumo:
OBJECTIVE Endoscopic lung volume reduction (ELVR) with valves has been shown to improve COPD patients with severe emphysema. However, a major complication is pneumothoraces, occurring typically soon after valve implantation, with severe consequences if not managed promptly. Based on the knowledge that strain activity is related to a higher risk of pneumothoraces, we asked whether modifying post-operative medical care with the inclusion of strict short-term limitation of strain activity is associated with a lower incidence of pneumothorax. METHODS Seventy-two (72) emphysematous patients without collateral ventilation were treated with bronchial valves and included in the study. Thirty-two (32) patients received standard post-implantation medical management (Standard Medical Care (SMC)), and 40 patients received a modified medical care that included an additional bed rest for 48 hours and cough suppression, as needed (Modified Medical Care (MMC)). RESULTS The baseline characteristics were similar for the two groups, except there were more males in the SMC cohort. Overall, ten pneumothoraces occurred up to four days after ELVR, eight pneumothoraces in the SMC, and only two in the MMC cohorts (p=0.02). Complicated pneumothoraces and pneumothoraces after upper lobe treatment were significantly lower in MMC (p=0.02). Major clinical outcomes showed no significant differences between the two cohorts. CONCLUSIONS In conclusion, modifying post-operative medical care to include bed rest for 48 hours after ELVR and cough suppression, if needed, might reduce the incidence of pneumothoraces. Prospective randomized studies with larger numbers of well-matched patients are needed to confirm the data.
Resumo:
OBJECTIVES To improve malnutrition awareness and management in our department of general internal medicine; to assess patients' nutritional risk; and to evaluate whether an online educational program leads to an increase in basic knowledge and more frequent nutritional therapies. METHODS A prospective pre-post intervention study at a university department of general internal medicine was conducted. Nutritional screening using Nutritional Risk Score 2002 (NRS 2002) was performed, and prescriptions of nutritional therapies were assessed. The intervention included an online learning program and a pocket card for all residents, who had to fill in a multiple-choice questions (MCQ) test about basic nutritional knowledge before and after the intervention. RESULTS A total of 342 patients were included in the preintervention phase, and 300 were in the postintervention phase. In the preintervention phase, 54.1% were at nutritional risk (NRS 2002 ≥3) compared with 61.7% in the postintervention phase. There was no increase in the prescription of nutritional therapies (18.7% versus 17.0%). Forty-nine and 41 residents (response rate 58% and 48%) filled in the MCQ test before and after the intervention, respectively. The mean percentage of correct answers was 55.6% and 59.43%, respectively (which was not significant). Fifty of 84 residents completed the online program. The residents who participated in the whole program scored higher on the second MCQ test (63% versus 55% correct answers, P = 0.031). CONCLUSIONS Despite a high ratio of malnourished patients, the nutritional intervention, as assessed by nutritional prescriptions, is insufficient. However, the simple educational program via Internet and usage of NRS 2002 pocket cards did not improve either malnutrition awareness or nutritional treatment. More sophisticated educational systems to fight malnutrition are necessary.