53 resultados para Automotive supplies - Design - Simulation methods
Resumo:
BACKGROUND: In high-income countries, viral load is routinely measured to detect failure of antiretroviral therapy (ART) and guide switching to second-line ART. Viral load monitoring is not generally available in resource-limited settings. We examined switching from nonnucleoside reverse transcriptase inhibitor (NNRTI)-based first-line regimens to protease inhibitor-based regimens in Africa, South America and Asia. DESIGN AND METHODS: Multicohort study of 17 ART programmes. All sites monitored CD4 cell count and had access to second-line ART and 10 sites monitored viral load. We compared times to switching, CD4 cell counts at switching and obtained adjusted hazard ratios for switching (aHRs) with 95% confidence intervals (CIs) from random-effects Weibull models. RESULTS: A total of 20 113 patients, including 6369 (31.7%) patients from 10 programmes with access to viral load monitoring, were analysed; 576 patients (2.9%) switched. Low CD4 cell counts at ART initiation were associated with switching in all programmes. Median time to switching was 16.3 months [interquartile range (IQR) 10.1-26.6] in programmes with viral load monitoring and 21.8 months (IQR 14.0-21.8) in programmes without viral load monitoring (P < 0.001). Median CD4 cell counts at switching were 161 cells/microl (IQR 77-265) in programmes with viral load monitoring and 102 cells/microl (44-181) in programmes without viral load monitoring (P < 0.001). Switching was more common in programmes with viral load monitoring during months 7-18 after starting ART (aHR 1.38; 95% CI 0.97-1.98), similar during months 19-30 (aHR 0.97; 95% CI 0.58-1.60) and less common during months 31-42 (aHR 0.29; 95% CI 0.11-0.79). CONCLUSION: In resource-limited settings, switching to second-line regimens tends to occur earlier and at higher CD4 cell counts in ART programmes with viral load monitoring compared with programmes without viral load monitoring.
Resumo:
BACKGROUND: The purpose of the study was to investigate allogeneic blood transfusion (ABT) and preoperative anemia as risk factors for surgical site infection (SSI). STUDY DESIGN AND METHODS: A prospective, observational cohort of 5873 consecutive general surgical procedures at Basel University Hospital was analyzed to determine the relationship between perioperative ABT and preoperative anemia and the incidence of SSI. ABT was defined as transfusion of leukoreduced red blood cells during surgery and anemia as hemoglobin concentration of less than 120 g/L before surgery. Surgical wounds and resulting infections were assessed to Centers for Disease Control standards. RESULTS: The overall SSI rate was 4.8% (284 of 5873). In univariable logistic regression analyses, perioperative ABT (crude odds ratio [OR], 2.93; 95% confidence interval [CI], 2.1 to 4.0; p < 0.001) and preoperative anemia (crude OR, 1.32; 95% CI, 1.0 to 1.7; p = 0.037) were significantly associated with an increased odds of SSI. After adjusting for 13 characteristics of the patient and the procedure in multivariable analyses, associations were substantially reduced for ABT (OR, 1.25; 95% CI, 0.8 to 1.9; p = 0.310; OR, 1.07; 95% CI, 0.6 to 2.0; p = 0.817 for 1-2 blood units and >or=3 blood units, respectively) and anemia (OR, 0.91; 95% CI, 0.7 to 1.2; p = 0.530). Duration of surgery was the main confounding variable. CONCLUSION: Our findings point to important confounding factors and strengthen existing doubts on leukoreduced ABT during general surgery and preoperative anemia as risk factors for SSIs.
Resumo:
BACKGROUND: Patients with apparent complete recovery from thrombotic thrombocytopenic purpura (TTP) often complain of problems with memory, concentration, and fatigue. STUDY DESIGN AND METHODS: Twenty-four patients who were enrolled in the Oklahoma TTP-HUS Registry for their initial episode of TTP, 1995-2006, and who had ADAMTS13 activity of less than 10 percent were evaluated for a broad range of cognitive functions 0.1 to 10.6 years (median, 4.0 years) after their most recent episode. At the time of their evaluation, they had normal physical and Mini-Mental State Examinations and no evidence of TTP. RESULTS: The patients, as a group, performed significantly worse on 4 of the 11 cognitive domains tested than standardized US data from neurologically normal individuals adjusted for age, sex, and education (p < 0.05). These four domains measured complex attention and concentration skills, information processing speed, rapid language generation, and rote memorization. Twenty-one (88%) patients performed below expectations on at least 1 of the 11 domains. No clear patterns were observed between cognitive test results and patients' characteristics or features of the preceding TTP, including age, occurrence of severe neurologic abnormalities, multiple episodes, and interval from an acute episode. CONCLUSION: Patients who have recovered from TTP may have persistent cognitive abnormalities. The abnormalities observed in these patients are characteristic of disorders associated with diffuse subcortical microvascular disease. Studies of larger patient groups will be required to confirm these preliminary observations and to determine patient characteristics that may contribute to persistent cognitive abnormalities.
Resumo:
BACKGROUND: Reports of deterioration and death after platelet (PLT) transfusions in patients with thrombotic thrombocytopenic purpura (TTP) have led to recommendations that they should not be given except for life-threatening hemorrhage. STUDY DESIGN AND METHODS: Published reports of PLT transfusions in patients with TTP were systematically reviewed and data from the Oklahoma TTP-HUS Registry, an inception cohort of 382 consecutive patients, 1989 through 2007, were analyzed. RESULTS: A systematic review identified 34 publications describing outcomes of patients with TTP after PLT transfusions: 9 articles attributed complications to PLT transfusions, 4 suggested that they may be safe, and 21 articles did not comment about a relation between PLT transfusions and outcomes. Fifty-four consecutive patients from the Oklahoma TTP-HUS Registry were prospectively analyzed. ADAMTS13 activity was less than 10 percent in 47 patients; also included were 7 patients whose activity was not measured but who may have been deficient. Thirty-three (61%) patients received PLT transfusions. The frequency of death was not different between the two groups (p = 0.971): 8 (24%) patients who received PLT transfusions died (thrombosis, 5; hemorrhage, 1; sepsis, 2) and 5 (24%) patients who did not receive PLT transfusions died (thrombosis, 4; hemorrhage, 1). The frequency of severe neurologic events was also not different (p = 0.190): 17 (52%) patients who received PLT transfusions (in 5 of these 17 patients, neurologic events only occurred before PLT transfusions) and 7 (33%) patients who did not receive PLT transfusions. CONCLUSION: Evidence for harm from PLT transfusions in patients with TTP is uncertain.
Resumo:
OBJECTIVE: To determine the characteristics of asthma (A) and allergic rhinitis (AR) among asthma patients in primary care practice. RESEARCH DESIGN AND METHODS: Primary care physicians, pulmonologists, and allergologists were asked to recruit consecutive asthma patients with or without allergic rhinitis from their daily practice. Cross-sectional data on symptoms, severity, treatment and impact on quality of life of A and AR were recorded and examined using descriptive statistics. Patients with and without AR were then compared. RESULTS: 1244 asthma patients were included by 211 physicians. Asthma was controlled in 19%, partially controlled in 27% and not controlled in 54%. Asthma treatment was generally based on inhaled corticosteroids (ICS) with or without long acting beta 2 agonists (78%). A leukotriene receptor antagonist (LTRA) was used by 46% of the patients. Overall, 950 (76%) asthma patients had AR (A + AR) and 294 (24%) did not (A - AR). Compared to patients with A - AR, A + AR patients were generally younger (mean age +/- standard deviation: 42 +/- 16 vs. 50 +/- 19 years, p < 0.001) and fewer used ICS (75% vs. 88%, p < 0.001). LTRA usage was similar in both groups (46% vs. 48%). Asthma was uncontrolled in 53% of A + AR and 57% of A - AR patients. Allergic rhinitis was treated with a mean of 1.9 specific AR medications: antihistamines (77%), nasal steroids (66%) and/or vasoconstrictors (38%), and/or LTRA (42%). Rhinorrhoea, nasal obstruction, or nasal itching were the most frequently reported AR symptoms and the greatest reported degree of impairment was in daily activities/sports (55%). CONCLUSIONS: Allergic rhinitis was more common among younger asthma patients, increased the burden of symptoms and the need for additional medication but was associated with improved asthma control. However, most asthma patients remained suboptimally controlled regardl-ess of concomitant AR.
Resumo:
OBJECTIVE Little information is available on the early course of hypertension in type 1 diabetes. The aim of our study, therefore, was to document circadian blood pressure profiles in patients with a diabetes duration of up to 20 years and relate daytime and nighttime blood pressure to duration of diabetes, BMI, insulin therapy, and HbA1c. RESEARCH DESIGN AND METHODS Ambulatory profiles of 24-h blood pressure were recorded in 354 pediatric patients with type 1 diabetes (age 14.6 +/- 4.2 years, duration of diabetes 5.6 +/- 5.0 years, follow-up for up to 9 years). A total of 1,011 profiles were available for analysis from patients not receiving antihypertensive medication. RESULTS Although daytime mean systolic pressure was significantly elevated in diabetic subjects (+3.1 mmHg; P < 0.0001), daytime diastolic pressure was not different from from the height- and sex-adjusted normal range (+0.1 mmHg, NS). In contrast, both systolic and diastolic nighttime values were clearly elevated (+7.2 and +4.2 mmHg; P < 0.0001), and nocturnal dipping was reduced (P < 0.0001). Systolic blood pressure was related to overweight in all patients, while diastolic blood pressure was related to metabolic control in young adults. Blood pressure variability was significantly lower in girls compared with boys (P < 0.01). During follow-up, no increase of blood pressure was noted; however, diastolic nocturnal dipping decreased significantly (P < 0.03). Mean daytime blood pressure was significantly related to office blood pressure (r = +0.54 for systolic and r = +0.40 for diastolic pressure); however, hypertension was confirmed by ambulatory blood pressure measurement in only 32% of patients with elevated office blood pressure. CONCLUSIONS During the early course of type 1 diabetes, daytime blood pressure is higher compared with that of healthy control subjects. The elevation of nocturnal values is even more pronounced and nocturnal dipping is reduced. The frequency of white-coat hypertension is high among adolescents with diabetes, and ambulatory blood pressure monitoring avoids unnecessary antihypertensive treatment.
Resumo:
BACKGROUND Heart failure with preserved ejection fraction (HFpEF) is remarkably common in elderly people with highly prevalent comorbid conditions. Despite its increasing in prevalence, there is no evidence-based effective therapy for HFpEF. We sought to evaluate whether inspiratory muscle training (IMT) improves exercise capacity, as well as left ventricular diastolic function, biomarker profile and quality of life (QoL) in patients with advanced HFpEF and nonreduced maximal inspiratory pressure (MIP). DESIGN AND METHODS A total of 26 patients with HFpEF (median (interquartile range) age, peak exercise oxygen uptake (peak VO2) and left ventricular ejection fraction of 73 years (66-76), 10 ml/min/kg (7.6-10.5) and 72% (65-77), respectively) were randomized to receive a 12-week programme of IMT plus standard care vs. standard care alone. The primary endpoint of the study was evaluated by positive changes in cardiopulmonary exercise parameters and distance walked in 6 minutes (6MWT). Secondary endpoints were changes in QoL, echocardiogram parameters of diastolic function, and prognostic biomarkers. RESULTS The IMT group improved significantly their MIP (p < 0.001), peak VO2 (p < 0.001), exercise oxygen uptake at anaerobic threshold (p = 0.001), ventilatory efficiency (p = 0.007), metabolic equivalents (p < 0,001), 6MWT (p < 0.001), and QoL (p = 0.037) as compared to the control group. No changes on diastolic function parameters or biomarkers levels were observed between both groups. CONCLUSIONS In HFpEF patients with low aerobic capacity and non-reduced MIP, IMT was associated with marked improvement in exercise capacity and QoL.
Resumo:
Software-maintenance offshore outsourcing (SMOO) projects have been plagued by tedious knowledge transfer during the service transition to the vendor. Vendor engineers risk being over-strained by the high amounts of novel information, resulting in extra costs that may erode the business case behind offshoring. Although stakeholders may desire to avoid these extra costs by implementing appropriate knowledge transfer practices, little is known on how effective knowledge transfer can be designed and managed in light of the high cognitive loads in SMOO transitions. The dissertation at hand addresses this research gap by presenting and integrating four studies. The studies draw on cognitive load theory, attributional theory, and control theory and they apply qualitative, quantitative, and simulation methods to qualitative data from eight in-depth longitudinal cases. The results suggest that the choice of appropriate learning tasks may be more central to knowledge transfer than the amount of information shared with vendor engineers. Moreover, because vendor staff may not be able to and not dare to effectively self-manage learn-ing tasks during early transition, client-driven controls may be initially required and subsequently faded out. Collectively, the results call for people-based rather than codification-based knowledge management strategies in at least moderately specific and complex software environments.
Resumo:
Susceptibility of different restorative materials to toothbrush abrasion and coffee staining Objective: The aim of this study was to evaluate the susceptibility of different restorative materials to surface alterations after an aging simulation. Methods: Specimens (n=15 per material) of five different restorative materials (CER: ceramic/Vita Mark II; EMP: composite/Empress Direct; LAV: CAD/CAM composite/Lava Ultimate; COM: prefabricated composite/Componeer; VEN: prefabricated composite/Venear) were produced. Whereas CER was glazed, EMP and LAV were polished with silicon polishers, and COM and VEN were left untreated. Mean roughness (Ra and Rz) and colorimetric parameters (L*a*b*), expressed as colour change (E), were measured. The specimens underwent an artificial aging procedure. After baseline measurements (M1), the specimens were successively immersed for 24 hours in coffee (M2), abraded in a toothbrushing simulator (M3), immersed in coffee (M4), abraded (M5) and repeatedly abraded (M6). After each aging procedure (M2-M6), surface roughness and colorimetric parameters were recorded. Differences between the materials regarding Ra/Rz and E were analysed with a nonparametric ANOVA analysis. The level of significance was set at α=0.05. Results: The lowest roughness values were obtained for CER. A significant increase in Ra was detected for EMP, COM and VEN compared to CER. The Ra/Rz values were found to be highly significantly different for the materials and measuring times (M) (p<0.0001). Regarding E most alterations were found for EMP and COM, whereas CER and LAV remained mostly stable. The E values were significantly different for the materials and M (p<0.0001). Conclusion: The ceramic and the CAD/CAM composite were the most stable materials with regard to roughness and colour change and the only materials that resulted in Ra values below 0.2 μm (the clinically relevant threshold). Venears and Componeers were more inert than the direct composite material and thus might be an alternative for extensive restorations in the aesthetic zone.
Resumo:
BACKGROUND Cytomegalovirus (CMV) retinitis is a major cause of visual impairment and blindness among patients with uncontrolled HIV infections. Whereas polymorphisms in interferon-lambda 3 (IFNL3, previously named IL28B) strongly influence the clinical course of hepatitis C, few studies examined the role of such polymorphisms in infections due to viruses other than hepatitis C virus. OBJECTIVES To analyze the association of newly identified IFNL3/4 variant rs368234815 with susceptibility to CMV-associated retinitis in a cohort of HIV-infected patients. DESIGN AND METHODS This retrospective longitudinal study included 4884 white patients from the Swiss HIV Cohort Study, among whom 1134 were at risk to develop CMV retinitis (CD4 nadir <100 /μl and positive CMV serology). The association of CMV-associated retinitis with rs368234815 was assessed by cumulative incidence curves and multivariate Cox regression models, using the estimated date of HIV infection as a starting point, with censoring at death and/or lost follow-up. RESULTS A total of 40 individuals among 1134 patients at risk developed CMV retinitis. The minor allele of rs368234815 was associated with a higher risk of CMV retinitis (log-rank test P = 0.007, recessive mode of inheritance). The association was still significant in a multivariate Cox regression model (hazard ratio 2.31, 95% confidence interval 1.09-4.92, P = 0.03), after adjustment for CD4 nadir and slope, HAART and HIV-risk groups. CONCLUSION We reported for the first time an association between an IFNL3/4 polymorphism and susceptibility to AIDS-related CMV retinitis. IFNL3/4 may influence immunity against viruses other than HCV.
Resumo:
BACKGROUND Dimethyl sulfoxide (DMSO) is essential for the preservation of liquid nitrogen-frozen stem cells, but is associated with toxicity in the transplant recipient. STUDY DESIGN AND METHODS In this prospective noninterventional study, we describe the use of DMSO in 64 European Blood and Marrow Transplant Group centers undertaking autologous transplantation on patients with myeloma and lymphoma and analyze side effects after return of DMSO-preserved stem cells. RESULTS While the majority of centers continue to use 10% DMSO, a significant proportion either use lower concentrations, mostly 5 or 7.5%, or wash cells before infusion (some for selected patients only). In contrast, the median dose of DMSO given (20 mL) was much less than the upper limit set by the same institutions (70 mL). In an accompanying statistical analysis of side effects noted after return of DMSO-preserved stem cells, we show that patients in the highest quartile receiving DMSO (mL and mL/kg body weight) had significantly more side effects attributed to DMSO, although this effect was not observed if DMSO was calculated as mL/min. Dividing the myeloma and lymphoma patients each into two equal groups by age we were able to confirm this result in all but young myeloma patients in whom an inversion of the odds ratio was seen, possibly related to the higher dose of melphalan received by young myeloma patients. CONCLUSION We suggest better standardization of preservation method with reduced DMSO concentration and attention to the dose of DMSO received by patients could help reduce the toxicity and morbidity of the transplant procedure.
Resumo:
Background information: During the late 1970s and the early 1980s, West Germany witnessed a reversal of gender differences in educational attainment, as females began to outperform males. Purpose: The main objective was to analyse which processes were behind the reversal of gender differences in educational attainment after 1945. The theoretical reflections and empirical evidence presented for the US context by DiPrete and Buchmann (Gender-specific trends in the value of education and the emerging gender gap in college completion, Demography 43: 1–24, 2006) and Buchmann, DiPrete, and McDaniel (Gender inequalities in education, Annual Review of Sociology 34: 319–37, 2008) are considered and applied to the West German context. It is suggested that the reversal of gender differences is a consequence of the change in female educational decisions, which are mainly related to labour market opportunities and not, as sometimes assumed, a consequence of a ‘boy’s crisis’. Sample: Several databases, such as the German General Social Survey, the German Socio-economic Panel and the German Life History Study, are employed for the longitudinal analysis of the educational and occupational careers of birth cohorts born in the twentieth century. Design and methods: Changing patterns of eligibility for university studies are analysed for successive birth cohorts and gender. Binary logistic regressions are employed for the statistical modelling of the individuals’ achievement, educational decision and likelihood for social mobility – reporting average marginal effects (AME). Results: The empirical results suggest that women’s better school achievement being constant across cohorts does not contribute to the explanation of the reversal of gender differences in higher education attainment, but the increase of benefits for higher education explains the changing educational decisions of women regarding their transition to higher education. Conclusions: The outperformance of females compared with males in higher education might have been initialised by several social changes, including the expansion of public employment, the growing demand for highly qualified female workers in welfare and service areas, the increasing returns of women’s increased education and training, and the improved opportunities for combining family and work outside the home. The historical data show that, in terms of (married) women’s increased labour market opportunities and female life-cycle labour force participation, the raising rates of women’s enrolment in higher education were – among other reasons – partly explained by their rising access to service class positions across birth cohorts, and the rise of their educational returns in terms of wages and long-term employment.
Resumo:
OBJECTIVES In Europe and elsewhere, health inequalities among HIV-positive individuals are of concern. We investigated late HIV diagnosis and late initiation of combination antiretroviral therapy (cART) by educational level, a proxy of socioeconomic position. DESIGN AND METHODS We used data from nine HIV cohorts within COHERE in Austria, France, Greece, Italy, Spain and Switzerland, collecting data on level of education in categories of the UNESCO/International Standard Classification of Education standard classification: non-completed basic, basic, secondary and tertiary education. We included individuals diagnosed with HIV between 1996 and 2011, aged at least 16 years, with known educational level and at least one CD4 cell count within 6 months of HIV diagnosis. We examined trends by education level in presentation with advanced HIV disease (AHD) (CD4 <200 cells/μl or AIDS within 6 months) using logistic regression, and distribution of CD4 cell count at cART initiation overall and among presenters without AHD using median regression. RESULTS Among 15 414 individuals, 52, 45,37, and 31% with uncompleted basic, basic, secondary and tertiary education, respectively, presented with AHD (P trend <0.001). Compared to patients with tertiary education, adjusted odds ratios of AHD were 1.72 (95% confidence interval 1.48-2.00) for uncompleted basic, 1.39 (1.24-1.56) for basic and 1.20 (1.08-1.34) for secondary education (P < 0.001). In unadjusted and adjusted analyses, median CD4 cell count at cART initiation was lower with poorer educational level. CONCLUSIONS Socioeconomic inequalities in delayed HIV diagnosis and initiation of cART are present in European countries with universal healthcare systems and individuals with lower educational level do not equally benefit from timely cART initiation.
Resumo:
BACKGROUND Delayed-onset muscle soreness (DOMS) is a common symptom in people participating in exercise, sport, or recreational physical activities. Several remedies have been proposed to prevent and alleviate DOMS. DESIGN AND METHODS A five-arm randomized controlled study was conducted to examine the effects of acupuncture on eccentric exercise-induced DOMS of the biceps brachii muscle. Participants were recruited through convenience sampling of students and general public. Participants were randomly allocated to needle, laser, sham needle, sham laser acupuncture, and no intervention. Outcome measures included pressure pain threshold (PPT), pain intensity (visual analog scale), and maximum isometric voluntary force. RESULTS Delayed-onset muscle soreness was induced in 60 participants (22 females, age 23.6 ± 2.8 years, weight 66.1 ± 9.6 kg, and height 171.6 ± 7.9 cm). Neither verum nor sham interventions significantly improved outcomes within 72 hours when compared with no treatment control (P > 0.05). CONCLUSIONS Acupuncture was not effective in the treatment of DOMS. From a mechanistic point of view, these results have implications for further studies: (1) considering the high-threshold mechanosensitive nociceptors of the muscle, the cutoff for PPT (5 kg/cm) chosen to avoid bruising might have led to ceiling effects; (2) the traditional acupuncture regimen, targeting muscle pain, might have been inappropriate as the DOMS mechanisms seem limited to the muscular unit and its innervation. Therefore, a regionally based regimen including an intensified intramuscular needling (dry needling) should be tested in future studies, using a higher cutoff for PPT to avoid ceiling effects.
Resumo:
OBJECTIVES Rates of TB/HIV coinfection and multi-drug resistant (MDR)-TB are increasing in Eastern Europe (EE). We aimed to study clinical characteristics, factors associated with MDR-TB and predicted activity of empiric anti-TB treatment at time of TB diagnosis among TB/HIV coinfected patients in EE, Western Europe (WE) and Latin America (LA). DESIGN AND METHODS Between January 1, 2011, and December 31, 2013, 1413 TB/HIV patients (62 clinics in 19 countries in EE, WE, Southern Europe (SE), and LA) were enrolled. RESULTS Significant differences were observed between EE (N = 844), WE (N = 152), SE (N = 164), and LA (N = 253) in the proportion of patients with a definite TB diagnosis (47%, 71%, 72% and 40%, p<0.0001), MDR-TB (40%, 5%, 3% and 15%, p<0.0001), and use of combination antiretroviral therapy (cART) (17%, 40%, 44% and 35%, p<0.0001). Injecting drug use (adjusted OR (aOR) = 2.03 (95% CI 1.00-4.09), prior anti-TB treatment (3.42 (1.88-6.22)), and living in EE (7.19 (3.28-15.78)) were associated with MDR-TB. Among 585 patients with drug susceptibility test (DST) results, the empiric (i.e. without knowledge of the DST results) anti-TB treatment included ≥3 active drugs in 66% of participants in EE compared with 90-96% in other regions (p<0.0001). CONCLUSIONS In EE, TB/HIV patients were less likely to receive a definite TB diagnosis, more likely to house MDR-TB and commonly received empiric anti-TB treatment with reduced activity. Improved management of TB/HIV patients in EE requires better access to TB diagnostics including DSTs, empiric anti-TB therapy directed at both susceptible and MDR-TB, and more widespread use of cART.