93 resultados para change of generation
Resumo:
OBJECTIVE: To determine the effects of cognitive-behavioral stress management (CBSM) training on clinical and psychosocial markers in HIV-infected persons. METHODS: A randomized controlled trial in four HIV outpatient clinics of 104 HIV-infected persons taking combination antiretroviral therapy (cART), measuring HIV-1 surrogate markers, adherence to therapy and well-being 12 months after 12 group sessions of 2 h CBSM training. RESULTS: Intent-to-treat analyses showed no effects on HIV-1 surrogate markers in the CBSM group compared with the control group: HIV-1 RNA < 50 copies/ml in 81.1% [95% confidence interval (CI), 68.0-90.6] and 74.5% (95% CI, 60.4-85.7), respectively (P = 0.34), and mean CD4 cell change from baseline of 53.0 cells/microl (95% CI, 4.1-101.8) and 15.5 cells/microl (95% CI, -34.3 to 65.4), respectively (P = 0.29). Self-reported adherence to therapy did not differ between groups at baseline (P = 0.53) or at 12 month's post-intervention (P = 0.47). Significant benefits of CBSM over no intervention were observed in mean change of quality of life scores: physical health 2.9 (95% CI, 0.7-5.1) and -0.2 (95% CI, -2.1 to 1.8), respectively (P = 0.05); mental health 4.8 (95% CI, 1.8-7.3) and -0.5 (95% CI, -3.3 to 2.2) (P = 0.02); anxiety -2.1 (95% CI, -3.6 to -1.0) and 0.3 (95% CI, -0.7 to 1.4), respectively (P = 0.002); and depression -2.1 (95% CI, -3.2 to -0.9) and 0.02 (95% CI, -1.0 to 1.1), respectively (P = 0.001). Alleviation of depression and anxiety symptoms were most pronounced among participants with high psychological distress at baseline. CONCLUSION: CBSM training of HIV-infected persons taking on cART does not improve clinical outcome but has lasting effects on quality of life and psychological well-being.
Resumo:
Background: Medical students do not accurately self-assess their competence. However, little is known about the awareness of change of competence over time. The aim of this study was to evaluate if students are aware of their progress. Summary of work: Twenty-two fourth year medical students had self- and expert-assessments of their clinical skills in musculoskeletal medicine in an OSCE like station (4 point Likert scale) at the beginning (t0) and end (t1) of their eight weeks clerkship in internal medicine. Thirteen students were assigned to the intervention of a 6x1 hour practical examination course; nine took part in the regular clinical clerkship activities only and served as controls. Summary of results/Conclusions: The intervention students significantly improved their skills (from 2.78 ± 0.36 to 3.30 ± 0.36, p<0.05) in contrast to the control students (from 3.11 ± 0.58 to 2.83 ± 0.49, n.s.). At t0, 19 students, at t1 21 out of 22 students underestimated their competence. Correlations between the change of self- and expert-assessment were r=0.43, p<0.05 (all), r=0.47, n.s. (control) and r=-0.12, n.s. (intervention), respectively. Take-home message: Medical students improving their clinical skills by an interactive course in addition to their regular clerkship activities are not aware of their progress
Resumo:
OBJECTIVES: To evaluate the relationship between T1 after intravenous contrast administration (T1Gd) and Delta relaxation rate (DeltaR1) = (1/T1(Gd) - 1/T1o) in the delayed Gadolinium-Enhanced MRI of cartilage (dGEMRIC) evaluation of cartilage repair tissue. MATERIALS AND METHODS: Thirty single MR examinations from 30 patients after matrix-associated autologous chondrocyte transplantations of the knee joint with different postoperative intervals were examined using an 8-channel knee-coil at 3T. T1 mapping using a 3D GRE sequence with a 35/10 degrees flip angle excitation pulse combination was performed before and after contrast administration (dGEMRIC technique). T1 postcontrast (T1(Gd)) and the DeltaR1 (relative index of pre- and postcontrast R1 value) were calculated for repair tissue and the weight-bearing normal appearing control cartilage. For evaluation of the different postoperative intervals, MR exams were subdivided into 3 groups (up to 12 months, 12-24 months, more than 24 months). For statistical analysis Spearman correlation coefficients were calculated. RESULTS: The mean value for T1 postcontrast was 427 +/- 159 ms, for DeltaR1 1.85 +/- 1.0; in reference cartilage 636 +/- 181 ms for T1 postcontrast and 0.83 +/- 0.5 for DeltaR1.The correlation coefficients were highly significant between T1 (Gd) and DeltaR1 for repair tissue (0.969) as well as normal reference cartilage (0.928) in total, and for the reparative cartilage in the early, middle postoperative, and late postoperative interval after surgery (R values: -0.986, -0.970, and -0.978, respectively). Using either T1(Gd) or DeltaR1, the 2 metrics resulted in similar conclusions regarding the time course of change of repair tissue and control tissue, namely that highly significant (P > 0.01) differences between cartilage repair tissue and reference cartilage were found for all follow-up groups. Additionally, for both metrics highly significant differences (P < 0.01) between early follow up and the 2 later postoperative groups for cartilage repair tissue were found. No statistical differences were found between the 2 later follow-up groups of reparative cartilage either for T1 (Gd) or DeltaR1. CONCLUSION: The high correlation between T1 (Gd) and DeltaR1 and the comparable conclusions reached utilizing metric implies that T1 mapping before intravenous administration of MR contrast agent is not necessary for the evaluation of repair tissue. This will help to reduce costs, inconvenience for the patients, simplifies the examination procedure, and makes dGEMRIC more attractive for follow-up of patients after cartilage repair surgeries.
Resumo:
GOALS OF WORK: In patients with locally advanced esophageal cancer, only those responding to the treatment ultimately benefit from preoperative chemoradiation. We investigated whether changes in subjective dysphagia or eating restrictions after two cycles of induction chemotherapy can predict histopathological tumor response observed after chemoradiation. In addition, we examined general long-term quality of life (QoL) and, in particular, eating restrictions after esophagectomy. MATERIALS AND METHODS: Patients with resectable, locally advanced squamous cell- or adenocarcinoma of the esophagus were treated with two cycles of chemotherapy followed by chemoradiation and surgery. They were asked to complete the EORTC oesophageal-specific QoL module (EORTC QLQ-OES24), and linear analogue self-assessment QoL indicators, before and during neoadjuvant therapy and quarterly until 1 year postoperatively. A median change of at least eight points was considered as clinically meaningful. MAIN RESULTS: Clinically meaningful improvements in the median scores for dysphagia and eating restrictions were found during induction chemotherapy. These improvements were not associated with a histopathological response observed after chemoradiation, but enhanced treatment compliance. Postoperatively, dysphagia scores remained low at 1 year, while eating restrictions persisted more frequently in patients with extended transthoracic resection compared to those with limited transhiatal resection. CONCLUSIONS: The improvement of dysphagia and eating restrictions after induction chemotherapy did not predict tumor response observed after chemoradiation. One year after esophagectomy, dysphagia was a minor problem, and global QoL was rather good. Eating restrictions persisted depending on the surgical technique used.
Resumo:
BACKGROUND AND PURPOSE: In order to use a single implant with one treatment plan in fractionated high-dose-rate brachytherapy (HDR-B), applicator position shifts must be corrected prior to each fraction. The authors investigated the use of gold markers for X-ray-based setup and position control between the single fractions. PATIENTS AND METHODS: Caudad-cephalad movement of the applicators prior to each HDR-B fraction was determined on radiographs using two to three gold markers, which had been inserted into the prostate as intraprostatic reference, and one to two radiopaque-labeled reference applicators. 35 prostate cancer patients, treated by HDR-B as a monotherapy between 10/2003 and 06/2006 with four fractions of 9.5 Gy each, were analyzed. Toxicity was scored according to the CTCAE Score, version 3.0. Median follow-up was 3 years. RESULTS: The mean change of applicators positions compared to baseline varied substantially between HDR-B fractions, being 1.4 mm before fraction 1 (range, -4 to 2 mm), -13.1 mm before fraction 2 (range, -36 to 0 mm), -4.1 mm before fraction 3 (range, -21 to 9 mm), and -2.6 mm at fraction 4 (range, -16 to 9 mm). The original position of the applicators could be readjusted easily prior to each fraction in every patient. In 18 patients (51%), the applicators were at least once readjusted > 10 mm, however, acute or late grade > or = 2 genitourinary toxicity was not increased (p = 1.0) in these patients. CONCLUSION: Caudad position shifts up to 36 mm were observed. Gold markers represent a valuable tool to ensure setup accuracy and precise dose delivery in fractionated HDR-B monotherapy of prostate cancer.
Resumo:
In Europe and the United States, the recreational use of gamma-hydroxy butyric acid (GHB) at dance clubs and "rave" parties has increased substantially. In addition, GHB is used to assist in the commission of sexual assaults. The aim of this controlled clinical study was to acquire pharmacokinetic profiles, detection times, and excretion rates in human subjects. Eight GHB-naïve volunteers were administered a single 25-mg/kg body weight oral dose of GHB, and plasma, urine, and oral fluid specimens were analyzed by using gas chromatography-mass spectrometry (GC-MS). Liquid-liquid extraction was performed after acid conversion of GHB to gamma-butyrolactone. Limits of quantitation of 0.1 (oral fluid), 0.2 (urine), and 0.5 microg/mL (plasma) could be achieved in the selected ion monitoring mode. GHB plasma peaks of 39.4 +/- 25.2 microg/mL (mean +/- SEM) occurred 20-45 min after administration. The terminal plasma elimination half-life was 30.4 +/- 2.45 min, the distribution volume 52.7 +/- 15.0 L, and the total clearance 1228 +/- 233 microL/min. In oral fluid, GHB could be detected up to 360 min, with peak concentrations of 203 +/- 92.4 microg/mL in the 10-min samples. In urine, 200 +/- 71.8 and 230 +/- 86.3 microg/mL, were the highest GHB levels measured at 30 and 60 min, respectively. Only 1.2 +/- 0.2% of the dose was excreted, resulting in a detection window of 720 min. Common side-effects were confusion, sleepiness, and dizziness; euphoria and change of vital functions were not observed. GHB is extensively metabolized and rapidly eliminated in urine and oral fluid. Consequently, samples should be collected as soon as possible after ingestion.
Resumo:
Colonization with more than one distinct strain of the same species, also termed cocolonization, is a prerequisite for horizontal gene transfer between pneumococcal strains that may lead to change of the capsular serotype. Capsule switch has become an important issue since the introduction of conjugated pneumococcal polysaccharide vaccines. There is, however, a lack of techniques to detect multiple colonization by S. pneumoniae strains directly in nasopharyngeal samples. Two hundred eighty-seven nasopharyngeal swabs collected during the prevaccine era within a nationwide surveillance program were analyzed by a novel technique for the detection of cocolonization, based on PCR amplification of a noncoding region adjacent to the pneumolysin gene (plyNCR) and restriction fragment length polymorphism (RFLP) analysis. The numbers of strains and their relative abundance in cocolonized samples were determined by terminal RFLP. The pneumococcal carriage rate found by PCR was 51.6%, compared to 40.0% found by culture. Cocolonization was present in 9.5% (10/105) of samples, most (9/10) of which contained two strains in a ratio of between 1:1 and 17:1. Five of the 10 cocolonized samples showed combinations of vaccine types only (n = 2) or combinations of nonvaccine types only (n = 3). Carriers of multiple pneumococcal strains had received recent antibiotic treatment more often than those colonized with a single strain (33% versus 9%, P = 0.025). This new technique allows for the rapid and economical study of pneumococcal cocolonization in nasopharyngeal swabs. It will be valuable for the surveillance of S. pneumoniae epidemiology under vaccine selection pressure.
Resumo:
BACKGROUND Acute cardiogenic shock after myocardial infarction is associated with high in-hospital mortality attributable to persisting low-cardiac output. The Impella-EUROSHOCK-registry evaluates the safety and efficacy of the Impella-2.5-percutaneous left-ventricular assist device in patients with cardiogenic shock after acute myocardial infarction. METHODS AND RESULTS This multicenter registry retrospectively included 120 patients (63.6±12.2 years; 81.7% male) with cardiogenic shock from acute myocardial infarction receiving temporary circulatory support with the Impella-2.5-percutaneous left-ventricular assist device. The primary end point evaluated mortality at 30 days. The secondary end point analyzed the change of plasma lactate after the institution of hemodynamic support, and the rate of early major adverse cardiac and cerebrovascular events as well as long-term survival. Thirty-day mortality was 64.2% in the study population. After Impella-2.5-percutaneous left-ventricular assist device implantation, lactate levels decreased from 5.8±5.0 mmol/L to 4.7±5.4 mmol/L (P=0.28) and 2.5±2.6 mmol/L (P=0.023) at 24 and 48 hours, respectively. Early major adverse cardiac and cerebrovascular events were reported in 18 (15%) patients. Major bleeding at the vascular access site, hemolysis, and pericardial tamponade occurred in 34 (28.6%), 9 (7.5%), and 2 (1.7%) patients, respectively. The parameters of age >65 and lactate level >3.8 mmol/L at admission were identified as predictors of 30-day mortality. After 317±526 days of follow-up, survival was 28.3%. CONCLUSIONS In patients with acute cardiogenic shock from acute myocardial infarction, Impella 2.5-treatment is feasible and results in a reduction of lactate levels, suggesting improved organ perfusion. However, 30-day mortality remains high in these patients. This likely reflects the last-resort character of Impella-2.5-application in selected patients with a poor hemodynamic profile and a greater imminent risk of death. Carefully conducted randomized controlled trials are necessary to evaluate the efficacy of Impella-2.5-support in this high-risk patient group.
Resumo:
INTRODUCTION Stable reconstruction of proximal femoral (PF) fractures is especially challenging due to the peculiarity of the injury patterns and the high load-bearing requirement. Since its introduction in 2007, the PF-locking compression plate (LCP) 4.5/5.0 has improved osteosynthesis for intertrochanteric and subtrochanteric fractures of the femur. This study reports our early results with this implant. METHODS Between January 2008 and June 2010, 19 of 52 patients (12 males, 7 females; mean age 59 years, range 19-96 years) presenting with fractures of the trochanteric region were treated at the authors' level 1 trauma centre with open reduction and internal fixation using PF-LCP. Postoperatively, partial weight bearing was allowed for all 19 patients. Follow-up included a thorough clinical and radiological evaluation at 1.5, 3, 6, 12, 24, 36 and 48 months. Failure analysis was based on conventional radiological and clinical assessment regarding the type of fracture, postoperative repositioning, secondary fracture dislocation in relation to the fracture constellation and postoperative clinical function (Merle d'Aubigné score). RESULTS In 18 patients surgery achieved adequate reduction and stable fixation without intra-operative complications. In one patient an ad latus displacement was observed on postoperative X-rays. At the third month follow-up four patients presented with secondary varus collapse and at the sixth month follow-up two patients had 'cut-outs' of the proximal fragment, with one patient having implant failure due to a broken proximal screw. Revision surgeries were performed in eight patients, one patient receiving a change of one screw, three patients undergoing reosteosynthesis with implantation of a condylar plate and one patient undergoing hardware removal with secondary implantation of a total hip prosthesis. Eight patients suffered from persistent trochanteric pain and three patients underwent hardware removal. CONCLUSIONS Early results for PF-LCP osteosynthesis show major complications in 7 of 19 patients requiring reosteosynthesis or prosthesis implantation due to secondary loss of reduction or hardware removal. Further studies are required to evaluate the limitations of this device.
Resumo:
BACKGROUND Assessment of pre-test probability of pulmonary embolism (PE) and prognostic stratification are two widely recommended steps in the management of patients with suspected PE. Some items of the Geneva prediction rule may have a prognostic value. We analyzed whether the initial probability assessed by the Geneva rule was associated with the outcome of patients with PE. METHODS In a post-hoc analysis of a multicenter trial including 1,693 patients with suspected PE, the all-cause death or readmission rates during the 3-month follow-up of patients with confirmed PE were analyzed. PE probability group was prospectively assessed by the revised Geneva score (RGS). Similar analyses were made with the a posteriori-calculated simplified Geneva score (SGS). RESULTS PE was confirmed in 357 patients and 21 (5.9%) died during the 3-month follow-up. The mortality rate differed significantly with the initial RGS group, as with the SGS group. For the RGS, the mortality increased from 0% (95% Confidence Interval: [0-5.4%]) in the low-probability group to 14.3% (95% CI: [6.3-28.2%]) in the high-probability group, and for the SGS, from 0% (95% CI: [0-5.4%] to 17.9% (95% CI: [7.4-36%]). Readmission occurred in 58 out of the 352 patients with complete information on readmission (16.5%). No significant change of readmission rate was found among the RGS or SGS groups. CONCLUSIONS Returning to the initial PE probability evaluation may help clinicians predict 3-month mortality in patients with confirmed PE. (ClinicalTrials.gov: NCT00117169).
Resumo:
BACKGROUND Follicular variant of papillary thyroid carcinoma (FVPTC) shares features of papillary (PTC) and follicular (FTC) thyroid carcinomas on a clinical, morphological, and genetic level. MicroRNA (miRNA) deregulation was extensively studied in PTCs and FTCs. However, very limited information is available for FVPTC. The aim of this study was to assess miRNA expression in FVPTC with the most comprehensive miRNA array panel and to correlate it with the clinicopathological data. METHODS Forty-four papillary thyroid carcinomas (17 FVPTC, 27 classic PTC) and eight normal thyroid tissue samples were analyzed for expression of 748 miRNAs using Human Microarray Assays on the ABI 7900 platform (Life Technologies, Carlsbad, CA). In addition, an independent set of 61 tumor and normal samples was studied for expression of novel miRNA markers detected in this study. RESULTS Overall, the miRNA expression profile demonstrated similar trends between FVPTC and classic PTC. Fourteen miRNAs were deregulated in FVPTC with a fold change of more than five (up/down), including miRNAs known to be upregulated in PTC (miR-146b-3p, -146-5p, -221, -222 and miR-222-5p) and novel miRNAs (miR-375, -551b, 181-2-3p, 99b-3p). However, the levels of miRNA expression were different between these tumor types and some miRNAs were uniquely dysregulated in FVPTC allowing separation of these tumors on the unsupervised hierarchical clustering analysis. Upregulation of novel miR-375 was confirmed in a large independent set of follicular cell derived neoplasms and benign nodules and demonstrated specific upregulation for PTC. Two miRNAs (miR-181a-2-3p, miR-99b-3p) were associated with an adverse outcome in FVPTC patients by a Kaplan-Meier (p < 0.05) and multivariate Cox regression analysis (p < 0.05). CONCLUSIONS Despite high similarity in miRNA expression between FVPTC and classic PTC, several miRNAs were uniquely expressed in each tumor type, supporting their histopathologic differences. Highly upregulated miRNA identified in this study (miR-375) can serve as a novel marker of papillary thyroid carcinoma, and miR-181a-2-3p and miR-99b-3p can predict relapse-free survival in patients with FVPTC thus potentially providing important diagnostic and predictive value.
Resumo:
Traditionally, critical swimming speed has been defined as the speed when a fish can no longer propel itself forward, and is exhausted. To gain a better understanding of the metabolic processes at work during a U(crit) swim test, and that lead to fatigue, we developed a method using in vivo (31)P-NMR spectroscopy in combination with a Brett-type swim tunnel. Our data showed that a metabolic transition point is reached when the fish change from using steady state aerobic metabolism to non-steady state anaerobic metabolism, as indicated by a significant increase in inorganic phosphate levels from 0.3+/-0.3 to 9.5+/-3.4 mol g(-1), and a drop in intracellular pH from 7.48+/-0.03 to 6.81+/-0.05 in muscle. This coincides with the point when the fish change gait from subcarangiform swimming to kick-and-glide bursts. As the number of kicks increased, so too did the Pi concentration, and the pH(i) dropped. Both changes were maximal at U(crit). A significant drop in Gibbs free energy change of ATP hydrolysis from -55.6+/-1.4 to -49.8+/-0.7 kJ mol(-1) is argued to have been involved in fatigue. This confirms earlier findings that the traditional definition of U(crit), unlike other critical points that are typically marked by a transition from aerobic to anaerobic metabolism, is the point of complete exhaustion of both aerobic and anaerobic resources.
Resumo:
Rapid changes in atmospheric methane (CH4), temperature and precipitation are documented by Greenland ice core data both for glacial times (the so called Dansgaard-Oeschger (D-O) events) as well as for a cooling event in the early Holocene (the 8.2 kyr event). The onsets of D-O warm events are paralleled by abrupt increases in CH4 by up to 250 ppb in a few decades. Vice versa, the 8.2 kyr event is accompanied by an intermittent decrease in CH4 of about 80 ppb over 150 yr. The abrupt CH4 changes are thought to mainly originate from source emission variations in tropical and boreal wet ecosystems, but complex process oriented bottom-up model estimates of the changes in these ecosystems during rapid climate changes are still missing. Here we present simulations of CH4 emissions from northern peatlands with the LPJ-Bern dynamic global vegetation model. The model represents CH4 production and oxidation in soils and transport by ebullition, through plant aerenchyma, and by diffusion. Parameters are tuned to represent site emission data as well as inversion-based estimates of northern wetland emissions. The model is forced with climate input data from freshwater hosing experiments using the NCAR CSM1.4 climate model to simulate an abrupt cooling event. A concentration reduction of ~10 ppb is simulated per degree K change of mean northern hemispheric surface temperature in peatlands. Peatland emissions are equally sensitive to both changes in temperature and in precipitation. If simulated changes are taken as an analogy to the 8.2 kyr event, boreal peatland emissions alone could only explain 23 of the 80 ppb decline in atmospheric methane concentration. This points to a significant contribution to source changes from low latitude and tropical wetlands to this event.
Resumo:
Little is known about the influence of different stressors on fine motor skills, the concentration of testosterone (T), and their interaction in adolescents. Therefore, 62 high school students aged 14–15 years were randomly assigned to two experimental groups (exercise, psychosocial stress) and a control group. Exercise stress was induced at 65–75% of the maximum heart rate by running for 15 minutes (n = 24). Psychosocial stress was generated by an intelligence test (HAWIK- IV), which was uncontrollable and characterized by social-evaluative-threat to the students (n=21). The control group followed was part of a regular school lesson with the same duration (n = 28). Saliva was collected after a normal school lesson (pre-test) as well as after the intervention/control period (post-test) and was analyzed for testosterone. Fine motor skills were assessed pre- and post-intervention using a manual dexterity test (Flower Trail) from the Movement Assessment Battery for Children-2. A repeated measure ANCOVA including gender as a covariate revealed a significant group by test interaction, indicating an increase in manual dexterity only for the psychosocial stress group. Correlation analysis of all students shows that the change of testosterone from pre- to post-test was directly linked (r = 2.31, p = .01) to the changes in manual dexterity performance. Participants showing high increases in testosterone from pre- to post-test made fewer mistakes in the fine motor skills task. Findings suggest that manual dexterity increases when psychosocial stress is induced and that improvement of manual dexterity performance corresponds with the increase of testosterone.
Resumo:
AIM: To determine the feasibility of evaluating surgically induced hepatocyte damage using gadoxetate disodium (Gd-EOB-DTPA) as a marker for viable hepatocytes at magnetic resonance imaging (MRI) after liver resection. MATERIAL AND METHODS: Fifteen patients were prospectively enrolled in this institutional review board-approved study prior to elective liver resection after informed consent. Three Tesla MRI was performed 3-7 days after surgery. Three-dimensional (3D) T1-weighted (W) volumetric interpolated breath-hold gradient echo (VIBE) sequences covering the liver were acquired before and 20 min after Gd-EOB-DTPA administration. The signal-to-noise ratio (SNR) was used to compare the uptake of Gd-EOB-DTPA in healthy liver tissue and in liver tissue adjacent to the resection border applying paired Student's t-test. Correlations with potential influencing factors (blood loss, duration of intervention, age, pre-existing liver diseases, postoperative change of resection surface) were calculated using Pearson's correlation coefficient. RESULTS: Before Gd-EOB-DTPA administration the SNR did not differ significantly (p = 0.052) between healthy liver tissue adjacent to untouched liver borders [59.55 ± 25.46 (SD)] and the liver tissue compartment close to the resection surface (63.31 ± 27.24). During the hepatocyte-specific phase, the surgical site showed a significantly (p = 0.04) lower SNR (69.44 ± 24.23) compared to the healthy site (78.45 ± 27.71). Dynamic analyses revealed a significantly lower increase (p = 0.008) in signal intensity in the healthy tissue compared to the resection border compartment. CONCLUSION: EOB-DTPA-enhanced MRI may have the potential to be an effective non-invasive tool for detecting hepatocyte damage after liver resection.