34 resultados para Bíblia - N.T. -João 9,1-41 - Comentário
Resumo:
The trabecular bone score (TBS) is an index of bone microarchitectural texture calculated from anteroposterior dual-energy X-ray absorptiometry (DXA) scans of the lumbar spine (LS) that predicts fracture risk, independent of bone mineral density (BMD). The aim of this study was to compare the effects of yearly intravenous zoledronate (ZOL) versus placebo (PLB) on LS BMD and TBS in postmenopausal women with osteoporosis. Changes in TBS were assessed in the subset of 107 patients recruited at the Department of Osteoporosis of the University Hospital of Berne, Switzerland, who were included in the HORIZON trial. All subjects received adequate calcium and vitamin D3. In these patients randomly assigned to either ZOL (n = 54) or PLB (n = 53) for 3 years, BMD was measured by DXA and TBS assessed by TBS iNsight (v1.9) at baseline and 6, 12, 24, and 36 months after treatment initiation. Baseline characteristics (mean ± SD) were similar between groups in terms of age, 76.8 ± 5.0 years; body mass index (BMI), 24.5 ± 3.6 kg/m(2) ; TBS, 1.178 ± 0.1 but for LS T-score (ZOL-2.9 ± 1.5 versus PLB-2.1 ± 1.5). Changes in LS BMD were significantly greater with ZOL than with PLB at all time points (p < 0.0001 for all), reaching +9.58% versus +1.38% at month 36. Change in TBS was significantly greater with ZOL than with PLB as of month 24, reaching +1.41 versus-0.49% at month 36; p = 0.031, respectively. LS BMD and TBS were weakly correlated (r = 0.20) and there were no correlations between changes in BMD and TBS from baseline at any visit. In postmenopausal women with osteoporosis, once-yearly intravenous ZOL therapy significantly increased LS BMD relative to PLB over 3 years and TBS as of 2 years.
Resumo:
AIMS High-density lipoprotein (HDL) cholesterol is a strong predictor of cardiovascular mortality. This work aimed to investigate whether the presence of coronary artery disease (CAD) impacts on its predictive value. METHODS AND RESULTS We studied 3141 participants (2191 males, 950 females) of the LUdwigshafen RIsk and Cardiovascular health (LURIC) study. They had a mean ± standard deviation age of 62.6 ± 10.6 years, body mass index of 27.5 ± 4.1 kg/m², and HDL cholesterol of 38.9 ± 10.8 mg/dL. The cohort consisted of 699 people without CAD, 1515 patients with stable CAD, and 927 patients with unstable CAD. The participants were prospectively followed for cardiovascular mortality over a median (inter-quartile range) period of 9.9 (8.7-10.7) years. A total of 590 participants died from cardiovascular diseases. High-density lipoprotein cholesterol by tertiles was inversely related to cardiovascular mortality in the entire cohort (P = 0.009). There was significant interaction between HDL cholesterol and CAD in predicting the outcome (P = 0.007). In stratified analyses, HDL cholesterol was strongly associated with cardiovascular mortality in people without CAD [3rd vs. 1st tertile: HR (95% CI) = 0.37 (0.18-0.74), P = 0.005], but not in patients with stable [3rd vs. 1st tertile: HR (95% CI) = 0.81 (0.61-1.09), P = 0.159] and unstable [3rd vs. 1st tertile: HR (95% CI) = 0.91 (0.59-1.41), P = 0.675] CAD. These results were replicated by analyses in 3413 participants of the AtheroGene cohort and 5738 participants of the ESTHER cohort, and by a meta-analysis comprising all three cohorts. CONCLUSION The inverse relationship of HDL cholesterol with cardiovascular mortality is weakened in patients with CAD. The usefulness of considering HDL cholesterol for cardiovascular risk stratification seems limited in such patients.
Resumo:
OBJECTIVE To validate use of stress MRI for evaluation of stifle joints of dogs with an intact or deficient cranial cruciate ligament (CrCL). SAMPLE 10 cadaveric stifle joints from 10 dogs. PROCEDURES A custom-made limb-holding device and a pulley system linked to a paw plate were used to apply axial compression across the stifle joint and induce cranial tibial translation with the joint in various degrees of flexion. By use of sagittal proton density-weighted MRI, CrCL-intact and deficient stifle joints were evaluated under conditions of loading stress simulating the tibial compression test or the cranial drawer test. Medial and lateral femorotibial subluxation following CrCL transection measured under a simulated tibial compression test and a cranial drawer test were compared. RESULTS By use of tibial compression test MRI, the mean ± SD cranial tibial translations in the medial and lateral compartments were 9.6 ± 3.7 mm and 10 ± 4.1 mm, respectively. By use of cranial drawer test MRI, the mean ± SD cranial tibial translations in the medial and lateral compartments were 8.3 ± 3.3 mm and 9.5 ± 3.5 mm, respectively. No significant difference in femorotibial subluxation was found between stress MRI techniques. Femorotibial subluxation elicited by use of the cranial drawer test was greater in the lateral than in the medial compartment. CONCLUSIONS AND CLINICAL RELEVANCE Both stress techniques induced stifle joint subluxation following CrCL transection that was measurable by use of MRI, suggesting that both methods may be further evaluated for clinical use.
Resumo:
BACKGROUND Previous studies indicate increased prevalences of suicidal ideation, suicide attempts, and completed suicide in Huntington's disease (HD) compared with the general population. This study investigates correlates and predictors of suicidal ideation in HD. METHODS The study cohort consisted of 2106 HD mutation carriers, all participating in the REGISTRY study of the European Huntington's Disease Network. Of the 1937 participants without suicidal ideation at baseline, 945 had one or more follow-up measurements. Participants were assessed for suicidal ideation by the behavioural subscale of the Unified Huntington's Disease Rating Scale (UHDRS). Correlates of suicidal ideation were analyzed using logistic regression analysis and predictors were analyzed using Cox regression analysis. RESULTS At baseline, 169 (8.0%) mutation carriers endorsed suicidal ideation. Disease duration (odds ratio [OR]=0.96; 95% confidence interval [CI]: 0.9-1.0), anxiety (OR=2.14; 95%CI: 1.4-3.3), aggression (OR=2.41; 95%CI: 1.5-3.8), a previous suicide attempt (OR=3.95; 95%CI: 2.4-6.6), and a depressed mood (OR=13.71; 95%CI: 6.7-28.0) were independently correlated to suicidal ideation at baseline. The 4-year cumulative incidence of suicidal ideation was 9.9%. Longitudinally, the presence of a depressed mood (hazard ratio [HR]=2.05; 95%CI: 1.1-4.0) and use of benzodiazepines (HR=2.44; 95%CI: 1.2-5.0) at baseline were independent predictors of incident suicidal ideation, whereas a previous suicide attempt was not predictive. LIMITATIONS As suicidal ideation was assessed by only one item, and participants were a selection of all HD mutation carriers, the prevalence of suicidal ideation was likely underestimated. CONCLUSIONS Suicidal ideation in HD frequently occurs. Assessment of suicidal ideation is a priority in mutation carriers with a depressed mood and in those using benzodiazepines.
Resumo:
Einleitung Eine eher unbekannte Art des Mentalen Trainings ist das Training im Klartraum (Erla-cher, Stumbrys & Schredl, 2011-12). Im Klartraum ist sich der Träumende bewusst, dass er träumt, und kann dadurch den fortlaufenden Trauminhalt kontrollieren. Frühere Stu-dien zeigten, dass es möglich ist, motorische Aufgaben im Klartraum zu üben, um dadurch eine verbesserte Leistung im Wachzustand zu erreichen (Erlacher & Schredl, 2010). Jedoch ist wenig über die Prävalenz von Klarträumern im Sport bekannt. Methode Die Stichprobe umfasste 840 deutsche (D: 483 m, 357 w) und 1323 japanische (J: 1000 m, 323 w) Athleten. Das Durchschnittsalter betrug 20,4 Jahre (D: 21,6 J: 19,7). Die Teil-nehmer wurden in verschiedenen Sportarten – von Mannschaftssportarten (z.B. Basket-ball) bis Einzelsportarten (z.B. Leichtathletik) – rekrutiert und füllten einen Fragebogen zum Thema Sport, Schlaf und Traum aus. Die Athleten waren durchschnittlich 9,1 Jahre (D: 11.1, J: 7,9) aktiv und trainierten etwa 14,4 Stunden (D: 11.1, J: 16,7) pro Woche. Der Fragebogen erfasste auf einer 8-stufigen Skala die Klartraumhäufigkeit (Plus Definition: Für ein klares Verständnis von Klarträumen); die Anwendung (z.B. Training) für den Sport und, wenn dies bestätigt wurde, ob sportliche Verbesserungen erlebten wurden. Ergebnisse 47% (D: 57%, J: 41%) der Athleten gaben an, mindesten einen Klartraum erlebt zu ha-ben, 20% (D: 24% J: 18%) sind häufige Klarträumer (mit einem oder mehrere Klarträume pro Monat) und 9% (D 9% , J: 9%) nutzen Klarträume für ihren Sport, davon berichtet die Mehrheit, dass das Klartraumtraining die sportliche Leistung im Wachzustand verbessert. Diskussion Etwa die Hälfte der Athleten kennt das Klarträumen aus eigener Erfahrung, ein Fünftel sind häufige Klarträumer und etwa jeder zehnte Athlet nutzt Klarträume für seinen Sport. Für die deutsche Stichprobe ist die Prävalenzrate der Athleten ähnlich wie in der Bevöl-kerung. Für die japanische Stichprobe liegen keine repräsentativen Bevölkerungsdaten vor, auf der Grundlage der hier vorgestellten Fragebogendaten scheint es jedoch, dass kulturellen Unterschiede eine untergeordnete Rolle spielen. Literatur Erlacher, D. & Schredl, M. (2010). Practicing a motor task in a lucid dream enhances subsequent perfor-mance: A pilot study. The Sport Psychologist, 24(2), 157-167. Erlacher, D., Stumbrys, T. & Schredl, M. (2011-2012). Frequency of lucid dreams and lucid dream practice in German athletes. Imagination, Cognition and Personality, 31(3), 237-246.
Resumo:
PURPOSE The objectives of this systematic review are (1) to quantitatively estimate the esthetic outcomes of implants placed in postextraction sites, and (2) to evaluate the influence of simultaneous bone augmentation procedures on these outcomes. MATERIALS AND METHODS Electronic and manual searches of the dental literature were performed to collect information on esthetic outcomes based on objective criteria with implants placed after extraction of maxillary anterior and premolar teeth. All levels of evidence were accepted (case series studies required a minimum of 5 cases). RESULTS From 1,686 titles, 114 full-text articles were evaluated and 50 records included for data extraction. The included studies reported on single-tooth implants adjacent to natural teeth, with no studies on multiple missing teeth identified (6 randomized controlled trials, 6 cohort studies, 5 cross-sectional studies, and 33 case series studies). Considerable heterogeneity in study design was found. A meta-analysis of controlled studies was not possible. The available evidence suggests that esthetic outcomes, determined by esthetic indices (predominantly the pink esthetic score) and positional changes of the peri-implant mucosa, may be achieved for single-tooth implants placed after tooth extraction. Immediate (type 1) implant placement, however, is associated with a greater variability in outcomes and a higher frequency of recession of > 1 mm of the midfacial mucosa (eight studies; range 9% to 41% and median 26% of sites, 1 to 3 years after placement) compared to early (type 2 and type 3) implant placement (2 studies; no sites with recession > 1 mm). In two retrospective studies of immediate (type 1) implant placement with bone graft, the facial bone wall was not detectable on cone beam CT in 36% and 57% of sites. These sites had more recession of the midfacial mucosa compared to sites with detectable facial bone. Two studies of early implant placement (types 2 and 3) combined with simultaneous bone augmentation with GBR (contour augmentation) demonstrated a high frequency (above 90%) of facial bone wall visible on CBCT. Recent studies of immediate (type 1) placement imposed specific selection criteria, including thick tissue biotype and an intact facial socket wall, to reduce esthetic risk. There were no specific selection criteria for early (type 2 and type 3) implant placement. CONCLUSIONS Acceptable esthetic outcomes may be achieved with implants placed after extraction of teeth in the maxillary anterior and premolar areas of the dentition. Recession of the midfacial mucosa is a risk with immediate (type 1) placement. Further research is needed to investigate the most suitable biomaterials to reconstruct the facial bone and the relationship between long-term mucosal stability and presence/absence of the facial bone, the thickness of the facial bone, and the position of the facial bone crest.
Resumo:
BACKGROUND AND AIMS We investigated the association between significant liver fibrosis, determined by AST-to-platelet ratio index (APRI), and all-cause mortality among HIV-infected patients prescribed antiretroviral therapy (ART) in Zambia METHODS: Among HIV-infected adults who initiated ART, we categorized baseline APRI scores according to established thresholds for significant hepatic fibrosis (APRI ≥1.5) and cirrhosis (APRI ≥2.0). Using multivariable logistic regression we identified risk factors for elevated APRI including demographic characteristics, body mass index (BMI), HIV clinical and immunologic status, and tuberculosis. In the subset tested for hepatitis B surface antigen (HBsAg), we investigated the association of hepatitis B virus co-infection with APRI score. Using Kaplan-Meier analysis and Cox proportional hazards regression we determined the association of elevated APRI with death during ART. RESULTS Among 20,308 adults in the analysis cohort, 1,027 (5.1%) had significant liver fibrosis at ART initiation including 616 (3.0%) with cirrhosis. Risk factors for significant fibrosis or cirrhosis included male sex, BMI <18, WHO clinical stage 3 or 4, CD4+ count <200 cells/mm(3) , and tuberculosis. Among the 237 (1.2%) who were tested, HBsAg-positive patients had four times the odds (adjusted odds ratio, 4.15; 95% CI, 1.71-10.04) of significant fibrosis compared HBsAg-negatives. Both significant fibrosis (adjusted hazard ratio 1.41, 95% CI, 1.21-1.64) and cirrhosis (adjusted hazard ratio 1.57, 95% CI, 1.31-1.89) were associated with increased all-cause mortality. CONCLUSION Liver fibrosis may be a risk factor for mortality during ART among HIV-infected individuals in Africa. APRI is an inexpensive and potentially useful test for liver fibrosis in resource-constrained settings. This article is protected by copyright. All rights reserved.
Resumo:
AIM The optimal duration of dual antiplatelet therapy (DAPT) following the use of new generation drug-eluting stents is unknown. METHODS AND RESULTS The association between DAPT interruption and the rates of stent thrombosis (ST) and cardiac death/target-vessel myocardial infarction (CD/TVMI) in patients receiving a Resolute zotarolimus-eluting stent (R-ZES) was analysed in 4896 patients from the pooled RESOLUTE clinical programme. Daily acetylsalicylate (ASA) and a thienopyridine for 6-12 months were prescribed. A DAPT interruption was defined as any interruption of ASA and/or a thienopyridine of >1 day; long interruptions were >14 days. Three groups were analysed: no interruption, interruption during the first month, and >1-12 months. There were 1069 (21.83%) patients with a DAPT interruption and 3827 patients with no interruption. Among the 166 patients in the 1-month interruption group, 6 definite/probable ST events occurred (3.61%; all long DAPT interruptions), and among the 903 patients in the >1-12 months (60% occurred between 6 and 12 months) interruption group, 1 ST event occurred (0.11%; 2-day DAPT interruption). Among patients with no DAPT interruption, 32 ST events occurred (0.84%). Rates of CD/TVMI were 6.84% in the 1-month long interruption group, 1.41% in the >1-12 months long interruption group, and 4.08% in patients on continuous DAPT. CONCLUSION In a pooled population of patients receiving an R-ZES, DAPT interruptions within 1 month are associated with a high risk of adverse outcomes. Dual antiplatelet therapy interruptions between 1 and 12 months were associated with low rates of ST and adverse cardiac outcomes. Randomized clinical trials are needed to determine whether early temporary or permanent interruption of DAPT is truly safe. CLINICAL TRIALSGOV IDENTIFIERS NCT00617084; NCT00726453; NCT00752128; NCT00927940.
Resumo:
QUESTIONS UNDER STUDY To improve the response of deteriorating patients during their hospital stay, the University Hospital Bern has introduced a Medical Emergency Team (MET). Aim of this retrospective cohort study is to review the preceding factors, patient characteristics, process parameters and their correlation to patient outcomes of MET calls since the introduction of the team. METHODS Data on patient characteristics, parameters related to MET activation and intervention and patient outcomes were evaluated. A Vital Sign Score (VSS), which is defined as the sum of the occurrence of each vital sign abnormalities, was calculated for all physiological parameters pre MET event, during event and correlation with hospital outcomes. RESULTS A total of 1,628 MET calls in 1,317 patients occurred; 262 (19.9%) of patients with MET calls during their hospital stay died. The VSS pre MET event (odds ratio [OR] 1.78, 95% confidence interval [CI] 1.50-2.13; AUROC 0.63; all p <0.0001) and during the MET call (OR 1.60, 95% CI 1.41-1.83; AUROC 0.62; all p <0.0001) were significantly correlated to patient outcomes. A significant increase in MET calls from 5.2 to 16.5 per 1000 hospital admissions (p <0.0001) and a decrease in cardiac arrest calls in the MET perimeter from 1.6 in 2008 to 0.8 per 1000 admissions was observed during the study period (p = 0.014). CONCLUSIONS The VSS is a significant predictor of mortality in patients assessed by the MET. Increasing MET utilisation coincided with a decrease in cardiac arrest calls in the MET perimeter.
Resumo:
The heat of summer 2003 in Western and Central Europe was claimed to be unprecedented since the Middle Ages on the basis of grape harvest data (GHD) and late wood maximum density (MXD) data from trees in the Alps. This paper shows that the authors of these studies overlooked the fact that the heat and drought in Switzerland in 1540 likely exceeded the amplitude of the previous hottest summer of 2003, because the persistent temperature and precipitation anomaly in that year, described in an abundant and coherent body of documentary evidence, severely affected the reliability of GHD and tree-rings as proxy-indicators for temperature estimates. Spring–summer (AMJJ) temperature anomalies of 4.7 °C to 6.8 °C being significantly higher than in 2003 were assessed for 1540 from a new long Swiss GHD series (1444 to 2011). During the climax of the heat wave in early August the grapes desiccated on the vine, which caused many vine-growers to interrupt or postpone the harvest despite full grape maturity until after the next spell of rain. Likewise, the leaves of many trees withered and fell to the ground under extreme drought stress as would usually be expected in late autumn. It remains to be determined by further research whether and how far this result obtained from local analyses can be spatially extrapolated. Based on the temperature estimates for Switzerland it is assumed from a great number of coherent qualitative documentary evidence about the outstanding heat drought in 1540 that AMJJ temperatures were likely more extreme in neighbouring regions of Western and Central Europe than in 2003. Considering the significance of soil moisture deficits for record breaking heat waves, these results still need to be validated with estimated seasonal precipitation. It is concluded that biological proxy data may not properly reveal record breaking heat and drought events. Such assessments thus need to be complemented with the critical study of contemporary evidence from documentary sources which provide coherent and detailed data about weather extremes and related impacts on human, ecological and social systems.
Resumo:
Insulin and glucagon are glucoregulatory hormones that contribute to glucose homeostasis. Plasma insulin is elevated during normoglycemia or hyperglycemia and acts as a suppressor of glucagon secretion. We have investigated if and how insulin and glucose contribute to the regulation of glucagon secretion through long term (48 h) elevated insulin concentrations during simultaneous hypoglycemia or euglycemia in mid-lactating dairy cows. Nineteen Holstein dairy cows were randomly assigned to 3 treatment groups: an intravenous insulin infusion (HypoG, n = 5) to decrease plasma glucose concentrations (2.5 mmol/L), a hyperinsulinemic-euglycemic clamp to study effects of insulin at simultaneously normal glucose concentrations (EuG, n = 6) and a 0.9% saline infusion (NaCl, n = 8). Plasma glucose was measured at 5-min intervals, and insulin and glucose infusion rates were adjusted accordingly. Area under the curve of hourly glucose, insulin, and glucagon concentrations on day 2 of infusion was evaluated by analysis of variance with treatments as fixed effect. Insulin infusion caused an increase of plasma insulin area under the curve (AUC)/h in HypoG (41.9 ± 8.1 mU/L) and EuG (57.8 ± 7.8 mU/L) compared with NaCl (13.9 ± 1.1 mU/L; P < 0.01). Induced hyperinsulinemia caused a decline of plasma glucose AUC/h to 2.3 ± 0.1 mmol/L in HypoG (P < 0.01), whereas plasma glucose AUC/h remained unchanged in EuG (3.8 ± 0.2 mmol/L) and NaCl (4.1 ± 0.1 mmol/L). Plasma glucagon AUC/h was lower in EuG (84.0 ± 6.3 pg/mL; P < 0.05) and elevated in HypoG (129.0 ± 7.0 pg/mL; P < 0.01) as compared with NaCl (106.1 ± 5.4 pg/mL). The results show that intravenous insulin infusion induces elevated glucagon concentrations during hypoglycemia, although the same insulin infusion reduces glucagon concentrations at simultaneously normal glucose concentrations. Thus, insulin does not generally have an inhibitory effect on glucagon concentrations. If simultaneously glucose is low and insulin is high, glucagon is upregulated to increase glucose availability. Therefore, insulin and glucose are conjoint regulatory factors of glucagon concentrations in dairy cows, and the plasma glucose status is the key factor to decide if its concentrations are increased or decreased. This regulatory effect can be important for the maintenance of glucose homeostasis if insulin secretion is upregulated by other factors than high glucose such as high plasma lipid and protein concentrations at simultaneously low glucose.
Resumo:
PURPOSE To systematically evaluate the dependence of intravoxel-incoherent-motion (IVIM) parameters on the b-value threshold separating the perfusion and diffusion compartment, and to implement and test an algorithm for the standardized computation of this threshold. METHODS Diffusion weighted images of the upper abdomen were acquired at 3 Tesla in eleven healthy male volunteers with 10 different b-values and in two healthy male volunteers with 16 different b-values. Region-of-interest IVIM analysis was applied to the abdominal organs and skeletal muscle with a systematic increase of the b-value threshold for computing pseudodiffusion D*, perfusion fraction Fp , diffusion coefficient D, and the sum of squared residuals to the bi-exponential IVIM-fit. RESULTS IVIM parameters strongly depended on the choice of the b-value threshold. The proposed algorithm successfully provided optimal b-value thresholds with the smallest residuals for all evaluated organs [s/mm2]: e.g., right liver lobe 20, spleen 20, right renal cortex 150, skeletal muscle 150. Mean D* [10(-3) mm(2) /s], Fp [%], and D [10(-3) mm(2) /s] values (±standard deviation) were: right liver lobe, 88.7 ± 42.5, 22.6 ± 7.4, 0.73 ± 0.12; right renal cortex: 11.5 ± 1.8, 18.3 ± 2.9, 1.68 ± 0.05; spleen: 41.9 ± 57.9, 8.2 ± 3.4, 0.69 ± 0.07; skeletal muscle: 21.7 ± 19.0; 7.4 ± 3.0; 1.36 ± 0.04. CONCLUSION IVIM parameters strongly depend upon the choice of the b-value threshold used for computation. The proposed algorithm may be used as a robust approach for IVIM analysis without organ-specific adaptation. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.
Resumo:
OBJECTIVE Vitamin D (D₃) status is reported to correlate negatively with insulin production and insulin sensitivity in patients with type 2 diabetes mellitus (T2DM). However, few placebo-controlled intervention data are available. We aimed to assess the effect of large doses of parenteral D3 on glycosylated haemoglobin (HbA(₁c)) and estimates of insulin action (homeostasis model assessment insulin resistance: HOMA-IR) in patients with stable T2DM. MATERIALS AND METHODS We performed a prospective, randomised, double-blind, placebo-controlled pilot study at a single university care setting in Switzerland. Fifty-five patients of both genders with T2DM of more than 10 years were enrolled and randomised to either 300,000 IU D₃ or placebo, intramuscularly. The primary endpoint was the intergroup difference in HbA(₁c) levels. Secondary endpoints were: changes in insulin sensitivity, albuminuria, calcium/phosphate metabolism, activity of the renin-aldosterone axis and changes in 24-hour ambulatory blood pressure values. RESULTS After 6 months of D₃ supply, there was a significant intergroup difference in the change in HbA(₁c) levels (relative change [mean ± standard deviation] +2.9% ± 1.5% in the D₃ group vs +6.9% ± 2.1% the in placebo group, p = 0.041) as HOMA-IR decreased by 12.8% ± 5.6% in the D₃ group and increased by 10% ± 5.4% in the placebo group (intergroup difference, p = 0.032). Twenty-four-hour urinary albumin excretion decreased in the D₃ group from 200 ± 41 to 126 ± 39, p = 0.021). There was no significant intergroup difference for the other secondary endpoints. CONCLUSIONS D₃ improved insulin sensitivity (based on HOMA-IR) and affected the course of HbA(₁c) positively compared with placebo in patients with T2DM.
Resumo:
BACKGROUND Guidelines recommend that health care personnel (HCP) wear gloves for all interactions with patients on contact precautions. We aimed to assess hand hygiene (HH) compliance during contact precautions before and after eliminating mandatory glove use. METHODS We assessed HH compliance of HCP in the care of patients on contact precautions in 50 series before (2009) and 6 months after (2012) eliminating mandatory glove use and compared these results with the hospital-wide HH compliance. RESULTS We assessed 426 HH indications before and 492 indications after the policy change. Compared with 2009, we observed a significantly higher HH compliance in patients on contact precautions in 2012 (52%; 95% confidence interval [95% CI], 47-57) vs 85%; 95% CI, 82-88; P < .001). During the same period, hospital-wide HH compliance also increased from 63% (95% CI, 61-65) to 81% (95% CI 80-83) (P < .001). However, the relative improvement (RI) of HH compliance during contact precautions was significantly higher than the hospital-wide relative improvement (RI, 1.6; 95% CI, 1.49-1.81 vs 1.29; 95% CI, 1.25-1.34), with a relative improvement ratio of 1.27 (95% CI, 1.15-1.41). CONCLUSION Eliminating mandatory glove use in the care of patients on contact precautions increased HH compliance in our institution, particularly before invasive procedures and before patient contacts. Further studies on the effect on pathogen transmission are needed before revisiting the current official guidelines on the topic.
Resumo:
BACKGROUND The most recommended NRTI combinations as first-line antiretroviral treatment for HIV-1 infection in resource-rich settings are tenofovir/emtricitabine, abacavir/lamivudine, tenofovir/lamivudine and zidovudine/lamivudine. Efficacy studies of these combinations also considering pill numbers, dosing frequencies and ethnicities are rare. METHODS We included patients starting first-line combination ART (cART) with or switching from first-line cART without treatment failure to tenofovir/emtricitabine, abacavir/lamivudine, tenofovir/lamivudine and zidovudine/lamivudine plus efavirenz or nevirapine. Cox proportional hazards regression was used to investigate the effect of the different NRTI combinations on two primary outcomes: virological failure (VF) and emergence of NRTI resistance. Additionally, we performed a pill burden analysis and adjusted the model for pill number and dosing frequency. RESULTS Failure events per treated patient for the four NRTI combinations were as follows: 19/1858 (tenofovir/emtricitabine), 9/387 (abacavir/lamivudine), 11/344 (tenofovir/lamivudine) and 45/1244 (zidovudine/lamivudine). Compared with tenofovir/emtricitabine, abacavir/lamivudine had an adjusted HR for having VF of 2.01 (95% CI 0.86-4.55), tenofovir/lamivudine 2.89 (1.22-6.88) and zidovudine/lamivudine 2.28 (1.01-5.14), whereas for the emergence of NRTI resistance abacavir/lamivudine had an HR of 1.17 (0.11-12.2), tenofovir/lamivudine 11.3 (2.34-55.3) and zidovudine/lamivudine 4.02 (0.78-20.7). Differences among regimens disappeared when models were additionally adjusted for pill burden. However, non-white patients compared with white patients and higher pill number per day were associated with increased risks of VF and emergence of NRTI resistance: HR of non-white ethnicity for VF was 2.85 (1.64-4.96) and for NRTI resistance 3.54 (1.20-10.4); HR of pill burden for VF was 1.41 (1.01-1.96) and for NRTI resistance 1.72 (0.97-3.02). CONCLUSIONS Although VF and emergence of resistance was very low in the population studied, tenofovir/emtricitabine appears to be superior to abacavir/lamivudine, tenofovir/lamivudine and zidovudine/lamivudine. However, it is unclear whether these differences are due to the substances as such or to an association of tenofovir/emtricitabine regimens with lower pill burden.