931 resultados para Quality of care
Resumo:
Point-of-care (POC) tests offer potentially substantial benefits for the management of infectious diseases, mainly by shortening the time to result and by making the test available at the bedside or at remote care centres. Commercial POC tests are already widely available for the diagnosis of bacterial and viral infections and for parasitic diseases, including malaria. Infectious diseases specialists and clinical microbiologists should be aware of the indications and limitations of each rapid test, so that they can use them appropriately and correctly interpret their results. The clinical applications and performance of the most relevant and commonly used POC tests are reviewed. Some of these tests exhibit insufficient sensitivity, and should therefore be coupled to confirmatory tests when the results are negative (e.g. Streptococcus pyogenes rapid antigen detection test), whereas the results of others need to be confirmed when positive (e.g. malaria). New molecular-based tests exhibit better sensitivity and specificity than former immunochromatographic assays (e.g. Streptococcus agalactiae detection). In the coming years, further evolution of POC tests may lead to new diagnostic approaches, such as panel testing, targeting not just a single pathogen, but all possible agents suspected in a specific clinical setting. To reach this goal, the development of serology-based and/or molecular-based microarrays/multiplexed tests will be needed. The availability of modern technology and new microfluidic devices will provide clinical microbiologists with the opportunity to be back at the bedside, proposing a large variety of POC tests that will allow quicker diagnosis and improved patient care.
Resumo:
State-wide class-size reduction (CSR) policies have typically failed to produce large achievement gains. One explanation is that the introduction of such policies forces schools to hire relatively low-quality teachers. This paper uses data from an anonymous state to explore whether teacher quality suff ered from the introduction of CSR. We find that it did, but not nearly enough to explain the small achievement effects of CSR. The combined fall in achievement due to hiring lower quality teachers and more inexperienced teachers is small relative to the unrealized gains. Furthermore, between-school diff erences in the quality of incoming teachers cannot explain the poor estimated CSR performance from previous quasi-experimental treatment-control comparisons.
Resumo:
ABSTRACT Objectives: Patients with failed back surgery syndrome (FBSS) and chronic neuropathic pain experience levels of health-related quality of life (HRQoL) that are considerably lower than those reported in other areas of chronic pain. The aim of this article was to quantify the extent to which reductions in (leg and back) pain and disability over time translate into improvements in generic HRQoL as measured by the EuroQoL-5D and SF-36 instruments. Methods: Using data from the multinational Prospective, Randomized, Controlled, Multicenter Study of Patients with Failed Back Surgery Syndrome trial, we explore the relationship between generic HRQoL-assessed using two instruments often used in clinical trials (i.e., the SF-36 and EuroQol-5D)-and disease-specific outcome measures (i.e., Oswestry disability index [ODI], leg and back pain visual analog scale [VAS]) in neuropathic patients with FBSS. Results: In our sample of 100 FBSS patients, generic HRQoL was moderately associated with ODI (correlation coefficient: -0.462 to -0.638) and mildly associated with leg pain VAS (correlation coefficient: -0.165 to -0.436). The multilevel regression analysis results indicate that functional ability (as measured by the ODI) is significantly associated with HRQoL, regardless of the generic HRQoL instrument used. On the other hand, changes over time in leg pain were significantly associated with changes in the EuroQoL-5D and physical component summary scores, but not with the mental component summary score. Conclusions: Reduction in leg pain and functional disability is statistically significantly associated with improvements in generic HRQoL. This is the first study to investigate the longitudinal relationship between generic and disease-specific HRQoL of neuropathic pain patients with FBSS, using multinational data.
Resumo:
Purpose: To evaluate the extent of quality of life (QoL) associated adverse events (AEs) following PRECISION TACE with DC Bead compared with conventional transarterial chemoembolisation (cTACE). Methods and Materials: 201 intermediate HCC patients were treated with DC Bead (PRECISION TACE) or conventional TACE (cTACE) with doxorubicin in the PRECISION V clinical study. 93 patients were treated with DC Bead and 108 Patients with cTACE every 2 months and followed up for 6 months. AEs were classified according to the South West Oncology Group criteria. QoL associated AEs were defined as alopecia, constipation, nausea, vomiting, pyrexia, chills, asthenia, fatigue, and headache. Results: The biggest difference in QoL associated AEs was for alopecia: 2 patients (2.2%) for DC-Bead versus 21 patients (19.4%) for cTACE. For other clinical symptoms, constipation (n=10; 10.8% vs. n=13; 12%), vomiting (n=10; 10.8% vs. n=14; 13.0%), pyrexia (n=16; 17.2% vs. n=26; 24.1%), chills (n=1; 1.1% vs. n=5; 4.6%), and headache (n=2; 2.2% vs. n=8; 7.4%) showed lower incidence in the DC Bead group versus cTACE. Nausea, n= 15; 13.9% (n=15; 16.1%) and fatigue, n=6; 5.6% (n=13; 14.0%) were lower for cTACE. Total dose of doxorubicin was on average 35% higher in the DC Bead group. Conclusion: Although patients in the DC Bead group received a higher doxorubicin dose, less QoL associated AEs were reported for this group. Alopecia, the most obvious outward sign of toxicity, was only reported in a tenth of DC Bead patients. Thus, PRECISION TACE with DC Bead improves quality of life associated adverse events.
Resumo:
BACKGROUND: Risk factors for early mortality after pulmonary embolism (PE) are widely known. However, it is uncertain which factors are associated with early readmission after PE. We sought to identify predictors of readmission after an admission for PE. METHODS: We studied 14 426 patient discharges with a primary diagnosis of PE from 186 acute care hospitals in Pennsylvania from January 1, 2000, to November 30, 2002. The outcome was readmission within 30 days of presentation for PE. We used a discrete proportional odds model to study the association between time to readmission and patient factors (age, sex, race, insurance, discharge status, and severity of illness), thrombolysis, and hospital characteristics (region, teaching status, and number of beds). RESULTS: Overall, 2064 patient discharges (14.3%) resulted in a readmission within 30 days of presentation for PE. The most common reasons for readmission were venous thromboembolism (21.9%), cancer (10.8%), pneumonia (5.2%), and bleeding (5.0%). In multivariable analysis, African American race (odds ratio [OR], 1.19; 95% confidence interval [CI], 1.02-1.38), Medicaid insurance (OR, 1.54; 95% CI, 1.31-1.81), discharge home with supplemental care (OR, 1.40; 95% CI, 1.27-1.54), leaving the hospital against medical advice (OR, 2.84; 95% CI, 1.80-4.48), and severity of illness were independently associated with readmission; readmission also varied by hospital region. CONCLUSIONS: Early readmission after PE is common. African American race, Medicaid insurance, severity of illness, discharge status, and hospital region are significantly associated with readmission. The high readmission rates for venous thromboembolism and bleeding suggest that readmission may be linked to suboptimal quality of care in the management of PE.
Resumo:
Rapport de synthèse : Introduction : Internet est une source importante d'information sur la santé mentale. Le trouble bipolaire est communément associé à un handicap, des comorbidités, un faible taux d'introspection et une mauvaise compliance au traitement. Le fardeau de la maladie, de par les épisodes dépressifs et maniaques, peut conduire les personnes (dont le diagnostic de trouble bipolaire a été déjà posé ou non), ainsi que leur famille à rechercher des informations sur Internet. De ce fait, il est important que les sites Web traitant du sujet contiennent de l'information de haute qualité, basée sur les évidences scientifiques. Objectif.: évaluer la qualité des informations consultables sur Internat au sujet du trouble bipolaire et identifier des indicateurs de qualité. Méthode: deux mots-clés : « bipolar disorder » et « manic depressive illness » ont été introduits dans les moteurs de recherche les plus souvent utilisés sur Internet. Les sites Internet ont été évalués avec un formulaire standard conçu pour noter les sites sur la base de l'auteur (privé, université, entreprise,...), la présentation, l'interactivité, la lisibilité et la qualité du contenu. Le label de qualité « Health On the Net» (HON), et l'outil DISCERN ont été utilisés pour vérifier leur efficacité comme indicateurs de la qualité. Résultats: sur les 80 sites identifiés, 34 ont été inclus. Sur la base de la mesure des résultats, la qualité du contenu des sites s'est avérée être bonne. La qualité du contenu des sites Web qui traitent du trouble bipolaire est expliquée de manière significative par la lisibilité, la responsabilité et l'interactivité aussi bien que par un score global. Conclusions: dans l'ensemble, la qualité du contenu de l'étude des sites Web traitant du trouble bipolaire est de bonne qualité.
Resumo:
ABSTRACTA significant share of deliveries are performed by Cesarian section (C-section) in Europe and in many developed and developing countries. The aims of this thesis are to highlight the non medical, especially economic and financial, incentives that explain the use of C-section, as well as the medical consequences of C-section on women's health, in regard with other factors of ob¬stetrical care quality such as hospital concentration. Those diagnoses enable us to exhibit ways of improvement of obstetrical care quality in France. Our analysis focus on two countries, France and Switzerland. In the first part of the thesis, we show the influence of two non medical factors on the C-section use, namely the hospital payment system on the one hand and the obstetricians behaviour, especially their demand for leisure, on the other hand. With French data on the year 2003, we show firstly that the fee-for-service payment system of private for profit hospitals induces a higher probability of using C-section. Obstetricians play also a preeminent role in the decision to use a C-section, as the probability of a C-section rises with the number of obstetricians. We then focus on a French reform introduced in 2004, to investigate the impact of Prospective Payment System on obstetric practise. We show that the rise of C-section rate between 2003 and 2006 is mainly caused by changes in hospitals and patients features. Obstetricians practises do not vary a lot for patients with the same risk code. In the mean time however, the number of women coded with a high risk rises. This can be caused by improvements in the quality of coding, obstetricians chosing codes that match better the real health state of their patients. Yet, it can also show that obstetricians change their coding practises to justify the use of certain practises, such as C-section, with no regard to the health state of patients. Financial factors are not the only non medical fac¬tors that can influence the resort to C-section. Using Shelton Brown ΠΙ identification strategy, we focus on the potential impact of obstetricians leisure preference on the use of C-section. We use the distributions of days and hours of delivering and the types of C-section - planned or emergency C-sections - to show that the obstetricians demand for leisure has a significant impact on the resort to C-section, but only in emergency situations. The second part of the thesis deals with some ways to improve obstetric care quality. We use on the one hand swiss and french data to study the impact of C-section on the patients' probability of having an obstetric complication and on the other hand the influence of hospital concentration on the quality of obstetric care. We find the same results as former medical studies about the risks entailed by C-section on obstetric complications.These results prove women ought to be better informed of the medical consequences of C-section and that the slowing of C-section use should be a priority of public health policy. We finally focus on another way to improve obstetric care quality, that is hospital lmarket concentration. We investigate the impact of hospital concentration by integrating the Herfindahl-Hirschman index in our model, on health care quality, measured by the HCUP indicator. We find that hospital concentration has a negative impact on obstetric care quality, which undermines today's policy of hospital closings in France.JEL classification: 112; 118Keywords: Hospital; C-section; Payment System; Counterfactual Estimation; Quality of Care.RÉSUMÉUne part importante des accouchements sont réalisés par césarienne en Europe et dans de nom¬breux pays développés ou en développement. Les objectifs de cette thèse sont de mettre en évidence les déterminants non médicaux, notamment économiques et financiers, expliquant le développe¬ment de cette pratique, ainsi que ses conséquences sur la santé des femmes après Γ accouchement, en lien avec d'autres facteurs comme la concentration locale des structures hospitalières. Les résul¬tats exposés dans cette thèse éclairent les perspectives et voies d'amélioration de la qualité des soins en obstétriques.Notre analyse se concentre sur deux pays : la France et la Suisse. Dans la première partie de la thèse, nous mettons en évidence l'influence de deux déterminants non médicaux sur l'emploi de la césarienne : le système de paiement des hôpitaux d'une part, et le comportement des médecins obstétriciens d'autre part. En étudiant des données françaises de 2003, nous montrons d'abord que le financement à l'acte des établissements privés engendre une hausse de la proba¬bilité de pratiquer une césarienne. Le rôle de l'obstrétricien paraît également déterminant dans la décision d'opérer une césarienne, la probabilité d'employer cette technique augmentant avec le nombre d'obstétriciens. Nous nous intéressons ensuite à l'impact de la mise en place en 2004 du système de paiement prospectif sur l'évolution des pratiques obstétricales entre 2003 et 2006 en France. La hausse du taux de recours à la césarienne entre 2004 et 2006 peut ainsi être principa¬lement imputée aux évolutions des caractéristiques des hôpitaux et des patients, les pratiques des obstétriciens, pour un même codage de la situation du patient, variant peu. Dans le même temps cependant, les pratiques de codage des patients parles obstétriciens évoluent fortement, les femmes étant de plus en plus nombreuses à porter des codes correspondant à des situations à risques. Cette évolution peut indiquer que la qualité du codage en 2006 s'est améliorée par rapport à 2004, le codage correspondant de plus en plus à la situation réelle des patientes. H peut aussi indiquer que les pratiques de codage évoluent pour justifier un recours accru à la césarienne, sans lien avec l'état réel des patientes. Les facteurs financiers ne sont pas les seuls facteurs non médicaux à pouvoir expliquer le recours à la césarienne : nous nous intéressons, en suivant la stratégie d'identifica¬tion de Shelton Brown m, à l'impact potentiel de la demande de loisir des médecins obstétriciens sur la pratique de la césarienne. En utilisant la distribution des jours et heures d'accouchement, et en distinguant les césariennes planifiées de celles effectuées en urgence, nous constatons que la demande de loisir des obstétriciens influence significativement le recours à la césarienne, mais uni¬quement pour les interventions d'urgence. La deuxième partie de la thèse est consacrée à l'étude de la qualité des soins en obstétriques. Nous utilisons des données suisses et françaises pour analyser d'une part l'impact de la césarienne sur la survenue de complications obstétricales et d'autre part l'impact de la concentration des soins sur la qualité des soins en obstétrique. Nons confirmons les résultats antérieurs de la littérature médicale sur la dangerosité de la césarienne comme facteur de complications obstétricales. Ces conclusions montrent que les femmes ont besoin d'être informées des conséquences de la césarienne sur leur santé et que le ralentissement de l'augmentation de la pratique de la césarienne devrait être un objectif de la politique publique de santé. Nous nous in¬téressons à un autre facteur d'amélioration des soins en obstrétique, l'organisation des hôpitaux et particulièrement leur concentration. Nous estimons ainsi l'effet de la concentration sur la qualité des soins obstétriques en intégrant l'indice de Herfindahl-Hirschman dans notre modèle, la qualité des soins étant mesurée à l'aide de l'indicateur HCUP. Nous constatons que la concentration des naissances a un impact négatif sur la qualité des soins en obstétrique, résultat qui va dans le sens contraire des politiques de fermeture d'hôpitaux menées actuellement en France. JEL classification : 112 ; 118Mots-clés : Hôpital ; Césarienne ; Système de paiement ; Contrefactuels ; Qualité des soins, sur la qualité des soins en obstétrique.
Resumo:
Background: While quality of life (QoL) is a well-recognised outcome measure of Crohn disease (CD) activity, its influence on other outcome measures, including exacerbation of CD is poorly understood. If QoL measures were to be associated with intestinal inflammatory activity, they might be useful for early detection of subclinical flares. Aims: We hypothesised that low QoL might be associated with subsequent CD flares. Methods: A cohort of 318 adult CD patients was observed for 1 year after assessment of baseline characteristics. Data were collected in Swiss university hospitals, regional hospitals and private practices. At inclusion, patients completed the Inflammatory Bowel Disease QoL Questionnaire (gastrointestinal QoL; range: 32 to 224 points) and the Short Form-36 Health Survey (general QoL; range: 35 to 145 points). During follow up, flares were recorded. Binary logistic regression was performed to estimate the relation between QoL and the odds of subsequent flares. Results: A twofold decrease in the odds of flares (99% CI: 1.1; 4.0) per standard deviation of gastrointestinal QoL and a threefold decrease (99% CI: 1.5; 6.2) per standard deviation of general QoL were observed. Conclusions: The close association between QoL and subsequent flares suggests that QoL measures might be useful in detecting upcoming flares before they become clinically apparent.
Resumo:
Objective : The main objective of this study was to assess mother-child patterns of interaction in relation to later quality of attachment in a group of children with an orofacial cleft compared with children without cleft. Design : Families were contacted when the child was 2 months old for a direct assessment of mother-child interaction and then at 12 months for a direct assessment of the child's attachment. Data concerning socioeconomical information and posttraumatic stress symptoms in mothers were collected at the first appointment. Participants : Forty families of children with a cleft and 45 families of children without cleft were included in the study. Families were recruited at birth in the University Hospital of Lausanne. Results : Results showed that children with a cleft were more difficult and less cooperative during interaction at 2 months of age with their mother compared with children without a cleft. No significant differences were found in mothers or in dyadic interactive styles. Concerning the child's attachment at 12 months old, no differences were found in attachment security. However, secure children with a cleft were significantly more avoidant with their mother during the reunion episodes than secure children without cleft. Conclusion : Despite the facial disfigurement and the stress engendered by treatment during the first months of the infant's life, children with cleft and their mothers are doing as well as families without cleft with regard to the mothers' mental health, mother-child relationships, and later quality of attachment. A potential contribution for this absence of difference may be the pluridisciplinary support that families of children with cleft benefit from in Lausanne.
Resumo:
This paper presents an initial challenge to tackle the every so "tricky" points encountered when dealing with energy accounting, and thereafter illustrates how such a system of accounting can be used when assessing for the metabolic changes in societies. The paper is divided in four main sections. The first three, present a general discussion on the main issues encountered when conducting energy analyses. The last section, subsequently, combines this heuristic approach to the actual formalization of it, in quantitative terms, for the analysis of possible energy scenarios. Section one covers the broader issue of how to account for the relevant categories used when accounting for Joules of energy; emphasizing on the clear distinction between Primary Energy Sources (PES) (which are the physical exploited entities that are used to derive useable energy forms (energy carriers)) and Energy Carriers (EC) (the actual useful energy that is transmitted for the appropriate end uses within a society). Section two sheds light on the concept of Energy Return on Investment (EROI). Here, it is emphasized that, there must already be a certain amount of energy carriers available to be able to extract/exploit Primary Energy Sources to thereafter generate a net supply of energy carriers. It is pointed out that this current trend of intense energy supply has only been possible to the great use and dependence on fossil energy. Section three follows up on the discussion of EROI, indicating that a single numeric indicator such as an output/input ratio is not sufficient in assessing for the performance of energetic systems. Rather an integrated approach that incorporates (i) how big the net supply of Joules of EC can be, given an amount of extracted PES (the external constraints); (ii) how much EC needs to be invested to extract an amount of PES; and (iii) the power level that it takes for both processes to succeed, is underlined. Section four, ultimately, puts the theoretical concepts at play, assessing for how the metabolic performances of societies can be accounted for within this analytical framework.
Resumo:
Sixty-nine entire male pigs with different halothane genotype (homozygous halothane positive – nn –, n=36; and homozygous halothane negative – NN-, n=33) were fed with a supplementation of magnesium sulphate (Mg) and/or L-tryptophan (Trp) in the diet for 5 days before slaughter. Animals were housed individually and were submitted to stressful ante mortem conditions (mixed in the lorry according to treatments and transported 1 hour on rough roads). Individual feed intake was recorded during the 5-d treatment. At the abattoir, pig behaviour was assessed in the raceway to the stunning system and during the stunning period by exposure to CO2. Muscle pH, colour, water holding capacity, texture and cathepsin activities were determined to assess meat quality. The number of pigs with an individual feed intake lower than 2 kg/d was significantly different among diets (P&0.05; Control: 8.7 %; Mg&Trp: 43.5 %; Trp: 17.4 %) and they were considered to have inadequate supplement intake. During the ante mortem period, 15.2 % of pigs included in the experiment died, and this percentage decreased to 8.7 % in those pigs with a feed intake & 2kg/day, all of them from the stress-sensitive pigs (nn). In general, no differences were observed in the behaviour of pigs along the corridor leading to the stunning system and inside the CO2 stunning system. During the stunning procedure, Trp diet showed shorter periods of muscular excitation than control and Mg&Trp diets. The combination of a stressful ante mortem treatment and Mg&Trp supplementation led to carcasses with high incidence of severe skin lesions. Different meat quality results were found when considering all pigs or considering only those with adequate supplement intake. In this later case, Trp increased pH45 (6.15) vs Control diet (5.96) in the Longissimus thoracis (LT) muscle (P&0.05) and pH at 24h (Trp: 5.59 vs C: 5.47) led to a higher incidence of dark, firm and exudative (DFD) traits in SM muscle (P&0.05). Genotype affected negatively all the meat quality traits. Seventy-five percent of LT and 60.0 % of the SM muscles from nn pigs were classified as pale, soft and exudative (PSE), while none of the NN pigs showed these traits (P&0.0001). No significant differences were found between genotypes on the incidence of DFD meat. Due to the negative effects observed in the Mg&Trp group in feed intake and carcass quality, the utilization of a mixture of magnesium sulphate and tryptophan is not recommended.
Resumo:
BACKGROUND: Increasing the appropriateness of use of upper gastrointestinal (GI) endoscopy is important to improve quality of care while at the same time containing costs. This study explored whether detailed explicit appropriateness criteria significantly improve the diagnostic yield of upper GI endoscopy. METHODS: Consecutive patients referred for upper GI endoscopy at 6 centers (1 university hospital, 2 district hospitals, 3 gastroenterology practices) were prospectively included over a 6-month period. After controlling for disease presentation and patient characteristics, the relationship between the appropriateness of upper GI endoscopy, as assessed by explicit Swiss criteria developed by the RAND/UCLA panel method, and the presence of relevant endoscopic lesions was analyzed. RESULTS: A total of 2088 patients (60% outpatients, 57% men) were included. Analysis was restricted to the 1681 patients referred for diagnostic upper GI endoscopy. Forty-six percent of upper GI endoscopies were judged to be appropriate, 15% uncertain, and 39% inappropriate by the explicit criteria. No cancer was found in upper GI endoscopies judged to be inappropriate. Upper GI endoscopies judged appropriate or uncertain yielded significantly more relevant lesions (60%) than did those judged to be inappropriate (37%; odds ratio 2.6: 95% CI [2.2, 3.2]). In multivariate analyses, the diagnostic yield of upper GI endoscopy was significantly influenced by appropriateness, patient gender and age, treatment setting, and symptoms. CONCLUSIONS: Upper GI endoscopies performed for appropriate indications resulted in detecting significantly more clinically relevant lesions than did those performed for inappropriate indications. In addition, no upper GI endoscopy that resulted in a diagnosis of cancer was judged to be inappropriate. The use of such criteria improves patient selection for upper GI endoscopy and can thus contribute to efforts aimed at enhancing the quality and efficiency of care. (Gastrointest Endosc 2000;52:333-41).
Resumo:
AIMS: In patients with alcohol dependence, health-related quality of life (QOL) is reduced compared with that of a normal healthy population. The objective of the current analysis was to describe the evolution of health-related QOL in adults with alcohol dependence during a 24-month period after initial assessment for alcohol-related treatment in a routine practice setting, and its relation to drinking pattern which was evaluated across clusters based on the predominant pattern of alcohol use, set against the influence of baseline variables METHODS: The Medical Outcomes Study 36-Item Short-Form Survey (MOS-SF-36) was used to measure QOL at baseline and quarterly for 2 years among participants in CONTROL, a prospective observational study of patients initiating treatment for alcohol dependence. The sample consisted of 160 adults with alcohol dependence (65.6% males) with a mean (SD) age of 45.6 (12.0) years. Alcohol use data were collected using TimeLine Follow-Back. Based on the participant's reported alcohol use, three clusters were identified: 52 (32.5%) mostly abstainers, 64 (40.0%) mostly moderate drinkers and 44 (27.5%) mostly heavy drinkers. Mixed-effect linear regression analysis was used to identify factors that were potentially associated with the mental and physical summary MOS-SF-36 scores at each time point. RESULTS: The mean (SD) MOS-SF-36 mental component summary score (range 0-100, norm 50) was 35.7 (13.6) at baseline [mostly abstainers: 40.4 (14.6); mostly moderate drinkers 35.6 (12.4); mostly heavy drinkers 30.1 (12.1)]. The score improved to 43.1 (13.4) at 3 months [mostly abstainers: 47.4 (12.3); mostly moderate drinkers 44.2 (12.7); mostly heavy drinkers 35.1 (12.9)], to 47.3 (11.4) at 12 months [mostly abstainers: 51.7 (9.7); mostly moderate drinkers 44.8 (11.9); mostly heavy drinkers 44.1 (11.3)], and to 46.6 (11.1) at 24 months [mostly abstainers: 49.2 (11.6); mostly moderate drinkers 45.7 (11.9); mostly heavy drinkers 43.7 (8.8)]. Mixed-effect linear regression multivariate analyses indicated that there was a significant association between a lower 2-year follow-up MOS-SF-36 mental score and being a mostly heavy drinker (-6.97, P < 0.001) or mostly moderate drinker (-3.34 points, P = 0.018) [compared to mostly abstainers], being female (-3.73, P = 0.004), and having a Beck Inventory scale score ≥8 (-6.54, P < 0.001), at baseline. The mean (SD) MOS-SF-36 physical component summary score was 48.8 (10.6) at baseline, remained stable over the follow-up and did not differ across the three clusters. Mixed-effect linear regression univariate analyses found that the average 2-year follow-up MOS-SF-36 physical score was increased (compared with mostly abstainers) in mostly heavy drinkers (+4.44, P = 0.007); no other variables tested influenced the MOS-SF-36 physical score. CONCLUSION: Among individuals with alcohol dependence, a rapid improvement was seen in the mental dimension of QOL following treatment initiation, which was maintained during 24 months. Improvement was associated with the pattern of alcohol use, becoming close to the general population norm in patients classified as mostly abstainers, improving substantially in mostly moderate drinkers and improving only slightly in mostly heavy drinkers. The physical dimension of QOL was generally in the normal range but was not associated with drinking patterns.
Resumo:
CONTEXT: New trial data and drug regimens that have become available in the last 2 years warrant an update to guidelines for antiretroviral therapy (ART) in human immunodeficiency virus (HIV)-infected adults in resource-rich settings. OBJECTIVE: To provide current recommendations for the treatment of adult HIV infection with ART and use of laboratory-monitoring tools. Guidelines include when to start therapy and with what drugs, monitoring for response and toxic effects, special considerations in therapy, and managing antiretroviral failure. DATA SOURCES, STUDY SELECTION, AND DATA EXTRACTION: Data that had been published or presented in abstract form at scientific conferences in the past 2 years were systematically searched and reviewed by an International Antiviral Society-USA panel. The panel reviewed available evidence and formed recommendations by full panel consensus. DATA SYNTHESIS: Treatment is recommended for all adults with HIV infection; the strength of the recommendation and the quality of the evidence increase with decreasing CD4 cell count and the presence of certain concurrent conditions. Recommended initial regimens include 2 nucleoside reverse transcriptase inhibitors (tenofovir/emtricitabine or abacavir/lamivudine) plus a nonnucleoside reverse transcriptase inhibitor (efavirenz), a ritonavir-boosted protease inhibitor (atazanavir or darunavir), or an integrase strand transfer inhibitor (raltegravir). Alternatives in each class are recommended for patients with or at risk of certain concurrent conditions. CD4 cell count and HIV-1 RNA level should be monitored, as should engagement in care, ART adherence, HIV drug resistance, and quality-of-care indicators. Reasons for regimen switching include virologic, immunologic, or clinical failure and drug toxicity or intolerance. Confirmed treatment failure should be addressed promptly and multiple factors considered. CONCLUSION: New recommendations for HIV patient care include offering ART to all patients regardless of CD4 cell count, changes in therapeutic options, and modifications in the timing and choice of ART in the setting of opportunistic illnesses such as cryptococcal disease and tuberculosis.
Resumo:
OBJECTIVES: Beyond its well-documented association with depressive symptoms across the lifespan, at an individual level, quality of life may be determined by multiple factors: psychosocial characteristics, current physical health and long-term personality traits. METHOD: Quality of life was assessed in two distinct community-based age groups (89 young adults aged 36.2 ± 6.3 and 92 older adults aged 70.4 ± 5.5 years), each group equally including adults with and without acute depressive symptoms. Regression models were applied to explore the association between quality of life assessed with the World Health Organization Quality of Life - Bref (WHOQOL-Bref) and depression severity, education, social support, physical illness, as well as personality dimensions as defined by the Five-Factor Model. RESULTS: In young age, higher quality of life was uniquely associated with lower severity of depressive symptoms. In contrast, in old age, higher quality of life was related to both lower levels of depressive mood and of physical illness. In this age group, a positive association was also found between quality of life and higher levels of Openness to experience and Agreeableness personality dimensions. CONCLUSION: Our data indicated that, in contrast to young cohorts, where acute depression is the main determinant of poor quality of life, physical illness and personality dimensions represent additional independent predictors of this variable in old age. This observation points to the need for concomitant consideration of physical and psychological determinants of quality of life in old age.