853 resultados para Illicit Drug Use
Resumo:
Tobacco use is positively associated with severity of symptoms along the schizophrenia spectrum. Accordingly it could be argued that neuropsychological performance, formerly thought to be modulated by schizotypy, is actually modulated by drug use or an interaction of drug use and schizotypy. We tested whether habitual cigarette smokers as compared to non-smokers would show a neuropsychological profile similar to that observed along the schizophrenia spectrum and, if so, whether smoking status or nicotine dependence would be more significant modulators of behavior than schizotypy. Because hemispheric dominance has been found to be attenuated along the schizophrenia spectrum, 40 right-handed male students (20 non-smokers) performed lateralized left- (lexical decisions) and right- (facial decision task) hemisphere dominant tasks. All individuals completed self-report measures of schizotypy and nicotine dependence. Schizotypy predicted laterality in addition to smoking status: While positive schizotypy (Unusual Experiences) was unrelated to hemispheric performance, Cognitive Disorganization predicted reduced left hemisphere dominant language functions. These latter findings suggest that Cognitive Disorganization should be regarded separately as a potentially important mediator of thought disorganization and language processing. Additionally, increasing nicotine dependence among smokers predicted a right hemisphere shift of function in both tasks that supports the role of the right hemisphere in compulsive/impulsive behavior.
Resumo:
Increasingly, patients with unhealthy alcohol and other drug use are being seen in primary care and other non-specialty addiction settings. Primary care providers are well positioned to screen, assess, and treat patients with alcohol and other drug use because this use, and substance use disorders, may contribute to a host of medical and mental health harms. We sought to identify and examine important recent advances in addiction medicine in the medical literature that have implications for the care of patients in primary care or other generalist settings. To accomplish this aim, we selected articles in the field of addiction medicine, critically appraised and summarized the manuscripts, and highlighted their implications for generalist practice. During an initial review, we identified articles through an electronic Medline search (limited to human studies and in English) using search terms for alcohol and other drugs of abuse published from January 2010 to January 2012. After this initial review, we searched for other literature in web-based or journal resources for potential articles of interest. From the list of articles identified in these initial reviews, each of the six authors independently selected articles for more intensive review and identified the ones they found to have a potential impact on generalist practice. The identified articles were then ranked by the number of authors who selected each article. Through a consensus process over 4 meetings, the authors reached agreement on the articles with implications for practice for generalist clinicians that warranted inclusion for discussion. The authors then grouped the articles into five categories: 1) screening and brief interventions in outpatient settings, 2) identification and management of substance use among inpatients, 3) medical complications of substance use, 4) use of pharmacotherapy for addiction treatment in primary care and its complications, and 5) integration of addiction treatment and medical care. The authors discuss each selected articles' merits, limitations, conclusions, and implication to advancing addiction screening, assessment, and treatment of addiction in generalist physician practice environments.
Resumo:
BACKGROUND: There is an ongoing debate as to whether combined antiretroviral treatment (cART) during pregnancy is an independent risk factor for prematurity in HIV-1-infected women. OBJECTIVE: The aim of the study was to examine (1) crude effects of different ART regimens on prematurity, (2) the association between duration of cART and duration of pregnancy, and (3) the role of possibly confounding risk factors for prematurity. METHOD: We analysed data from 1180 pregnancies prospectively collected by the Swiss Mother and Child HIV Cohort Study (MoCHiV) and the Swiss HIV Cohort Study (SHCS). RESULTS: Odds ratios for prematurity in women receiving mono/dual therapy and cART were 1.8 [95% confidence interval (CI) 0.85-3.6] and 2.5 (95% CI 1.4-4.3) compared with women not receiving ART during pregnancy (P=0.004). In a subgroup of 365 pregnancies with comprehensive information on maternal clinical, demographic and lifestyle characteristics, there was no indication that maternal viral load, age, ethnicity or history of injecting drug use affected prematurity rates associated with the use of cART. Duration of cART before delivery was also not associated with duration of pregnancy. CONCLUSION: Our study indicates that confounding by maternal risk factors or duration of cART exposure is not a likely explanation for the effects of ART on prematurity in HIV-1-infected women.
Resumo:
There has been relatively little change over recent decades in the methods used in research on self-reported delinquency. Face-to-face interviews and selfadministered interviews in the classroom are still the predominant alternatives envisaged. New methods have been brought into the picture by recent computer technology, the Internet, and an increasing availability of computer equipment and Internet access in schools. In the autumn of 2004, a controlled experiment was conducted with 1,203 students in Lausanne (Switzerland), where "paper-and-pencil" questionnaires were compared with computer-assisted interviews through the Internet. The experiment included a test of two different definitions of the (same) reference period. After the introductory question ("Did you ever..."), students were asked how many times they had done it (or experienced it), if ever, "over the last 12 months" or "since the October 2003 vacation". Few significant differences were found between the results obtained by the two methods and for the two definitions of the reference period, in the answers concerning victimisation, self-reported delinquency, drug use, failure to respond (missing data). Students were found to be more motivated to respond through the Internet, take less time for filling out the questionnaire, and were apparently more confident of privacy, while the school principals were less reluctant to allow classes to be interviewed through the Internet. The Internet method also involves considerable cost reductions, which is a critical advantage if self-reported delinquency surveys are to become a routinely applied method of evaluation, particularly so in countries with limited resources. On balance, the Internet may be instrumental in making research on self-reported delinquency far more feasible in situations where limited resources so far have prevented its implementation.
Resumo:
OBJECTIVES: In 2002, the canton of Fribourg, Switzerland, implemented a coordinated pharmaceutical care service in nursing homes to promote rational drug use. In the context of this service, a project was conducted to develop recommendations for the pharmacological management of behavioral and psychological symptoms of dementia (BPSD) in nursing home residents. DESIGN AND METHODS: Selected evidence-based guidelines and meta-analysis sources related to the management of depression, insomnia, and agitation in dementia patients were systematically searched and evaluated. Evidence and controversies regarding the pharmacological treatment of the most common BPSD symptoms were reviewed, and treatment algorithms were developed. RESULTS: Ten evidence-based guidelines and meta-analyses for BPSD management were identified, with none specifically addressing issues related to nursing home residents. Based on this literature, recommendations were developed for the practice of pharmacological management of depression, sleep disturbances, and agitation in nursing home residents. For depression, SSRIs are considered the first choice if an antidepressant is required. No clear evidence has been found for sleep disturbances; the underlying conditions need to be investigated closely before the introduction of any drug therapy. Many drugs have been investigated for the treatment of agitation, and if necessary, antipsychotics could be used, although they have significant side effects. Several areas of uncertainty were identified, such as the current controversy about typical and atypical antipsychotic use or the appropriateness of cholinesterase inhibitors for controlling agitation. Treatment algorithms were presented to general practitioners, pharmacists, and medical directors of nursing homes in the canton of Fribourg, and will now be implemented progressively, using educational sessions, pharmaceutical counseling, and monitoring. CONCLUSION: Based on existing evidence-based studies, recommendations were developed for the practice of pharmacological management of depression, sleep disturbances, and agitation in nursing home residents. It should be further studied whether these algorithms implemented through pharmaceutical care services will improve psychotropic drug prescriptions and prevent drug-related problems in nursing home residents
Resumo:
OBJECTIVES: The impact of diagnostic delay (a period from appearance of first symptoms to diagnosis) on the clinical course of Crohn's disease (CD) is unknown. We examined whether length of diagnostic delay affects disease outcomes. METHODS: Data from the Swiss IBD cohort study were analyzed. Patients were recruited from university centers (68%), regional hospitals (14%), and private practices (18%). The frequencies of occurrence of bowel stenoses, internal fistulas, perianal fistulas, and CD-related surgery (intestinal and perianal) were analyzed. RESULTS: A total of 905 CD patients (53.4% female, median age at diagnosis 26 (20-36) years) were stratified into four groups according to the quartiles of diagnostic delay (0-3, 4-9, 10-24, and ≥25 months, respectively). Median diagnostic delay was 9 (3-24) months. The frequency of immunomodulator and/or antitumor necrosis factor drug use did not differ among the four groups. The length of diagnostic delay was positively correlated with the occurrence of bowel stenosis (odds ratio (OR) 1.76, P=0.011 for delay of ≥25 months) and intestinal surgery (OR 1.76, P=0.014 for delay of 10-24 months and OR 2.03, P=0.003 for delay of ≥25 months). Disease duration was positively associated and non-ileal disease location was negatively associated with bowel stenosis (OR 1.07, P<0.001, and OR 0.41, P=0.005, respectively) and intestinal surgery (OR 1.14, P<0.001, and OR 0.23, P<0.001, respectively). CONCLUSIONS: The length of diagnostic delay is correlated with an increased risk of bowel stenosis and CD-related intestinal surgery. Efforts should be undertaken to shorten the diagnostic delay.
Resumo:
BACKGROUND: Prognostic models and nomograms were recently developed to predict survival of patients with newly diagnosed glioblastoma multiforme (GBM).1 To improve predictions, models should be updated with the most recent patient and disease information. Nomograms predicting patient outcome at the time of disease progression are required. METHODS: Baseline information from 299 patients with recurrent GBM recruited in 8 phase I or II trials of the EORTC Brain Tumor Group was used to evaluate clinical parameters as prognosticators of patient outcome. Univariate (log rank) and multivariate (Cox models) analyses were made to assess the ability of patients' characteristics (age, sex, performance status [WHO PS], and MRC neurological deficit scale), disease history (prior treatments, time since last treatment or initial diagnosis, and administration of steroids or antiepileptics) and disease characteristics (tumor size and number of lesions) to predict progression free survival (PFS) and overall survival (OS). Bootstrap technique was used for models internal validation. Nomograms were computed to provide individual patients predictions. RESULTS: Poor PS and more than 1 lesion had a significant prognostic impact for both PFS and OS. Antiepileptic drug use was significantly associated with worse PFS. Larger tumors (split by the median of the largest tumor diameter >42.5 mm) and steroid use had shorter OS. Age, sex, neurologic deficit, prior therapies, and time since last therapy or initial diagnosis did not show independent prognostic value for PFS or OS. CONCLUSIONS: This analysis confirms that PS but not age is a major prognostic factor for PFS and OS. Multiple or large tumors and the need to administer steroids significantly increase the risk of progression and death. Nomograms at the recurrence could be used to obtain accurate predictions for the design of new targeted therapy trials or retrospective analyses. (1. T. Gorlia et al., Nomograms for predicting survival of patients with newly diagnosed glioblastoma. Lancet Oncol 9 (1): 29-38, 2008.)
Resumo:
Prominent doping cases in certain sports have recently raised public awareness of doping and reinforced the perception that doping is widespread. Efforts to deal with doping in sport have intensified in recent years, yet the general public believes that the 'cheaters' are ahead of the testers. Therefore, there is an urgent need to change the antidoping strategy. For example, the increase in the number of individual drug tests conducted between 2005 and 2012 was approximately 90 000 and equivalent to an increase of about 50%, yet the number of adverse analytical findings remained broadly the same. There is also a strikingly different prevalence of doping substances and methods in sports such as a 0.03% prevalence of anabolic steroids in football compared to 0.4% in the overall WADA statistics. Future efforts in the fight against doping should therefore be more heavily based on preventative strategies such as education and on the analysis of data and forensic intelligence and also on the experiences of relevant stakeholders such as the national antidoping organisations, the laboratories, athletes or team physicians and related biomedical support staff. This strategy is essential to instigate the change needed to more effectively fight doping in sport.
Resumo:
BACKGROUND: The clinical course of HIV-1 infection is highly variable among individuals, at least in part as a result of genetic polymorphisms in the host. Toll-like receptors (TLRs) have a key role in innate immunity and mutations in the genes encoding these receptors have been associated with increased or decreased susceptibility to infections. OBJECTIVES: To determine whether single-nucleotide polymorphisms (SNPs) in TLR2-4 and TLR7-9 influenced the natural course of HIV-1 infection. METHODS: Twenty-eight SNPs in TLRs were analysed in HAART-naive HIV-positive patients from the Swiss HIV Cohort Study. The SNPs were detected using Sequenom technology. Haplotypes were inferred using an expectation-maximization algorithm. The CD4 T cell decline was calculated using a least-squares regression. Patients with a rapid CD4 cell decline, less than the 15th percentile, were defined as rapid progressors. The risk of rapid progression associated with SNPs was estimated using a logistic regression model. Other candidate risk factors included age, sex and risk groups (heterosexual, homosexual and intravenous drug use). RESULTS: Two SNPs in TLR9 (1635A/G and +1174G/A) in linkage disequilibrium were associated with the rapid progressor phenotype: for 1635A/G, odds ratio (OR), 3.9 [95% confidence interval (CI),1.7-9.2] for GA versus AA and OR, 4.7 (95% CI,1.9-12.0) for GG versus AA (P = 0.0008). CONCLUSION: Rapid progression of HIV-1 infection was associated with TLR9 polymorphisms. Because of its potential implications for intervention strategies and vaccine developments, additional epidemiological and experimental studies are needed to confirm this association.
Resumo:
Despite improvements in health care, the incidence of infective endocarditis has not decreased over the past decades. This apparent paradox is explained by a progressive evolution in risk factors; while classic predisposing conditions such as rheumatic heart disease have been all but eradicated, new risk factors for infective endocarditis have emerged. These include intravenous drug use, sclerotic valve disease in elderly patients, use of prosthetic valves, and nosocomial disease. Newly identified pathogens, which are difficult to cultivate--eg, Bartonella spp and Tropheryma whipplei--are present in selected individuals, and resistant organisms are challenging conventional antimicrobial therapy. Keeping up with these changes depends on a comprehensive approach, allying understanding of the pathogenesis of disease with the development of new drugs for infective endocarditis. Infection by staphylococci and streptococci is being dissected at the molecular level. New ideas for antimicrobial agents are being developed. These novel insights should help redefine preventive and therapeutic strategies against infective endocarditis.
Resumo:
BACKGROUND: Hepatitis C virus (HCV) infection is associated with decreased health-related quality of life (HRQOL). Although HCV has been suggested to directly impair neuropsychiatric functions, other factors may also play a role. PATIENTS AND METHODS: In this cross-sectional study, we assessed the impact of various host-, disease- and virus-related factors on HRQOL in a large, unselected population of anti-HCV-positive subjects. All individuals (n = 1736) enrolled in the Swiss Hepatitis C Cohort Study (SCCS) were asked to complete the Short Form 36 (SF-36) and the Hospital Anxiety Depression Scale (HADS). RESULTS: 833 patients (48%) returned the questionnaires. Survey participants had significantly worse scores in both assessment instruments when compared to a general population. By multivariable analysis, reduced HRQOL (mental and physical summary scores of SF-36) was independently associated with income. In addition, a low physical summary score was associated with age and diabetes, whereas a low mental summary score was associated with intravenous drug use. HADS anxiety and depression scores were independently associated with income and intravenous drug use. In addition, HADS depression score was associated with diabetes. None of the SF-36 or HADS scores correlated with either the presence or the level of serum HCV RNA. In particular, SF-36 and HADS scores were comparable in 555 HCV RNA-positive and 262 HCV RNA-negative individuals. CONCLUSIONS: Anti-HCV-positive subjects have decreased HRQOL compared to controls. The magnitude of this decrease was clinically important for the SF-36 vitality score. Host and environmental, rather than viral factors, seem to impact on HRQOL level.
Resumo:
OBJECTIVES: Polypharmacy is one of the main management issues in public health policies because of its financial impact and the increasing number of people involved. The polymedicated population according to their demographic and therapeutic profile and the cost for the public healthcare system were characterised. DESIGN: Cross-sectional study. SETTING: Primary healthcare in Barcelona Health Region, Catalonia, Spain (5 105 551 inhabitants registered). PARTICIPANTS: All insured polymedicated patients. Polymedicated patients were those with a consumption of ≥16 drugs/month. MAIN OUTCOMES MEASURES: The study variables were related to age, gender and medication intake obtained from the 2008 census and records of prescriptions dispensed in pharmacies and charged to the public health system. RESULTS: There were 36 880 polymedicated patients (women: 64.2%; average age: 74.5±10.9 years). The total number of prescriptions billed in 2008 was 2 266 830 (2 272 920 total package units). The most polymedicated group (up to 40% of the total prescriptions) was patients between 75 and 84 years old. The average number of prescriptions billed monthly per patient was 32±2, with an average cost of 452.7±27.5. The total cost of those prescriptions corresponded to 2% of the drug expenditure in Catalonia. The groups N, C, A, R and M represented 71.4% of the total number of drug package units dispensed to polymedicated patients. Great variability was found between the medication profiles of men and women, and between age groups; greater discrepancies were found in paediatric patients (5-14 years) and the elderly (≥65 years). CONCLUSIONS: This study provides essential information to take steps towards rational drug use and a structured approach in the polymedicated population in primary healthcare.
Resumo:
BACKGROUND: We conducted a retrospective analysis of administration of nonoccupational HIV post-exposure prophylaxis (nPEP) in a single centre where tracing and testing of the source of exposure were carried out systematically over a 10-year period. METHODS: Files of all nPEP requests between 1998 and 2007 were reviewed. Characteristics of the exposed and source patients, the type of exposure, and clinical and serological outcomes were analysed. RESULTS: nPEP requests increased by 850% over 10 years. Among 910 events, 58% were heterosexual exposures, 15% homosexual exposures, 6% sexual assaults and 20% nonsexual exposures. In 208 events (23%), the source was reported to be HIV positive. In the remaining cases, active source tracing enabled 298 HIV tests to be performed (42%) and identified 11 HIV infections (3.7%). nPEP was able to be avoided or interrupted in 31% of 910 events when the source tested negative. Of 710 patients who started nPEP, 396 (56%) reported side effects, among whom 39 (5%) had to interrupt treatment. There were two HIV seroconversions, and neither was attributed to nPEP failure. CONCLUSIONS: nPEP requests increased over time. HIV testing of the source person avoided nPEP in 31% of events and was therefore paramount in the management of potential HIV exposures. Furthermore, it allowed active screening of populations potentially at risk for undiagnosed HIV infection, as shown by the increased HIV prevalence in these groups (3.7%) compared with a prevalence of 0.3% in Switzerland as a whole.
Resumo:
Marijuana is the most widely used illicit drug, however its effects on cognitive functions underling safe driving remain mostly unexplored. Our goal was to evaluate the impact of cannabis on the driving ability of occasional smokers, by investigating changes in the brain network involved in a tracking task. The subject characteristics, the percentage of Δ(9)-Tetrahydrocannabinol in the joint, and the inhaled dose were in accordance with real-life conditions. Thirty-one male volunteers were enrolled in this study that includes clinical and toxicological aspects together with functional magnetic resonance imaging of the brain and measurements of psychomotor skills. The fMRI paradigm was based on a visuo-motor tracking task, alternating active tracking blocks with passive tracking viewing and rest condition. We show that cannabis smoking, even at low Δ(9)-Tetrahydrocannabinol blood concentrations, decreases psychomotor skills and alters the activity of the brain networks involved in cognition. The relative decrease of Blood Oxygen Level Dependent response (BOLD) after cannabis smoking in the anterior insula, dorsomedial thalamus, and striatum compared to placebo smoking suggests an alteration of the network involved in saliency detection. In addition, the decrease of BOLD response in the right superior parietal cortex and in the dorsolateral prefrontal cortex indicates the involvement of the Control Executive network known to operate once the saliencies are identified. Furthermore, cannabis increases activity in the rostral anterior cingulate cortex and ventromedial prefrontal cortices, suggesting an increase in self-oriented mental activity. Subjects are more attracted by intrapersonal stimuli ("self") and fail to attend to task performance, leading to an insufficient allocation of task-oriented resources and to sub-optimal performance. These effects correlate with the subjective feeling of confusion rather than with the blood level of Δ(9)-Tetrahydrocannabinol. These findings bolster the zero-tolerance policy adopted in several countries that prohibits the presence of any amount of drugs in blood while driving.
Resumo:
OBJECTIVES: Persons from sub-Saharan Africa (SSA) are increasingly enrolled in the Swiss HIV Cohort Study (SHCS). Cohorts from other European countries showed higher rates of viral failure among their SSA participants. We analyzed long-term outcomes of SSA versus North Western European participants. DESIGN: We analyzed data of the SHCS, a nation-wide prospective cohort study of HIV-infected adults at 7 sites in Switzerland. METHODS: SSA and North Western European participants were included if their first treatment combination consisted of at least 3 antiretroviral drugs (cART), if they had at least 1 follow-up visit, did not report active injecting drug use, and did not start cART with CD4 counts >200 cells per microliter during pregnancy. Early viral response, CD4 cell recovery, viral failure, adherence, discontinuation from SHCS, new AIDS-defining events, and survival were analyzed using linear regression and Cox proportional hazard models. RESULTS: The proportion of participants from SSA within the SHCS increased from 2.6% (<1995) to 20.8% (2005-2009). Of 4656 included participants, 808 (17.4%) were from SSA. Early viral response (6 months) and rate of viral failure in an intent-to-stay-on-cART approach were similar. However, SSA participants had a higher risk of viral failure on cART (adjusted hazard ratio: 2.03, 95% confidence interval: 1.50 to 2.75). Self-reported adherence was inferior for SSA. There was no increase of AIDS-defining events or mortality in SSA participants. CONCLUSIONS: Increased attention must be given to factors negatively influencing adherence to cART in participants from SSA to guarantee equal longer-term results on cART.