843 resultados para Drugs abuse
Resumo:
A fluorimetric microassay that uses a redox dye to determine the viability of the flagellate Trichomonas vaginalis has been optimised to provide a more sensitive method to evaluate potential trichomonacidal compounds. Resazurin has been used in recent years to test drugs against different parasites, including trichomonadid protozoa; however, the reproducibility of these resazurin-based methods in our laboratory has been limited because the flagellate culture medium spontaneously reduces the resazurin. The objective of this work was to refine the fluorimetric microassay method previously developed by other research groups to reduce the fluorescence background generated by the media and increase the sensitivity of the screening assay. The experimental conditions, time of incubation, resazurin concentration and media used in the microtitre plates were adjusted. Different drug sensitivity studies against T. vaginalis were developed using the 5-nitroimidazole reference drugs, new 5-nitroindazolinones and 5-nitroindazole synthetic derivatives. Haemocytometer count results were compared with the resazurin assay using a 10% solution of 3 mM resazurin dissolved in phosphate buffered saline with glucose (1 mg/mL). The fluorimetric assay and the haemocytometer counts resulted in similar percentages of trichomonacidal activity in all the experiments, demonstrating that the fluorimetric microtitre assay has the necessary accuracy for high-throughput screening of new drugs against T. vaginalis.
Resumo:
The endocannabinoid system (ECS) has been implicated in many physiological functions, including the regulation of appetite, food intake and energy balance, a crucial involvement in brain reward systems and a role in psychophysiological homeostasis (anxiety and stress responses). We first introduce this important regulatory system and chronicle what is known concerning the signal transduction pathways activated upon the binding of endogenous cannabinoid ligands to the Gi/0-coupled CB1 cannabinoid receptor, as well as its interactions with other hormones and neuromodulators which can modify endocannabinoid signaling in the brain. Anorexia nervosa (AN) and bulimia nervosa (BN) are severe and disabling psychiatric disorders, characterized by profound eating and weight alterations and body image disturbances. Since endocannabinoids modulate eating behavior, it is plausible that endocannabinoid genes may contribute to the biological vulnerability to these diseases. We present and discuss data suggesting an impaired endocannabinoid signaling in these eating disorders, including association of endocannabinoid components gene polymorphisms and altered CB1-receptor expression in AN and BN. Then we discuss recent findings that may provide new avenues for the identification of therapeutic strategies based on the endocannabinod system. In relation with its implications as a reward-related system, the endocannabinoid system is not only a target for cannabis but it also shows interactions with other drugs of abuse. On the other hand, there may be also a possibility to point to the ECS as a potential target for treatment of drug-abuse and addiction. Within this framework we will focus on enzymatic machinery involved in endocannabinoid inactivation (notably fatty acid amide hydrolase or FAAH) as a particularly interesting potential target. Since a deregulated endocannabinoid system may be also related to depression, anxiety and pain symptomatology accompanying drug-withdrawal states, this is an area of relevance to also explore adjuvant treatments for improving these adverse emotional reactions.
Resumo:
Animal studies point to an implication of the endocannabinoid system on executive functions. In humans, several studies have suggested an association between acute or chronic use of exogenous cannabinoids (Δ9-tetrahydrocannabinol) and executive impairments. However, to date, no published reports establish the relationship between endocannabinoids, as biomarkers of the cannabinoid neurotransmission system, and executive functioning in humans. The aim of the present study was to explore the association between circulating levels of plasma endocannabinoids N-arachidonoylethanolamine (AEA) and 2-Arachidonoylglycerol (2-AG) and executive functions (decision making, response inhibition and cognitive flexibility) in healthy subjects. One hundred and fifty seven subjects were included and assessed with the Wisconsin Card Sorting Test; Stroop Color and Word Test; and Iowa Gambling Task. All participants were female, aged between 18 and 60 years and spoke Spanish as their first language. Results showed a negative correlation between 2-AG and cognitive flexibility performance (r = -.37; p<.05). A positive correlation was found between AEA concentrations and both cognitive flexibility (r = .59; p<.05) and decision making performance (r = .23; P<.05). There was no significant correlation between either 2-AG (r = -.17) or AEA (r = -.08) concentrations and inhibition response. These results show, in humans, a relevant modulation of the endocannabinoid system on prefrontal-dependent cognitive functioning. The present study might have significant implications for the underlying executive alterations described in some psychiatric disorders currently associated with endocannabinoids deregulation (namely drug abuse/dependence, depression, obesity and eating disorders). Understanding the neurobiology of their dysexecutive profile might certainly contribute to the development of new treatments and pharmacological approaches.
Resumo:
Non-steroidal anti-inflammatory drugs (NSAIDs) are the drugs most frequently involved in hypersensitivity drug reactions. Histamine is released in the allergic response to NSAIDs and is responsible for some of the clinical symptoms. The aim of this study is to analyze clinical association of functional polymorphisms in the genes coding for enzymes involved in histamine homeostasis with hypersensitivity response to NSAIDs. We studied a cohort of 442 unrelated Caucasian patients with hypersensitivity to NSAIDs. Patients who experienced three or more episodes with two or more different NSAIDs were included. If this requirement was not met diagnosis was established by challenge. A total of 414 healthy unrelated controls ethnically matched with patients and from the same geographic area were recruited. Analyses of the SNPs rs17740607, rs2073440, rs1801105, rs2052129, rs10156191, rs1049742 and rs1049793 in the HDC, HNMT and DAO genes were carried out by means of TaqMan assays. The detrimental DAO 16 Met allele (rs10156191), which causes decreased metabolic capacity, is overrepresented among patients with crossed-hypersensitivity to NSAIDs with an OR = 1.7 (95% CI = 1.3-2.1; Pc = 0.0003) with a gene-dose effect (P = 0.0001). The association was replicated in two populations from different geographic areas (Pc = 0.008 and Pc = 0.004, respectively). CONCLUSIONS AND IMPLICATIONS: The DAO polymorphism rs10156191 which causes impaired metabolism of circulating histamine is associated with the clinical response in crossed-hypersensitivity to NSAIDs and could be used as a biomarker of response.
Resumo:
The trypanosomatid cytoskeleton is responsible for the parasite's shape and it is modulated throughout the different stages of the parasite's life cycle. When parasites are exposed to media with reduced osmolarity, they initially swell, but subsequently undergo compensatory shrinking referred to as regulatory volume decrease (RVD). We studied the effects of anti-microtubule (Mt) drugs on the proliferation of Leishmania mexicana promastigotes and their capacity to undergo RVD. All of the drugs tested exerted antiproliferative effects of varying magnitudes [ansamitocin P3 (AP3)> trifluoperazine > taxol > rhizoxin > chlorpromazine]. No direct relationship was found between antiproliferative drug treatment and RVD. Similarly, Mt stability was not affected by drug treatment. Ansamitocin P3, which is effective at nanomolar concentrations, blocked amastigote-promastigote differentiation and was the only drug that impeded RVD, as measured by light dispersion. AP3 induced 2 kinetoplasts (Kt) 1 nucleus cells that had numerous flagella-associated Kts throughout the cell. These results suggest that the dramatic morphological changes induced by AP3 alter the spatial organisation and directionality of the Mts that are necessary for the parasite's hypotonic stress-induced shape change, as well as its recovery.
Resumo:
Mice infected with Schistosoma mansoni were treated with oxamniquine, praziquantel, artesunate at the pre-patent phase, aiming at observing schistogram alterations. Half of the animals were perfused five days post-treatment for counting and classification of immature worms, based on pre-established morphological criteria (schistogram); the remaining animals were evaluated 42 or 100 days after infection and perfusion of the portal-system was performed for collection and counting of adult worms and oogram. It was observed that oxamniquine and artesunate treatment administered at the pre-postural phase causes significant reduction in the number of immature and adult worms. However, there was little reduction with praziquantel when used at the dose of 400 mg/kg for treatments administered 14, 15, 21 or 23 days post-infection. Artesunate was responsible for significant alterations in development of young worms, as well as for a higher number of worms presenting intestinal damages. Immature adult worms were detected in mice treated with artesunate or oxamniquine at the pre-patent phase of infection and recovered by perfusion 100 days after infection. Schistogram proved to be a very useful tool for experimental evaluation of the activity of antischistosomal drugs and a good model to identify the most sensitive stages to drugs.
Resumo:
Nonimmediate drug hypersensitivity reactions (DHRs) are difficult to manage in daily clinical practice, mainly owing to their heterogeneous clinical manifestations and the lack of selective biological markers. In vitro methods are necessaryto establish a diagnosis, especially given the low sensitivity of skin tests and the inherent risks of drug provocation testing. In vitro evaluation of nonimmediate DHRs must include approaches that can be applied during the different phases of the reaction. During the acute phase, monitoring markers in both skin and peripheral blood helps to discriminate between immediate and nonimmediate DHRs with cutaneous responses and to distinguish between reactions that, although they present similar clinical symptoms, are produced by different immunological mechanisms and therefore have a different treatment and prognosis. During the resolution phase, in vitro testing is used to detect the response of T cells to drug stimulation; however, this approach has certain limitations, such as the lack of validated studies assessing sensitivity. Moreover, in vitro tests indicate an immune response that is not always related to a DHR. In this review, members of the Immunology and Drug Allergy Committee of the Spanish Society of Allergy and Clinical Immunology (SEAIC) provide an overview of the most widely used in vitro tests for evaluating nonimmediate DHRs.
Resumo:
The treatment of essential hypertension is based essentially on the prescription of four major classes of antihypertensive drugs, i.e. blockers of the renin-angiotensin system, calcium channel blockers, diuretics and beta-blockers. In recent years, very few new drug therapies of hypertension have become available. Therefore, it is crucial for physicians to optimize their antihypertensive therapies with the drugs available on the market. In each of the classes of antihypertensive drugs, questions have recently been raised: are angiotensin-converting enzyme (ACE) inhibitors superior to angiotensin II receptor blockers (ARB)? Is it possible to reduce the incidence of peripheral oedema with calcium antagonists? Is hydrochlorothiazide really the good diuretic to use in combination therapies? The purpose of this review is to discuss these various questions in the light of the most recent clinical studies and meta-analyses. These latter suggest that ACE inhibitors and ARB are equivalent except for a better tolerability profile of ARB. Third generation calcium channel blockers enable to reduce the incidence of peripheral oedema and chlorthalidone is certainly more effective than hydrochlorothiazide in preventing cardiovascular events in hypertension. At last, studies suggest that drug adherence and long-term persistence under therapy is one of the major issues in the actual management of essential hypertension.
Resumo:
Since 2004, cannabis has been prohibited by the World Anti-Doping Agency for all sports competitions. In the years since then, about half of all positive doping cases in Switzerland have been related to cannabis consumption. In doping urine analysis, the target analyte is 11-nor-9-carboxy-Delta(9)-tetrahydrocannabinol (THC-COOH), the cutoff being 15 ng/mL. However, the wide urinary detection window of the long-term metabolite of Delta(9)-tetrahydrocannabinol (THC) does not allow a conclusion to be drawn regarding the time of consumption or the impact on the physical performance. The purpose of the present study on light cannabis smokers was to evaluate target analytes with shorter urinary excretion times. Twelve male volunteers smoked a cannabis cigarette standardized to 70 mg THC per cigarette. Plasma and urine were collected up to 8 h and 11 days, respectively. Total THC, 11-hydroxy-Delta(9)-tetrahydrocannabinol (THC-OH), and THC-COOH were determined after hydrolysis followed by solid-phase extraction and gas chromatography/mass spectrometry. The limits of quantitation were 0.1-1.0 ng/mL. Eight puffs delivered a mean THC dose of 45 mg. Plasma levels of total THC, THC-OH, and THC-COOH were measured in the ranges 0.2-59.1, 0.1-3.9, and 0.4-16.4 ng/mL, respectively. Peak concentrations were observed at 5, 5-20, and 20-180 min. Urine levels were measured in the ranges 0.1-1.3, 0.1-14.4, and 0.5-38.2 ng/mL, peaking at 2, 2, and 6-24 h, respectively. The times of the last detectable levels were 2-8, 6-96, and 48-120 h. Besides high to very high THC-COOH levels (245 +/- 1,111 ng/mL), THC (3 +/- 8 ng/mL) and THC-OH (51 +/- 246 ng/mL) were found in 65 and 98% of cannabis-positive athletes' urine samples, respectively. In conclusion, in addition to THC-COOH, the pharmacologically active THC and THC-OH should be used as target analytes for doping urine analysis. In the case of light cannabis use, this may allow the estimation of more recent consumption, probably influencing performance during competitions. However, it is not possible to discriminate the intention of cannabis use, i.e., for recreational or doping purposes. Additionally, pharmacokinetic data of female volunteers are needed to interpret cannabis-positive doping cases of female athletes.
Resumo:
Addiction to major drugs of abuse, such as cocaine, has recently been linked to alterations in adult neurogenesis in the hippocampus. The endogenous cannabinoid system modulates this proliferative response as demonstrated by the finding that pharmacological activation/blockade of cannabinoid CB1 and CB2 receptors not only modulates neurogenesis but also modulates cell death in the brain. In the present study, we evaluated whether the endogenous cannabinoid system affects cocaine-induced alterations in cell proliferation. To this end, we examined whether pharmacological blockade of either CB1 (Rimonabant, 3 mg/kg) or CB2 receptors (AM630, 3 mg/kg) would affect cell proliferation [the cells were labeled with 5-bromo-2'-deoxyuridine (BrdU)] in the subventricular zone (SVZ) of the lateral ventricle and the dentate subgranular zone (SGZ). Additionally, we measured cell apoptosis (as monitored by the expression of cleaved caspase-3) and glial activation [by analyzing the expression of glial fibrillary acidic protein (GFAP) and Iba-1] in the striatum and hippocampus during acute and repeated (4 days) cocaine administration (20 mg/kg). The results showed that acute cocaine exposure decreased the number of BrdU-immunoreactive (ir) cells in the SVZ and SGZ. In contrast, repeated cocaine exposure reduced the number of BrdU-ir cells only in the SVZ. Both acute and repeated cocaine exposure increased the number of cleaved caspase-3-, GFAP- and Iba1-ir cells in the hippocampus, and this effect was counteracted by AM630 or Rimonabant, which increased the number of BrdU-, GFAP-, and Iba1-ir cells in the hippocampus. These results indicate that the changes in neurogenic, apoptotic and gliotic processes that were produced by repeated cocaine administration were normalized by pharmacological blockade of CB1 and CB2. The restorative effects of cannabinoid receptor blockade on hippocampal cell proliferation were associated with the prevention of the induction of conditioned locomotion but not with the prevention of cocaine-induced sensitization.
Resumo:
Purpose. To survey the management of patients with neovascular age-related macular degeneration (nvAMD) in Spain. Methods. An observational retrospective multicenter study was conducted. The variables analyzed were sociodemographic characteristics, foveal and macular thickness, visual acuity (VA), type of treatment, number of injections, and the initial administration of a loading dose of an antiangiogenic drug. Results. 208 patients were followed up during 23.4 months in average. During the first and second years, patients received a mean of 4.5 ± 1.8 and 1.6 ± 2.1 injections of antiangiogenic drugs, and 5.4 ± 2.8 and 3.6 ± 2.2 follow-up visits were performed, respectively. The highest improvement in VA was observed at 3 months of follow-up, followed by a decrease in the response that stabilized above baseline values until the end of the study. Patients who received an initial loading dose presented greater VA gains than those without. Conclusions. Our results suggest the need for a more standardized approach in the management and diagnosis of nvAMD receiving VEGF inhibitors. To achieve the visual outcomes reported in pivotal trials, an early diagnosis, proactive approach (more treating than follow-up visits), and a close monitoring might be the key to successfully manage nvAMD.
Resumo:
BACKGROUND: In Switzerland, intravenous drug use (IDU) accounts for 80% of newly acquired hepatitis C virus (HCV) infections. Early HCV treatment has the potential to interrupt the transmission chain and reduce morbidity/mortality due to decompensated liver cirrhosis and hepatocellular carcinoma. Nevertheless, patients in drug substitution programs are often insufficiently screened and treated. OBJECTIVE/METHODS: With the aim to improve HCV management in IDUs, we conducted a cross sectional chart review in three opioid substitution programs in St. Gallen (125 methadone and 71 heroin recipients). Results were compared with another heroin substitution program in Bern (202 patients) and SCCS/SHCS data. RESULTS: Among the methadone/heroin recipients in St. Gallen, diagnostic workup of HCV was better than expected: HCV/HIV-status was unknown in only 1% (2/196), HCV RNA was not performed in 9% (13/146) of anti-HCV-positives and the genotype missing in 15% (12/78) of HCV RNA-positives. In those without spontaneous clearance (two thirds), HCV treatment uptake was 23% (21/91) (HIV-: 29% (20/68), HIV+: 4% (1/23)), which was lower than in methadone/heroin recipients and particularly non-IDUs within the SCCS/SHCS, but higher than in the, mainly psychiatrically focussed, heroin substitution program in Bern (8%). Sustained virological response (SVR) rates were comparable in all settings (overall: 50%, genotype 1: 35-40%, genotype 3: two thirds). In St. Gallen, the median delay from the estimated date of infection (IDU start) to first diagnosis was 10 years and to treatment was another 7.5 years. CONCLUSIONS: Future efforts need to focus on earlier HCV diagnosis and improvement of treatment uptake among patients in drug substitution programs, particularly if patients are HIV-co-infected. New potent drugs might facilitate the decision to initiate treatment.
Resumo:
Background: Current guidelines underline the limitations of existing instruments to assess fitness to drive and the poor adaptability of batteries of neuropsychological tests in primary care settings. Aims: To provide a free, reliable, transparent computer based instrument capable of detecting effects of age or drugs on visual processing and cognitive functions. Methods: Relying on systematic reviews of neuropsychological tests and driving performances, we conceived four new computed tasks measuring: visual processing (Task1), movement attention shift (Task2), executive response, alerting and orientation gain (Task3), and spatial memory (Task4). We then planned five studies to test MedDrive's reliability and validity. Study-1 defined instructions and learning functions collecting data from 105 senior drivers attending an automobile club course. Study-2 assessed concurrent validity for detecting minor cognitive impairment (MCI) against useful field of view (UFOV) on 120 new senior drivers. Study-3 collected data from 200 healthy drivers aged 20-90 to model age related normal cognitive decline. Study-4 measured MedDrive's reliability having 21 healthy volunteers repeat tests five times. Study-5 tested MedDrive's responsiveness to alcohol in a randomised, double-blinded, placebo, crossover, dose-response validation trial including 20 young healthy volunteers. Results: Instructions were well understood and accepted by all senior drivers. Measures of visual processing (Task1) showed better performances than the UFOV in detecting MCI (ROC 0.770 vs. 0.620; p=0.048). MedDrive was capable of explaining 43.4% of changes occurring with natural cognitive decline. In young healthy drivers, learning effects became negligible from the third session onwards for all tasks except for dual tasking (ICC=0.769). All measures except alerting and orientation gain were affected by blood alcohol concentrations. Finally, MedDrive was able to explain 29.3% of potential causes of swerving on the driving simulator. Discussion and conclusions: MedDrive reveals improved performances compared to existing computed neuropsychological tasks. It shows promising results both for clinical and research purposes.
Resumo:
PURPOSE: The perioperative treatment of patients on dual antiplatelet therapy after myocardial infarction, cerebrovascular event or coronary stent implantation represents an increasingly frequent issue for urologists and anesthesiologists. We assess the current scientific evidence and propose strategies concerning treatment of these patients. MATERIALS AND METHODS: A MEDLINE and PubMed search was conducted for articles related to antiplatelet therapy after myocardial infarction, coronary stents and cerebrovascular events, as well as the use of aspirin and/or clopidogrel in the context of surgery. RESULTS: Early discontinuation of antiplatelet therapy for secondary prevention is associated with a high risk of coronary thrombosis, which is further increased by the hypercoagulable state induced by surgery. Aspirin has recently been recommended as a lifelong therapy. Clopidogrel is mandatory for 6 weeks after myocardial infarction and bare metal stents, and for 12 months after drug-eluting stents. Surgery must be postponed beyond these waiting periods or performed with patients receiving dual antiplatelet therapy because withdrawal therapy increases 5 to 10 times the risk of postoperative myocardial infarction, stent thrombosis or death. The shorter the waiting period between revascularization and surgery the greater the risk of adverse cardiac events. The risk of surgical hemorrhage is increased approximately 20% by aspirin and 50% by clopidogrel. CONCLUSIONS: The risk of coronary thrombosis when antiplatelet agents are withdrawn before surgery is generally higher than the risk of surgical hemorrhage when antiplatelet agents are maintained. However, this issue has not yet been sufficiently evaluated in urological patients and in many instances during urological surgery the risk of bleeding can be dangerous. A thorough dialogue among surgeon, cardiologist and anesthesiologist is essential to determine all risk factors and define the best possible strategy for each patient.
Resumo:
Adolescence, defined as a transition phase toward autonomy and independence, is a natural time of learning and adjustment, particularly in the setting of long-term goals and personal aspirations. It also is a period of heightened sensation seeking, including risk taking and reckless behaviors, which is a major cause of morbidity and mortality among teenagers. Recent observations suggest that a relative immaturity in frontal cortical neural systems may underlie the adolescent propensity for uninhibited risk taking and hazardous behaviors. However, converging preclinical and clinical studies do not support a simple model of frontal cortical immaturity, and there is substantial evidence that adolescents engage in dangerous activities, including drug abuse, despite knowing and understanding the risks involved. Therefore, a current consensus considers that much brain development during adolescence occurs in brain regions and systems that are critically involved in the perception and evaluation of risk and reward, leading to important changes in social and affective processing. Hence, rather than naive, immature and vulnerable, the adolescent brain, particularly the prefrontal cortex, should be considered as prewired for expecting novel experiences. In this perspective, thrill seeking may not represent a danger but rather a window of opportunities permitting the development of cognitive control through multiple experiences. However, if the maturation of brain systems implicated in self-regulation is contextually dependent, it is important to understand which experiences matter most. In particular, it is essential to unveil the underpinning mechanisms by which recurrent adverse episodes of stress or unrestricted access to drugs can shape the adolescent brain and potentially trigger life-long maladaptive responses.