943 resultados para intravenous drug users
Resumo:
Objectives - Treatment of established status epilepticus (SE) requires immediate intravenous anticonvulsant therapy. Currently used first-line drugs may cause potentially hazardous side effects. We aimed to assess the efficacy and safety of intravenous lacosamide (LCM) in SE after failure of standard treatment. Methods - We retrospectively analyzed 39 patients (21 women, 18 men, median age 62 years) from the hospital databases of five neurological departments in Germany, Austria and Switzerland between September 2008 and January 2010 who were admitted in SE and received at least one dose of intravenous LCM. Results - Types of SE were generalized convulsive (n = 6), complex partial (n = 17) and simple partial (n = 16). LCM was administered after failure of benzodiazepins or other standard drugs in all but one case. Median bolus dose of LCM was 400 mg (range 200-400 mg), which was administered at 40-80 mg/min in those patients where infusion rate was documented. SE stopped after LCM in 17 patients, while 22 patients needed further anticonvulsant treatment. The success rate in patients receiving LCM as first or second drug was 3/5, as third drug 11/19, and as fourth or later drug 3/15. In five subjects, SE could not be terminated at all. No serious adverse events attributed to LCM were documented. Conclusions - Intravenous LCM may be an alternative treatment for established SE after failure of standard therapy, or when standard agents are considered unsuitable.
Resumo:
Background: The objective of this study was to determine if mental health and substance use diagnoses were equally detected in frequent users (FUs) compared to infrequent users (IUs) of emergency departments (EDs). Methods: In a sample of 399 adult patients (>= 18 years old) admitted to a teaching hospital ED, we compared the mental health and substance use disorders diagnoses established clinically and consigned in the medical files by the ED physicians to data obtained in face-to-face research interviews using the Primary Care Evaluation of Mental Disorders (PRIME-MD) and the Alcohol, Smoking and Involvement Screening Test (ASSIST). Between November 2009 and June 2010, 226 FUs (>4 visits within a year) who attended the ED were included, and 173 IUs (<= 4 visits within a year) were randomly selected from a pool of identified patients to comprise the comparison group. Results: For mental health disorders identified by the PRIME-MD, FUs were more likely than IUs to have an anxiety (34 vs. 16%, Chi2(1) = 16.74, p <0.001), depressive (47 vs. 25%, Chi2(1) = 19.11, p <0.001) or posttraumatic stress (PTSD) disorder (11 vs. 5%, Chi2(1) = 4.87, p = 0.027). Only 3/76 FUs (4%) with an anxiety disorder, 16/104 FUs (15%) with a depressive disorder and none of the 24 FUs with PTSD were detected by the ED medical staff. None of the 27 IUs with an anxiety disorder, 6/43 IUs (14%) with a depressive disorder and none of the 8 IUs with PTSD were detected. For substance use disorders identified by the ASSIST, FUs were more at risk than IUs for alcohol (24 vs. 7%, Chi2(1) = 21.12, p <0.001) and drug abuse/dependence (36 vs. 25%, Chi2(1) = 5.52, p = 0.019). Of the FUs, 14/54 (26%) using alcohol and 8/81 (10%) using drugs were detected by the ED physicians. Of the IUs, 5/12 (41%) using alcohol and none of the 43 using drugs were detected. Overall, there was no significant difference in the rate of detection of mental health and substance use disorders between FUs and IUs (Fisher's Exact Test: anxiety, p = 0.567; depression, p = 1.000; PTSD, p = 1.000; alcohol, p = 0.517; and drugs, p = 0.053). Conclusions: While the prevalence of mental health and substance use disorders was higher among FUs, the rates of detection were not significantly different for FUs vs. IUs. However, it may be that drug disorders among FUs were more likely to be detected.
Resumo:
Rationale: Treatment of status epilepticus (SE) usually requires intravenous anticonvulsant therapy. Although there are established drugs of first choice for its treatment, potentially hazardous side effects of these agents are not uncommon. Lacosamide (LCM) is a novel anticonvulsant drug that is available as infusion solution. LCM could be an alternative for treatment of SE when the standard drugs fail or should be avoided. Methods: We retrospectively identified patients from the hospital databases of two German and one Swiss neurological departments (University Hospital Marburg, Klinikum Osnabrueck, University Hospital Lausanne) between September 1st 2008 and May 22nd 2009 who were admitted because of SE and received at least one dose of intravenous LCM for treatment of SE. Results: Seventeen patients (11 female, 6 male) were identified. Median age was 71 years. 3 patients suffered from generalized convulsive SE, 8 patients had significant reduction of awareness with or without subtle motor symptoms, 6 patients had a simple focal status without relevant reduction of awareness. Etiology was acute symptomatic in 5 patients, remote symptomatic without pre-existing epilepsy in 6 patients, remote symptomatic and pre-existing epilepsy in 5 patients, and unknown in 1 patient. LCM was administered after failure of first line therapy in all cases. The first LCM bolus was 400mg in 13 patients and 200mg in 4 patients. LCM administration stopped SE in 7 patients. In 2 of them, LCM was administered immediately after benzodiazepine administration, in the others after failure of benzodiazepines and other first-line and/or second-line drugs. In 3 patients, SE was terminated by other anticonvulsants like Phenytoin, Phenobarbital or Oxcarbazepine. In 5 patients, SE could only be terminated by intubation and application of high-dose Midazolam, Propofol and/or Thiopental. In 2 patients, SE could not be terminated in spite of high doses of barbiturates. There was no serious adverse event documented that could possibly be attributed to LCM Conclusions: Intravenous LCM may be an alternative treatment for SE after failure of benzodiazepins and other established drugs, or when such agents are considered unsuitable.
Resumo:
This article tries to reconcile economic-industrial policy with health policy when dealing with biomedical innovation and welfare state sustainability. Better health accounts for an increasingly large proportion of welfare improvements. Explanation is given to the welfare losses coming from the fact than industrial and health policy tend to ignore each other. Drug s prices reflecting their relative relative effectiveness send the right signal to the industry rewarding innovation with impact on quantity and quality of life- and to the buyers of health care services.The level of drug s public reimbursement indicates the social willingness to pay of the different national health systems, not only by means of inclusion, or rejection, in the basket of services covered, but especially establishing the proportion of the price that is going to be financed publicly.Reference pricing for therapeutic equivalents as the upper limit of the social willingness to pay- and two-tiered co-payments for users (avoidable and inversely related with the incremental effectiveness of de drug) are deemed appropriate for those countries concerned at the same time with increasing their productivity and maintaining its welfare state. Profits drive R&D but not its location. There is no intrinsic contradiction between high productivity and a consolidated National Health Service (welfare state) as the European Nordic Countries are telling us every day.
Resumo:
OBJECTIVE: The aim of this study was to evaluate a French language version of the Adolescent Drug Abuse Diagnosis (ADAD) instrument in a Swiss sample of adolescent illicit drug and/or alcohol users. PARTICIPANTS AND SETTING: The participants in the study were 102 French-speaking adolescents aged 13-19 years who fitted the criteria of illicit drug or alcohol use (at least one substance--except tobacco--once a week during the last 3 months). They were recruited in hospitals, institutions and leisure places. Procedure. The ADAD was administered individually by trained psychologists. It was integrated into a broader protocol including alcohol and drug abuse DSM-IV diagnoses, the BDI-13 (Beck Depression Inventory), life events and treatment trajectories. RESULTS: The ADAD appears to show good inter-rater reliability; the subscales showed good internal coherence and the correlations between the composite scores and the severity ratings were moderate to high. Finally, the results confirmed good concurrent validity for three out of eight ADAD dimensions. CONCLUSIONS: The French language version of the ADAD appears to be an adequate instrument for assessing drug use and associated problems in adolescents. Despite its complexity, the instrument has acceptable validity, reliability and usefulness criteria, enabling international and transcultural comparisons.
Resumo:
OBJECTIVES: To compare the use of co-medication, the potential drug-drug interactions (PDDIs) and the effect on antiretroviral therapy (ART) tolerability and efficacy in HIV-infected individuals according to age, ≥ 50 years or <50 years. METHODS: All ART-treated participants were prospectively included once during a follow-up visit of the Swiss HIV Cohort Study. Information on any current medication was obtained by participant self-report and medical prescription history. The complete treatment was subsequently screened for PDDIs using a customized version of the Liverpool drug interaction database. RESULTS: Drug prescriptions were analysed for 1497 HIV-infected individuals: 477 age ≥ 50 and 1020 age <50. Older patients were more likely to receive one or more co-medications compared with younger patients (82% versus 61%; P < 0.001) and thus had more frequent PDDIs (51% versus 35%; P < 0.001). Furthermore, older patients tended to use a higher number of co-medications and certain therapeutic drug classes more often, such as cardiovascular drugs (53% versus 19%; P < 0.001), gastrointestinal medications (10% versus 6%; P = 0.004) and hormonal agents (6% versus 3%; P = 0.04). PDDIs with ART occurred mainly with cardiovascular drugs (27%), CNS agents (22%) and methadone (6%) in older patients and with CNS agents (27%), methadone (15%) and cardiovascular drugs (11%) in younger patients. The response to ART did not differ between the two groups. CONCLUSIONS: The risk for PDDIs with ART increased in older patients who take more drugs than their younger HIV-infected counterparts. However, medication use in older and younger patients did not differ in terms of effect on antiretroviral tolerability and response.
Resumo:
BACKGROUND: Increasingly, patients receiving methadone treatment are found in low threshold facilities (LTF), which provide needle exchange programmes in Switzerland. This paper identifies the characteristics of LTF attendees receiving methadone treatment (MT) compared with other LTF attendees (non-MT). METHODS: A national cross-sectional survey was conducted in 2006 over five consecutive days in all LTF (n=25). Attendees were given an anonymous questionnaire, collecting information on socio-demographic indicators, drug consumption, injection, methadone treatment, and self-reported HIV and HCV status. Univariate analysis and logistic regression were performed to compare MT to non-MT. The response rate was 66% (n=1128). RESULTS: MT comprised 57.6% of the sample. In multivariate analysis, factors associated with being on MT were older age (OR: 1.38), being female (OR: 1.60), having one's own accommodation (OR: 1.56), receiving public assistance (OR: 2.29), lifetime injecting (OR: 2.26), HIV-positive status (OR: 2.00), and having consumed cocaine during the past month (OR: 1.37); MT were less likely to have consumed heroin in the past month (OR: 0.76, not significant) and visited LTF less often on a daily basis (OR: 0.59). The number of injections during the past week was not associated with MT. CONCLUSIONS: More LTF attendees were in the MT group, bringing to light an underappreciated LTF clientele with specific needs. The MT group consumption profile may reflect therapeutic failure or deficits in treatment quality and it is necessary to acknowledge this and to strengthen the awareness of LTF personnel about potential needs of MT attendees to meet their therapeutic goals.
Resumo:
Background and objective: Patients in the ICU often get many intravenous (iv) drugs at the same time. Even with three-lumen central venous catheters, the administration of more than one drug in the same iv line (IVL) is frequently necessary. The objective of this study was to observe how nurses managed to administer these many medications and to evaluate the proportion of two-drugs associations (TDA) that are compatible or not, based on known compatibility data. Design: Observational prospective study over 4 consecutive months. All patients receiving simultaneously more than one drugs in the same IVL (Y-site injection or mixed in the same container) were included. For each patient, all iv drugs were recorded, as well as concentration, infusion solution, location on the IVL system, time, rate and duration of administration. For each association of two or more drugs, compatibility of each drug was checked with each other. Compatibilities between these pairs of drugs were assessed using published data (mainly Trissel LA. Handbook on Injectable Drugs and Trissel's Tables of Physical Compatibility) and visual tests performed in our quality control laboratory. Setting: 34 beds university hospital adult ICU. Main outcome measures: Percentage of compatibilities and incompatibilities between drugs administered in the same IVL. Results: We observed 1,913 associations of drugs administered together in the same IVL, 783 implying only two drugs. The average number of drugs per IVL was 3.1 ± 0.8 (range: 2-9). 83.2% of the drugs were given by continuous infusion, 14.3% by intermittent infusion and 2.5% in bolus. The associations observed allowed to form 8,421 pairs of drugs (71.7% drug-drug and 28.3% drug-solute). According to literature data, 80.2% of the association were considered as compatible and 4.4% incompatible. 15.4% were not interpretable because of different conditions between local practices and those described in the literature (drug concentration, solute, etc.) or because of a lack of data. After laboratory tests performed on the most used drugs (furosemide, KH2PO4, morphine HCl, etc.), the proportion of compatible TDA raised to 85.7%, the incompatible stayed at 4.6% and only 9.7% remain unknown or not interpretable. Conclusions: Nurses managed the administration of iv medications quite well, as only less than 5% of observed TDA were considered as incompatible. But the 10% of TDA with unavailable compatibility data should have been avoided too, since the consequences of their concomitant administration cannot be predictable. For practical reasons, drugs were analysed only by pairs, which constitutes the main limit of this work. The average number of drugs in the same association being three, laboratory tests are currently performed to evaluate some of the most observed three-drugs associations.
Resumo:
Background and Purpose-Demographic changes will result in a rapid increase of patients age >= 90 years (nonagenarians), but little is known about outcomes in these patients after intravenous thrombolysis (IVT) for acute ischemic stroke. We aimed to assess safety and functional outcome in nonagenarians treated with IVT and to compare the outcomes with those of patients age 80 to 89 years (octogenarians).Methods-We analyzed prospectively collected data of 284 consecutive stroke patients age >= 80 years treated with IVT in 7 Swiss stroke units. Presenting characteristics, favorable outcome (modified Rankin scale [mRS] 0 or 1), mortality at 3 months, and symptomatic intracranial hemorrhage (SICH) using the National Institute of Neurological Disorders and Stroke (NINDS) and Safe Implementation of Thrombolysis in Stroke-Monitoring Study (SITS-MOST) criteria were compared between nonagenarians and octogenarians.Results-As compared with octogenarians (n=238; mean age, 83 years), nonagenarians (n=46; mean age, 92 years) were more often women (70% versus 54%; P=0.046) and had lower systolic blood pressure (161 mm Hg versus 172 mm Hg; P=0.035). Patients age >= 90 years less often had a favorable outcome and had a higher incidence of mortality than did patients age 80 to 89 years (14.3% versus 30.2%; P=0.034; and 45.2% versus 22.1%; P=0.002; respectively), while more nonagenarians than octogenarians experienced a SICH (SICHNINDS, 13.3% versus 5.9%; P=0.106; SICHSITS-MOST, 13.3% versus 4.7%; P=0.037). Multivariate adjustment identified age >= 90 years as an independent predictor of mortality (P=0.017).Conclusions-Our study suggests less favorable outcomes in nonagenarians as compared with octogenarians after IVT for ischemic stroke, and it demands a careful selection for treatment, unless randomized controlled trials yield more evidence for IVT in very old stroke patients. (Stroke. 2011; 42: 1967-1970.)
Resumo:
BACKGROUND AND PURPOSE: Onset-to-reperfusion time (ORT) has recently emerged as an essential prognostic factor in acute ischemic stroke therapy. Although favorable outcome is associated with reduced ORT, it remains unclear whether intracranial bleeding depends on ORT. We therefore sought to determine whether ORT influenced the risk and volume of intracerebral hemorrhage (ICH) after combined intravenous and intra-arterial therapy. METHODS: Based on our prospective registry, we included 157 consecutive acute ischemic stroke patients successfully recanalized with combined intravenous and intra-arterial therapy between April 2007 and October 2011. Primary outcome was any ICH within 24 hours posttreatment. Secondary outcomes included occurrence of symptomatic ICH (sICH) and ICH volume measured with the ABC/2. RESULTS: Any ICH occurred in 26% of the study sample (n=33). sICH occurred in 5.5% (n=7). Median ICH volume was 0.8 mL. ORT was increased in patients with ICH (median=260 minutes; interquartile range=230-306) compared with patients without ICH (median=226 minutes; interquartile range=200-281; P=0.008). In the setting of sICH, ORT reached a median of 300 minutes (interquartile range=276-401; P=0.004). The difference remained significant after adjustment for potential confounding factors (adjusted P=0.045 for ICH; adjusted P=0.002 for sICH). There was no correlation between ICH volume and ORT (r=0.16; P=0.33). CONCLUSIONS: ORT influences the rate but not the volume of ICH and appears to be a critical predictor of symptomatic hemorrhage after successful combined intravenous and intra-arterial therapy. To minimize the risk of bleeding, revascularization should be achieved within 4.5 hours of stroke onset.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
Azoles are widely used in antifungal therapy in medicine. Resistance to azoles can occur in Candida albicans principally by overexpression of multidrug transporter gene CDR1, CDR2, or MDR1 or by overexpression of ERG11, which encodes the azole target. The expression of these genes is controlled by the transcription factors (TFs) TAC1 (involved in the control of CDR1 and CDR2), MRR1 (involved in the control of MDR1), and UPC2 (involved in the control of ERG11). Several gain-of-function (GOF) mutations are present in hyperactive alleles of these TFs, resulting in the overexpression of target genes. While these mutations are beneficial to C. albicans survival in the presence of the antifungal drugs, their effects could potentially alter the fitness and virulence of C. albicans in the absence of the selective drug pressure. In this work, the effect of GOF mutations on C. albicans virulence was addressed in a systemic model of intravenous infection by mouse survival and kidney fungal burden assays. We engineered a set of strains with identical genetic backgrounds in which hyperactive alleles were reintroduced in one or two copies at their genomic loci. The results obtained showed that neither TAC1 nor MRR1 GOF mutations had a significant effect on C. albicans virulence. In contrast, the presence of two hyperactive UPC2 alleles in C. albicans resulted in a significant decrease in virulence, correlating with diminished kidney colonization compared to that by the wild type. In agreement with the effect on virulence, the decreased fitness of an isolate with UPC2 hyperactive alleles was observed in competition experiments with the wild type in vivo but not in vitro. Interestingly, UPC2 hyperactivity delayed filamentation of C. albicans after phagocytosis by murine macrophages, which may at least partially explain the virulence defects. Combining the UPC2 GOF mutation with another hyperactive TF did not compensate for the negative effect of UPC2 on virulence. In conclusion, among the major TFs involved in azole resistance, only UPC2 had a negative impact on virulence and fitness, which may therefore have consequences for the epidemiology of antifungal resistance.
Resumo:
The hypothesis was tested that oral antibiotic treatment in children with acute pyelonephritis and scintigraphy-documented lesions is equally as efficacious as sequential intravenous/oral therapy with respect to the incidence of renal scarring. A randomised multi-centre trial was conducted in 365 children aged 6 months to 16 years with bacterial growth in cultures from urine collected by catheter. The children were assigned to receive either oral ceftibuten (9 mg/kg once daily) for 14 days or intravenous ceftriaxone (50 mg/kg once daily) for 3 days followed by oral ceftibuten for 11 days. Only patients with lesions detected on acute-phase dimercaptosuccinic acid (DMSA) scintigraphy underwent follow-up scintigraphy. Efficacy was evaluated by the rate of renal scarring after 6 months on follow-up scintigraphy. Of 219 children with lesions on acute-phase scintigraphy, 152 completed the study; 80 (72 females, median age 2.2 years) were given ceftibuten and 72 (62 females, median age 1.6 years) were given ceftriaxone/ceftibuten. Patients in the intravenous/oral group had significantly higher C-reactive protein (CRP) concentrations at baseline and larger lesion(s) on acute-phase scintigraphy. Follow-up scintigraphy showed renal scarring in 21/80 children treated with ceftibuten and 33/72 with ceftriaxone/ceftibuten (p = 0.01). However, after adjustment for the confounding variables (CRP and size of acute-phase lesion), no significant difference was observed for renal scarring between the two groups (p = 0.2). Renal scarring correlated with the extent of the acute-phase lesion (r = 0.60, p < 0.0001) and the grade of vesico-ureteric reflux (r = 0.31, p = 0.03), and was more frequent in refluxing renal units (p = 0.04). The majority of patients, i.e. 44 in the oral group and 47 in the intravenous/oral group, were managed as out-patients. Side effects were not observed. From this study, we can conclude that once-daily oral ceftibuten for 14 days yielded comparable results to sequential ceftriaxone/ceftibuten treatment in children aged 6 months to 16 years with DMSA-documented acute pyelonephritis and it allowed out-patient management in the majority of these children.
Resumo:
The major problems associated with the use of corticosteroids for the treatment of ocular diseases are their poor intraocular penetration to the posterior segment when administered locally and their secondary side effects when given systemically. To circumvent these problems more efficient methods and techniques of local delivery are being developed. The purposes of this study were: (1) to investigate the pharmacokinetics of intraocular penetration of hemisuccinate methyl prednisolone (HMP) after its delivery using the transscleral Coulomb controlled iontophoresis (CCI) system applied to the eye or after intravenous (i.v.) injection in the rabbit, (2) to test the safety of the CCI system for the treated eyes and (3) to compare the pharmacokinetic profiles of HMP intraocular distribution after CCI delivery to i.v. injection. For each parameter evaluated, six rabbit eyes were used. For the CCI system, two concentrations of HMP (62.5 and 150mg ml(-1)), various intensities of current and duration of treatment were analyzed. In rabbits serving as controls the HMP was infused in the CCI device but without applied electric current. For the i.v. delivery, HMP at 10mg kg(-1)as a 62.5mg ml(-1)solution was used. The rabbits were observed clinically for evidence of ocular toxicity. At various time points after the administration of drug, rabbits were killed and intraocular fluids and tissues were sampled for methylprednisolone (MP) concentrations by high pressure liquid chromatography (HPLC). Histology examinations were performed on six eyes of each group. Among groups that received CCI, the concentrations of MP increased in all ocular tissues and fluids in relation to the intensities of current used (0.4, 1.0 and 2.0mA/0.5cm(2)) and its duration (4 and 10min). Sustained and highest levels of MP were achieved in the choroid and the retina of rabbit eyes treated with the highest current and 10min duration of CCI. No clinical toxicity or histological lesions were observed following CCI. Negligible amounts of MP were found in ocular tissues in the CCI control group without application of current. Compared to i.v. administration, CCI achieved higher and more sustained tissue concentrations with negligible systemic absorption. These data demonstrate that high levels of MP can be safely achieved in intraocular tissues and fluids of the rabbit eye, using CCI. With this system, intraocular tissues levels of MP are higher than those achieved after i.v. injection. Furthermore, if needed, the drug levels achieved with CCI can be modulated as a function of current intensity and duration of treatment. CCI could therefore be used as an alternative method for the delivery of high levels of MP to the intraocular tissues of both the anterior and posterior segments.