13 resultados para Adverse Drug Reaction Reporting Systems
em Université de Lausanne, Switzerland
Resumo:
BACKGROUND/RATIONALE: Patient safety is a major concern in healthcare systems worldwide. Although most safety research has been conducted in the inpatient setting, evidence indicates that medical errors and adverse events are a threat to patients in the primary care setting as well. Since information about the frequency and outcomes of safety incidents in primary care is required, the goals of this study are to describe the type, frequency, seasonal and regional distribution of medication incidents in primary care in Switzerland and to elucidate possible risk factors for medication incidents. Label="METHODS AND ANALYSIS" ="METHODS"/> <AbstractText STUDY DESIGN AND SETTING: We will conduct a prospective surveillance study to identify cases of medication incidents among primary care patients in Switzerland over the course of the year 2015. PARTICIPANTS: Patients undergoing drug treatment by 167 general practitioners or paediatricians reporting to the Swiss Federal Sentinel Reporting System. INCLUSION CRITERIA: Any erroneous event, as defined by the physician, related to the medication process and interfering with normal treatment course. EXCLUSION CRITERIA: Lack of treatment effect, adverse drug reactions or drug-drug or drug-disease interactions without detectable treatment error. PRIMARY OUTCOME: Medication incidents. RISK FACTORS: Age, gender, polymedication, morbidity, care dependency, hospitalisation. STATISTICAL ANALYSIS: Descriptive statistics to assess type, frequency, seasonal and regional distribution of medication incidents and logistic regression to assess their association with potential risk factors. Estimated sample size: 500 medication incidents. LIMITATIONS: We will take into account under-reporting and selective reporting among others as potential sources of bias or imprecision when interpreting the results. ETHICS AND DISSEMINATION: No formal request was necessary because of fully anonymised data. The results will be published in a peer-reviewed journal. TRIAL REGISTRATION NUMBER: NCT0229537.
Resumo:
Reduced re'nal function has been reported with tenofovir disoproxil fumarate (TDF). It is not clear whether TDF co-administered with a boosted protease inhibitor (PI) leads to a greater decline in renal function than TDF co-administered with a non-nucleoside reverse transcriptase inhibitor (NNRTI).Methods: We selected ail antiretroviral therapy-naive patients in the Swiss HIV Cohort Study (SHCS) with calibrated or corrected serum creatinine measurements starting antiretroviral therapy with TDF and either efavirenz (EFV) or the ritonavir-boosted PIs, lopinavir (LPV/r) or atazanavir (ATV/r). As a measure of renal function, we used the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation to estimate the glomerular filtration rate (eGFR). We calculated the difference in eGFR over time between two therapies using a marginal model for repeated measures. In weighted analyses, observations were weighted by the product of their point of treatment and censoring weights to adjust for differences both in the sort of patients starting each therapy and in the sort of patients remaining on each therapy over time.Results: By March 2011, 940 patients with at least one creatinine measurement on a first therapy with either TDF and EFV (n=484), TDF and LPVlr (n=269) or TDF and ATV/r (n=187) had been followed for a median of 1. 7, 1.2 and 1.3 years, respectively. Table 1 shows the difference in average estimated GFR (eGFR) over time since starting cART for two marginal models. The first model was not adjusted for potential confounders; the second mode! used weights to adjust for confounders. The results suggest a greater decline in renal function during the first 6 months if TDF is used with a PI rather than with an NNRTI, but no further difference between these therapies after the first 6 months. TDF and ATV/r may lead to a greater decline in the first 6 months than TDF and LPVlr.Conclusions: TDF co-administered with a boosted PI leads to a greater de cline in renal function over the first 6 months of therapy than TDF co-administered with an NNRTI; this decline may be worse with ATV/r than with LPV/r.
Resumo:
Background: Switzerland was the first country to approve certolizumab pegol (Cimzia, CZP) for the treatment of patients with moderate to severe Crohn's disease (CD) in September 2007. This phase IV study aimed to evaluate the efficacy and safety of CZP in a Swiss multicenter cohort of practice-based patients. Methods: Baseline and Week 6 evaluation questionnaires were sent to all Swiss gastroenterologists in hospitals and private practices. Disease activity was assessed with the Harvey-Bradshaw Index (HBI) and adverse events were evaluated according to WHO guidelines. Results: Fifty patients (31 women, 19 men) were included; 56% had complicated disease (stricture or fistula) and 52% had undergone prior CD-related surgery. All patients. had prior exposure to systemic steroids, 96% to immunomodulators, 78% to infliximab, and 50% to adalimumab. A significant decrease in HBI was observed at Week 6 (versus Week 0) following induction therapy with CZP 400 mg subcutaneously at Weeks 0, 2, and 4 (12.6 +/- 4.7 Week 0 versus 6.2 +/- 4.4 Week 6, P < 0.001). Response and remission rates at Week 6 were 54% and 40%, respectively. We identified 8/11 CD patients undergoing a 50% fistula response (P = 0.021). The frequency of adverse drug reactions attributed to CZP was 6%. CZP was continued in 80% of patients beyond Week 6. Conclusions: In a population of CD patients with complicated disease behavior, CZP induced a response and remission in 54% and 40% of patients, respectively. This series provides the first evidence of the effectiveness of CZP in perianal fistulizing CD.
Resumo:
As part of the evaluation of the Confederation's measures to reduce drug related problems, a review of available data on drug use and drug related problems in Switzerland has been conducted. Source of data included: population surveys (adults and teenagers), surveys among drug users, health statistics (drug related and AIDS related deaths, HIV case reporting, drug treatments) police statistics (denunciations for consumption). The aims of reducing the number of dependent hard drug users have been achieved where heroin is concerned. In particular, there seems to have been a decrease in the number of people becoming addicted to this substance. For all other illegal substances, especially cannabis, the trend is towards an increased use, as in many European countries. As regards dependent drug users, especially injecting drug users, progress has been made in the area of harm reduction and treatment coverage. This epidemiological assessment can be used in the discussions currently engaged about the revision of the Law governing narcotics and will be a baseline for future follow up of the situation.
When the Line is Crossed... : Paths to Control and Sanction Behaviour Necessitating a State Reaction
Resumo:
The article presents a special form of a European comparative synopsis. For this case examples have been chosen ranging from administrative or minor (criminal) offences to increasingly serious offences and offenders. In this way it can be comparatively demonstrated how the criminal justice systems studied handle specific cases and whether they do so in a similar or different way.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
Intravitreal administration has been widely used since 20 years and has been shown to improve the treatment of diseases of the posterior segment of the eye with infectious origin or in edematous maculopathies. This route of administration allows to achieve high concentration of drug in the vitreous and avoids the problems resulting from systemic administration. However, two basic problems limit the use of intravitreal therapy. Many drugs are rapidly cleared from the vitreous humor; therefore, to reach and to maintain effective therapy repeated injections are necessary. Repeated intravitreal injections increase the risk of endophthalmitis, damage to lens, retinal detachment. Moreover, some drugs provoke a local toxicity at their effective dose inducing side-effects and possible retinal lesions. In this context, the development and the use of new drug delivery systems for intravitreal administration are necessary to treat chronic ocular diseases. Among them, particulate systems such as liposomes have been widely studied. Liposomes are easily injectable and permit to reduce the toxicity and to increase the residence time of several drugs in the eye. They are also able to protect in vivo poorly-stable molecules from degradation such as peptides and nucleic acids. Some promising results have been obtained for the treatment of retinitis induced by cytomegalovirus in human and more recently for the treatment of uveitis in animal. Finally, the fate of liposomes in ocular tissues and fluids after their injection into the vitreous and their elimination routes begin to be more known.
Resumo:
BACKGROUND: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10- year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Europe. METHODS: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10- year risk of fatal or non-fatal CHD>20% and a 10-year risk of fatal CVD≥5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. RESULTS: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC. Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24′ 310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). CONCLUSIONS: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Europe.
Resumo:
Adolescence, defined as a transition phase toward autonomy and independence, is a natural time of learning and adjustment, particularly in the setting of long-term goals and personal aspirations. It also is a period of heightened sensation seeking, including risk taking and reckless behaviors, which is a major cause of morbidity and mortality among teenagers. Recent observations suggest that a relative immaturity in frontal cortical neural systems may underlie the adolescent propensity for uninhibited risk taking and hazardous behaviors. However, converging preclinical and clinical studies do not support a simple model of frontal cortical immaturity, and there is substantial evidence that adolescents engage in dangerous activities, including drug abuse, despite knowing and understanding the risks involved. Therefore, a current consensus considers that much brain development during adolescence occurs in brain regions and systems that are critically involved in the perception and evaluation of risk and reward, leading to important changes in social and affective processing. Hence, rather than naive, immature and vulnerable, the adolescent brain, particularly the prefrontal cortex, should be considered as prewired for expecting novel experiences. In this perspective, thrill seeking may not represent a danger but rather a window of opportunities permitting the development of cognitive control through multiple experiences. However, if the maturation of brain systems implicated in self-regulation is contextually dependent, it is important to understand which experiences matter most. In particular, it is essential to unveil the underpinning mechanisms by which recurrent adverse episodes of stress or unrestricted access to drugs can shape the adolescent brain and potentially trigger life-long maladaptive responses.
Resumo:
The hypocretins, also known as orexins, are two neuropeptides now commonly described as critical components to maintain and regulate the stability of arousal. Several lines of evidence have raised the hypothesis that hypocretin-producing neurons are part of the circuitries that mediate the hypothalamic response to acute stress. Intracerebral administration of hypocretin leads to a dose-related reinstatement of drug and food seeking behaviors. Furthermore, stress-induced reinstatement can be blocked with hypocretin receptor 1 antagonism. These results, together with recent data showing that hypocretin is critically involved in cocaine sensitization through the recruitment of NMDA receptors in the ventral tegmental area, strongly suggest that activation of hypocretin neurons play a critical role in the development of the addiction process. The activity of hypocretin neurons may affect addictive behavior by contributing to brain sensitization or by modulating the brain reward system. Hypocretinergic cells, in coordination with brain stress systems may lead to a vulnerable state that facilitates the resumption of drug seeking behavior. Hence, the hypocretinergic system is a new drug target that may be used to prevent relapse of drug seeking
Resumo:
Repeated antimalarial treatment for febrile episodes and self-treatment are common in malaria-endemic areas. The intake of antimalarials prior to participating in an in vivo study may alter treatment outcome and affect the interpretation of both efficacy and safety outcomes. We report the findings from baseline plasma sampling of malaria patients prior to inclusion into an in vivo study in Tanzania and discuss the implications of residual concentrations of antimalarials in this setting. In an in vivo study conducted in a rural area of Tanzania in 2008, baseline plasma samples from patients reporting no antimalarial intake within the last 28 days were screened for the presence of 14 antimalarials (parent drugs or metabolites) using liquid chromatography-tandem mass spectrometry. Among the 148 patients enrolled, 110 (74.3%) had at least one antimalarial in their plasma: 80 (54.1%) had lumefantrine above the lower limit of calibration (LLC = 4 ng/mL), 7 (4.7%) desbutyl-lumefantrine (4 ng/mL), 77 (52.0%) sulfadoxine (0.5 ng/mL), 15 (10.1%) pyrimethamine (0.5 ng/mL), 16 (10.8%) quinine (2.5 ng/mL) and none chloroquine (2.5 ng/mL). The proportion of patients with detectable antimalarial drug levels prior to enrollment into the study is worrying. Indeed artemether-lumefantrine was supposed to be available only at government health facilities. Although sulfadoxine-pyrimethamine is only recommended for intermittent preventive treatment in pregnancy (IPTp), it was still widely used in public and private health facilities and sold in drug shops. Self-reporting of previous drug intake is unreliable and thus screening for the presence of antimalarial drug levels should be considered in future in vivo studies to allow for accurate assessment of treatment outcome. Furthermore, persisting sub-therapeutic drug levels of antimalarials in a population could promote the spread of drug resistance. The knowledge on drug pressure in a given population is important to monitor standard treatment policy implementation.
Resumo:
OBJECTIVE: Recent pharmacologic studies in our laboratory have suggested that the spinal neuropeptide Y (NPY) Y1 receptor contributes to pain inhibition and to the analgesic effects of NPY. To rule out off-target effects, the present study used Y1-receptor-deficient (-/-) mice to further explore the contribution of Y1 receptors to pain modulation. METHODS AND RESULTS: Y1(-/-) mice exhibited reduced latency in the hotplate test of acute pain and a longer-lasting heat allodynia in the complete Freund's adjuvant (CFA) model of inflammatory pain. Y1 deletion did not change CFA-induced inflammation. Upon targeting the spinal NPY systems with intrathecal drug delivery, NPY reduced tactile and heat allodynia in the CFA model and the partial sciatic nerve ligation model of neuropathic pain. Importantly, we show for the first time that NPY does not exert these anti-allodynic effects in Y1(-/-) mice. Furthermore, in nerve-injured CD1 mice, concomitant injection of the potent Y1 antagonist BIBO3304 prevented the anti-allodynic actions of NPY. Neither NPY nor BIBO3304 altered performance on the Rotorod test, arguing against an indirect effect of motor function. CONCLUSION: The Y1 receptor contributes to pain inhibition and to the analgesic effects of NPY.
Resumo:
BACKGROUND: Digoxin intoxication results in predominantly digestive, cardiac and neurological symptoms. This case is outstanding in that the intoxication occurred in a nonagenarian and induced severe, extensively documented visual symptoms as well as dysphagia and proprioceptive illusions. Moreover, it went undiagnosed for a whole month despite close medical follow-up, illustrating the difficulty in recognizing drug-induced effects in a polymorbid patient. CASE PRESENTATION: Digoxin 0.25 mg qd for atrial fibrillation was prescribed to a 91-year-old woman with an estimated creatinine clearance of 18 ml/min. Over the following 2-3 weeks she developed nausea, vomiting and dysphagia, snowy and blurry vision, photopsia, dyschromatopsia, aggravated pre-existing formed visual hallucinations and proprioceptive illusions. She saw her family doctor twice and visited the eye clinic once until, 1 month after starting digoxin, she was admitted to the emergency room. Intoxication was confirmed by a serum digoxin level of 5.7 ng/ml (reference range 0.8-2 ng/ml). After stopping digoxin, general symptoms resolved in a few days, but visual complaints persisted. Examination by the ophthalmologist revealed decreased visual acuity in both eyes, 4/10 in the right eye (OD) and 5/10 in the left eye (OS), decreased color vision as demonstrated by a score of 1/13 in both eyes (OU) on Ishihara pseudoisochromatic plates, OS cataract, and dry age-related macular degeneration (ARMD). Computerized static perimetry showed non-specific diffuse alterations suggestive of either bilateral retinopathy or optic neuropathy. Full-field electroretinography (ERG) disclosed moderate diffuse rod and cone dysfunction and multifocal ERG revealed central loss of function OU. Visual symptoms progressively improved over the next 2 months, but multifocal ERG did not. The patient was finally discharged home after a 5 week hospital stay. CONCLUSION: This case is a reminder of a complication of digoxin treatment to be considered by any treating physician. If digoxin is prescribed in a vulnerable patient, close monitoring is mandatory. In general, when facing a new health problem in a polymorbid patient, it is crucial to elicit a complete history, with all recent drug changes and detailed complaints, and to include a drug adverse reaction in the differential diagnosis.