981 resultados para Drug control
Resumo:
BACKGROUND: Up to 5% of patients presenting to the emergency department (ED) four or more times within a 12 month period represent 21% of total ED visits. In this study we sought to characterize social and medical vulnerability factors of ED frequent users (FUs) and to explore if these factors hold simultaneously. METHODS: We performed a case-control study at Lausanne University Hospital, Switzerland. Patients over 18 years presenting to the ED at least once within the study period (April 2008 toMarch 2009) were included. FUs were defined as patients with four or more ED visits within the previous 12 months. Outcome data were extracted from medical records of the first ED attendance within the study period. Outcomes included basic demographics and social variables, ED admission diagnosis, somatic and psychiatric days hospitalized over 12 months, and having a primary care physician.We calculated the percentage of FUs and non-FUs having at least one social and one medical vulnerability factor. The four chosen social factors included: unemployed and/or dependence on government welfare, institutionalized and/or without fixed residence, either separated, divorced or widowed, and under guardianship. The fourmedical vulnerability factors were: ≥6 somatic days hospitalized, ≥1 psychiatric days hospitalized, ≥5 clinical departments used (all three factors measured over 12 months), and ED admission diagnosis of alcohol and/or drug abuse. Univariate and multivariate logistical regression analyses allowed comparison of two JGIM ABSTRACTS S391 random samples of 354 FUs and 354 non-FUs (statistical power 0.9, alpha 0.05 for all outcomes except gender, country of birth, and insurance type). RESULTS: FUs accounted for 7.7% of ED patients and 24.9% of ED visits. Univariate logistic regression showed that FUs were older (mean age 49.8 vs. 45.2 yrs, p=0.003),more often separated and/or divorced (17.5%vs. 13.9%, p=0.029) or widowed (13.8% vs. 8.8%, p=0.029), and either unemployed or dependent on government welfare (31.3% vs. 13.3%, p<0.001), compared to non-FUs. FUs cumulated more days hospitalized over 12 months (mean number of somatic days per patient 1.0 vs. 0.3, p<0.001; mean number of psychiatric days per patient 0.12 vs. 0.03, p<0.001). The two groups were similar regarding gender distribution (females 51.7% vs. 48.3%). The multivariate linear regression model was based on the six most significant factors identified by univariate analysis The model showed that FUs had more social problems, as they were more likely to be institutionalized or not have a fixed residence (OR 4.62; 95% CI, 1.65 to 12.93), and to be unemployed or dependent on government welfare (OR 2.03; 95% CI, 1.31 to 3.14) compared to non-FUs. FUs were more likely to need medical care, as indicated by involvement of≥5 clinical departments over 12 months (OR 6.2; 95%CI, 3.74 to 10.15), having an ED admission diagnosis of substance abuse (OR 3.23; 95% CI, 1.23 to 8.46) and having a primary care physician (OR 1.70;95%CI, 1.13 to 2.56); however, they were less likely to present with an admission diagnosis of injury (OR 0.64; 95% CI, 0.40 to 1.00) compared to non-FUs. FUs were more likely to combine at least one social with one medical vulnerability factor (38.4% vs. 12.1%, OR 7.74; 95% CI 5.03 to 11.93). CONCLUSIONS: FUs were more likely than non-FUs to have social and medical vulnerability factors and to have multiple factors in combination.
Treatment intensification and risk factor control: toward more clinically relevant quality measures.
Resumo:
BACKGROUND: Intensification of pharmacotherapy in persons with poorly controlled chronic conditions has been proposed as a clinically meaningful process measure of quality. OBJECTIVE: To validate measures of treatment intensification by evaluating their associations with subsequent control in hypertension, hyperlipidemia, and diabetes mellitus across 35 medical facility populations in Kaiser Permanente, Northern California. DESIGN: Hierarchical analyses of associations of improvements in facility-level treatment intensification rates from 2001 to 2003 with patient-level risk factor levels at the end of 2003. PATIENTS: Members (515,072 and 626,130; age >20 years) with hypertension, hyperlipidemia, and/or diabetes mellitus in 2001 and 2003, respectively. MEASUREMENTS: Treatment intensification for each risk factor defined as an increase in number of drug classes prescribed, of dosage for at least 1 drug, or switching to a drug from another class within 3 months of observed poor risk factor control. RESULTS: Facility-level improvements in treatment intensification rates between 2001 and 2003 were strongly associated with greater likelihood of being in control at the end of 2003 (P < or = 0.05 for each risk factor) after adjustment for patient- and facility-level covariates. Compared with facility rankings based solely on control, addition of percentages of poorly controlled patients who received treatment intensification changed 2003 rankings substantially: 14%, 51%, and 29% of the facilities changed ranks by 5 or more positions for hypertension, hyperlipidemia, and diabetes, respectively. CONCLUSIONS: Treatment intensification is tightly linked to improved control. Thus, it deserves consideration as a process measure for motivating quality improvement and possibly for measuring clinical performance.
Resumo:
A variant upstream of human leukocyte antigen C (HLA-C) shows the most significant genome-wide effect on HIV control in European Americans and is also associated with the level of HLA-C expression. We characterized the differential cell surface expression levels of all common HLA-C allotypes and tested directly for effects of HLA-C expression on outcomes of HIV infection in 5243 individuals. Increasing HLA-C expression was associated with protection against multiple outcomes independently of individual HLA allelic effects in both African and European Americans, regardless of their distinct HLA-C frequencies and linkage relationships with HLA-B and HLA-A. Higher HLA-C expression was correlated with increased likelihood of cytotoxic T lymphocyte responses and frequency of viral escape mutation. In contrast, high HLA-C expression had a deleterious effect in Crohn's disease, suggesting a broader influence of HLA expression levels in human disease.
Resumo:
BACKGROUND: Assessment of the proportion of patients with well controlled cardiovascular risk factors underestimates the proportion of patients receiving high quality of care. Evaluating whether physicians respond appropriately to poor risk factor control gives a different picture of quality of care. We assessed physician response to control cardiovascular risk factors, as well as markers of potential overtreatment in Switzerland, a country with universal healthcare coverage but without systematic quality monitoring, annual report cards on quality of care or financial incentives to improve quality. METHODS: We performed a retrospective cohort study of 1002 randomly selected patients aged 50-80 years from four university primary care settings in Switzerland. For hypertension, dyslipidemia and diabetes mellitus, we first measured proportions in control, then assessed therapy modifications among those in poor control. "Appropriate clinical action" was defined as a therapy modification or return to control without therapy modification within 12 months among patients with baseline poor control. Potential overtreatment of these conditions was defined as intensive treatment among low-risk patients with optimal target values. RESULTS: 20% of patients with hypertension, 41% with dyslipidemia and 36% with diabetes mellitus were in control at baseline. When appropriate clinical action in response to poor control was integrated into measuring quality of care, 52 to 55% had appropriate quality of care. Over 12 months, therapy of 61% of patients with baseline poor control was modified for hypertension, 33% for dyslipidemia, and 85% for diabetes mellitus. Increases in number of drug classes (28-51%) and in drug doses (10-61%) were the most common therapy modifications. Patients with target organ damage and higher baseline values were more likely to have appropriate clinical action. We found low rates of potential overtreatment with 2% for hypertension, 3% for diabetes mellitus and 3-6% for dyslipidemia. CONCLUSIONS: In primary care, evaluating whether physicians respond appropriately to poor risk factor control, in addition to assessing proportions in control, provide a broader view of the quality of care than relying solely on measures of proportions in control. Such measures could be more clinically relevant and acceptable to physicians than simply reporting levels of control.
Resumo:
Background: As imatinib pharmacokinetics are highly variable, plasma levels differ largely between patients under the same dosage. Retrospective studies in chronic myeloid leukemia (CML) patients showed significant correlations between low levels and suboptimal response, and between high levels and poor tolerability. Monitoring of plasma levels is thus increasingly advised, targeting trough concentrations of 1000 μg/L and above. Objectives: Our study was launched to assess the clinical usefulness of systematic imatinib TDM in CML patients. The present preliminary evaluation questions the appropriateness of dosage adjustment following plasma level measurement to reach the recommended trough level, while allowing an interval of 4-24 h after last drug intake for blood sampling. Methods: Initial blood samples from the first 9 patients in the intervention arm were obtained 4-25 h after last dose. Trough levels in 7 patients were predicted to be significantly away from the target (6 <750 μg/L, and 1 >1500 μg/L with poor tolerance), based on a Bayesian approach using a population pharmacokinetic model. Individual dosage adjustments were taken up in 5 patients, who had a control measurement 1-4 weeks after dosage change. Predicted trough levels were confronted to anterior model-based extrapolations. Results: Before dosage adjustment, observed concentrations extrapolated at trough ranged from 359 to 1832 μg/L (median 710; mean 804, CV 53%) in the 9 patients. After dosage adjustment they were expected to target between 720 and 1090 μg/L (median 878; mean 872, CV 13%). Observed levels of the 5 recheck measurements extrapolated at trough actually ranged from 710 to 1069 μg/L (median 1015; mean 950, CV 16%) and had absolute differences of 21 to 241 μg/L to the model-based predictions (median 175; mean 157, CV 52%). Differences between observed and predicted trough levels were larger when intervals between last drug intake and sampling were very short (~4 h). Conclusion: These preliminary results suggest that TDM of imatinib using a Bayesian interpretation is able to bring trough levels closer to 1000 μg/L (with CV decreasing from 53% to 16%). While this may simplify blood collection in daily practice, as samples do not have to be drawn exactly at trough, the largest possible interval to last drug intake yet remains preferable. This encourages the evaluation of the clinical benefit of a routine TDM intervention in CML patients, which the randomized Swiss I-COME study aims to.
Resumo:
Understanding the complexity of cancer depends on an elucidation of the underlying regulatory networks, at the cellular and intercellular levels and in their temporal dimension. This Opinion article focuses on the multilevel crosstalk between the Notch pathway and the p53 and p63 pathways. These two coordinated signalling modules are at the interface of external damaging signals and control of stem cell potential and differentiation. Positive or negative reciprocal regulation of the two pathways can vary with cell type and cancer stage. Therefore, selective or combined targeting of the two pathways could improve the efficacy and reduce the toxicity of cancer therapies.
Resumo:
A nyone traveling to the United States from countries other than New Zealand will be surprised by the prevalence of health-related advertisements on television, including ads for drugs. Typically, these TV ads follow a pattern: an ad for a burger at only 99 cents, followed by one for a proton-pump inhibitor, then an ad on healthy home-cooked food delivered directly to your home and an ad for a home-based abdominal workout DVD, followed by an ad for a lipid-lowering drug. There are, however, nuances. After 8 pm, the visitor might encounter an ad for the "little blue pill." This sequence sometimes includes an ad featuring antihistamines for allergic rhinitis in spring and one promoting antidepressants in the winter. Such direct-to-consumer advertising (DTCA) of prescription drugs is usual business in the United States and New Zealand but is prohibited in the rest of the world. Why? Because DTCA for prescribing drugs has pros and cons (discussed elsewhere,1-3 including in JGIM4) that are balanced differently in different countries. Constitutional factors-such as the First Amendment protections on speech, including commercial speech, in the United States5 -as well as patient and population safety considerations, which all differ across countries, modulate reactions to DTCA. Additionally, lack of robust data on the impact of DTCA on prescription drug use adds to the confusion. Evidence, though limited, suggests that DTCA increases drug sales. However, whether the increase in sales corrects existing underuse or encourages over/misuse is not clear.
Resumo:
Drug combinations can improve angiostatic cancer treatment efficacy and enable the reduction of side effects and drug resistance. Combining drugs is non-trivial due to the high number of possibilities. We applied a feedback system control (FSC) technique with a population-based stochastic search algorithm to navigate through the large parametric space of nine angiostatic drugs at four concentrations to identify optimal low-dose drug combinations. This implied an iterative approach of in vitro testing of endothelial cell viability and algorithm-based analysis. The optimal synergistic drug combination, containing erlotinib, BEZ-235 and RAPTA-C, was reached in a small number of iterations. Final drug combinations showed enhanced endothelial cell specificity and synergistically inhibited proliferation (p < 0.001), but not migration of endothelial cells, and forced enhanced numbers of endothelial cells to undergo apoptosis (p < 0.01). Successful translation of this drug combination was achieved in two preclinical in vivo tumor models. Tumor growth was inhibited synergistically and significantly (p < 0.05 and p < 0.01, respectively) using reduced drug doses as compared to optimal single-drug concentrations. At the applied conditions, single-drug monotherapies had no or negligible activity in these models. We suggest that FSC can be used for rapid identification of effective, reduced dose, multi-drug combinations for the treatment of cancer and other diseases.
Resumo:
Background: Vancomycin is a cornerstone antibiotic for the management of severe Gram positive infections. However, high doses of vancomycin are associated with a risk of nephrotoxicity. This study aimed to evaluate the relationship between the evolution of vancomycin trough concentration and the occurrence of nephrotoxicity, and to identify risk factors for both vancomycin-associated nephrotoxicity and vancomycin overexposure. Methods: A total of 1240 patients records from our hospital therapeutic drug monitoring database between 2007 and 2011 were screened and grouped according to predefined criteria defining vancomycin overexposure (one or more occurrence of a trough level ≥ 20 mg/L) and treatment-related nephrotoxicity (rise of serum creatinine by ≥ 50% over baseline). A representative sample of 150 cases was selected for in depth analysis. Weighted logistic regression analyses were used to test associations between vancomycin overexposure, nephrotoxicity and other predictors of interest. Results: Patients with high trough concentrations were found to be more likely to develop nephrotoxicity (odds ratio: 4.12; p <0.001). Specific risk factors, notably concomitant nephrotoxic treatments and comorbid conditions (heart failure), were found to independently increase the risk of either nephrotoxicity or vancomycin exposure. Finally, the exploration of temporal relationships between variations of vancomycin trough concentrations and creatinine levels were in line with circular causality with some antecedence of vancomycin on creatinine changes. Conclusion: Our results confirm the important nephrotoxic potential of vancomycin and indicate that the utilisation of this drug deserves thorough individualization for conditions susceptible to increase its concentration exposure and reactive adjustment based on therapeutic drug monitoring.
Resumo:
BACKGROUND: Pregnant women with asthma need to take medication during pregnancy. OBJECTIVE: We sought to identify whether there is an increased risk of specific congenital anomalies after exposure to antiasthma medication in the first trimester of pregnancy. METHODS: We performed a population-based case-malformed control study testing signals identified in a literature review. Odds ratios (ORs) of exposure to the main groups of asthma medication were calculated for each of the 10 signal anomalies compared with registrations with nonchromosomal, nonsignal anomalies as control registrations. In addition, exploratory analyses were done for each nonsignal anomaly. The data set included 76,249 registrations of congenital anomalies from 13 EUROmediCAT registries. RESULTS: Cleft palate (OR, 1.63; 95% CI, 1.05-2.52) and gastroschisis (OR, 1.89; 95% CI, 1.12-3.20) had significantly increased odds of exposure to first-trimester use of inhaled β2-agonists compared with nonchromosomal control registrations. Odds of exposure to salbutamol were similar. Nonsignificant ORs of exposure to inhaled β2-agonists were found for spina bifida, cleft lip, anal atresia, severe congenital heart defects in general, or tetralogy of Fallot. None of the 4 literature signals of exposure to inhaled steroids were confirmed (cleft palate, cleft lip, anal atresia, and hypospadias). Exploratory analyses found an association between renal dysplasia and exposure to the combination of long-acting β2-agonists and inhaled corticosteroids (OR, 3.95; 95% CI, 1.99-7.85). CONCLUSIONS: The study confirmed increased odds of first-trimester exposure to inhaled β2-agonists for cleft palate and gastroschisis and found a potential new signal for renal dysplasia associated with combined long-acting β2-agonists and inhaled corticosteroids. Use of inhaled corticosteroids during the first trimester of pregnancy seems to be safe in relation to the risk for a range of specific major congenital anomalies.
Resumo:
Les pistolets à impulsion électrique (PIE) sont de plus en plus fréquemment utilisés en Europe ces dernières années, le modèle le plus connu étant le Taser®. Les connaissances scientifiques concernant les PIE et leurs effets potentiels restent toutefois limitées. Nous avons conduit une revue de littérature afin d'évaluer les implications potentielles de leur utilisation en termes de sécurité, de morbidité et de mortalité. Une exposition unique chez un individu sain peut généralement être considérée comme peu dangereuse. Les sujets à risque de complications sont les individus exposés à de multiples décharges, les personnes sous l'influence de substances psychoactives, ceux qui montrent des signes d'agitation extrême, ou encore les individus présentant des comorbidités médicales. L'éventail des complications pouvant survenir lors de leur exposition est large et inclut les lésions provoquées par les impacts des électrodes, les traumatismes liés à la chute induite par la paralysie transitoire ou des complications cardiovasculaires. Dans ce contexte, les personnes exposées doivent être examinées attentivement, et les éventuelles lésions traumatiques doivent être exclues. The use of electronic control devices (ECD), such as the Taser®, has increased in Europe over the past decade. However, scientific data concerning the potential health impact of ECD usage remains limited. We reviewed the scientific literature in order to evaluate the safety, mortality, and morbidity associated with ECD use. Exposure of a healthy individual to a single ECD electroshock can be considered generally safe. Complications can, however, occur if the patient is subject to multiple electroshocks, if the patient has significant medical comorbidities, or when exposure is associated with drug abuse or agitated delirium. The broad spectrum of potential complications associated with ECD exposure includes direct trauma caused by the ECD electrodes, injuries caused by the transient paralysis-induced fall, and cardiovascular events. An ECD-exposed patient requires careful examination during which traumatic injuries are actively sought out.
Resumo:
[Summary] 2. Roles of quality control in the pharmaceutical and biopharmaceutical industries. - 2.1. Pharmaceutical industry. - 2.2. Biopharmaceutical industry. - 2.3. Policy and regulatory. - 2.3.1. The US Food and Drug Administration (FDA). - 2.3.2. The European Medicine Agency (EMEA). - 2.3.3. The Japanese Ministry of Work, Labor and Welfare (MHLW). - 2.3.4. The Swiss Agency for Therapeutic Products (Swissmedic). - 2.3.5. The International Conference on Harmonization (ICH). - - 3. Types of testing. - 3.1. Microbiological purity tests. - 3.2. Physiochemical tests. - 3.3. Critical to quality steps. - 3.3.1. API starting materials and excipients. - 3.3.2. Intermediates. - 3.3.3. APIs (drug substances) and final drug product. - 3.3.4. Primary and secondary packaging materials fro drug products. - - 4. Manufacturing cost and quality control. - 4.1.1. Pharmaceutical manufacturing cost breakdown. - 4.1.2. Biopharmaceutical manufacturing cost breakdown. - 4.2. Batch failure / rejection / rework / recalls. - - 5. Future trends in the quality control of pharmaceuticals and biopharmaceuticals. - 5.1. Rapid and real time testing. - 5.1.1. Physio-chemicals testing. - 5.1.2. Rapid microbiology methods
Resumo:
This work describes the development and validation of a dissolution test for 50 mg losartan potassium capsules using HPLC and UV spectrophotometry. A 2(4) full factorial design was carried out to optimize dissolution conditions and potassium phosphate buffer, pH 6.8 as dissolution medium, basket as apparatus at the stirring speed of 50 rpm and time of 30 min were considered adequate. Both dissolution procedure and analytical methods were validated and a statistical analysis showed that there are no significant differences between HPLC and spectrophotometry. Since there is no official monograph, this dissolution test could be applied for quality control routine.
Resumo:
Sensitive and selective spectrophotometric and spectrofluorimetric methods have been developed for determination of some drugs such as Pramipexole, Nebivolol, Carvedilol, and Eletriptan, which commonly contain secondary amino group. The subject methods were developed via derivatization of the secondary amino groups with 7-Chloro-4-Nitrobenzofurazon in borate buffer where a yellow colored reaction product was obtained and measured spectrophotometrically or spectrofluorimetrically. Concentration ranges were found as 2.0 to 250 μg mL-1 and 0.1 to 3.0 μg mL-1, for spectrophotometric and spectrofluorimetric study, respectively. The described methods can be easily applied by the quality control laboratories in routine analyses of these drugs in pharmaceutical preparations.