854 resultados para drug dose increase
Resumo:
To assess drug-related problems in patients with liver cirrhosis by investigating the prevalence of inadequately dosed drugs and their association with adverse drug reactions (ADRs) and hospitalizations.
Resumo:
The purpose of this study is to evaluate the effects of high doses of injected opiates as prescribed maintenance in intravenous drugs users. This was accomplished via a randomised double-blind study with crossover at an outpatient clinic in Bern, Switzerland. The subjects were 39 patients with a long history of intravenous opioid use and persistent abuse despite treatment; they were randomly allocated to two groups. Group A was started on controlled injection of graduated doses of morphine up to a satisfying individual dose and was then switched as a double blind to heroin at a randomly determined day between week three and four. Subsequently this group was given heroin for the remaining two to three weeks of the study. Group B was started on heroin and was then switched to morphine in the same manner. Equipotent solutions of 3% morphine and 2% heroin were administered. The main outcome measures were clinical observations, structural interviews and self report of subjective experiences to assess the effects of the drugs. In 16 cases, the study had to be discontinued owing to severe morphine-induced histamine reactions. Thirteen participants in Group B presented these adverse reactions on the day of the switch-over. Full data were thus only obtainable for 17 participants. Average daily doses were 491 mg for heroin and 597 mg for morphine. The findings indicate that heroin significantly produced a lower grade of itching, flushing, urticaria and pain/nausea. A negative correlation between dose and euphoria was observed for both heroin and morphine. The authors concluded that as heroin produces fewer side effects it is the preferred high-dose maintenance prescription to morphine. The perceived euphoric effects are limited in both substances.
Resumo:
RATIONALE: Olanzapine is an atypical antipsychotic drug with a more favourable safety profile than typical antipsychotics with a hitherto unknown topographic quantitative electroencephalogram (QEEG) profile. OBJECTIVES: We investigated electrical brain activity (QEEG and cognitive event related potentials, ERPs) in healthy subjects who received olanzapine. METHODS: Vigilance-controlled, 19-channel EEG and ERP in an auditory odd-ball paradigm were recorded before and 3 h, 6 h and 9 h after administration of either a single dose of placebo or olanzapine (2.5 mg and 5 mg) in ten healthy subjects. QEEG was analysed by spectral analysis and evaluated in nine frequency bands. For the P300 component in the odd-ball ERP, the amplitude and latency was analysed. Statistical effects were tested using a repeated-measurement analysis of variance. RESULTS: For the interaction between time and treatment, significant effects were observed for theta, alpha-2, beta-2 and beta-4 frequency bands. The amplitude of the activity in the theta band increased most significantly 6 h after the 5-mg administration of olanzapine. A pronounced decrease of the alpha-2 activity especially 9 h after 5 mg olanzapine administration could be observed. In most beta frequency bands, and most significantly in the beta-4 band, a dose-dependent decrease of the activity beginning 6 h after drug administration was demonstrated. Topographic effects could be observed for the beta-2 band (occipital decrease) and a tendency for the alpha-2 band (frontal increase and occipital decrease), both indicating a frontal shift of brain electrical activity. There were no significant changes in P300 amplitude or latency after drug administration. Conclusion: QEEG alterations after olanzapine administration were similar to EEG effects gained by other atypical antipsychotic drugs, such as clozapine. The increase of theta activity is comparable to the frequency distribution observed for thymoleptics or antipsychotics for which treatment-emergent somnolence is commonly observed, whereas the decrease of beta activity observed after olanzapine administration is not characteristic for these drugs. There were no clear signs for an increased cerebral excitability after a single-dose administration of 2.5 mg and 5 mg olanzapine in healthy controls.
Resumo:
The toxicity of long-term immunosuppressive therapy has become a major concern in long-term follow-up of heart transplant recipients. In this respect the quality of renal function is undoubtedly linked to cyclosporin A (CsA) drug levels. In cardiac transplantation, specific CsA trough levels have historically been maintained between 250 and 350 micrograms/L in many centers without direct evidence for the necessity of such high levels while using triple-drug immunosuppression. This retrospective analysis compares the incidence of acute and chronic graft rejection as well as overall mortality between groups of patients with high (250 to 350 micrograms/L) and low (150 to 250 micrograms/L) specific CsA trough levels. A total of 332 patients who underwent heart transplantation between October 1985 and October 1992 with a minimum follow-up of 30 days were included in this study (46 women and 276 men; aged, 44 +/- 12 years; mean follow-up, 1,122 +/- 777 days). Standard triple-drug immunosuppression included first-year specific CsA target trough levels of 250 to 300 micrograms/L. Patients were grouped according to their average creatinine level in the first postoperative year (group I, < 130 mumol/L, n = 234; group II, > or = 130 mumol/L, n = 98). The overall 5-year survival excluding the early 30-day mortality was 92% (group I, 216/232) and 91% (group II, 89/98) with 75% of the mortality due to chronic rejection. The rate of rejection for the entire follow-up period was similar in both groups (first year: group I, 3.2 +/- 2.6 rejection/patient/year; group II, 3.6 +/- 2.7 rejection/patient/year; p = not significant).(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
The considerable search for synergistic agents in cancer research is motivated by the therapeutic benefits achieved by combining anti-cancer agents. Synergistic agents make it possible to reduce dosage while maintaining or enhancing a desired effect. Other favorable outcomes of synergistic agents include reduction in toxicity and minimizing or delaying drug resistance. Dose-response assessment and drug-drug interaction analysis play an important part in the drug discovery process, however analysis are often poorly done. This dissertation is an effort to notably improve dose-response assessment and drug-drug interaction analysis. The most commonly used method in published analysis is the Median-Effect Principle/Combination Index method (Chou and Talalay, 1984). The Median-Effect Principle/Combination Index method leads to inefficiency by ignoring important sources of variation inherent in dose-response data and discarding data points that do not fit the Median-Effect Principle. Previous work has shown that the conventional method yields a high rate of false positives (Boik, Boik, Newman, 2008; Hennessey, Rosner, Bast, Chen, 2010) and, in some cases, low power to detect synergy. There is a great need for improving the current methodology. We developed a Bayesian framework for dose-response modeling and drug-drug interaction analysis. First, we developed a hierarchical meta-regression dose-response model that accounts for various sources of variation and uncertainty and allows one to incorporate knowledge from prior studies into the current analysis, thus offering a more efficient and reliable inference. Second, in the case that parametric dose-response models do not fit the data, we developed a practical and flexible nonparametric regression method for meta-analysis of independently repeated dose-response experiments. Third, and lastly, we developed a method, based on Loewe additivity that allows one to quantitatively assess interaction between two agents combined at a fixed dose ratio. The proposed method makes a comprehensive and honest account of uncertainty within drug interaction assessment. Extensive simulation studies show that the novel methodology improves the screening process of effective/synergistic agents and reduces the incidence of type I error. We consider an ovarian cancer cell line study that investigates the combined effect of DNA methylation inhibitors and histone deacetylation inhibitors in human ovarian cancer cell lines. The hypothesis is that the combination of DNA methylation inhibitors and histone deacetylation inhibitors will enhance antiproliferative activity in human ovarian cancer cell lines compared to treatment with each inhibitor alone. By applying the proposed Bayesian methodology, in vitro synergy was declared for DNA methylation inhibitor, 5-AZA-2'-deoxycytidine combined with one histone deacetylation inhibitor, suberoylanilide hydroxamic acid or trichostatin A in the cell lines HEY and SKOV3. This suggests potential new epigenetic therapies in cell growth inhibition of ovarian cancer cells.
Resumo:
Modulation of tumor hypoxia to increase bioreductive drug antitumor activity was investigated. The antivascular agent 5,6-dimethylxanthenone acetic acid (DMXAA) was used in combination studies with the bioreductive drugs Tirapazamine (TPZ) and Mitomycin C (MMC). Blood perfusion studies with DMXAA showed a maximal reduction of 66% in tumor blood flow 4 hours post drug administration. This tumor specific decrease in perfusion was also found to be dose-dependent, with 25 and 30 mg/kg DMXAA yielding greater than 50% reduction in tumor blood flow. Increases in antitumor activity with combination therapy (bioreductive drugs $+$ DMXAA) were significant over individual therapies, suggesting an increased activity due to increased hypoxia induced by DMXAA. Combination studies yielded the following significant tumor growth delays over control: MMC (5mg/kg) $+$ DMXAA (25mg/kg) = 20 days, MMC (2.5mg/kg) $+$ DMXAA (25 mg/kg) = 8 days, TPZ (21.4mg/kg) $+$ DMXAA (17.5mg/kg) = 4 days. The mechanism of interaction of these drugs was investigated by measuring metabolite production and DNA damage. 'Real time' microdialysis studies indicated maximal metabolite production at 20-30 minutes post injection for individual and combination therapies. DNA double strand breaks induced by TPZ $\pm$ DMXAA (20 minutes post injection) were analyzed by pulsed field gel electrophoresis (PFGE). Southern blot analyses and quantification showed TPZ induced DNA double strand breaks, but this effect was not evident in combination studies with DMXAA. Based on these data, combination studies of TPZ $+$ DMXAA showed increased antitumor activity over individual drug therapies. The mechanism of this increased activity, however, does not appear to be due to an increase in TPZ bioreduction at this time point. ^
Resumo:
Purpose. Drug users are a large group of those at highest risk for contracting Hepatitis B (HBV). This study sought to identify predictors of HBV vaccine acceptance and compliance in a cohort of current drug users in Houston, Texas. Perceived severity of HBV, perceived risk of HBV, perceived peer support of HBV vaccine, and perceived benefits of HBV vaccine were also examined assess their relationship to HBV compliance. ^ Methods. A randomized intervention study was conducted in a cohort of current drug users in Houston, Texas. Participants were recruited by community outreach workers from two urban neighborhoods in Houston known for high drug use. Participants were randomized to a standard vaccine schedule group or an accelerated vaccine schedule group. Participants were also randomized to either a standard behavioral intervention group or an enhanced behavioral intervention group designed to increase HBV vaccine acceptance and compliance. Baseline visits included an interview for demographic factors, drug and sexual behaviors, and HBV beliefs; and participants received the first dose of the HBV vaccine and one of the behavioral interventions. ^ Results. Of 1,643 screening participants, 77% accepted the HBV vaccine. Participants ages ≥50 were twice as likely to accept the vaccine. African Americans and less frequent drug users were also significantly more likely to accept the vaccine. Of the 1,259 participants who enrolled in the study, 75% were compliant to the HBV vaccine. Predictors of compliance were found to be race, housing status, and alcohol use. Speedball users were found to be 74% less likely to be compliant the HBV vaccine. None of the behavioral constructs assessed were found to significantly predict HBV compliance. However, additional analyses found that there were significant changes in mean scores of the behavioral concepts when measured at six month follow-up. ^ Conclusion. Results from this study indicate that when offered a free vaccine in the drug user community, a large percentage will be compliant to the vaccine series. The behavioral cognitions commonly used in HBV compliance research need to be extended to accurately fit this cohort. Also, vaccine intervention focus needs to be on reaching the homeless segment of the drug users and the speedball users. ^
Resumo:
Despite the availability of hepatitis B vaccine for over two decades, drug users and other high-risk adult populations have experienced low vaccine coverage. Poor compliance has limited efforts to reduce transmission of hepatitis B infection in this population. Evidence suggests that immunological response in drug users is impaired compared to the general population, both in terms of lower seroprotection rates and antibodies levels.^ The current study investigated the effectiveness of the multi-dose hepatitis B vaccine and compared the effect of the standard and accelerated vaccine schedules in a not-in-treatment, drug-using adult population in the city of Houston, USA.^ A population of drug-users from two communities in Houston, susceptible to hepatitis B, was sampled by outreach workers and referral methodology. Subjects were randomized either to the standard hepatitis vaccine schedule (0, 1-, 6-month) or to an accelerated schedule (0, 1-, 2-month). Antibody levels were detected through laboratory analyses at various time-points. The participants were followed for two years and seroconversion rates were calculated to determine immune response.^ A four percent difference in the overall compliance rate was observed between the standard (73%) and accelerated schedules (77%). Logistic regression analyses showed that drug users living on the streets were twice as likely to not complete all three vaccine doses (p=0.028), and current speedball use was also associated with non-completion (p=0.002). Completion of all three vaccinations in the multivariate analysis was also correlated with older age. Drug users on the accelerated schedule were 26% more likely to achieve completion, although this factor was marginally significant (p=0.085).^ Cumulative adequate protective response was gained by 65% of the HBV susceptible subgroup by 12-months and was identical for both the standard and accelerated schedules. Excess protective response (>=100 mIU/mL) occurred with greater frequency at the later period for the standard schedule (36% at 12-months compared to 14% at six months), while the greater proportion of excess protective response for the accelerated schedule occurred earlier (34% at 6 months compared to 18% at 12-months). Seroconversion at the adequate protective response level of 10 mIU/mL was reached by the accelerated schedule group at a quicker rate (62% vs. 49%), and with a higher mean titer (104.8 vs. 64.3 mIU/mL), when measured at six months. Multivariate analyses indicated a 63% increased risk of non-response for older age and confirmed the existence of an accelerating decline in immune response to vaccination manifesting after 40 years (p=0.001). Injecting more than daily was also highly associated with the risk of non-response (p=0.016).^ The substantial increase in the seroprotection rate at six months may be worth the trade-off against the faster antibody titer decrease and is recommended for enhancing compliance and seroconversion. Utilization of the accelerated schedule with the primary objective of increasing compliance and seroconversion rates during the six months after the first dose may confer early protective immunity and reduce the HBV vulnerability of drug users who continue, or have recently initiated, increased high risk drug use and sexual behaviors.^
Resumo:
Background: For most cytotoxic and biologic anti-cancer agents, the response rate of the drug is commonly assumed to be non-decreasing with an increasing dose. However, an increasing dose does not always result in an appreciable increase in the response rate. This may especially be true at high doses for a biologic agent. Therefore, in a phase II trial the investigators may be interested in testing the anti-tumor activity of a drug at more than one (often two) doses, instead of only at the maximum tolerated dose (MTD). This way, when the lower dose appears equally effective, this dose can be recommended for further confirmatory testing in a phase III trial under potential long-term toxicity and cost considerations. A common approach to designing such a phase II trial has been to use an independent (e.g., Simon's two-stage) design at each dose ignoring the prior knowledge about the ordering of the response probabilities at the different doses. However, failure to account for this ordering constraint in estimating the response probabilities may result in an inefficient design. In this dissertation, we developed extensions of Simon's optimal and minimax two-stage designs, including both frequentist and Bayesian methods, for two doses that assume ordered response rates between doses. ^ Methods: Optimal and minimax two-stage designs are proposed for phase II clinical trials in settings where the true response rates at two dose levels are ordered. We borrow strength between doses using isotonic regression and control the joint and/or marginal error probabilities. Bayesian two-stage designs are also proposed under a stochastic ordering constraint. ^ Results: Compared to Simon's designs, when controlling the power and type I error at the same levels, the proposed frequentist and Bayesian designs reduce the maximum and expected sample sizes. Most of the proposed designs also increase the probability of early termination when the true response rates are poor. ^ Conclusion: Proposed frequentist and Bayesian designs are superior to Simon's designs in terms of operating characteristics (expected sample size and probability of early termination, when the response rates are poor) Thus, the proposed designs lead to more cost-efficient and ethical trials, and may consequently improve and expedite the drug discovery process. The proposed designs may be extended to designs of multiple group trials and drug combination trials.^
Resumo:
The nucleus accumbens is considered a critical target of the action of drugs of abuse. In this nucleus a "shell" and a "core" have been distinguished on the basis of anatomical and histochemical criteria. The present study investigated the effect in freely moving rats of intravenous cocaine, amphetamine, and morphine on extracellular dopamine concentrations in the nucleus accumbens shell and core by means of microdialysis with vertically implanted concentric probes. Doses selected were in the range of those known to sustain drug self-administration in rats. Morphine, at 0.2 and 0.4 mg/kg, and cocaine, at 0.5 mg/kg, increased extracellular dopamine selectivity in the shell. Higher doses of cocaine (1.0 mg/kg) and the lowest dose of amphetamine tested (0.125 mg/kg) increased extracellular dopamine both in the shell and in the core, but the effect was significantly more pronounced in the shell compared with the core. Only the highest dose of amphetamine (0.250 mg/kg) increased extracellular dopamine in the shell and in the core to a similar extent. The present results provide in vivo neurochemical evidence for a functional compartmentation within the nucleus accumbens and for a preferential effect of psychostimulants and morphine in the shell of the nucleus accumbens at doses known to sustain intravenous drug self-administration.
Resumo:
Without music.
Resumo:
Background: A sharp reduction in heroin supply in Australia in 2001 was followed by a large but transient increase in cocaine use among injecting drug users (IDU) in Sydney. This paper assesses whether the increase in cocaine use among IDU was accompanied by increased rates of violent crime as occurred in the United States in the 1980s. Specifically, the paper aims to examine the impact of increased cocaine use among Sydney IDU upon police incidents of robbery with a weapon, assault and homicide. Methods: Data on cocaine use among IDU was obtained from the Illicit Drug Reporting System (IDRS). Monthly NSW Police incident data on arrests for cocaine possession/ use, robbery offences, homicides, and assaults, were obtained from the Bureau of Crime Statistics and Research. Time series analysis was conducted on the police data series where possible. Semi-structured interviews were conducted with representatives from law enforcement and health agencies about the impacts of cocaine use on crime and policing. Results: There was a significant increase in cocaine use and cocaine possession offences in the months immediately following the reduction in heroin supply. There was also a significant increase in incidents of robbery where weapons were involved. There were no increases in offences involving firearms, homicides or reported assaults. Conclusion: The increased use of cocaine among injecting drug users following the heroin shortage led to increases in violent crime. Other States and territories that also experienced a heroin shortage but did not show any increases in cocaine use did not report any increase in violent crimes. The violent crimes committed did not involve guns, most likely because of its stringent gun laws, in contrast to the experience of American cities that have experienced high rates of cocaine use and violent crime.
Resumo:
Disturbances in electrolyte homeostasis are a frequent adverse side-effect of the administration of aminoglycoside antibiotics such as gentamicin, and the antineoplastic agent cis-platinum. The aims of this work were to further elucidate the site(s) and mechanism(s) by which these drugs may produce disturbances in the renal reabsorption of calcium and magnesium. These investigations were undertaken using a range of in vivo and in vitro techniques and models. Initially, a series of in vivo studies was conducted to delineate aspects of the acute and chronic effects of both drugs on renal electrolyte handling and to select and evaluate an appropriate animal model: subsequent investigations were focused on gentamicin. In a study of the acute and chronic effects of cis-platinum administration, there were pronounced acute changes in a variety of indices of nephrotoxic injury, including electrolyte excretion. Most effects resolved but there were chronic increases in the urinary excretion of calcium and magnesium. The renal response of three strains of rat (Fischer 344, Sprague-Dawley (SD), and Wistar) to a ranges of doses of gentamicin was also investigated. Drug administration produced substantially different responses between strains, in particular marked differences in calcium and magnesium excretion. The results suggested that the SD rat was an appropriately sensitive strain for use in further investigations. Acute infusion of gentamicin in the anaesthetised SD rat produced rapid, substantial increases in the fractional excretion of calcium and magnesium, while sodium and potassium output were unaffected, confirming previous results of similar experiments using F344 rats. Studies using lithium clearance measurements in the anaesthetised SD rat were undertaken to investigate the effects of gentamicin on proximal tubular calcium reabsorption. Lithium clearance was unaffected by acute gentamicin infusion, suggesting that the site of acute gentamicin-induced hypercalciuria may not be located in the proximal tubule. Inhibition of Ca2+ ATPase activity was investigated as a potential mechanism by which calcium reabsorption could be affected after aminoglycoside administration. In vitro, both Ca2+ ATPase and Na+/K+ ATPase activity could be similarly inhibited by the presence of aminoglycosides, in a dose-related manner. Whilst inhibition of Na+/K+ ATPase could be demonstrated biochemically after in vivo administration of gentamicin, there were no concurrent effects on Ca2+ ATPase activity, suggesting that inhibition of Ca2+ ATPase activity is unlikely to be a primary mechanism of aminoglycoside-induced reductions of calcium reabsorption. Histochemical studies could not discern inhibition of either Na+/K+ ATPase or Ca2+ ATPase activity after in vivo administration of gentamicin. Selection of renal cell lines for further investigative in vitro studies on the mechanisms of altered cation reabsorption was considered using MTT (3-(4,5,-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide) and Neutral Red cytotoxicity assays. The ability of LLC-PK1 and LLC-RK1 cell lines to correctly rank a series of nephrotoxic compounds with their known nephrotoxic potency in vivo was studied. Using these cell lines grown on semi-permeable inserts, alterations in the paracellular transport of 45Ca was investigated as a possible mechanism by which gentamicin could alter calcium reabsorption in vivo. Short term exposure (I h) of LLC-RK1 cells to gentamicin, via both cell surfaces, resulted in a reduction in paracellular permeability to both transepithelial 3H-mannitol and 45Ca fluxes. When LLC-RK1 cells were exposed via the apical surface only, similar dose-related reductions were seen to those observed when cells were exposed to the drug from both sides. Short-term basal exposure to gentamicin appeared to contribute less to the observed reductions in 3H-mannitol and 45Ca fluxes. Experiments investigating transepithelial movement of 45Ca and 3H-mannitol on LLC-PK1 cells after acute gentamicin exposure were inconclusive. Longer exposure (48 h) to gentamicin caused an increase in the permeability of the monolayer and a consequent increase in transepithelial 45Ca flux in the LLC-RK1 cell line; increases in permeability of LLC-PK1 cells to 45Ca and 3H-mannitol were not apparent under the same conditions. The site and mechanism at which gentamicin, in particular, alters calcium reabsorption cannot be definitively described from these studies. However, indirect evidence from lithium clearance studies suggests that the site of the lesion is unlikely to be located in the proximal tubule. The mechanism by which gentamicin exposure alters calcium reabsorption may be by reducing paracellular permeability to calcium rather than by altering active calcium transport processes.