999 resultados para Trough Levels
Resumo:
Background: As imatinib pharmacokinetics are highly variable, plasma levels differ largely between patients under the same dosage. Retrospective studies in chronic myeloid leukemia (CML) patients showed significant correlations between low levels and suboptimal response, and between high levels and poor tolerability. Monitoring of plasma levels is thus increasingly advised, targeting trough concentrations of 1000 μg/L and above. Objectives: Our study was launched to assess the clinical usefulness of systematic imatinib TDM in CML patients. The present preliminary evaluation questions the appropriateness of dosage adjustment following plasma level measurement to reach the recommended trough level, while allowing an interval of 4-24 h after last drug intake for blood sampling. Methods: Initial blood samples from the first 9 patients in the intervention arm were obtained 4-25 h after last dose. Trough levels in 7 patients were predicted to be significantly away from the target (6 <750 μg/L, and 1 >1500 μg/L with poor tolerance), based on a Bayesian approach using a population pharmacokinetic model. Individual dosage adjustments were taken up in 5 patients, who had a control measurement 1-4 weeks after dosage change. Predicted trough levels were confronted to anterior model-based extrapolations. Results: Before dosage adjustment, observed concentrations extrapolated at trough ranged from 359 to 1832 μg/L (median 710; mean 804, CV 53%) in the 9 patients. After dosage adjustment they were expected to target between 720 and 1090 μg/L (median 878; mean 872, CV 13%). Observed levels of the 5 recheck measurements extrapolated at trough actually ranged from 710 to 1069 μg/L (median 1015; mean 950, CV 16%) and had absolute differences of 21 to 241 μg/L to the model-based predictions (median 175; mean 157, CV 52%). Differences between observed and predicted trough levels were larger when intervals between last drug intake and sampling were very short (~4 h). Conclusion: These preliminary results suggest that TDM of imatinib using a Bayesian interpretation is able to bring trough levels closer to 1000 μg/L (with CV decreasing from 53% to 16%). While this may simplify blood collection in daily practice, as samples do not have to be drawn exactly at trough, the largest possible interval to last drug intake yet remains preferable. This encourages the evaluation of the clinical benefit of a routine TDM intervention in CML patients, which the randomized Swiss I-COME study aims to.
Resumo:
PURPOSE: Adequate empirical antibiotic dose selection for critically ill burn patients is difficult due to extreme variability in drug pharmacokinetics. Therapeutic drug monitoring (TDM) may aid antibiotic prescription and implementation of initial empirical antimicrobial dosage recommendations. This study evaluated how gradual TDM introduction altered empirical dosages of meropenem and imipenem/cilastatin in our burn ICU. METHODS: Imipenem/cilastatin and meropenem use and daily empirical dosage at a five-bed burn ICU were analyzed retrospectively. Data for all burn admissions between 2001 and 2011 were extracted from the hospital's computerized information system. For each patient receiving a carbapenem, episodes of infection were reviewed and scored according to predefined criteria. Carbapenem trough serum levels were characterized. Prior to May 2007, TDM was available only by special request. Real-time carbapenem TDM was introduced in June 2007; it was initially available weekly and has been available 4 days a week since 2010. RESULTS: Of 365 patients, 229 (63%) received antibiotics (109 received carbapenems). Of 23 TDM determinations for imipenem/cilastatin, none exceeded the predefined upper limit and 11 (47.8%) were insufficient; the number of TDM requests was correlated with daily dose (r=0.7). Similar numbers of inappropriate meropenem trough levels (30.4%) were below and above the upper limit. Real-time TDM introduction increased the empirical dose of imipenem/cilastatin, but not meropenem. CONCLUSIONS: Real-time carbapenem TDM availability significantly altered the empirical daily dosage of imipenem/cilastatin at our burn ICU. Further studies are needed to evaluate the individual impact of TDM-based antibiotic adjustment on infection outcomes in these patients.
Resumo:
OBJECTIVE: We present a prospective study of a microemulsion of cyclosporin to treat idiopathic nephrotic syndrome in ten children with normal renal function who presented cyclosporin trough levels between 50 and 150 ng/ml and achieved complete remission with cyclosporin. To compare the pharmacokinetic parameters of cyclosporin in idiopathic nephrotic syndrome during remission and relapse of the nephrotic state. METHOD: The pharmacokinetic profile of cyclosporin was evaluated with the 12-hour area under the time-concentration curve (auc0-12) using seven time-point samples. This procedure was performed on each patient during remission and relapse with the same cyclosporin dose in mg/kg/day. The 12-hour area under the time-concentration curve was calculated using the trapezoidal rule. All of the pharmacokinetic parameters and the resumed 4-hour area under the time-concentration curve were correlated with the 12-hour area under the time-concentration curve. ClinicalTrials.gov: NCT01616446. RESULTS: There were no significant differences in any parameters of the pharmacokinetic of cyclosporin during remission and relapse, even when the data were normalized by dose. The best correlation with the 12-hour area under the time-concentration curve was the 4-hour area under the time-concentration curve on remission and relapse of the disease, followed by the 2-hour level after cyclosporin (c2) dosing in both disease states. CONCLUSIONS: These data indicate that the same parameters used for cyclosporin therapeutic monitoring estimated during the nephrotic state can also be used during remission. Larger controlled studies are needed to confirm these findings.
Resumo:
OBJECTIVE: We present a prospective study of a microemulsion of cyclosporin to treat idiopathic nephrotic syndrome in ten children with normal renal function who presented cyclosporin trough levels between 50 and 150 ng/ml and achieved complete remission with cyclosporin. To compare the pharmacokinetic parameters of cyclosporin in idiopathic nephrotic syndrome during remission and relapse of the nephrotic state. METHOD: The pharmacokinetic profile of cyclosporin was evaluated with the 12-hour area under the timeconcentration curve (auc0-12) using seven time-point samples. This procedure was performed on each patient during remission and relapse with the same cyclosporin dose in mg/kg/day. The 12-hour area under the timeconcentration curve was calculated using the trapezoidal rule. All of the pharmacokinetic parameters and the resumed 4-hour area under the time-concentration curve were correlated with the 12-hour area under the timeconcentration curve. ClinicalTrials.gov:NCT01616446. RESULTS: There were no significant differences in any parameters of the pharmacokinetic of cyclosporin during remission and relapse, even when the data were normalized by dose. The best correlation with the 12-hour area under the time-concentration curve was the 4-hour area under the time-concentration curve on remission and relapse of the disease, followed by the 2-hour level after cyclosporin (c2) dosing in both disease states. CONCLUSIONS: These data indicate that the same parameters used for cyclosporin therapeutic monitoring estimated during the nephrotic state can also be used during remission. Larger controlled studies are needed to confirm these findings.
Resumo:
The toxicity of long-term immunosuppressive therapy has become a major concern in long-term follow-up of heart transplant recipients. In this respect the quality of renal function is undoubtedly linked to cyclosporin A (CsA) drug levels. In cardiac transplantation, specific CsA trough levels have historically been maintained between 250 and 350 micrograms/L in many centers without direct evidence for the necessity of such high levels while using triple-drug immunosuppression. This retrospective analysis compares the incidence of acute and chronic graft rejection as well as overall mortality between groups of patients with high (250 to 350 micrograms/L) and low (150 to 250 micrograms/L) specific CsA trough levels. A total of 332 patients who underwent heart transplantation between October 1985 and October 1992 with a minimum follow-up of 30 days were included in this study (46 women and 276 men; aged, 44 +/- 12 years; mean follow-up, 1,122 +/- 777 days). Standard triple-drug immunosuppression included first-year specific CsA target trough levels of 250 to 300 micrograms/L. Patients were grouped according to their average creatinine level in the first postoperative year (group I, < 130 mumol/L, n = 234; group II, > or = 130 mumol/L, n = 98). The overall 5-year survival excluding the early 30-day mortality was 92% (group I, 216/232) and 91% (group II, 89/98) with 75% of the mortality due to chronic rejection. The rate of rejection for the entire follow-up period was similar in both groups (first year: group I, 3.2 +/- 2.6 rejection/patient/year; group II, 3.6 +/- 2.7 rejection/patient/year; p = not significant).(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
The posters presented at the 6th International Immunoglobulin Symposium covered a wide range of fields and included both basic science and clinical research. From the abstracts accepted for poster presentation, 12 abstracts were selected for oral presentations in three parallel sessions on immunodeficiencies, autoimmunity and basic research. The immunodeficiency presentations dealt with novel, rare class-switch recombination (CSR) deficiencies, attenuation of adverse events following IVIg treatment, association of immunoglobulin (Ig)G trough levels and protection against acute infection in patients with X-linked agammaglobulinaemia (XLA) and common variable immunodeficiency (CVID), and the reduction of class-switched memory B cells in patients with specific antibody deficiency (SAD). The impact of intravenous immunoglobulin on fetal alloimmune thrombocytopenia, pregnancy and postpartum-related relapses in multiple sclerosis and refractory myositis, as well as experiences with subcutaneous immunoglobulin in patients with multi-focal motor neuropathy, were the topics presented in the autoimmunity session. The interaction of dendritic cell (DC)-SIGN and alpha2,6-sialylated IgG Fc and its impact on human DCs, the enrichment of sialylated IgG in plasma-derived IgG, as wells as prion surveillance and monitoring of anti-measles titres in immunoglobulin products, were covered in the basic science session. In summary, the presentations illustrated the breadth of immunoglobulin therapy usage and highlighted the progress that is being made in diverse areas of basic and clinical research, extending our understanding of the mechanisms of immunoglobulin action and contributing to improved patient care.
Resumo:
OBJECTIVE: To examine a once daily dosing regimen of netilmicin in critically ill neonates and children. DESIGN AND SETTING: Open, prospective study on 81 antibiotic courses in 77 critically ill neonates and children, hospitalized in a multidisciplinary pediatric/neonatal intensive care unit. For combined empiric therapy (aminoglycoside and beta-lactam), netilmicin was given intravenously over 5 min once every 24 h. The dose ranged from 3.5-6 mg/kg, mainly depending upon gestational and postnatal age. Peak levels were determined by immunoassay 30 min after the second dose and trough levels 1 h before the third and fifth dose or after adaptation of dosing. RESULTS: All peak levels (n = 28) were clearly above 12 mumol/l (mean 22, range 13-41 mumol/l). Eighty-nine trough levels were within desired limits (< 4 mumol/l) and 11 (11%) above 4 mumol/l, mostly in conjunction with impaired renal function. CONCLUSIONS: Optimal peak and trough levels of netilmicin can be achieved by once daily dosing, adapted to gestational/postnatal age and renal function.
Resumo:
Objectives: To determine HIV-1 RNA in cerebrospinal fluid (CSF) of successfully treated patients and to evaluate if combination antiretroviral treatments with higher central nervous system penetration-effectiveness (CPE) achieve better CSF viral suppression. Methods: Viral loads (VLs) and drug concentrations of lopinavir, atazanavir, and efavirenz were measured in plasma and CSF. The CPE was calculated using 2 different methods. Results: The authors analyzed 87 CSF samples of 60 patients. In 4 CSF samples, HIV-1 RNA was detectable with 43–82 copies per milliliter. Median CPE in patients with detectable CSF VL was significantly lower compared with individuals with undetectable VL: CPE of 1.0 (range, 1.0–1.5) versus 2.3 (range, 1.0–3.5) using the method of 2008 (P = 0.011) and CPE of 6 (range, 6–8) versus 8 (range, 5–12) using the method of 2010 (P = 0.022). The extrapolated CSF trough levels for atazanavir (n = 12) were clearly above the 50% inhibitory concentration (IC50) in only 25% of samples; both patients on atazanavir/ritonavir with detectable CSF HIV-1 RNA had trough levels in the range of the presumed IC50. The extrapolated CSF trough level for lopinavir (n = 42) and efavirenz (n = 18) were above the IC50 in 98% and 78%, respectively, of samples, including the patients with detectable CSF HIV-1 RNA. Conclusions: This study suggests that treatment regimens with high intracerebral efficacy reflected by a high CPE score are essential to achieve CSF HIV-1 RNA suppression. The CPE score including all drug components was a better predictor for treatment failure in the CSF than the sole concentrations of protease inhibitor or nonnucleoside reverse transcriptase inhibitor in plasma or CSF.
Resumo:
OBJECTIVE: The aetiology of Crohn's disease (CD) has been related to nucleotide-binding oligomerisation domain containing 2 (NOD2) and ATG16L1 gene variants. The observation of bacterial DNA translocation in patients with CD led us to hypothesise that this process may be facilitated in patients with NOD2/ATG16L1-variant genotypes, affecting the efficacy of anti-tumour necrosis factor (TNF) therapies. DESIGN: 179 patients with Crohn's disease were included. CD-related NOD2 and ATG16L1 variants were genotyped. Phagocytic and bactericidal activities were evaluated in blood neutrophils. Bacterial DNA, TNFα, IFNγ, IL-12p40, free serum infliximab/adalimumab levels and antidrug antibodies were measured. RESULTS: Bacterial DNA was found in 44% of patients with active disease versus 23% of patients with remitting disease (p=0.01). A NOD2-variant or ATG16L1-variant genotype was associated with bacterial DNA presence (OR 4.8; 95% CI 1.1 to 13.2; p=0.001; and OR 2.4; 95% CI 1.4 to 4.7; p=0.01, respectively). This OR was 12.6 (95% CI 4.2 to 37.8; p=0.001) for patients with a double-variant genotype. Bacterial DNA was associated with disease activity (OR 2.6; 95% CI 1.3 to 5.4; p=0.005). Single and double-gene variants were not associated with disease activity (p=0.19). Patients with a NOD2-variant genotype showed decreased phagocytic and bactericidal activities in blood neutrophils, increased TNFα levels in response to bacterial DNA and decreased trough levels of free anti-TNFα. The proportion of patients on an intensified biological therapy was significantly higher in the NOD2-variant groups. CONCLUSIONS: Our results characterise a subgroup of patients with CD who may require a more aggressive therapy to reduce the extent of inflammation and the risk of relapse
Resumo:
The consumption of immunoglobulins (Ig) is increasing due to better recognition of antibody deficiencies, an aging population, and new indications. This review aims to examine the various dosing regimens and research developments in the established and in some of the relevant off-label indications in Europe. The background to the current regulatory settings in Europe is provided as a backdrop for the latest developments in primary and secondary immunodeficiencies and in immunomodulatory indications. In these heterogeneous areas, clinical trials encompassing different routes of administration, varying intervals, and infusion rates are paving the way toward more individualized therapy regimens. In primary antibody deficiencies, adjustments in dosing and intervals will depend on the clinical presentation, effective IgG trough levels and IgG metabolism. Ideally, individual pharmacokinetic profiles in conjunction with the clinical phenotype could lead to highly tailored treatment. In practice, incremental dosage increases are necessary to titrate the optimal dose for more severely ill patients. Higher intravenous doses in these patients also have beneficial immunomodulatory effects beyond mere IgG replacement. Better understanding of the pharmacokinetics of Ig therapy is leading to a move away from simplistic "per kg" dosing. Defective antibody production is common in many secondary immunodeficiencies irrespective of whether the causative factor was lymphoid malignancies (established indications), certain autoimmune disorders, immunosuppressive agents, or biologics. This antibody failure, as shown by test immunization, may be amenable to treatment with replacement Ig therapy. In certain immunomodulatory settings [e.g., idiopathic thrombocytopenic purpura (ITP)], selection of patients for Ig therapy may be enhanced by relevant biomarkers in order to exclude non-responders and thus obtain higher response rates. In this review, the developments in dosing of therapeutic immunoglobulins have been limited to high and some medium priority indications such as ITP, Kawasaki' disease, Guillain-Barré syndrome, chronic inflammatory demyelinating polyradiculoneuropathy, myasthenia gravis, multifocal motor neuropathy, fetal alloimmune thrombocytopenia, fetal hemolytic anemia, and dermatological diseases.
Resumo:
The endogenous clock that drives circadian rhythms is thought to communicate temporal information within the cell via cycling downstream transcripts. A transcript encoding a glycine-rich RNA-binding protein, Atgrp7, in Arabidopsis thaliana undergoes circadian oscillations with peak levels in the evening. The AtGRP7 protein also cycles with a time delay so that Atgrp7 transcript levels decline when the AtGRP7 protein accumulates to high levels. After AtGRP7 protein concentration has fallen to trough levels, Atgrp7 transcript starts to reaccumulate. Overexpression of AtGRP7 in transgenic Arabidopsis plants severely depresses cycling of the endogenous Atgrp7 transcript. These data establish both transcript and protein as components of a negative feedback circuit capable of generating a stable oscillation. AtGRP7 overexpression also depresses the oscillation of the circadian-regulated transcript encoding the related RNA-binding protein AtGRP8 but does not affect the oscillation of transcripts such as cab or catalase mRNAs. We propose that the AtGRP7 autoregulatory loop represents a “slave” oscillator in Arabidopsis that receives temporal information from a central “master” oscillator, conserves the rhythmicity by negative feedback, and transduces it to the output pathway by regulating a subset of clock-controlled transcripts.
Resumo:
BACKGROUND Nocardiosis is a rare, life-threatening opportunistic infection, affecting 0.04% to 3.5% of patients after solid organ transplantation (SOT). The aim of this study was to identify risk factors for Nocardia infection after SOT and to describe the presentation of nocardiosis in these patients. METHODS We performed a retrospective case-control study of adult patients diagnosed with nocardiosis after SOT between 2000 and 2014 in 36 European (France, Belgium, Switzerland, Netherlands, Spain) centers. Two control subjects per case were matched by institution, transplant date and transplanted organ. A multivariable analysis was performed using conditional logistic regression to identify risk factors for nocardiosis. RESULTS One hundred and seventeen cases of nocardiosis and 234 control patients were included. Nocardiosis occurred at a median of 17.5 [range 2-244] months after transplantation. In multivariable analysis, high calcineurin inhibitor trough levels in the month before diagnosis (OR=6.11 [2.58-14.51]), use of tacrolimus (OR=2.65 [1.17-6.00]) and corticosteroid dose (OR=1.12 [1.03-1.22]) at the time of diagnosis, patient age (OR=1.04 [1.02-1.07]) and length of stay in intensive care unit after SOT (OR=1.04 [1.00-1.09]) were independently associated with development of nocardiosis; low-dose cotrimoxazole prophylaxis was not found to prevent nocardiosis. Nocardia farcinica was more frequently associated with brain, skin and subcutaneous tissue infections than were other Nocardia species. Among the 30 cases with central nervous system nocardiosis, 13 (43.3%) had no neurological symptoms. CONCLUSIONS We identified five risk factors for nocardiosis after SOT. Low-dose cotrimoxazole was not found to prevent Nocardia infection. These findings may help improve management of transplant recipients.
Resumo:
Patient outcomes in transplantation would improve if dosing of immunosuppressive agents was individualized. The aim of this study is to develop a population pharmacokinetic model of tacrolimus in adult liver transplant recipients and test this model in individualizing therapy. Population analysis was performed on data from 68 patients. Estimates were sought for apparent clearance (CL/F) and apparent volume of distribution (V/F) using the nonlinear mixed effects model program (NONMEM). Factors screened for influence on these parameters were weight, age, sex, transplant type, biliary reconstructive procedure, postoperative day, days of therapy, liver function test results, creatinine clearance, hematocrit, corticosteroid dose, and interacting drugs. The predictive performance of the developed model was evaluated through Bayesian forecasting in an independent cohort of 36 patients. No linear correlation existed between tacrolimus dosage and trough concentration (r(2) = 0.005). Mean individual Bayesian estimates for CL/F and V/F were 26.5 8.2 (SD) L/hr and 399 +/- 185 L, respectively. CL/F was greater in patients with normal liver function. V/F increased with patient weight. CL/F decreased with increasing hematocrit. Based on the derived model, a 70-kg patient with an aspartate aminotransferase (AST) level less than 70 U/L would require a tacrolimus dose of 4.7 mg twice daily to achieve a steady-state trough concentration of 10 ng/mL. A 50-kg patient with an AST level greater than 70 U/L would require a dose of 2.6 mg. Marked interindividual variability (43% to 93%) and residual random error (3.3 ng/mL) were observed. Predictions made using the final model were reasonably nonbiased (0.56 ng/mL), but imprecise (4.8 ng/mL). Pharmacokinetic information obtained will assist in tacrolimus dosing; however, further investigation into reasons for the pharmacokinetic variability of tacrolimus is required.
Resumo:
The aim of this study was to determine the most informative sampling time(s) providing a precise prediction of tacrolimus area under the concentration-time curve (AUC). Fifty-four concentration-time profiles of tacrolimus from 31 adult liver transplant recipients were analyzed. Each profile contained 5 tacrolimus whole-blood concentrations (predose and 1, 2, 4, and 6 or 8 hours postdose), measured using liquid chromatography-tandem mass spectrometry. The concentration at 6 hours was interpolated for each profile, and 54 values of AUC(0-6) were calculated using the trapezoidal rule. The best sampling times were then determined using limited sampling strategies and sensitivity analysis. Linear mixed-effects modeling was performed to estimate regression coefficients of equations incorporating each concentration-time point (C0, C1, C2, C4, interpolated C5, and interpolated C6) as a predictor of AUC(0-6). Predictive performance was evaluated by assessment of the mean error (ME) and root mean square error (RMSE). Limited sampling strategy (LSS) equations with C2, C4, and C5 provided similar results for prediction of AUC(0-6) (R-2 = 0.869, 0.844, and 0.832, respectively). These 3 time points were superior to C0 in the prediction of AUC. The ME was similar for all time points; the RMSE was smallest for C2, C4, and C5. The highest sensitivity index was determined to be 4.9 hours postdose at steady state, suggesting that this time point provides the most information about the AUC(0-12). The results from limited sampling strategies and sensitivity analysis supported the use of a single blood sample at 5 hours postdose as a predictor of both AUC(0-6) and AUC(0-12). A jackknife procedure was used to evaluate the predictive performance of the model, and this demonstrated that collecting a sample at 5 hours after dosing could be considered as the optimal sampling time for predicting AUC(0-6).