46 resultados para warfarin dosing algorithms
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
BACKGROUND Despite substantial evidence supporting a pharmacogenetic approach to warfarin therapy in adults, evidence on the importance of genetics in warfarin therapy in children is limited, particularly for clinical outcomes. We assessed the contribution of CYP2C9/VKORC1/CYP4F2 genotypes and variation in other genes involved in vitamin K and coagulation pathways to warfarin dose and related clinical outcomes in children. PROCEDURE Clinical and genetic data for 93 children (age ≤ 18 years) who received warfarin therapy were obtained. DNA was genotyped for 93 selected single nucleotide polymorphisms using a custom assay. RESULTS With a median age of 4.8 years, our cohort included more young children than most previous studies. Overall, 76.3% of dose variability was explained by weight, indication, VKORC1-1639G/A and CYP2C9 *2/*3, with genotypes accounting for 21.1% of variability. There was a strong correlation (R(2) = 0.68; P < 0.001) between actual and predicted warfarin dose using a pediatric genotype-based dosing model. VKORC1 genotype had a significant impact on time to therapeutic international normalized ratio (INR) (P = 0.047) and time to over-anticoagulation (INR > 4; P = 0.024) during the initiation of therapy. CYP2C9*3 carriers were also at increased risk of major bleeding while receiving warfarin (adjusted OR = 11.28). An additional variant in CYP2C9 (rs7089580) was significantly associated with warfarin dose (P = 0.020) in a multivariate clinical and genetic model. CONCLUSIONS This study confirms the importance of VKORC1/CYP2C9 genotypes for warfarin dosing in a young pediatric cohort and demonstrates an impact of genetic factors on clinical outcomes in children. Furthermore, we identified an additional variant in CYP2C9 of potential relevance for warfarin dosing in children.
Resumo:
Erythropoietin (EPO) and iron deficiency as causes of anemia in patients with limited renal function or end-stage renal disease are well addressed. The concomitant impairment of red blood cell (RBC) survival has been largely neglected. Properties of the uremic environment like inflammation, increased oxidative stress and uremic toxins seem to be responsible for the premature changes in RBC membrane and cytoskeleton. The exposure of antigenic sites and breakdown of the phosphatidylserine asymmetry promote RBC phagocytosis. While the individual response to treatment with EPO-stimulating agents (ESA) depends on both the RBC's lifespan and the production rate, uniform dosing algorithms do not meet that demand. The clinical use of mathematical models predicting ESA-induced changes in hematocrit might be greatly improved once independent estimates of RBC production rate and/or lifespan become available, thus making the concomitant estimation of both parameters unnecessary. Since heme breakdown by the hemoxygenase pathway results in carbon monoxide (CO) which is exhaled, a simple CO breath test has been used to calculate hemoglobin turnover and therefore RBC survival and lifespan. Future research will have to be done to validate and implement this method in patients with kidney failure. This will result in new insights into RBC kinetics in renal patients. Eventually, these findings are expected to improve our understanding of the hemoglobin variability in response to ESA.
Resumo:
The management of anemia in patients with chronic renal failure has greatly improved with the availability of recombinant human erythropoietin in the late 1980s, leading to a considerable reduction in mortality and morbidity and to an improvement in quality of life. The findings from recent controlled clinical outcome trials have resulted in a rather narrow, generally accepted therapeutic hematocrit target range. However, currently available dosing algorithms do not permit achievement and maintenance of target values within the therapeutic range in many patients. One possible explanation for this failure may be the ignorance of a finite erythrocyte lifespan not integrated into most algorithms. The purpose of this article is to underline the essential role played by the erythrocyte lifespan in the erythropoietic response to recombinant human erythropoietin and to encourage the integration of this concept in the future development of computer-assisted decision support systems.
Resumo:
Prospective validation of two algorithms for the initiation of phenprocoumon treatment.
Resumo:
OBJECTIVE To systematically review evidence on genetic variants influencing outcomes during warfarin therapy and provide practice recommendations addressing the key questions: (1) Should genetic testing be performed in patients with an indication for warfarin therapy to improve achievement of stable anticoagulation and reduce adverse effects? (2) Are there subgroups of patients who may benefit more from genetic testing compared with others? (3) How should patients with an indication for warfarin therapy be managed based on their genetic test results? METHODS A systematic literature search was performed for VKORC1 and CYP2C9 and their association with warfarin therapy. Evidence was critically appraised, and clinical practice recommendations were developed based on expert group consensus. RESULTS Testing of VKORC1 (-1639G>A), CYP2C9*2, and CYP2C9*3 should be considered for all patients, including pediatric patients, within the first 2 weeks of therapy or after a bleeding event. Testing for CYP2C9*5, *6, *8, or *11 and CYP4F2 (V433M) is currently not recommended. Testing should also be considered for all patients who are at increased risk of bleeding complications, who consistently show out-of-range international normalized ratios, or suffer adverse events while receiving warfarin. Genotyping results should be interpreted using a pharmacogenetic dosing algorithm to estimate the required dose. SIGNIFICANCE This review provides the latest update on genetic markers for warfarin therapy, clinical practice recommendations as a basis for informed decision making regarding the use of genotype-guided dosing in patients with an indication for warfarin therapy, and identifies knowledge gaps to guide future research.
Resumo:
Carnosine (β-alanyl-L-histidine) is found in high concentrations in skeletal muscle and chronic β-alanine (BA) supplementation can increase carnosine content. This placebo-controlled, double-blind study compared two different 8-week BA dosing regimens on the time course of muscle carnosine loading and 8-week washout, leading to a BA dose-response study with serial muscle carnosine assessments throughout. Thirty-one young males were randomized into three BA dosing groups: (1) high-low: 3.2 g BA/day for 4 weeks, followed by 1.6 g BA/day for 4 weeks; (2) low-low: 1.6 g BA/day for 8 weeks; and (3) placebo. Muscle carnosine in tibialis-anterior (TA) and gastrocnemius (GA) muscles was measured by 1H-MRS at weeks 0, 2, 4, 8, 12 and 16. Flushing symptoms and blood clinical chemistry were trivial in all three groups and there were no muscle carnosine changes in the placebo group. During the first 4 weeks, the increase for high-low (TA 2.04 mmol/kgww, GA 1.75 mmol/kgww) was ~twofold greater than low-low (TA 1.12 mmol/kgww, GA 0.80 mmol/kgww). 1.6 g BA/day significantly increased muscle carnosine within 2 weeks and induced continual rises in already augmented muscle carnosine stores (week 4-8, high-low regime). The dose-response showed a carnosine increase of 2.01 mmol/kgww per 100 g of consumed BA, which was only dependent upon the total accumulated BA consumed (within a daily intake range of 1.6-3.2 g BA/day). Washout rates were gradual (0.18 mmol/kgww and 0.43 mmol/kgww/week; ~2%/week). In summary, the absolute increase in muscle carnosine is only dependent upon the total BA consumed and is not dependent upon baseline muscle carnosine, the muscle type, or the daily amount of supplemented BA.
Resumo:
Background Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have shown that a patient's antibody reaction in a confirmatory line immunoassay (INNO-LIATM HIV I/II Score, Innogenetics) provides information on the duration of infection. Here, we sought to further investigate the diagnostic specificity of various Inno-Lia algorithms and to identify factors affecting it. Methods Plasma samples of 714 selected patients of the Swiss HIV Cohort Study infected for longer than 12 months and representing all viral clades and stages of chronic HIV-1 infection were tested blindly by Inno-Lia and classified as either incident (up to 12 m) or older infection by 24 different algorithms. Of the total, 524 patients received HAART, 308 had HIV-1 RNA below 50 copies/mL, and 620 were infected by a HIV-1 non-B clade. Using logistic regression analysis we evaluated factors that might affect the specificity of these algorithms. Results HIV-1 RNA <50 copies/mL was associated with significantly lower reactivity to all five HIV-1 antigens of the Inno-Lia and impaired specificity of most algorithms. Among 412 patients either untreated or with HIV-1 RNA ≥50 copies/mL despite HAART, the median specificity of the algorithms was 96.5% (range 92.0-100%). The only factor that significantly promoted false-incident results in this group was age, with false-incident results increasing by a few percent per additional year. HIV-1 clade, HIV-1 RNA, CD4 percentage, sex, disease stage, and testing modalities exhibited no significance. Results were similar among 190 untreated patients. Conclusions The specificity of most Inno-Lia algorithms was high and not affected by HIV-1 variability, advanced disease and other factors promoting false-recent results in other STARHS. Specificity should be good in any group of untreated HIV-1 patients.
Optimizing human in vivo dosing and delivery of β-alanine supplements for muscle carnosine synthesis
Resumo:
Interest into the effects of carnosine on cellular metabolism is rapidly expanding. The first study to demonstrate in humans that chronic β-alanine (BA) supplementation (~3-6 g BA/day for ~4 weeks) can result in significantly augmented muscle carnosine concentrations (>50%) was only recently published. BA supplementation is potentially poised for application beyond the niche exercise and performance-enhancement field and into other more clinical populations. When examining all BA supplementation studies that directly measure muscle carnosine (n=8), there is a significant linear correlation between total grams of BA consumed (of daily intake ranges of 1.6-6.4 g BA/day) versus both the relative and absolute increases in muscle carnosine. Supporting this, a recent dose-response study demonstrated a large linear dependency (R2=0.921) based on the total grams of BA consumed over 8 weeks. The pre-supplementation baseline carnosine or individual subjects' body weight (from 65 to 90 kg) does not appear to impact on subsequent carnosine synthesis from BA consumption. Once muscle carnosine is augmented, the washout is very slow (~2%/week). Recently, a slow-release BA tablet supplement has been developed showing a smaller peak plasma BA concentration and delayed time to peak, with no difference in the area under the curve compared to pure BA in solution. Further, this slow-release profile resulted in a reduced urinary BA loss and improved retention, while at the same time, eliciting minimal paraesthesia symptoms. However, our complete understanding of optimizing in vivo delivery and dosing of BA is still in its infancy. Thus, this review will clarify our current knowledge of BA supplementation to augment muscle carnosine as well as highlight future research questions on the regulatory points of control for muscle carnosine synthesis.
Resumo:
There is no consensus regarding optimal dosing of high dose methotrexate (HDMTX) in patients with primary CNS lymphoma. Our aim was to develop a convenient dosing algorithm to target AUC(MTX) in the range between 1000 and 1100 µmol l(-1) h.
Resumo:
Background Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have previously demonstrated that a patient's antibody reaction pattern in a confirmatory line immunoassay (INNO-LIA™ HIV I/II Score) provides information on the duration of infection, which is unaffected by clinical, immunological and viral variables. In this report we have set out to determine the diagnostic performance of Inno-Lia algorithms for identifying incident infections in patients with known duration of infection and evaluated the algorithms in annual cohorts of HIV notifications. Methods Diagnostic sensitivity was determined in 527 treatment-naive patients infected for up to 12 months. Specificity was determined in 740 patients infected for longer than 12 months. Plasma was tested by Inno-Lia and classified as either incident (< = 12 m) or older infection by 26 different algorithms. Incident infection rates (IIR) were calculated based on diagnostic sensitivity and specificity of each algorithm and the rule that the total of incident results is the sum of true-incident and false-incident results, which can be calculated by means of the pre-determined sensitivity and specificity. Results The 10 best algorithms had a mean raw sensitivity of 59.4% and a mean specificity of 95.1%. Adjustment for overrepresentation of patients in the first quarter year of infection further reduced the sensitivity. In the preferred model, the mean adjusted sensitivity was 37.4%. Application of the 10 best algorithms to four annual cohorts of HIV-1 notifications totalling 2'595 patients yielded a mean IIR of 0.35 in 2005/6 (baseline) and of 0.45, 0.42 and 0.35 in 2008, 2009 and 2010, respectively. The increase between baseline and 2008 and the ensuing decreases were highly significant. Other adjustment models yielded different absolute IIR, although the relative changes between the cohorts were identical for all models. Conclusions The method can be used for comparing IIR in annual cohorts of HIV notifications. The use of several different algorithms in combination, each with its own sensitivity and specificity to detect incident infection, is advisable as this reduces the impact of individual imperfections stemming primarily from relatively low sensitivities and sampling bias.
Resumo:
The early detection of subjects with probable Alzheimer's disease (AD) is crucial for effective appliance of treatment strategies. Here we explored the ability of a multitude of linear and non-linear classification algorithms to discriminate between the electroencephalograms (EEGs) of patients with varying degree of AD and their age-matched control subjects. Absolute and relative spectral power, distribution of spectral power, and measures of spatial synchronization were calculated from recordings of resting eyes-closed continuous EEGs of 45 healthy controls, 116 patients with mild AD and 81 patients with moderate AD, recruited in two different centers (Stockholm, New York). The applied classification algorithms were: principal component linear discriminant analysis (PC LDA), partial least squares LDA (PLS LDA), principal component logistic regression (PC LR), partial least squares logistic regression (PLS LR), bagging, random forest, support vector machines (SVM) and feed-forward neural network. Based on 10-fold cross-validation runs it could be demonstrated that even tough modern computer-intensive classification algorithms such as random forests, SVM and neural networks show a slight superiority, more classical classification algorithms performed nearly equally well. Using random forests classification a considerable sensitivity of up to 85% and a specificity of 78%, respectively for the test of even only mild AD patients has been reached, whereas for the comparison of moderate AD vs. controls, using SVM and neural networks, values of 89% and 88% for sensitivity and specificity were achieved. Such a remarkable performance proves the value of these classification algorithms for clinical diagnostics.