999 resultados para Eliciting dose (ED)
Resumo:
Aim: To compare a less intensive regimen based on high-dose imatinib (IM) to an intensive IM/HyperCVAD regimen in adults with Ph+ ALL, in terms of early response and outcome after stem cell transplantation (SCT). Methods: Patients aged 18-60 years with previously untreated Ph+ ALL not evolving from chronic myeloid leukemia were eligible if no contra-indication to chemotherapy and SCT (ClinicalTrials.gov ID, NCT00327678). After a steroid prephase allowing Ph and/or BCR-ABL diagnosis, cycle 1 differed between randomization arms. In arm A (IM-based), IM was given at 800 mg on day 1-28, combined with vincristine (2 mg, day 1, 8, 15, 22) and dexamethasone (40 mg, day 1-2, 8-9, 15-16, and 22-23) only. In arm B (IM/HyperCVAD), IM was given at 800 mg on day 1-14, combined with adriamycin (50 mg/m2, day 4), cyclophosphamide (300 mg/m2/12h, day 1, 2, 3), vincristine (2 mg, day 4 and 11), and dexamethasone (40 mg, day 1-4 and 11-14). All patients received a cycle 2 combining high-dose methotrexate (1 g/m2, day 1) and AraC (3 g/m2/12h, day 2 and 3) with IM at 800 mg on day 1-14, whatever their response. Four intrathecal infusions were given during this induction/consolidation period. Minimal residual disease (MRD) was centrally evaluated by quantitative RQ-PCR after cycle 1 (MRD1) and cycle 2 (MRD2). Major MRD response was defined as BCR-ABL/ABL ratio <0.1%. Then, all patients were to receive allogeneic SCT using related or unrelated matched donor stem cells or autologous SCT if no donor and a major MRD2 response. IM/chemotherapy maintenance was planned after autologous SCT. In the absence of SCT, patients received alternating cycles 1 (as in arm B) and cycles 2 followed by maintenance, like in the published IM/HyperCVAD regimen. The primary objective was non-inferiority of arm A in term of major MRD2 response. Secondary objectives were CR rate, SCT rate, treatment- and transplant-related mortality, relapse-free (RFS), event-free (EFS) and overall (OS) survival. Results: Among the 270 patients randomized between May 2006 and August 2011, 265 patients were evaluable for this analysis (133 arm A, 132 arm B; median age, 47 years; median follow-up, 40 months). Main patient characteristics were well-balanced between both arms. Due to higher induction mortality in arm B (9 versus 1 deaths; P=0.01), CR rate was higher in the less intensive arm A (98% versus 89% after cycle 1 and 98% versus 91% after cycle 2; P= 0.003 and 0.006, respectively). A total of 213 and 205 patients were evaluated for bone marrow MRD1 and MRD2. The rates of patients reaching major MRD response and undetectable MRD were 45% (44% arm A, 46% arm B; P=0.79) and 10% (in both arms) at MRD1 and 66% (68% arm A, 63.5% arm B; P=0.56) and 25% (28% arm A, 22% arm B; P=0.33) at MRD2, respectively. The non-inferiority primary endpoint was thus demonstrated (P= 0.002). Overall, EFS was estimated at 42% (95% CI, 35-49) and OS at 51% (95% CI, 44-57) at 3 years, with no difference between arm A and B (46% versus 38% and 53% versus 49%; P=0.25 and 0.61, respectively). Of the 251 CR patients, 157 (80 arm A, 77 arm B) and 34 (17 in both arms) received allogeneic and autologous SCT in first CR, respectively. Allogeneic transplant-related mortality was similar in both arms (31.5% versus 22% at 3 years; P=0.51). Of the 157 allografted patients, 133 had MRD2 evaluation and 89 had MRD2 <0.1%. In these patients, MRD2 did not significantly influence post-transplant RFS and OS, either when tested with the 0.1% cutoff or as a continuous log covariate. Of the 34 autografted patients, 31 had MRD2 evaluation and, according to the protocol, 28 had MRD2 <0.1%. When restricting the comparison to patients achieving major MRD2 response and with the current follow-up, a trend for better results was observed after autologous as compared to allogeneic SCT (RFS, 63% versus 49.5% and OS, 69% versus 58% at 3 years; P=0.35 and P=0.08, respectively). Conclusions: In adults, the use of TK inhibitors (TKI) has markedly improved the results of Ph+ ALL therapy, now close to those observed in Ph-negative ALL. We demonstrated here that chemotherapy intensity may be safely reduced when associated with high-dose IM. We will further explore this TKI-based strategy using nilotinib prior to SCT in our next GRAAPH-2013 trial. The trend towards a better outcome after autologous compared to allogeneic SCT observed in MRD responders validates MRD as an important early surrogate endpoint for treatment stratification and new drug investigation in this disease.
Resumo:
A dose-response strategy may not only allow investigation of the impact of foods and nutrients on human health but may also reveal differences in the response of individuals to food ingestion based on their metabolic health status. In a randomized crossover study, we challenged 19 normal-weight (BMI: 20-25 kg/m(2)) and 18 obese (BMI: >30 kg/m(2)) men with 500, 1000, and 1500 kcal of a high-fat (HF) meal (60.5% energy from fat). Blood was taken at baseline and up to 6 h postprandially and analyzed for a range of metabolic, inflammatory, and hormonal variables, including plasma glucose, lipids, and C-reactive protein and serum insulin, glucagon-like peptide-1, interleukin-6 (IL-6), and endotoxin. Insulin was the only variable that could differentiate the postprandial response of normal-weight and obese participants at each of the 3 caloric doses. A significant response of the inflammatory marker IL-6 was only observed in the obese group after ingestion of the HF meal containing 1500 kcal [net incremental AUC (iAUC) = 22.9 ± 6.8 pg/mL × 6 h, P = 0.002]. Furthermore, the net iAUC for triglycerides significantly increased from the 1000 to the 1500 kcal meal in the obese group (5.0 ± 0.5 mmol/L × 6 h vs. 6.0 ± 0.5 mmol/L × 6 h; P = 0.015) but not in the normal-weight group (4.3 ± 0.5 mmol/L × 6 h vs. 4.8 ± 0.5 mmol/L × 6 h; P = 0.31). We propose that caloric dose-response studies may contribute to a better understanding of the metabolic impact of food on the human organism. This study was registered at clinicaltrials.gov as NCT01446068.
'Toxic' and 'Nontoxic': confirming critical terminology concepts and context for clear communication
Resumo:
If 'the dose makes the poison', and if the context of an exposure to a hazard shapes the risk as much as the innate character of the hazard itself, then what is 'toxic' and what is 'nontoxic'? This article is intended to help readers and communicators: anticipate that concepts such as 'toxic' and 'nontoxic' may have different meanings to different stakeholders in different contexts of general use, commerce, science, and the law; recognize specific situations in which terms and related information could potentially be misperceived or misinterpreted; evaluate the relevance, reliability, and other attributes of information for a given situation; control actions, assumptions, interpretations, conclusions, and decisions to avoid flaws and achieve a desired outcome; and confirm that the desired outcome has been achieved. To meet those objectives, we provide some examples of differing toxicology terminology concepts and contexts; a comprehensive decision-making framework for understanding and managing risk; along with a communication and education message and audience-planning matrix to support the involvement of all relevant stakeholders; a set of CLEAR-communication assessment criteria for use by both readers and communicators; example flaws in decision-making; a suite of three tools to assign relevance vs reliability, align know vs show, and refine perception vs reality aspects of information; and four steps to foster effective community involvement and support. The framework and supporting process are generally applicable to meeting any objective.
Resumo:
Atualmente, os critérios utilizados no Sul do Brasil para definição da dose de nitrogênio (N) a ser aplicada no milho são o teor de matéria orgânica no solo, a expectativa de rendimento da cultura e as características da cultura antecessora. Embora apresente alta relação carbono:nitrogênio (C/N), a aveia preta é a espécie de cobertura de solo de inverno mais utilizada como antecessora às culturas comerciais de verão. Essa característica pode resultar em imobilização do N do solo, deficiência de N na planta de milho e redução no rendimento de grãos. Embora na determinação da dose a ser aplicada sejam consideradas as espécies antecessoras, os avanços quanto à melhor época para aplicar N em cobertura em milho foram pequenos. Com o objetivo de avaliar a época mais adequada para aplicação da primeira dose de N em cobertura no milho cultivado em sucessão a espécies de inverno com distintas relações C/N, um experimento foi realizado em vasos em casa de vegetação em Porto Alegre-RS. Utilizaram-se colunas de solo não deformado, classificado como Argissolo Vermelho distrófico típico. Os tratamentos constaram de quatro sistemas de coberturas de solo de inverno (aveia preta, ervilhaca comum, nabo forrageiro e pousio) e três formas de manejo de N em cobertura (com aplicação de N nos estádios V3 ou V5 e sem aplicação de N em cobertura). O delineamento experimental foi o completamente casualizado, em esquema fatorial 4 x 3, com três repetições. Procedeu-se à análise de variância pelo teste F e à comparação de médias pelo teste de Tukey (p < 0,05). A aveia, que tem alta relação C/N, apresentou baixa taxa de mineralização e de liberação de N de seus resíduos, enquanto as culturas de ervilhaca comum e nabo forrageiro mostraram relação C/N mais estreita, estimulando esses processos. Com o uso de aveia como cultura antecessora ao milho, verificou-se diminuição dos teores de N mineral no solo e de N total na planta de milho, independentemente da época de aplicação de N em cobertura, diminuindo o desenvolvimento inicial da planta. Em sucessão à ervilhaca comum e ao nabo forrageiro, o teor relativo de clorofila na folha e a produção de massa seca de milho, avaliados no estádio V7, foram maiores em relação aos obtidos em sucessão à aveia preta, independentemente da época de aplicação da primeira dose de N em cobertura. Os dados obtidos evidenciaram ser possível retardar a época de aplicação da primeira dose de N em cobertura em milho do estádio V3 para V5, quando o milho for cultivado em sucessão a espécies de inverno com baixa relação C/N.
Resumo:
Os solos brasileiros, principalmente os do cerrado, são bastante intemperizados e pobres em alguns micronutrientes catiônicos na solução do solo. A utilização de técnicas de manejo, como a adubação verde, pode favorecer o fluxo difusivo (FD) e a disponibilidade desses nutrientes para as plantas. O presente trabalho visou avaliar se a incorporação de adubos verdes ao solo, em diferentes doses e épocas, modifica o FD e a forma iônica de transporte dos micronutrientes Zn, Cu, Fe e Mn no solo. Para tanto, foram incorporados dois resíduos vegetais largamente cultivados como adubo verde: o feijão guandu (Cajanus cajan) ou o milheto (Pennisetum americanum) por diferentes períodos (0, 15, 25, 35, 45 e 55 dias) e doses (0, 9, 18 e 36 t ha-1) num Latossolo Vermelho, argiloso, em condições de laboratório. Para avaliar o FD, utilizaram-se resinas de troca aniônica (positivamente carregada) e de troca catiônica (negativamente carregada) na forma de lâmina, incubadas junto ao solo em câmaras de difusão durante 15 dias. Os resultados obtidos demonstraram que houve aumento do FD do Cu e do Fe com o aumento das doses de material vegetal, principalmente no início do período de incubação, e maior fluxo desses dois micronutrientes para a resina aniônica em relação à catiônica, possivelmente por ser o seu transporte no solo mais dependente da formação de complexos organometálicos com carga líquida negativa. Já para Zn e Mn, o fluxo difusivo foi maior para a resina catiônica. O aumento do tempo de incubação favoreceu o fluxo difusivo de Mn e Zn e reduziu o do Cu e Fe.
Resumo:
Some methadone maintenance treatment (MMT) programs prescribe inadequate daily methadone doses. Patients complain of withdrawal symptoms and continue illicit opioid use, yet practitioners are reluctant to increase doses above certain arbitrary thresholds. Serum methadone levels (SMLs) may guide practitioners dosing decisions, especially for those patients who have low SMLs despite higher methadone doses. Such variation is due in part to the complexities of methadone metabolism. The medication itself is a racemic (50:50) mixture of 2 enantiomers: an active "R" form and an essentially inactive "S" form. Methadone is metabolized primarily in the liver, by up to five cytochrome P450 isoforms, and individual differences in enzyme activity help explain wide ranges of active R-enantiomer concentrations in patients given identical doses of racemic methadone. Most clinical research studies have used methadone doses of less than 100 mg/day [d] and have not reported corresponding SMLs. New research suggests that doses ranging from 120 mg/d to more than 700 mg/d, with correspondingly higher SMLs, may be optimal for many patients. Each patient presents a unique clinical challenge, and there is no way of prescribing a single best methadone dose to achieve a specific blood level as a "gold standard" for all patients. Clinical signs and patient-reported symptoms of abstinence syndrome, and continuing illicit opioid use, are effective indicators of dose inadequacy. There does not appear to be a maximum daily dose limit when determining what is adequately "enough" methadone in MMT.
Resumo:
OBJECTIVE: We investigated whether the oral administration of a low dose (75 micro g) of midazolam, a CYP3A probe, can be used to measure the in vivo CYP3A activity. METHODS: Plasma concentrations of midazolam, 1'OH-midazolam and 4'OH-midazolam were measured after the oral administration of 7.5 mg and 75 micro g midazolam in 13 healthy subjects without medication, in four subjects pretreated for 2 days with ketoconazole (200 mg b.i.d.), a CYP3A inhibitor, and in four subjects pretreated for 4 days with rifampicin (450 mg q.d.), a CYP3A inducer. RESULTS: After oral administration of 75 micro g midazolam, the 30-min total (unconjugated + conjugated) 1'OH-midazolam/midazolam ratios measured in the groups without co-medication, with ketoconazole and with rifampicin were (mean+/-SD): 6.23+/-2.61, 0.79+/-0.39 and 56.1+/-12.4, respectively. No side effects were reported by the subjects taking this low dose of midazolam. Good correlations were observed between the 30-min total 1'OH-midazolam/midazolam ratio and midazolam clearance in the group without co-medication (r(2)=0.64, P<0.001) and in the three groups taken together (r(2)=0.91, P<0.0001). Good correlations were also observed between midazolam plasma levels and midazolam clearance, measured between 1.5 h and 4 h. CONCLUSION: A low oral dose of midazolam can be used to phenotype CYP3A, either by the determination of total 1'OH-midazolam/midazolam ratios at 30 min or by the determination of midazolam plasma levels between 1.5 h and 4 h after its administration.
Resumo:
OBJECTIVES: To investigate the effect of low-dose aspirin administered in the morning or evening on the rate of discontinuation of prolonged-release nicotinic acid (Niaspan) due to flushing in patients at elevated cardiovascular risk. RESEARCH DESIGN AND METHODS: This was an observational, non-interventional study in patients at elevated cardiovascular risk due to cardiovascular disease or type 2 diabetes. Patients received prolonged-release nicotinic acid and aspirin under the usual care of their physician for 15 weeks. MAIN OUTCOME MEASURES: The main outcome measure was the rate of treatment discontinuation for flushing. Other adverse drug reactions (ADRs) were also recorded. Lipid parameters were also measured. RESULTS: The patient population included 539 subjects (70% male); 36% had type 2 diabetes, 80% had prior cardiovascular disease, and 37% had a family history of cardiovascular disease. The rate of treatment discontinuation due to flushing did not differ (p = 0.3375) between the morning aspirin group (10.6%) and the evening aspirin group (13.8%). The overall incidence of flushing was 57%. Most flushes were of mild or moderate severity and decreases occurred over time in both frequency and intensity. ADRs unrelated to flushing occurred in 6.6% of the morning aspirin group and 7.4% of the evening aspirin group. HDL-cholesterol increased by +21.3% in the overall population, together with moderate improvements in other lipid parameters. CONCLUSIONS: Flushing was the most common ADR with prolonged-release nicotinic acid treatment, as expected. The timing of aspirin administration did not influence the rate of treatment discontinuations for flushing. Marked increases in HDL-cholesterol were observed.
Resumo:
BACKGROUND: : A primary goal of clinical pharmacology is to understand the factors that determine the dose-effect relationship and to use this knowledge to individualize drug dose. METHODS: : A principle-based criterion is proposed for deciding among alternative individualization methods. RESULTS: : Safe and effective variability defines the maximum acceptable population variability in drug concentration around the population average. CONCLUSIONS: : A decision on whether patient covariates alone are sufficient, or whether therapeutic drug monitoring in combination with target concentration intervention is needed, can be made by comparing the remaining population variability after a particular dosing method with the safe and effective variability.
Resumo:
Pollution of air, water and soil by industrial chemicals presents a potential health risk to humans. Such chemicals can enter the human body by three routes, namely by inhalation, dermal absorption, and ingestion and in special cases by injection (needle sticks, bites, cuts, etc.). In the workplace, pulmonary and dermal absorption are the main routes of entry, but poor personal hygiene and work habits can result in ingestion that contributes to the dose. Air monitoring provides reliable information on inhalation exposure, and patches can be used to estimate dermal exposure. Local adverse effects, such as skin and eye irritation, or nose and lung irritation, are closely related to the external exposure. Systemic adverse effects, on the other hand, are related to the absorbed amount (dose), or to the level of the pollutant or its metabolite in the target organ. Human biological monitoring is becoming a powerful tool for scientists and policy makers to assess and manage the risk of exposure to chemicals both in the general population and at the workpalce. This chapter will focus on the occupational environment keeping in mind that biological monitoring in humans is a very actual issue in public health politics, in environmental medicine, and in science in general.
Resumo:
The pharmacokinetics (PK) of efavirenz (EFV) is characterized by marked interpatient variability that correlates with its pharmacodynamics (PD). In vitro-in vivo extrapolation (IVIVE) is a "bottom-up" approach that combines drug data with system information to predict PK and PD. The aim of this study was to simulate EFV PK and PD after dose reductions. At the standard dose, the simulated probability was 80% for viral suppression and 28% for central nervous system (CNS) toxicity. After a dose reduction to 400 mg, the probabilities of viral suppression were reduced to 69, 75, and 82%, and those of CNS toxicity were 21, 24, and 29% for the 516 GG, 516 GT, and 516 TT genotypes, respectively. With reduction of the dose to 200 mg, the probabilities of viral suppression decreased to 54, 62, and 72% and those of CNS toxicity decreased to 13, 18, and 20% for the 516 GG, 516 GT, and 516 TT genotypes, respectively. These findings indicate how dose reductions might be applied in patients with favorable genetic characteristics.