904 resultados para DOSING ALGORITHMS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed at identifying clinical factors for predicting hematologic toxicity after radioimmunotherapy with (90)Y-ibritumomab tiuxetan or (131)I-tositumomab in clinical practice. Hematologic data were available from 14 non-Hodgkin lymphoma patients treated with (90)Y-ibritumomab tiuxetan and 18 who received (131)I-tositumomab. The percentage baseline at nadir and 4 wk post nadir and the time to nadir were selected as the toxicity indicators for both platelets and neutrophils. Multiple linear regression analysis was performed to identify significant predictors (P < 0.05) of each indicator. For both platelets and neutrophils, pooled and separate analyses of (90)Y-ibritumomab tiuxetan and (131)I-tositumomab data yielded the time elapsed since the last chemotherapy as the only significant predictor of the percentage baseline at nadir. The extent of bone marrow involvement was not a significant factor in this study, possibly because of the short time elapsed since the last chemotherapy of the 7 patients with bone marrow involvement. Because both treatments were designed to deliver a comparable bone marrow dose, this factor also was not significant. None of the 14 factors considered was predictive of the time to nadir. The R(2) value for the model predicting percentage baseline at nadir was 0.60 for platelets and 0.40 for neutrophils. This model predicted the platelet and neutrophil toxicity grade to within ±1 for 28 and 30 of the 32 patients, respectively. For the 7 patients predicted with grade I thrombocytopenia, 6 of whom had actual grade I-II, dosing might be increased to improve treatment efficacy. The elapsed time since the last chemotherapy can be used to predict hematologic toxicity and customize the current dosing method in radioimmunotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we develop numerical algorithms that use small requirements of storage and operations for the computation of invariant tori in Hamiltonian systems (exact symplectic maps and Hamiltonian vector fields). The algorithms are based on the parameterization method and follow closely the proof of the KAM theorem given in [LGJV05] and [FLS07]. They essentially consist in solving a functional equation satisfied by the invariant tori by using a Newton method. Using some geometric identities, it is possible to perform a Newton step using little storage and few operations. In this paper we focus on the numerical issues of the algorithms (speed, storage and stability) and we refer to the mentioned papers for the rigorous results. We show how to compute efficiently both maximal invariant tori and whiskered tori, together with the associated invariant stable and unstable manifolds of whiskered tori. Moreover, we present fast algorithms for the iteration of the quasi-periodic cocycles and the computation of the invariant bundles, which is a preliminary step for the computation of invariant whiskered tori. Since quasi-periodic cocycles appear in other contexts, this section may be of independent interest. The numerical methods presented here allow to compute in a unified way primary and secondary invariant KAM tori. Secondary tori are invariant tori which can be contracted to a periodic orbit. We present some preliminary results that ensure that the methods are indeed implementable and fast. We postpone to a future paper optimized implementations and results on the breakdown of invariant tori.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. The complexity of the problem, the limited temporal and financial resources, as well as the high intraclass variance can make an algorithm fail if it is trained with a suboptimal dataset. Active learning aims at building efficient training sets by iteratively improving the model performance through sampling. A user-defined heuristic ranks the unlabeled pixels according to a function of the uncertainty of their class membership and then the user is asked to provide labels for the most uncertain pixels. This paper reviews and tests the main families of active learning algorithms: committee, large margin, and posterior probability-based. For each of them, the most recent advances in the remote sensing community are discussed and some heuristics are detailed and tested. Several challenging remote sensing scenarios are considered, including very high spatial resolution and hyperspectral image classification. Finally, guidelines for choosing the good architecture are provided for new and/or unexperienced user.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To calculate the variable costs involved with the process of delivering erythropoiesis stimulating agents (ESA) in European dialysis practices. METHODS: A conceptual model was developed to classify the processes and sub-processes followed in the pharmacy (ordering from supplier, receiving/storing/delivering ESA to the dialysis unit), dialysis unit (dose determination, ordering, receipt, registration, storage, administration, registration) and waste disposal unit. Time and material costs were recorded. Labour costs were derived from actual local wages while material costs came from the facilities' accounting records. Activities associated with ESA administration were listed and each activity evaluated to determine if dosing frequency affected the amount of resources required. RESULTS: A total of 21 centres in 8 European countries supplied data for 142 patients (mean) per hospital (range 42-648). Patients received various ESA regimens (thrice-weekly, twice-weekly, once-weekly, once every 2 weeks and once-monthly). Administering ESA every 2 weeks, the mean costs per patient per year for each process and the estimates of the percentage reduction in costs obtainable, respectively, were: pharmacy labour (10.1 euro, 39%); dialysis unit labour (66.0 euro, 65%); dialysis unit materials (4.11 euro, 61%) and waste unit materials (0.43 euro, 49%). LIMITATION: Impact on financial costs was not measured. CONCLUSION: ESA administration has quantifiable labour and material costs which are affected by dosing frequency.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Gentamicin is among the most commonly prescribed antibiotics in newborns, but large interindividual variability in exposure levels exists. Based on a population pharmacokinetic analysis of a cohort of unselected neonates, we aimed to validate current dosing recommendations from a recent reference guideline (Neofax®). Methods: From 3039 concentrations collected in 994 preterm (median gestational age 32.3 weeks, range 24.2-36.5) and 455 term newborns, treated at the University Hospital of Lausanne between 2006 and 2011, a population pharmacokinetic analysis was performed with NONMEM®. Model-based simulations were used to assess the ability of dosing regimens to bring concentrations into targets: trough ≤ 1mg/L and peak ~ 8mg/L. Results: A two-compartment model best characterized gentamicin pharmacokinetics. Model parameters are presented in the table. Body weight, gestational age and postnatal age positively influence clearance, which decreases under dopamine administration. Body weight and gestational age influence the distribution volume. Model based simulations confirm that preterm infants need doses superior to 4 mg/kg, and extended dosage intervals, up to 48 hours for very preterm newborns, whereas most term newborns would achieve adequate exposure under 4 mg/kg q. 24 h. More than 90% of neonates would achieve trough concentrations below 2 mg/L and peaks above 6 mg/L following most recent guidelines. Conclusions: Simulated gentamicin exposure demonstrates good accordance with recent dosing recommendations for target concentration achievement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: We have reported previously that 80 mg valsartan and 50 mg losartan provide less receptor blockade than 150 mg irbesartan in normotensive subjects. In this study we investigated the importance of drug dosing in mediating these differences by comparing the AT(1)-receptor blockade induced by 3 doses of valsartan with that obtained with 3 other antagonists at given doses. METHODS: Valsartan (80, 160, and 320 mg), 50 mg losartan, 150 mg irbesartan, and 8 mg candesartan were administered to 24 healthy subjects in a randomized, open-label, 3-period crossover study. All doses were given once daily for 8 days. The angiotensin II receptor blockade was assessed with two techniques, the reactive rise in plasma renin activity and an in vitro radioreceptor binding assay that quantified the displacement of angiotensin II by the blocking agents. Measurements were obtained before and 4 and 24 hours after drug intake on days 1 and 8. RESULTS: At 4 and 24 hours, valsartan induced a dose-dependent "blockade" of AT(1) receptors. Compared with other antagonists, 80 mg valsartan and 50 mg losartan had a comparable profile. The 160-mg and 320-mg doses of valsartan blocked AT(1) receptors at 4 hours by 80%, which was similar to the effect of 150 mg irbesartan. At trough, however, the valsartan-induced blockade was slightly less than that obtained with irbesartan. With use of plasma renin activity as a marker of receptor blockade, on day 8, 160 mg valsartan was equivalent to 150 mg irbesartan and 8 mg candesartan. CONCLUSIONS: These results show that the differences in angiotensin II receptor blockade observed with the various AT(1) antagonists are explained mainly by differences in dosing. When 160-mg or 320-mg doses were investigated, the effects of valsartan hardly differed from those obtained with recommended doses of irbesartan and candesartan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents an approach for mapping of precipitation data. The main goal is to perform spatial predictions and simulations of precipitation fields using geostatistical methods (ordinary kriging, kriging with external drift) as well as machine learning algorithms (neural networks). More practically, the objective is to reproduce simultaneously both the spatial patterns and the extreme values. This objective is best reached by models integrating geostatistics and machine learning algorithms. To demonstrate how such models work, two case studies have been considered: first, a 2-day accumulation of heavy precipitation and second, a 6-day accumulation of extreme orographic precipitation. The first example is used to compare the performance of two optimization algorithms (conjugate gradients and Levenberg-Marquardt) of a neural network for the reproduction of extreme values. Hybrid models, which combine geostatistical and machine learning algorithms, are also treated in this context. The second dataset is used to analyze the contribution of radar Doppler imagery when used as external drift or as input in the models (kriging with external drift and neural networks). Model assessment is carried out by comparing independent validation errors as well as analyzing data patterns.