899 resultados para warfarin dosing algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Gentamicin is among the most commonly prescribed antibiotics in newborns, but large interindividual variability in exposure levels exists. Based on a population pharmacokinetic analysis of a cohort of unselected neonates, we aimed to validate current dosing recommendations from a recent reference guideline (Neofax®). Methods: From 3039 concentrations collected in 994 preterm (median gestational age 32.3 weeks, range 24.2-36.5) and 455 term newborns, treated at the University Hospital of Lausanne between 2006 and 2011, a population pharmacokinetic analysis was performed with NONMEM®. Model-based simulations were used to assess the ability of dosing regimens to bring concentrations into targets: trough ≤ 1mg/L and peak ~ 8mg/L. Results: A two-compartment model best characterized gentamicin pharmacokinetics. Model parameters are presented in the table. Body weight, gestational age and postnatal age positively influence clearance, which decreases under dopamine administration. Body weight and gestational age influence the distribution volume. Model based simulations confirm that preterm infants need doses superior to 4 mg/kg, and extended dosage intervals, up to 48 hours for very preterm newborns, whereas most term newborns would achieve adequate exposure under 4 mg/kg q. 24 h. More than 90% of neonates would achieve trough concentrations below 2 mg/L and peaks above 6 mg/L following most recent guidelines. Conclusions: Simulated gentamicin exposure demonstrates good accordance with recent dosing recommendations for target concentration achievement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: We have reported previously that 80 mg valsartan and 50 mg losartan provide less receptor blockade than 150 mg irbesartan in normotensive subjects. In this study we investigated the importance of drug dosing in mediating these differences by comparing the AT(1)-receptor blockade induced by 3 doses of valsartan with that obtained with 3 other antagonists at given doses. METHODS: Valsartan (80, 160, and 320 mg), 50 mg losartan, 150 mg irbesartan, and 8 mg candesartan were administered to 24 healthy subjects in a randomized, open-label, 3-period crossover study. All doses were given once daily for 8 days. The angiotensin II receptor blockade was assessed with two techniques, the reactive rise in plasma renin activity and an in vitro radioreceptor binding assay that quantified the displacement of angiotensin II by the blocking agents. Measurements were obtained before and 4 and 24 hours after drug intake on days 1 and 8. RESULTS: At 4 and 24 hours, valsartan induced a dose-dependent "blockade" of AT(1) receptors. Compared with other antagonists, 80 mg valsartan and 50 mg losartan had a comparable profile. The 160-mg and 320-mg doses of valsartan blocked AT(1) receptors at 4 hours by 80%, which was similar to the effect of 150 mg irbesartan. At trough, however, the valsartan-induced blockade was slightly less than that obtained with irbesartan. With use of plasma renin activity as a marker of receptor blockade, on day 8, 160 mg valsartan was equivalent to 150 mg irbesartan and 8 mg candesartan. CONCLUSIONS: These results show that the differences in angiotensin II receptor blockade observed with the various AT(1) antagonists are explained mainly by differences in dosing. When 160-mg or 320-mg doses were investigated, the effects of valsartan hardly differed from those obtained with recommended doses of irbesartan and candesartan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents an approach for mapping of precipitation data. The main goal is to perform spatial predictions and simulations of precipitation fields using geostatistical methods (ordinary kriging, kriging with external drift) as well as machine learning algorithms (neural networks). More practically, the objective is to reproduce simultaneously both the spatial patterns and the extreme values. This objective is best reached by models integrating geostatistics and machine learning algorithms. To demonstrate how such models work, two case studies have been considered: first, a 2-day accumulation of heavy precipitation and second, a 6-day accumulation of extreme orographic precipitation. The first example is used to compare the performance of two optimization algorithms (conjugate gradients and Levenberg-Marquardt) of a neural network for the reproduction of extreme values. Hybrid models, which combine geostatistical and machine learning algorithms, are also treated in this context. The second dataset is used to analyze the contribution of radar Doppler imagery when used as external drift or as input in the models (kriging with external drift and neural networks). Model assessment is carried out by comparing independent validation errors as well as analyzing data patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents general problems and approaches for the spatial data analysis using machine learning algorithms. Machine learning is a very powerful approach to adaptive data analysis, modelling and visualisation. The key feature of the machine learning algorithms is that they learn from empirical data and can be used in cases when the modelled environmental phenomena are hidden, nonlinear, noisy and highly variable in space and in time. Most of the machines learning algorithms are universal and adaptive modelling tools developed to solve basic problems of learning from data: classification/pattern recognition, regression/mapping and probability density modelling. In the present report some of the widely used machine learning algorithms, namely artificial neural networks (ANN) of different architectures and Support Vector Machines (SVM), are adapted to the problems of the analysis and modelling of geo-spatial data. Machine learning algorithms have an important advantage over traditional models of spatial statistics when problems are considered in a high dimensional geo-feature spaces, when the dimension of space exceeds 5. Such features are usually generated, for example, from digital elevation models, remote sensing images, etc. An important extension of models concerns considering of real space constrains like geomorphology, networks, and other natural structures. Recent developments in semi-supervised learning can improve modelling of environmental phenomena taking into account on geo-manifolds. An important part of the study deals with the analysis of relevant variables and models' inputs. This problem is approached by using different feature selection/feature extraction nonlinear tools. To demonstrate the application of machine learning algorithms several interesting case studies are considered: digital soil mapping using SVM, automatic mapping of soil and water system pollution using ANN; natural hazards risk analysis (avalanches, landslides), assessments of renewable resources (wind fields) with SVM and ANN models, etc. The dimensionality of spaces considered varies from 2 to more than 30. Figures 1, 2, 3 demonstrate some results of the studies and their outputs. Finally, the results of environmental mapping are discussed and compared with traditional models of geostatistics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aplicació per a iPad a mode de repositori de continguts relacionats amb l'ensenyament d'assignatures d'informàtica.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IP based networks still do not have the required degree of reliability required by new multimedia services, achieving such reliability will be crucial in the success or failure of the new Internet generation. Most of existing schemes for QoS routing do not take into consideration parameters concerning the quality of the protection, such as packet loss or restoration time. In this paper, we define a new paradigm to develop new protection strategies for building reliable MPLS networks, based on what we have called the network protection degree (NPD). This NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability and an a posteriori evaluation, the failure impact degree (FID), to determine the impact on the network in case of failure. Having mathematical formulated these components, we point out the most relevant components. Experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms to offer a certain degree of protection

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In image segmentation, clustering algorithms are very popular because they are intuitive and, some of them, easy to implement. For instance, the k-means is one of the most used in the literature, and many authors successfully compare their new proposal with the results achieved by the k-means. However, it is well known that clustering image segmentation has many problems. For instance, the number of regions of the image has to be known a priori, as well as different initial seed placement (initial clusters) could produce different segmentation results. Most of these algorithms could be slightly improved by considering the coordinates of the image as features in the clustering process (to take spatial region information into account). In this paper we propose a significant improvement of clustering algorithms for image segmentation. The method is qualitatively and quantitative evaluated over a set of synthetic and real images, and compared with classical clustering approaches. Results demonstrate the validity of this new approach

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This letter presents a comparison between threeFourier-based motion compensation (MoCo) algorithms forairborne synthetic aperture radar (SAR) systems. These algorithmscircumvent the limitations of conventional MoCo, namelythe assumption of a reference height and the beam-center approximation.All these approaches rely on the inherent time–frequencyrelation in SAR systems but exploit it differently, with the consequentdifferences in accuracy and computational burden. Aftera brief overview of the three approaches, the performance ofeach algorithm is analyzed with respect to azimuthal topographyaccommodation, angle accommodation, and maximum frequencyof track deviations with which the algorithm can cope. Also, ananalysis on the computational complexity is presented. Quantitativeresults are shown using real data acquired by the ExperimentalSAR system of the German Aerospace Center (DLR).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the significant impact the use of glucocorticoids can have on fracture risk independent of bone density, their use has been incorporated as one of the clinical risk factors for calculating the 10-year fracture risk in the World Health Organization's Fracture Risk Assessment Tool (FRAX(®)). Like the other clinical risk factors, the use of glucocorticoids is included as a dichotomous variable with use of steroids defined as past or present exposure of 3 months or more of use of a daily dose of 5 mg or more of prednisolone or equivalent. The purpose of this report is to give clinicians guidance on adjustments which should be made to the 10-year risk based on the dose, duration of use and mode of delivery of glucocorticoids preparations. A subcommittee of the International Society for Clinical Densitometry and International Osteoporosis Foundation joint Position Development Conference presented its findings to an expert panel and the following recommendations were selected. 1) There is a dose relationship between glucocorticoid use of greater than 3 months and fracture risk. The average dose exposure captured within FRAX(®) is likely to be a prednisone dose of 2.5-7.5 mg/day or its equivalent. Fracture probability is under-estimated when prednisone dose is greater than 7.5 mg/day and is over-estimated when the prednisone dose is less than 2.5 mg/day. 2) Frequent intermittent use of higher doses of glucocorticoids increases fracture risk. Because of the variability in dose and dosing schedule, quantification of this risk is not possible. 3) High dose inhaled glucocorticoids may be a risk factor for fracture. FRAX(®) may underestimate fracture probability in users of high dose inhaled glucocorticoids. 4) Appropriate glucocorticoid replacement in individuals with adrenal insufficiency has not been found to increase fracture risk. In such patients, use of glucocorticoids should not be included in FRAX(®) calculations.