191 resultados para Predictive Modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable quantification of the macromolecule signals in short echo-time H-1 MRS spectra is particularly important at high magnetic fields for an accurate quantification of metabolite concentrations (the neurochemical profile) due to effectively increased spectral resolution of the macromolecule components. The purpose of the present study was to assess two approaches of quantification, which take the contribution of macromolecules into account in the quantification step. H-1 spectra were acquired on a 14.1 T/26 cm horizontal scanner on five rats using the ultra-short echo-time SPECIAL (spin echo full intensity acquired localization) spectroscopy sequence. Metabolite concentrations were estimated using LCModel, combined with a simulated basis set of metabolites using published spectral parameters and either the spectrum of macromolecules measured in vivo, using an inversion recovery technique, or baseline simulated by the built-in spline function. The fitted spline function resulted in a smooth approximation of the in vivo macromolecules, but in accordance with previous studies using Subtract-QUEST could not reproduce completely all features of the in vivo spectrum of macromolecules at 14.1 T. As a consequence, the measured macromolecular 'baseline' led to a more accurate and reliable quantification at higher field strengths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attrition in longitudinal studies can lead to biased results. The study is motivated by the unexpected observation that alcohol consumption decreased despite increased availability, which may be due to sample attrition of heavy drinkers. Several imputation methods have been proposed, but rarely compared in longitudinal studies of alcohol consumption. The imputation of consumption level measurements is computationally particularly challenging due to alcohol consumption being a semi-continuous variable (dichotomous drinking status and continuous volume among drinkers), and the non-normality of data in the continuous part. Data come from a longitudinal study in Denmark with four waves (2003-2006) and 1771 individuals at baseline. Five techniques for missing data are compared: Last value carried forward (LVCF) was used as a single, and Hotdeck, Heckman modelling, multivariate imputation by chained equations (MICE), and a Bayesian approach as multiple imputation methods. Predictive mean matching was used to account for non-normality, where instead of imputing regression estimates, "real" observed values from similar cases are imputed. Methods were also compared by means of a simulated dataset. The simulation showed that the Bayesian approach yielded the most unbiased estimates for imputation. The finding of no increase in consumption levels despite a higher availability remained unaltered. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a phenomenologically motivated magneto-mechanically coupled finite strain elastic framework for simulating the curing process of polymers in the presence of a magnetic load is proposed. This approach is in line with previous works by Hossain and co-workers on finite strain curing modelling framework for the purely mechanical polymer curing (Hossain et al., 2009b). The proposed thermodynamically consistent approach is independent of any particular free energy function that may be used for the fully-cured magneto-sensitive polymer modelling, i.e. any phenomenological or micromechanical-inspired free energy can be inserted into the main modelling framework. For the fabrication of magneto-sensitive polymers, micron-size ferromagnetic particles are mixed with the liquid matrix material in the uncured stage. The particles align in a preferred direction with the application of a magnetic field during the curing process. The polymer curing process is a complex (visco) elastic process that transforms a fluid to a solid with time. Such transformation process is modelled by an appropriate constitutive relation which takes into account the temporal evolution of the material parameters appearing in a particular energy function. For demonstration in this work, a frequently used energy function is chosen, i.e. the classical Mooney-Rivlin free energy enhanced by coupling terms. Several representative numerical examples are demonstrated that prove the capability of our approach to correctly capture common features in polymers undergoing curing processes in the presence of a magneto-mechanical coupled load.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The imatinib trough plasma concentration (C(min)) correlates with clinical response in cancer patients. Therapeutic drug monitoring (TDM) of plasma C(min) is therefore suggested. In practice, however, blood sampling for TDM is often not performed at trough. The corresponding measurement is thus only remotely informative about C(min) exposure. Objectives: The objectives of this study were to improve the interpretation of randomly measured concentrations by using a Bayesian approach for the prediction of C(min), incorporating correlation between pharmacokinetic parameters, and to compare the predictive performance of this method with alternative approaches, by comparing predictions with actual measured trough levels, and with predictions obtained by a reference method, respectively. Methods: A Bayesian maximum a posteriori (MAP) estimation method accounting for correlation (MAP-ρ) between pharmacokinetic parameters was developed on the basis of a population pharmacokinetic model, which was validated on external data. Thirty-one paired random and trough levels, observed in gastrointestinal stromal tumour patients, were then used for the evaluation of the Bayesian MAP-ρ method: individual C(min) predictions, derived from single random observations, were compared with actual measured trough levels for assessment of predictive performance (accuracy and precision). The method was also compared with alternative approaches: classical Bayesian MAP estimation assuming uncorrelated pharmacokinetic parameters, linear extrapolation along the typical elimination constant of imatinib, and non-linear mixed-effects modelling (NONMEM) first-order conditional estimation (FOCE) with interaction. Predictions of all methods were finally compared with 'best-possible' predictions obtained by a reference method (NONMEM FOCE, using both random and trough observations for individual C(min) prediction). Results: The developed Bayesian MAP-ρ method accounting for correlation between pharmacokinetic parameters allowed non-biased prediction of imatinib C(min) with a precision of ±30.7%. This predictive performance was similar for the alternative methods that were applied. The range of relative prediction errors was, however, smallest for the Bayesian MAP-ρ method and largest for the linear extrapolation method. When compared with the reference method, predictive performance was comparable for all methods. The time interval between random and trough sampling did not influence the precision of Bayesian MAP-ρ predictions. Conclusion: Clinical interpretation of randomly measured imatinib plasma concentrations can be assisted by Bayesian TDM. Classical Bayesian MAP estimation can be applied even without consideration of the correlation between pharmacokinetic parameters. Individual C(min) predictions are expected to vary less through Bayesian TDM than linear extrapolation. Bayesian TDM could be developed in the future for other targeted anticancer drugs and for the prediction of other pharmacokinetic parameters that have been correlated with clinical outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: Hyperglycaemia is now a recognized predictive factor of morbidity and mortality after coronary artery bypass grafting (CABG). For this reason, we aimed to evaluate the postoperative management of glucose control in patients undergoing cardiovascular surgery, and to assess the impact of glucose levels on in-hospital mortality and morbidity. METHODS: This was a retrospective study investigating the association between postoperative blood glucose and outcomes, including death, post-surgical complications, and length of stay in the intensive care unit (ICU) and in hospital. RESULTS: A total of 642 consecutive patients were enrolled into the study after cardiovascular surgery (CABG, carotid endarterectomy and bypass in the lower limbs). Patients' mean age was 68+/-10 years, and 74% were male. In-hospital mortality was 5% in diabetic patients vs 2% in non-diabetic patients (OR: 1.66, P=0.076). Having blood glucose levels in the upper quartile range (> or =8.8 mmol/L) on postoperative day 1 was independently associated with death (OR: 10.16, P=0.0002), infectious complications (OR: 1.76, P=0.04) and prolonged ICU stay (OR: 3.10, P<0.0001). Patients presenting with three or more hypoglycaemic episodes (<4.1 mmol/L) had increased rates of mortality (OR: 9.08, P<0.0001) and complications (OR: 8.57, P<0.0001). CONCLUSION: Glucose levels greater than 8.8 mmol/L on postoperative day 1 and having three or more hypoglycaemic episodes in the postoperative period were predictive of mortality and morbidity among patients undergoing cardiovascular surgery. This suggests that a multidisciplinary approach may be able to achieve better postoperative blood glucose control.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The impact of curative radiotherapy depends mainly on the total dose delivered homogenously in the targeted volume. Nevertheless, the dose delivered to the surrounding healthy tissues may reduce the therapeutic ratio of many radiation treatments. In a same population treated in one center with the same technique, it appears that individual radiosensitivity clearly exists, namely in terms of late side effects that are in principle non-reversible. This review details the different radiobiological approaches that have been developed to better understand the mechanisms of radiation-induced late effects. We also present the possibilities of clinical use of predictive assays in the close future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from the models applied and analysis scale, which are neglecting local controlling factors of debris flow hazard. The presented approach of debris flow hazard analysis, associating automatic detection of the source areas and a simple assessment of the debris flow spreading, provided results for consequent hazard and risk studies. However, for the validation and transferability of the parameters and results to other study areas, more testing is needed.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper describes how to integrate audience measurement and site visibility as the main research approaches in outdoor advertising research in a single concept. Details are portrayed on how GPS is used on a large scale in Switzerland for mobility analysis and audience measurement. Furthermore, the development of a software solution is introduced that allows the integration of all mobility data and poster location information. Finally a model and its results is presented for the calculation of coverage of individual poster campaigns and for the calculation of the number of contacts generated by each billboard.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research has demonstrated that landscape or watershed scale processes can influence instream aquatic ecosystems, in terms of the impacts of delivery of fine sediment, solutes and organic matter. Testing such impacts upon populations of organisms (i.e. at the catchment scale) has not proven straightforward and differences have emerged in the conclusions reached. This is: (1) partly because different studies have focused upon different scales of enquiry; but also (2) because the emphasis upon upstream land cover has rarely addressed the extent to which such land covers are hydrologically connected, and hence able to deliver diffuse pollution, to the drainage network However, there is a third issue. In order to develop suitable hydrological models, we need to conceptualise the process cascade. To do this, we need to know what matters to the organism being impacted by the hydrological system, such that we can identify which processes need to be modelled. Acquiring such knowledge is not easy, especially for organisms like fish that might occupy very different locations in the river over relatively short periods of time. However, and inevitably, hydrological modellers have started by building up piecemeal the aspects of the problem that we think matter to fish. Herein, we report two developments: (a) for the case of sediment associated diffuse pollution from agriculture, a risk-based modelling framework, SCIMAP, has been developed, which is distinct because it has an explicit focus upon hydrological connectivity; and (b) we use spatially distributed ecological data to infer the processes and the associated process parameters that matter to salmonid fry. We apply the model to spatially distributed salmon and fry data from the River Eden, Cumbria, England. The analysis shows, quite surprisingly, that arable land covers are relatively unimportant as drivers of fry abundance. What matters most is intensive pasture, a land cover that could be associated with a number of stressors on salmonid fry (e.g. pesticides, fine sediment) and which allows us to identify a series of risky field locations, where this land cover is readily connected to the river system by overland flow. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To assess the effectiveness of a multidisciplinary evaluation and referral process in a prospective cohort of general hospital patients with alcohol dependence. Alcohol-dependent patients were identified in the wards of the general hospital and its primary care center. They were evaluated and then referred to treatment by a multidisciplinary team; those patients who accepted to participate in this cohort study were consecutively included and followed for 6 months. Not included patients were lost for follow-up, whereas all included patients were assessed at time of inclusion, 2 and 6 months later by a research psychologist in order to collect standardized baseline patients' characteristics, process salient features and patients outcomes (defined as treatment adherence and abstinence). Multidisciplinary evaluation and therapeutic referral was feasible and effective, with a success rate of 43%for treatment adherence and 28%for abstinence at 6 months. Among patients' characteristics, predictors of success were an age over 45, not living alone, being employed and being motivated to treatment (RAATE-A score &lt; 18), whereas successful process characteristics included detoxification of the patient at time of referral and a full multidisciplinary referral meeting. This multidisciplinary model of evaluation and referral of alcohol dependent patients of a general hospital had a satisfactory level of effectiveness. Predictors of success and failure allow to identify subsets of patients for whom new strategies of motivation and treatment referral should be designed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To compare the pharmacokinetic and pharmacodynamic characteristics of angiotensin II receptor antagonists as a therapeutic class. DESIGN: Population pharmacokinetic-pharmacodynamic modelling study. METHODS: The data of 14 phase I studies with 10 different drugs were analysed. A common population pharmacokinetic model (two compartments, mixed zero- and first-order absorption, two metabolite compartments) was applied to the 2685 drug and 900 metabolite concentration measurements. A standard nonlinear mixed effect modelling approach was used to estimate the drug-specific parameters and their variabilities. Similarly, a pharmacodynamic model was applied to the 7360 effect measurements, i.e. the decrease of peak blood pressure response to intravenous angiotensin challenge recorded by finger photoplethysmography. The concentration of drug and metabolite in an effect compartment was assumed to translate into receptor blockade [maximum effect (Emax) model with first-order link]. RESULTS: A general pharmacokinetic-pharmacodynamic (PK-PD) model for angiotensin antagonism in healthy individuals was successfully built up for the 10 drugs studied. Representatives of this class share different pharmacokinetic and pharmacodynamic profiles. Their effects on blood pressure are dose-dependent, but the time course of the effect varies between the drugs. CONCLUSIONS: The characterisation of PK-PD relationships for these drugs gives the opportunity to optimise therapeutic regimens and to suggest dosage adjustments in specific conditions. Such a model can be used to further refine the use of this class of drugs.