886 resultados para Multivariate measurement model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain functions, such as learning, orchestrating locomotion, memory recall, and processing information, all require glucose as a source of energy. During these functions, the glucose concentration decreases as the glucose is being consumed by brain cells. By measuring this drop in concentration, it is possible to determine which parts of the brain are used during specific functions and consequently, how much energy the brain requires to complete the function. One way to measure in vivo brain glucose levels is with a microdialysis probe. The drawback of this analytical procedure, as with many steadystate fluid flow systems, is that the probe fluid will not reach equilibrium with the brain fluid. Therefore, brain concentration is inferred by taking samples at multiple inlet glucose concentrations and finding a point of convergence. The goal of this thesis is to create a three-dimensional, time-dependent, finite element representation of the brainprobe system in COMSOL 4.2 that describes the diffusion and convection of glucose. Once validated with experimental results, this model can then be used to test parameters that experiments cannot access. When simulations were run using published values for physical constants (i.e. diffusivities, density and viscosity), the resulting glucose model concentrations were within the error of the experimental data. This verifies that the model is an accurate representation of the physical system. In addition to accurately describing the experimental brain-probe system, the model I created is able to show the validity of zero-net-flux for a given experiment. A useful discovery is that the slope of the zero-net-flux line is dependent on perfusate flow rate and diffusion coefficients, but it is independent of brain glucose concentrations. The model was simplified with the realization that the perfusate is at thermal equilibrium with the brain throughout the active region of the probe. This allowed for the assumption that all model parameters are temperature independent. The time to steady-state for the probe is approximately one minute. However, the signal degrades in the exit tubing due to Taylor dispersion, on the order of two minutes for two meters of tubing. Given an analytical instrument requiring a five μL aliquot, the smallest brain process measurable for this system is 13 minutes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new physics-based technique for correcting inhomogeneities present in sub-daily temperature records is proposed. The approach accounts for changes in the sensor-shield characteristics that affect the energy balance dependent on ambient weather conditions (radiation, wind). An empirical model is formulated that reflects the main atmospheric processes and can be used in the correction step of a homogenization procedure. The model accounts for short- and long-wave radiation fluxes (including a snow cover component for albedo calculation) of a measurement system, such as a radiation shield. One part of the flux is further modulated by ventilation. The model requires only cloud cover and wind speed for each day, but detailed site-specific information is necessary. The final model has three free parameters, one of which is a constant offset. The three parameters can be determined, e.g., using the mean offsets for three observation times. The model is developed using the example of the change from the Wild screen to the Stevenson screen in the temperature record of Basel, Switzerland, in 1966. It is evaluated based on parallel measurements of both systems during a sub-period at this location, which were discovered during the writing of this paper. The model can be used in the correction step of homogenization to distribute a known mean step-size to every single measurement, thus providing a reasonable alternative correction procedure for high-resolution historical climate series. It also constitutes an error model, which may be applied, e.g., in data assimilation approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a measurement of the W+W- production cross section in pp collisions at root s = 7 TeV. The leptonic decay channels are analyzed using data corresponding to an integrated luminosity of 4: 6 fb(-1) collected with the ATLAS detector at the Large Hadron Collider. The W+W- production cross section sigma(pp -> W+W- + X) is measured to be 51.9 +/- 2.0(stat) +/- 3.9(syst) +/- 2.0(lumi) pb, compatible with the Standard Model prediction of 44.7(-1.9)(+2.1) pb. A measurement of the normalized fiducial cross section as a function of the leading lepton transverse momentum is also presented. The reconstructed transverse momentum distribution of the leading lepton is used to extract limits on anomalous WWZ and WW gamma couplings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Triggered event-related functional magnetic resonance imaging requires sparse intervals of temporally resolved functional data acquisitions, whose initiation corresponds to the occurrence of an event, typically an epileptic spike in the electroencephalographic trace. However, conventional fMRI time series are greatly affected by non-steady-state magnetization effects, which obscure initial blood oxygen level-dependent (BOLD) signals. Here, conventional echo-planar imaging and a post-processing solution based on principal component analysis were employed to remove the dominant eigenimages of the time series, to filter out the global signal changes induced by magnetization decay and to recover BOLD signals starting with the first functional volume. This approach was compared with a physical solution using radiofrequency preparation, which nullifies magnetization effects. As an application of the method, the detectability of the initial transient BOLD response in the auditory cortex, which is elicited by the onset of acoustic scanner noise, was used to demonstrate that post-processing-based removal of magnetization effects allows to detect brain activity patterns identical with those obtained using the radiofrequency preparation. Using the auditory responses as an ideal experimental model of triggered brain activity, our results suggest that reducing the initial magnetization effects by removing a few principal components from fMRI data may be potentially useful in the analysis of triggered event-related echo-planar time series. The implications of this study are discussed with special caution to remaining technical limitations and the additional neurophysiological issues of the triggered acquisition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transient inflammation is known to alter visceral sensory function and frequently precede the onset of symptoms in a subgroup of patients with irritable bowel syndrome (IBS). Duration and severity of the initial inflammatory stimulus appear to be risk factors for the manifestation of symptoms. Therefore, we aimed to characterize dose-dependent effects of trinitrobenzenesulfonic acid (TNBS)/ethanol on: (1) colonic mucosa, (2) cytokine release and (3) visceral sensory function in a rat model. Acute inflammation was induced in male Lewis rats by single administration of various doses of TNBS/ethanol (total of 0.8, 0.4 or 0.2 ml) in test animals or saline in controls. Assessment of visceromotor response (VMR) to colorectal distensions, histological evaluation of severity of inflammation, and measurement of pro-inflammatory cytokine levels (IL-2, IL-6) using enzyme-linked immunosorbent assay (ELISA) were performed 2h and 3, 14, 28, 31 and 42 days after induction. Increased serum IL-2 and IL-6 levels were evident prior to mucosal lesions 2h after induction of colitis and persist up to 14 days (p<0.05 vs. saline), although no histological signs of inflammation were detected at 14 days. In the acute phase, VMR was only significantly increased after 0.8 ml and 0.4 ml TNBS/ethanol (p<0.05 vs. saline). After 28 days, distension-evoked responses were persistently elevated (p<0.05 vs. saline) in 0.8 and 0.4 ml TNBS/ethanol-treated rats. In 0.2 ml TNBS/ethanol group, VMR was only enhanced after repeated visceral stimulation. Visceral hyperalgesia occurs after a transient colitis. However, even a mild acute but asymptomatic colitis can induce long-lasting visceral hyperalgesia in the presence of additional stimuli.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new method for fitting proportional hazards models with error-prone covariates. Regression coefficients are estimated by solving an estimating equation that is the average of the partial likelihood scores based on imputed true covariates. For the purpose of imputation, a linear spline model is assumed on the baseline hazard. We discuss consistency and asymptotic normality of the resulting estimators, and propose a stochastic approximation scheme to obtain the estimates. The algorithm is easy to implement, and reduces to the ordinary Cox partial likelihood approach when the measurement error has a degenerative distribution. Simulations indicate high efficiency and robustness. We consider the special case where error-prone replicates are available on the unobserved true covariates. As expected, increasing the number of replicate for the unobserved covariates increases efficiency and reduces bias. We illustrate the practical utility of the proposed method with an Eastern Cooperative Oncology Group clinical trial where a genetic marker, c-myc expression level, is subject to measurement error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a method for evaluating an ensemble of predictive models given a sample of observations comprising the model predictions and the outcome event measured with error. Our formulation allows us to simultaneously estimate measurement error parameters, true outcome — aka the gold standard — and a relative weighting of the predictive scores. We describe conditions necessary to estimate the gold standard and for these estimates to be calibrated and detail how our approach is related to, but distinct from, standard model combination techniques. We apply our approach to data from a study to evaluate a collection of BRCA1/BRCA2 gene mutation prediction scores. In this example, genotype is measured with error by one or more genetic assays. We estimate true genotype for each individual in the dataset, operating characteristics of the commonly used genotyping procedures and a relative weighting of the scores. Finally, we compare the scores against the gold standard genotype and find that Mendelian scores are, on average, the more refined and better calibrated of those considered and that the comparison is sensitive to measurement error in the gold standard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many seemingly disparate approaches for marginal modeling have been developed in recent years. We demonstrate that many current approaches for marginal modeling of correlated binary outcomes produce likelihoods that are equivalent to the proposed copula-based models herein. These general copula models of underlying latent threshold random variables yield likelihood based models for marginal fixed effects estimation and interpretation in the analysis of correlated binary data. Moreover, we propose a nomenclature and set of model relationships that substantially elucidates the complex area of marginalized models for binary data. A diverse collection of didactic mathematical and numerical examples are given to illustrate concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to develop statistical methodology to facilitate indirect estimation of the concentration of antiretroviral drugs and viral loads in the prostate gland and the seminal vesicle. The differences in antiretroviral drug concentrations in these organs may lead to suboptimal concentrations in one gland compared to the other. Suboptimal levels of the antiretroviral drugs will not be able to fully suppress the virus in that gland, lead to a source of sexually transmissible virus and increase the chance of selecting for drug resistant virus. This information may be useful selecting antiretroviral drug regimen that will achieve optimal concentrations in most of male genital tract glands. Using fractionally collected semen ejaculates, Lundquist (1949) measured levels of surrogate markers in each fraction that are uniquely produced by specific male accessory glands. To determine the original glandular concentrations of the surrogate markers, Lundquist solved a simultaneous series of linear equations. This method has several limitations. In particular, it does not yield a unique solution, it does not address measurement error, and it disregards inter-subject variability in the parameters. To cope with these limitations, we developed a mechanistic latent variable model based on the physiology of the male genital tract and surrogate markers. We employ a Bayesian approach and perform a sensitivity analysis with regard to the distributional assumptions on the random effects and priors. The model and Bayesian approach is validated on experimental data where the concentration of a drug should be (biologically) differentially distributed between the two glands. In this example, the Bayesian model-based conclusions are found to be robust to model specification and this hierarchical approach leads to more scientifically valid conclusions than the original methodology. In particular, unlike existing methods, the proposed model based approach was not affected by a common form of outliers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vitamin C (L-ascorbic acid) is an essential micronutrient that serves as an antioxidant and as a cofactor in many enzymatic reactions. Intestinal absorption and renal reabsorption of the vitamin is mediated by the epithelial apical L-ascorbic acid cotransporter SVCT1 (SLC23A1). We explored the molecular mechanisms of SVCT1-mediated L-ascorbic acid transport using radiotracer and voltage-clamp techniques in RNA-injected Xenopus oocytes. L-ascorbic acid transport was saturable (K(0.5) approximately 70 microM), temperature dependent (Q(10) approximately 5), and energized by the Na(+) electrochemical potential gradient. We obtained a Na(+)-L-ascorbic acid coupling ratio of 2:1 from simultaneous measurement of currents and fluxes. L-ascorbic acid and Na(+) saturation kinetics as a function of cosubstrate concentrations revealed a simultaneous transport mechanism in which binding is ordered Na(+), L-ascorbic acid, Na(+). In the absence of L-ascorbic acid, SVCT1 mediated pre-steady-state currents that decayed with time constants 3-15 ms. Transients were described by single Boltzmann distributions. At 100 mM Na(+), maximal charge translocation (Q(max)) was approximately 25 nC, around a midpoint (V(0.5)) at -9 mV, and with apparent valence approximately -1. Q(max) was conserved upon progressive removal of Na(+), whereas V(0.5) shifted to more hyperpolarized potentials. Model simulation predicted that the pre-steady-state current predominantly results from an ion-well effect on binding of the first Na(+) partway within the membrane electric field. We present a transport model for SVCT1 that will provide a framework for investigating the impact of specific mutations and polymorphisms in SLC23A1 and help us better understand the contribution of SVCT1 to vitamin C metabolism in health and disease.