36 resultados para approximation error


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary points: - The bias introduced by random measurement error will be different depending on whether the error is in an exposure variable (risk factor) or outcome variable (disease) - Random measurement error in an exposure variable will bias the estimates of regression slope coefficients towards the null - Random measurement error in an outcome variable will instead increase the standard error of the estimates and widen the corresponding confidence intervals, making results less likely to be statistically significant - Increasing sample size will help minimise the impact of measurement error in an outcome variable but will only make estimates more precisely wrong when the error is in an exposure variable

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new strategy for incremental building of multilayer feedforward neural networks is proposed in the context of approximation of functions from R-p to R-q using noisy data. A stopping criterion based on the properties of the noise is also proposed. Experimental results for both artificial and real data are performed and two alternatives of the proposed construction strategy are compared.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An epidemic model is formulated by a reactionâeuro"diffusion system where the spatial pattern formation is driven by cross-diffusion. The reaction terms describe the local dynamics of susceptible and infected species, whereas the diffusion terms account for the spatial distribution dynamics. For both self-diffusion and cross-diffusion, nonlinear constitutive assumptions are suggested. To simulate the pattern formation two finite volume formulations are proposed, which employ a conservative and a non-conservative discretization, respectively. An efficient simulation is obtained by a fully adaptive multiresolution strategy. Numerical examples illustrate the impact of the cross-diffusion on the pattern formation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal behavior relies on flexible adaptation to environmental requirements, notably based on the detection of errors. The impact of error detection on subsequent behavior typically manifests as a slowing down of RTs following errors. Precisely how errors impact the processing of subsequent stimuli and in turn shape behavior remains unresolved. To address these questions, we used an auditory spatial go/no-go task where continual feedback informed participants of whether they were too slow. We contrasted auditory-evoked potentials to left-lateralized go and right no-go stimuli as a function of performance on the preceding go stimuli, generating a 2 × 2 design with "preceding performance" (fast hit [FH], slow hit [SH]) and stimulus type (go, no-go) as within-subject factors. SH trials yielded SH trials on the following trials more often than did FHs, supporting our assumption that SHs engaged effects similar to errors. Electrophysiologically, auditory-evoked potentials modulated topographically as a function of preceding performance 80-110 msec poststimulus onset and then as a function of stimulus type at 110-140 msec, indicative of changes in the underlying brain networks. Source estimations revealed a stronger activity of prefrontal regions to stimuli after successful than error trials, followed by a stronger response of parietal areas to the no-go than go stimuli. We interpret these results in terms of a shift from a fast automatic to a slow controlled form of inhibitory control induced by the detection of errors, manifesting during low-level integration of task-relevant features of subsequent stimuli, which in turn influences response speed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The multiscale finite-volume (MSFV) method is designed to reduce the computational cost of elliptic and parabolic problems with highly heterogeneous anisotropic coefficients. The reduction is achieved by splitting the original global problem into a set of local problems (with approximate local boundary conditions) coupled by a coarse global problem. It has been shown recently that the numerical errors in MSFV results can be reduced systematically with an iterative procedure that provides a conservative velocity field after any iteration step. The iterative MSFV (i-MSFV) method can be obtained with an improved (smoothed) multiscale solution to enhance the localization conditions, with a Krylov subspace method [e.g., the generalized-minimal-residual (GMRES) algorithm] preconditioned by the MSFV system, or with a combination of both. In a multiphase-flow system, a balance between accuracy and computational efficiency should be achieved by finding a minimum number of i-MSFV iterations (on pressure), which is necessary to achieve the desired accuracy in the saturation solution. In this work, we extend the i-MSFV method to sequential implicit simulation of time-dependent problems. To control the error of the coupled saturation/pressure system, we analyze the transport error caused by an approximate velocity field. We then propose an error-control strategy on the basis of the residual of the pressure equation. At the beginning of simulation, the pressure solution is iterated until a specified accuracy is achieved. To minimize the number of iterations in a multiphase-flow problem, the solution at the previous timestep is used to improve the localization assumption at the current timestep. Additional iterations are used only when the residual becomes larger than a specified threshold value. Numerical results show that only a few iterations on average are necessary to improve the MSFV results significantly, even for very challenging problems. Therefore, the proposed adaptive strategy yields efficient and accurate simulation of multiphase flow in heterogeneous porous media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When researchers introduce a new test they have to demonstrate that it is valid, using unbiased designs and suitable statistical procedures. In this article we use Monte Carlo analyses to highlight how incorrect statistical procedures (i.e., stepwise regression, extreme scores analyses) or ignoring regression assumptions (e.g., heteroscedasticity) contribute to wrong validity estimates. Beyond these demonstrations, and as an example, we re-examined the results reported by Warwick, Nettelbeck, and Ward (2010) concerning the validity of the Ability Emotional Intelligence Measure (AEIM). Warwick et al. used the wrong statistical procedures to conclude that the AEIM was incrementally valid beyond intelligence and personality traits in predicting various outcomes. In our re-analysis, we found that the reliability-corrected multiple correlation of their measures with personality and intelligence was up to .69. Using robust statistical procedures and appropriate controls, we also found that the AEIM did not predict incremental variance in GPA, stress, loneliness, or well-being, demonstrating the importance for testing validity instead of looking for it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to use the plantar pressure insole for estimating the three-dimensional ground reaction force (GRF) as well as the frictional torque (T(F)) during walking. Eleven subjects, six healthy and five patients with ankle disease participated in the study while wearing pressure insoles during several walking trials on a force-plate. The plantar pressure distribution was analyzed and 10 principal components of 24 regional pressure values with the stance time percentage (STP) were considered for GRF and T(F) estimation. Both linear and non-linear approximators were used for estimating the GRF and T(F) based on two learning strategies using intra-subject and inter-subjects data. The RMS error and the correlation coefficient between the approximators and the actual patterns obtained from force-plate were calculated. Our results showed better performance for non-linear approximation especially when the STP was considered as input. The least errors were observed for vertical force (4%) and anterior-posterior force (7.3%), while the medial-lateral force (11.3%) and frictional torque (14.7%) had higher errors. The result obtained for the patients showed higher error; nevertheless, when the data of the same patient were used for learning, the results were improved and in general slight differences with healthy subjects were observed. In conclusion, this study showed that ambulatory pressure insole with data normalization, an optimal choice of inputs and a well-trained nonlinear mapping function can estimate efficiently the three-dimensional ground reaction force and frictional torque in consecutive gait cycle without requiring a force-plate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic waveform inversions are an increasingly popular tool for extracting subsurface information from seismic data. They are computationally much more efficient than elastic inversions. Naturally, an inherent disadvantage is that any elastic effects present in the recorded data are ignored in acoustic inversions. We investigate the extent to which elastic effects influence seismic crosshole data. Our numerical modeling studies reveal that in the presence of high contrast interfaces, at which P-to-S conversions occur, elastic effects can dominate the seismic sections, even for experiments involving pressure sources and pressure receivers. Comparisons of waveform inversion results using a purely acoustic algorithm on synthetic data that is either acoustic or elastic, show that subsurface models comprising small low-to-medium contrast (?30%) structures can be successfully resolved in the acoustic approximation. However, in the presence of extended high-contrast anomalous bodies, P-to-S-conversions may substantially degrade the quality of the tomographic images. In particular, extended low-velocity zones are difficult to image. Likewise, relatively small low-velocity features are unresolved, even when advanced a priori information is included. One option for mitigating elastic effects is data windowing, which suppresses later arriving seismic arrivals, such as shear waves. Our tests of this approach found it to be inappropriate because elastic effects are also included in earlier arriving wavetrains. Furthermore, data windowing removes later arriving P-wave phases that may provide critical constraints on the tomograms. Finally, we investigated the extent to which acoustic inversions of elastic data are useful for time-lapse analyses of high contrast engineered structures, for which accurate reconstruction of the subsurface structure is not as critical as imaging differential changes between sequential experiments. Based on a realistic scenario for monitoring a radioactive waste repository, we demonstrated that acoustic inversions of elastic data yield substantial distortions of the tomograms and also unreliable information on trends in the velocity changes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To review, retrospectively, the possible causes of sub- or intertrochanteric fractures after screw fixation of intracapsular fractures of the proximal femur. METHODS: Eighty-four patients with an intracapsular fracture of proximal femur were operated between 1995 and 1998 by using three cannulated 6.25 mm screws. The screws were inserted in a triangular configuration, one screw in the upper part of the femoral neck and two screws in the inferior part. Between 1999 and 2001, we use two screws proximally and one screw distally. RESULTS: In the first series, two patients died within one week after operation. Sixty-four fractures healed without problems. Four patients developed an atrophic non-union; avascular necrosis of the femoral head was found in 11 patients. Three patients (3.6%) suffered a sub- and/or intertrochanteric fracture after a mean postoperative time of 30 days, in one case without obvious trauma. In all three cases surgical revision was necessary. Between 1999 and 2001 we did not observe any fracture after screwing. CONCLUSION: Two screws in the inferior part of the femoral neck create a stress riser in the subtrochanteric region, potentially inducing a fracture in the weakened bone. For internal fixation for proximal intracapsular femoral fracture only one screw must be inserted in the inferior part of neck.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real time glycemia is a cornerstone for metabolic research, particularly when performing oral glucose tolerance tests (OGTT) or glucose clamps. From 1965 to 2009, the gold standard device for real time plasma glucose assessment was the Beckman glucose analyzer 2 (Beckman Instruments, Fullerton, CA), which technology couples glucose oxidase enzymatic assay with oxygen sensors. Since its discontinuation in 2009, today's researchers are left with few choices that utilize glucose oxidase technology. The first one is the YSI 2300 (Yellow Springs Instruments Corp., Yellow Springs, OH), known to be as accurate as the Beckman(1). The YSI has been used extensively for clinical research studies and is used to validate other glucose monitoring devices(2). The major drawback of the YSI is that it is relatively slow and requires high maintenance. The Analox GM9 (Analox instruments, London), more recent and faster, is increasingly used in clinical research(3) as well as in basic sciences(4) (e.g. 23 papers in Diabetes or 21 in Diabetologia). This article is protected by copyright. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.