817 resultados para Tracking error
Resumo:
In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations
Resumo:
One of the key challenges in the field of nanoparticle (NP) analysis is in producing reliable and reproducible characterisation data for nanomaterials. This study looks at the reproducibility using a relatively new, but rapidly adopted, technique, Nanoparticle Tracking Analysis (NTA) on a range of particle sizes and materials in several different media. It describes the protocol development and presents both the data and analysis of results obtained from 12 laboratories, mostly based in Europe, who are primarily QualityNano members. QualityNano is an EU FP7 funded Research Infrastructure that integrates 28 European analytical and experimental facilities in nanotechnology, medicine and natural sciences with the goal of developing and implementing best practice and quality in all aspects of nanosafety assessment. This study looks at both the development of the protocol and how this leads to highly reproducible results amongst participants. In this study, the parameter being measured is the modal particle size.
Resumo:
The multiscale finite-volume (MSFV) method is designed to reduce the computational cost of elliptic and parabolic problems with highly heterogeneous anisotropic coefficients. The reduction is achieved by splitting the original global problem into a set of local problems (with approximate local boundary conditions) coupled by a coarse global problem. It has been shown recently that the numerical errors in MSFV results can be reduced systematically with an iterative procedure that provides a conservative velocity field after any iteration step. The iterative MSFV (i-MSFV) method can be obtained with an improved (smoothed) multiscale solution to enhance the localization conditions, with a Krylov subspace method [e.g., the generalized-minimal-residual (GMRES) algorithm] preconditioned by the MSFV system, or with a combination of both. In a multiphase-flow system, a balance between accuracy and computational efficiency should be achieved by finding a minimum number of i-MSFV iterations (on pressure), which is necessary to achieve the desired accuracy in the saturation solution. In this work, we extend the i-MSFV method to sequential implicit simulation of time-dependent problems. To control the error of the coupled saturation/pressure system, we analyze the transport error caused by an approximate velocity field. We then propose an error-control strategy on the basis of the residual of the pressure equation. At the beginning of simulation, the pressure solution is iterated until a specified accuracy is achieved. To minimize the number of iterations in a multiphase-flow problem, the solution at the previous timestep is used to improve the localization assumption at the current timestep. Additional iterations are used only when the residual becomes larger than a specified threshold value. Numerical results show that only a few iterations on average are necessary to improve the MSFV results significantly, even for very challenging problems. Therefore, the proposed adaptive strategy yields efficient and accurate simulation of multiphase flow in heterogeneous porous media.
Resumo:
In this paper we present a new method to track bonemovements in stereoscopic X-ray image series of the kneejoint. The method is based on two different X-ray imagesets: a rotational series of acquisitions of the stillsubject knee that will allow the tomographicreconstruction of the three-dimensional volume (model),and a stereoscopic image series of orthogonal projectionsas the subject performs movements. Tracking the movementsof bones throughout the stereoscopic image series meansto determine, for each frame, the best pose of everymoving element (bone) previously identified in the 3Dreconstructed model. The quality of a pose is reflectedin the similarity between its simulated projections andthe actual radiographs. We use direct Fourierreconstruction to approximate the three-dimensionalvolume of the knee joint. Then, to avoid the expensivecomputation of digitally rendered radiographs (DRR) forpose recovery, we reformulate the tracking problem in theFourier domain. Under the hypothesis of parallel X-raybeams, we use the central-slice-projection theorem toreplace the heavy 2D-to-3D registration of projections inthe signal domain by efficient slice-to-volumeregistration in the Fourier domain. Focusing onrotational movements, the translation-relevant phaseinformation can be discarded and we only consider scalarFourier amplitudes. The core of our motion trackingalgorithm can be implemented as a classical frame-wiseslice-to-volume registration task. Preliminary results onboth synthetic and real images confirm the validity ofour approach.
Resumo:
Neuronal oscillations are an important aspect of EEG recordings. These oscillations are supposed to be involved in several cognitive mechanisms. For instance, oscillatory activity is considered a key component for the top-down control of perception. However, measuring this activity and its influence requires precise extraction of frequency components. This processing is not straightforward. Particularly, difficulties with extracting oscillations arise due to their time-varying characteristics. Moreover, when phase information is needed, it is of the utmost importance to extract narrow-band signals. This paper presents a novel method using adaptive filters for tracking and extracting these time-varying oscillations. This scheme is designed to maximize the oscillatory behavior at the output of the adaptive filter. It is then capable of tracking an oscillation and describing its temporal evolution even during low amplitude time segments. Moreover, this method can be extended in order to track several oscillations simultaneously and to use multiple signals. These two extensions are particularly relevant in the framework of EEG data processing, where oscillations are active at the same time in different frequency bands and signals are recorded with multiple sensors. The presented tracking scheme is first tested with synthetic signals in order to highlight its capabilities. Then it is applied to data recorded during a visual shape discrimination experiment for assessing its usefulness during EEG processing and in detecting functionally relevant changes. This method is an interesting additional processing step for providing alternative information compared to classical time-frequency analyses and for improving the detection and analysis of cross-frequency couplings.
Resumo:
Graph produced by Office of Drug Control Policy showing the tracking of Meth Labs in Iowa from 2008-2010.
Resumo:
Pseudoephedrine (PSE) is a common medicine used to treat colds and allergies. It is also a common ingredient, or precursor, used to manufacture methamphetamine, an illegal Schedule II drug under Iowa law. Prior to 2005, pseudoephedrine could be purchased over-the-counter, in any amount. Since PSE is the one ingredient needed in all methods of meth manufacturing, it was readily available to meth cooks.
Resumo:
Three-dimensional imaging and quantification of myocardial function are essential steps in the evaluation of cardiac disease. We propose a tagged magnetic resonance imaging methodology called zHARP that encodes and automatically tracks myocardial displacement in three dimensions. Unlike other motion encoding techniques, zHARP encodes both in-plane and through-plane motion in a single image plane without affecting the acquisition speed. Postprocessing unravels this encoding in order to directly track the 3-D displacement of every point within the image plane throughout an entire image sequence. Experimental results include a phantom validation experiment, which compares zHARP to phase contrast imaging, and an in vivo study of a normal human volunteer. Results demonstrate that the simultaneous extraction of in-plane and through-plane displacements from tagged images is feasible.
Resumo:
We present a heuristic method for learning error correcting output codes matrices based on a hierarchical partition of the class space that maximizes a discriminative criterion. To achieve this goal, the optimal codeword separation is sacrificed in favor of a maximum class discrimination in the partitions. The creation of the hierarchical partition set is performed using a binary tree. As a result, a compact matrix with high discrimination power is obtained. Our method is validated using the UCI database and applied to a real problem, the classification of traffic sign images.
Resumo:
A common way to model multiclass classification problems is by means of Error-Correcting Output Codes (ECOCs). Given a multiclass problem, the ECOC technique designs a code word for each class, where each position of the code identifies the membership of the class for a given binary problem. A classification decision is obtained by assigning the label of the class with the closest code. One of the main requirements of the ECOC design is that the base classifier is capable of splitting each subgroup of classes from each binary problem. However, we cannot guarantee that a linear classifier model convex regions. Furthermore, nonlinear classifiers also fail to manage some type of surfaces. In this paper, we present a novel strategy to model multiclass classification problems using subclass information in the ECOC framework. Complex problems are solved by splitting the original set of classes into subclasses and embedding the binary problems in a problem-dependent ECOC design. Experimental results show that the proposed splitting procedure yields a better performance when the class overlap or the distribution of the training objects conceal the decision boundaries for the base classifier. The results are even more significant when one has a sufficiently large training size.
Non-contact assessment of waist circumference: will tape measurements become progressively obsolete?
Resumo:
Waist circumference (WC) is a key variable to assess in health management as it is a proxy of abdominal fat mass and a surrogate marker of cardiometabolic disease risk, including the metabolic syndrome. Recently, a portable non-contact device calculating WC (ViScan) has been developed, which hence allows the tracking of WC independently of the inter-investigators error. We compared WC values obtained with this device with WC measured by simple non-stretchable tape in 74 adults of varying body mass indices (range 17-39 kg/m(2)). The correlation between the two methods was very high (r=0.97, P<0.0001) and the reproducibility (precision) assessed with a rigid phantom was excellent (<1 cm, coefficient of variability<1%). The instrument constitutes a potentially valuable tool for longitudinal surveys and comparative international studies, which require simple but precise measurements of WC in order to track the effect of subtle changes on various health outcomes.
Resumo:
DOC Research Director Lettie Prell recently compiled the calendar year 2012 data for offender releases from prison to community supervision in Iowa. Analyzes such as these help the Iowa Corrections system in identifying where the most reentry resource need is; what offender programming is most in demand; and which culturally-sensitive supervision and culturally-specific programming is prescribed.
Resumo:
The Data Warehouse Replacement Project includes updated County Health Snapshot reports. The snapshots provide an overview of key health indicators for local communities. They contain county-level measures that are organized into eight categories. The categories are asthma, cancer, health behaviors and outcomes, heart disease and other chronic conditions, infectious disease, mortality and injury prevention, population statistics and reproductive outcomes. The updated county health snapshots include almost forty new chronic disease indicators in addition to all of the indicators in the existing snapshots.There will be two different reports available in the Data Warehouse replacement system. One of the reports is a multi-year county health snapshot. This report has a similar format to the previous county health snapshot report in the current Data Warehouse. It will display multiple years of data for a single county. The data will be for the most current year available and the two years prior. The state values for the current year will also be included.