998 resultados para SPECKLE MODEL ESTIMATOR


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le traitement chirurgical des anévrismes de l'aorte abdominale est de plus en plus remplacé par la réparation endovasculaire de l’anévrisme (« endovascular aneurysm repair », EVAR) en utilisant des endoprothèses (« stent-grafts », SGs). Cependant, l'efficacité de cette approche moins invasive est compromise par l'incidence de l'écoulement persistant dans l'anévrisme, appelé endofuites menant à une rupture d'anévrisme si elle n'est pas détectée. Par conséquent, une surveillance de longue durée par tomodensitométrie sur une base annuelle est nécessaire ce qui augmente le coût de la procédure EVAR, exposant le patient à un rayonnement ionisants et un agent de contraste néphrotoxique. Le mécanisme de rupture d'anévrisme secondaire à l'endofuite est lié à une pression du sac de l'anévrisme proche de la pression systémique. Il existe une relation entre la contraction ou l'expansion du sac et la pressurisation du sac. La pressurisation résiduelle de l'anévrisme aortique abdominale va induire une pulsation et une circulation sanguine à l'intérieur du sac empêchant ainsi la thrombose du sac et la guérison de l'anévrisme. L'élastographie vasculaire non-invasive (« non-invasive vascular elastography », NIVE) utilisant le « Lagrangian Speckle Model Estimator » (LSME) peut devenir une technique d'imagerie complémentaire pour le suivi des anévrismes après réparation endovasculaire. NIVE a la capacité de fournir des informations importantes sur l'organisation d'un thrombus dans le sac de l'anévrisme et sur la détection des endofuites. La caractérisation de l'organisation d'un thrombus n'a pas été possible dans une étude NIVE précédente. Une limitation de cette étude était l'absence d'examen tomodensitométrique comme étalon-or pour le diagnostic d'endofuites. Nous avons cherché à appliquer et optimiser la technique NIVE pour le suivi des anévrismes de l'aorte abdominale (AAA) après EVAR avec endoprothèse dans un modèle canin dans le but de détecter et caractériser les endofuites et l'organisation du thrombus. Des SGs ont été implantés dans un groupe de 18 chiens avec un anévrisme créé dans l'aorte abdominale. Des endofuites de type I ont été créés dans 4 anévrismes, de type II dans 13 anévrismes tandis qu’un anévrisme n’avait aucune endofuite. L'échographie Doppler (« Doppler ultrasound », DUS) et les examens NIVE ont été réalisés avant puis à 1 semaine, 1 mois, 3 mois et 6 mois après l’EVAR. Une angiographie, une tomodensitométrie et des coupes macroscopiques ont été réalisées au moment du sacrifice. Les valeurs de contrainte ont été calculées en utilisant l`algorithme LSME. Les régions d'endofuite, de thrombus frais (non organisé) et de thrombus solide (organisé) ont été identifiées et segmentées en comparant les résultats de la tomodensitométrie et de l’étude macroscopique. Les valeurs de contrainte dans les zones avec endofuite, thrombus frais et organisé ont été comparées. Les valeurs de contrainte étaient significativement différentes entre les zones d'endofuites, les zones de thrombus frais ou organisé et entre les zones de thrombus frais et organisé. Toutes les endofuites ont été clairement caractérisées par les examens d'élastographie. Aucune corrélation n'a été trouvée entre les valeurs de contrainte et le type d'endofuite, la pression de sac, la taille des endofuites et la taille de l'anévrisme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estimation of a population size by means of capture-recapture techniques is an important problem occurring in many areas of life and social sciences. We consider the frequencies of frequencies situation, where a count variable is used to summarize how often a unit has been identified in the target population of interest. The distribution of this count variable is zero-truncated since zero identifications do not occur in the sample. As an application we consider the surveillance of scrapie in Great Britain. In this case study holdings with scrapie that are not identified (zero counts) do not enter the surveillance database. The count variable of interest is the number of scrapie cases per holding. For count distributions a common model is the Poisson distribution and, to adjust for potential heterogeneity, a discrete mixture of Poisson distributions is used. Mixtures of Poissons usually provide an excellent fit as will be demonstrated in the application of interest. However, as it has been recently demonstrated, mixtures also suffer under the so-called boundary problem, resulting in overestimation of population size. It is suggested here to select the mixture model on the basis of the Bayesian Information Criterion. This strategy is further refined by employing a bagging procedure leading to a series of estimates of population size. Using the median of this series, highly influential size estimates are avoided. In limited simulation studies it is shown that the procedure leads to estimates with remarkable small bias.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intravascular ultrasound (IVUS) phantoms are important to calibrate and evaluate many IVUS imaging processing tasks. However, phantom generation is never the primary focus of related works; hence, it cannot be well covered, and is usually based on more than one platform, which may not be accessible to investigators. Therefore, we present a framework for creating representative IVUS phantoms, for different intraluminal pressures, based on the finite element method and Field II. First, a coronary cross-section model is selected. Second, the coronary regions are identified to apply the properties. Third, the corresponding mesh is generated. Fourth, the intraluminal force is applied and the deformation computed. Finally, the speckle noise is incorporated. The framework was tested taking into account IVUS contrast, noise and strains. The outcomes are in line with related studies and expected values. Moreover, the framework toolbox is freely accessible and fully implemented in a single platform. (E-mail: fernando.okara@gmail.com) (c) 2012 World Federation for Ultrasound in Medicine & Biology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis presents a novel framework for state estimation in the context of robotic grasping and manipulation. The overall estimation approach is based on fusing various visual cues for manipulator tracking, namely appearance and feature-based, shape-based, and silhouette-based visual cues. Similarly, a framework is developed to fuse the above visual cues, but also kinesthetic cues such as force-torque and tactile measurements, for in-hand object pose estimation. The cues are extracted from multiple sensor modalities and are fused in a variety of Kalman filters.

A hybrid estimator is developed to estimate both a continuous state (robot and object states) and discrete states, called contact modes, which specify how each finger contacts a particular object surface. A static multiple model estimator is used to compute and maintain this mode probability. The thesis also develops an estimation framework for estimating model parameters associated with object grasping. Dual and joint state-parameter estimation is explored for parameter estimation of a grasped object's mass and center of mass. Experimental results demonstrate simultaneous object localization and center of mass estimation.

Dual-arm estimation is developed for two arm robotic manipulation tasks. Two types of filters are explored; the first is an augmented filter that contains both arms in the state vector while the second runs two filters in parallel, one for each arm. These two frameworks and their performance is compared in a dual-arm task of removing a wheel from a hub.

This thesis also presents a new method for action selection involving touch. This next best touch method selects an available action for interacting with an object that will gain the most information. The algorithm employs information theory to compute an information gain metric that is based on a probabilistic belief suitable for the task. An estimation framework is used to maintain this belief over time. Kinesthetic measurements such as contact and tactile measurements are used to update the state belief after every interactive action. Simulation and experimental results are demonstrated using next best touch for object localization, specifically a door handle on a door. The next best touch theory is extended for model parameter determination. Since many objects within a particular object category share the same rough shape, principle component analysis may be used to parametrize the object mesh models. These parameters can be estimated using the action selection technique that selects the touching action which best both localizes and estimates these parameters. Simulation results are then presented involving localizing and determining a parameter of a screwdriver.

Lastly, the next best touch theory is further extended to model classes. Instead of estimating parameters, object class determination is incorporated into the information gain metric calculation. The best touching action is selected in order to best discern between the possible model classes. Simulation results are presented to validate the theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mixed models may be defined with or without reference to sampling, and can be used to predict realized random effects, as when estimating the latent values of study subjects measured with response error. When the model is specified without reference to sampling, a simple mixed model includes two random variables, one stemming from an exchangeable distribution of latent values of study subjects and the other, from the study subjects` response error distributions. Positive probabilities are assigned to both potentially realizable responses and artificial responses that are not potentially realizable, resulting in artificial latent values. In contrast, finite population mixed models represent the two-stage process of sampling subjects and measuring their responses, where positive probabilities are only assigned to potentially realizable responses. A comparison of the estimators over the same potentially realizable responses indicates that the optimal linear mixed model estimator (the usual best linear unbiased predictor, BLUP) is often (but not always) more accurate than the comparable finite population mixed model estimator (the FPMM BLUP). We examine a simple example and provide the basis for a broader discussion of the role of conditioning, sampling, and model assumptions in developing inference.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a method of image-speckle contrast for the nonprecalibration measurement of the root-mean-square roughness and the lateral-correlation length of random surfaces with Gaussian correlation. We use the simplified model of the speckle fields produced by the weak scattering object in the theoretical analysis. The explicit mathematical relation shows that the saturation value of the image-speckle contrast at a large aperture radius determines the roughness, while the variation of the contrast with the aperture radius determines the lateral-correlation length. In the experimental performance, we specially fabricate the random surface samples with Gaussian correlation. The square of the image-speckle contrast is measured versus the radius of the aperture in the 4f system, and the roughness and the lateral-correlation length are extracted by fitting the theoretical result to the experimental data. Comparison of the measurement with that by an atomic force microscope shows our method has a satisfying accuracy. (C) 2002 Optical Society of America.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This correspondence introduces a new orthogonal forward regression (OFR) model identification algorithm using D-optimality for model structure selection and is based on an M-estimators of parameter estimates. M-estimator is a classical robust parameter estimation technique to tackle bad data conditions such as outliers. Computationally, The M-estimator can be derived using an iterative reweighted least squares (IRLS) algorithm. D-optimality is a model structure robustness criterion in experimental design to tackle ill-conditioning in model Structure. The orthogonal forward regression (OFR), often based on the modified Gram-Schmidt procedure, is an efficient method incorporating structure selection and parameter estimation simultaneously. The basic idea of the proposed approach is to incorporate an IRLS inner loop into the modified Gram-Schmidt procedure. In this manner, the OFR algorithm for parsimonious model structure determination is extended to bad data conditions with improved performance via the derivation of parameter M-estimators with inherent robustness to outliers. Numerical examples are included to demonstrate the effectiveness of the proposed algorithm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In most studies on beef cattle longevity, only the cows reaching a given number of calvings by a specific age are considered in the analyses. With the aim of evaluating all cows with productive life in herds, taking into consideration the different forms of management on each farm, it was proposed to measure cow longevity from age at last calving (ALC), that is, the most recent calving registered in the files. The objective was to characterize this trait in order to study the longevity of Nellore cattle, using the Kaplan-Meier estimators and the Cox model. The covariables and class effects considered in the models were age at first calving (AFC), year and season of birth of the cow and farm. The variable studied (ALC) was classified as presenting complete information (uncensored = 1) or incomplete information (censored = 0), using the criterion of the difference between the date of each cow's last calving and the date of the latest calving at each farm. If this difference was >36 months, the cow was considered to have failed. If not, this cow was censored, thus indicating that future calving remained possible for this cow. The records of 11 791 animals from 22 farms within the Nellore Breed Genetic Improvement Program ('Nellore Brazil') were used. In the estimation process using the Kaplan-Meier model, the variable of AFC was classified into three age groups. In individual analyses, the log-rank test and the Wilcoxon test in the Kaplan-Meier model showed that all covariables and class effects had significant effects (P < 0.05) on ALC. In the analysis considering all covariables and class effects, using the Wald test in the Cox model, only the season of birth of the cow was not significant for ALC (P > 0.05). This analysis indicated that each month added to AFC diminished the risk of the cow's failure in the herd by 2%. Nonetheless, this does not imply that animals with younger AFC had less profitability. Cows with greater numbers of calvings were more precocious than those with fewer calvings. Copyright © The Animal Consortium 2012.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: Primary: 62M10, 62J02, 62F12, 62M05, 62P05, 62P10; secondary: 60G46, 60F15.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62J12, 62F35

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Designing and estimating civil concrete structures is a complex process which to many practitioners is tied to manual or semi-manual processes of 2D design and cannot be further improved by automated, interacting design-estimating processes. This paper presents a feasibility study for the development an automated estimator for concrete bridge design. The study offers a value proposition: an efficient automated model-based estimator can add value to the whole bridge design-estimating process, i.e., reducing estimation errors, shortening the duration of success estimates, and increasing the benefit of doing cost estimation when compared with the current practice. This is then followed by a description of what is in an efficient automated model-based estimator and how it should be used. Finally the process of model-based estimating is compared with the current practice to highlight the values embedded in the automated processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biased estimation has the advantage of reducing the mean squared error (MSE) of an estimator. The question of interest is how biased estimation affects model selection. In this paper, we introduce biased estimation to a range of model selection criteria. Specifically, we analyze the performance of the minimum description length (MDL) criterion based on biased and unbiased estimation and compare it against modern model selection criteria such as Kay's conditional model order estimator (CME), the bootstrap and the more recently proposed hook-and-loop resampling based model selection. The advantages and limitations of the considered techniques are discussed. The results indicate that, in some cases, biased estimators can slightly improve the selection of the correct model. We also give an example for which the CME with an unbiased estimator fails, but could regain its power when a biased estimator is used.