945 resultados para Density-based Scanning Algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The registration of pre-operative volumetric datasets to intra- operative two-dimensional images provides an improved way of verifying patient position and medical instrument loca- tion. In applications from orthopedics to neurosurgery, it has a great value in maintaining up-to-date information about changes due to intervention. We propose a mutual information- based registration algorithm to establish the proper align- ment. For optimization purposes, we compare the perfor- mance of the non-gradient Powell method and two slightly di erent versions of a stochastic gradient ascent strategy: one using a sparsely sampled histogramming approach and the other Parzen windowing to carry out probability density approximation. Our main contribution lies in adopting the stochastic ap- proximation scheme successfully applied in 3D-3D registra- tion problems to the 2D-3D scenario, which obviates the need for the generation of full DRRs at each iteration of pose op- timization. This facilitates a considerable savings in compu- tation expense. We also introduce a new probability density estimator for image intensities via sparse histogramming, de- rive gradient estimates for the density measures required by the maximization procedure and introduce the framework for a multiresolution strategy to the problem. Registration results are presented on uoroscopy and CT datasets of a plastic pelvis and a real skull, and on a high-resolution CT- derived simulated dataset of a real skull, a plastic skull, a plastic pelvis and a plastic lumbar spine segment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a navigation system for autonomous underwater vehicles (AUVs) in partially structured environments, such as dams, harbors, marinas or marine platforms. A mechanical scanning imaging sonar is used to obtain information about the location of planar structures present in such environments. A modified version of the Hough transform has been developed to extract line features, together with their uncertainty, from the continuous sonar dataflow. The information obtained is incorporated into a feature-based SLAM algorithm running an Extended Kalman Filter (EKF). Simultaneously, the AUV's position estimate is provided to the feature extraction algorithm to correct the distortions that the vehicle motion produces in the acoustic images. Experiments carried out in a marina located in the Costa Brava (Spain) with the Ictineu AUV show the viability of the proposed approach

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a fuzzy Markov random field (FMRF) model is used to segment land-objects into free, grass, building, and road regions by fusing remotely, sensed LIDAR data and co-registered color bands, i.e. scanned aerial color (RGB) photo and near infra-red (NIR) photo. An FMRF model is defined as a Markov random field (MRF) model in a fuzzy domain. Three optimization algorithms in the FMRF model, i.e. Lagrange multiplier (LM), iterated conditional mode (ICM), and simulated annealing (SA), are compared with respect to the computational cost and segmentation accuracy. The results have shown that the FMRF model-based ICM algorithm balances the computational cost and segmentation accuracy in land-cover segmentation from LIDAR data and co-registered bands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A hybridised and Knowledge-based Evolutionary Algorithm (KEA) is applied to the multi-criterion minimum spanning tree problems. Hybridisation is used across its three phases. In the first phase a deterministic single objective optimization algorithm finds the extreme points of the Pareto front. In the second phase a K-best approach finds the first neighbours of the extreme points, which serve as an elitist parent population to an evolutionary algorithm in the third phase. A knowledge-based mutation operator is applied in each generation to reproduce individuals that are at least as good as the unique parent. The advantages of KEA over previous algorithms include its speed (making it applicable to large real-world problems), its scalability to more than two criteria, and its ability to find both the supported and unsupported optimal solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses how numerical gradient estimation methods may be used in order to reduce the computational demands on a class of multidimensional clustering algorithms. The study is motivated by the recognition that several current point-density based cluster identification algorithms could benefit from a reduction of computational demand if approximate a-priori estimates of the cluster centres present in a given data set could be supplied as starting conditions for these algorithms. In this particular presentation, the algorithm shown to benefit from the technique is the Mean-Tracking (M-T) cluster algorithm, but the results obtained from the gradient estimation approach may also be applied to other clustering algorithms and their related disciplines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anycasting communication is proposed in IPv6, and it is designed to support server replication by allowing applications to select and communicate with the “best” server, according to some performance or policy criteria, among the replicated servers. Originally any-cast researchers focus on network layer. In this paper we pay more attention to application-layer anycasting, because at application layer we can obtain more flexibility and scalability. First of all, we describe the application-layer anycast model, and then summarize the previous work in application-layer anycasting, especially the periodical probing algorithms for updating the database of anycast resolver. After that, we present our algorithm, the requirement-based probing algorithm, an efficient and practical algorithm. In the end, we analyse the algorithms using the queuing theory and the statistics characteristics of Internet traffic. The results show that the requirement-base probing algorithm has better performance not only in the average waiting time for all anycast queries, but also in the average time used for an anycast query.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a novel model for web-based database systems based on the multicast and anycast' protocols. In the model, we design a middleware, castway, which locates between the database server and the Web server. Every castway in a distributed system operates as a multicast node and an anycast node independently, respectively. The proposed mechanism can balance the workload among the distributed database servers, and offers the "best" server to serve for a query. Three algorithms are employed for the model: the requirement-based probing algorithm for anycast routing, the atomic multicast update algorithm for database synchronization, and the job deviation algorithm for system workload balance. The simulations and experiments show that the proposed model works very well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study contributes to work in baggage handling system (BHS) control, specifically dynamic bag routing. Although studies in BHS agent-based control have examined the need for intelligent control, but there has not been an effort to explore the dynamic routing problem. As such, this study provides additional insight into how agents can learn to route in a BHS. This study describes a BHS status-based routing algorithm that applies learning methods to select criteria based on routing decisions. Although numerous studies have identified the need for dynamic routing, little analytic attention has been paid to intelligent agents for learning routing tables rather than manual creation of routing rules. We address this issue by demonstrating the ability of agents to learn how to route based on bag status, a robust method that is able to function in a variety of different BHS designs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new non-parametric method for uncertainty quantification through construction of prediction intervals (PIs). The method takes the left and right end points of the type-reduced set of an interval type-2 fuzzy logic system (IT2FLS) model as the lower and upper bounds of a PI. No assumption is made in regard to the data distribution, behaviour, and patterns when developing intervals. A training method is proposed to link the confidence level (CL) concept of PIs to the intervals generated by IT2FLS models. The new PI-based training algorithm not only ensures that PIs constructed using IT2FLS models satisfy the CL requirements, but also reduces widths of PIs and generates practically informative PIs. Proper adjustment of parameters of IT2FLSs is performed through the minimization of a PI-based objective function. A metaheuristic method is applied for minimization of the non-linear non-differentiable cost function. Performance of the proposed method is examined for seven synthetic and real world benchmark case studies with homogenous and heterogeneous noise. The demonstrated results indicate that the proposed method is capable of generating high quality PIs. Comparative studies also show that the performance of the proposed method is equal to or better than traditional neural network-based methods for construction of PIs in more than 90% of cases. The superiority is more evident for the case of data with a heterogeneous noise. © 2014 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Array seismology is an useful tool to perform a detailed investigation of the Earth’s interior. Seismic arrays by using the coherence properties of the wavefield are able to extract directivity information and to increase the ratio of the coherent signal amplitude relative to the amplitude of incoherent noise. The Double Beam Method (DBM), developed by Krüger et al. (1993, 1996), is one of the possible applications to perform a refined seismic investigation of the crust and mantle by using seismic arrays. The DBM is based on a combination of source and receiver arrays leading to a further improvement of the signal-to-noise ratio by reducing the error in the location of coherent phases. Previous DBM works have been performed for mantle and core/mantle resolution (Krüger et al., 1993; Scherbaum et al., 1997; Krüger et al., 2001). An implementation of the DBM has been presented at 2D large-scale (Italian data-set for Mw=9.3, Sumatra earthquake) and at 3D crustal-scale as proposed by Rietbrock & Scherbaum (1999), by applying the revised version of Source Scanning Algorithm (SSA; Kao & Shan, 2004). In the 2D application, the rupture front propagation in time has been computed. In 3D application, the study area (20x20x33 km3), the data-set and the source-receiver configurations are related to the KTB-1994 seismic experiment (Jost et al., 1998). We used 60 short-period seismic stations (200-Hz sampling rate, 1-Hz sensors) arranged in 9 small arrays deployed in 2 concentric rings about 1 km (A-arrays) and 5 km (B-array) radius. The coherence values of the scattering points have been computed in the crustal volume, for a finite time-window along all array stations given the hypothesized origin time and source location. The resulting images can be seen as a (relative) joint log-likelihood of any point in the subsurface that have contributed to the full set of observed seismograms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The alveolated structure of the pulmonary acinus plays a vital role in gas exchange function. Three-dimensional (3D) analysis of the parenchymal region is fundamental to understanding this structure-function relationship, but only a limited number of attempts have been conducted in the past because of technical limitations. In this study, we developed a new image processing methodology based on finite element (FE) analysis for accurate 3D structural reconstruction of the gas exchange regions of the lung. Stereologically well characterized rat lung samples (Pediatr Res 53: 72-80, 2003) were imaged using high-resolution synchrotron radiation-based X-ray tomographic microscopy. A stack of 1,024 images (each slice: 1024 x 1024 pixels) with resolution of 1.4 mum(3) per voxel were generated. For the development of FE algorithm, regions of interest (ROI), containing approximately 7.5 million voxels, were further extracted as a working subunit. 3D FEs were created overlaying the voxel map using a grid-based hexahedral algorithm. A proper threshold value for appropriate segmentation was iteratively determined to match the calculated volume density of tissue to the stereologically determined value (Pediatr Res 53: 72-80, 2003). The resulting 3D FEs are ready to be used for 3D structural analysis as well as for subsequent FE computational analyses like fluid dynamics and skeletonization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This tutorial review article is intended to provide a general guidance to a reader interested to learn about the methodologies to obtain accurate electron density mapping in molecules and crystalline solids, from theory or from experiment, and to carry out a sensible interpretation of the results, for chemical, biochemical or materials science applications. The review mainly focuses on X-ray diffraction techniques and refinement of experimental models, in particular multipolar models. Neutron diffraction, which was widely used in the past to fix accurate positions of atoms, is now used for more specific purposes. The review illustrates three principal analyses of the experimental or theoretical electron density, based on quantum chemical, semi-empirical or empirical interpretation schemes, such as the quantum theory of atoms in molecules, the semi-classical evaluation of interaction energies and the Hirshfeld analysis. In particular, it is shown that a simple topological analysis based on a partition of the electron density cannot alone reveal the whole nature of chemical bonding. More information based on the pair density is necessary. A connection between quantum mechanics and observable quantities is given in order to provide the physical grounds to explain the observations and to justify the interpretations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time-based indoor localization has been investigated for several years but the accuracy of existing solutions is limited by several factors, e.g., imperfect synchronization, signal bandwidth and indoor environment. In this paper, we compare two time-based localization algorithms for narrow-band signals, i.e., multilateration and fingerprinting. First, we develop a new Linear Least Square (LLS) algorithm for Differential Time Difference Of Arrival (DTDOA). Second, fingerprinting is among the most successful approaches used for indoor localization and typically relies on the collection of measurements on signal strength over the area of interest. We propose an alternative by constructing fingerprints of fine-grained time information of the radio signal. We offer comprehensive analytical discussions on the feasibility of the approaches, which are backed up by evaluations in a software defined radio based IEEE 802.15.4 testbed. Our work contributes to research on localization with narrow-band signals. The results show that our proposed DTDOA-based LLS algorithm obviously improves the localization accuracy compared to traditional TDOA-based LLS algorithm but the accuracy is still limited because of the complex indoor environment. Furthermore, we show that time-based fingerprinting is a promising alternative to power-based fingerprinting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SOMS is a general surrogate-based multistart algorithm, which is used in combination with any local optimizer to find global optima for computationally expensive functions with multiple local minima. SOMS differs from previous multistart methods in that a surrogate approximation is used by the multistart algorithm to help reduce the number of function evaluations necessary to identify the most promising points from which to start each nonlinear programming local search. SOMS’s numerical results are compared with four well-known methods, namely, Multi-Level Single Linkage (MLSL), MATLAB’s MultiStart, MATLAB’s GlobalSearch, and GLOBAL. In addition, we propose a class of wavy test functions that mimic the wavy nature of objective functions arising in many black-box simulations. Extensive comparisons of algorithms on the wavy testfunctions and on earlier standard global-optimization test functions are done for a total of 19 different test problems. The numerical results indicate that SOMS performs favorably in comparison to alternative methods and does especially well on wavy functions when the number of function evaluations allowed is limited.