773 resultados para minimization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context. Convergent point (CP) search methods are important tools for studying the kinematic properties of open clusters and young associations whose members share the same spatial motion. Aims. We present a new CP search strategy based on proper motion data. We test the new algorithm on synthetic data and compare it with previous versions of the CP search method. As an illustration and validation of the new method we also present an application to the Hyades open cluster and a comparison with independent results. Methods. The new algorithm rests on the idea of representing the stellar proper motions by great circles over the celestial sphere and visualizing their intersections as the CP of the moving group. The new strategy combines a maximum-likelihood analysis for simultaneously determining the CP and selecting the most likely group members and a minimization procedure that returns a refined CP position and its uncertainties. The method allows one to correct for internal motions within the group and takes into account that the stars in the group lie at different distances. Results. Based on Monte Carlo simulations, we find that the new CP search method in many cases returns a more precise solution than its previous versions. The new method is able to find and eliminate more field stars in the sample and is not biased towards distant stars. The CP solution for the Hyades open cluster is in excellent agreement with previous determinations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: Hypertension is a major issue in public health, and the financial costs associated with hypertension continue to increase. Cost-effectiveness studies focusing on antihypertensive drug combinations, however, have been scarce. The cost-effectiveness ratios of the traditional treatment (hydrochlorothiazide and atenolol) and the current treatment (losartan and amlodipine) were evaluated in patients with grade 1 or 2 hypertension (HT1-2). For patients with grade 3 hypertension (HT3), a third drug was added to the treatment combinations: enalapril was added to the traditional treatment, and hydrochlorothiazide was added to the current treatment. METHODS: Hypertension treatment costs were estimated on the basis of the purchase prices of the antihypertensive medications, and effectiveness was measured as the reduction in systolic blood pressure and diastolic blood pressure (in mm Hg) at the end of a 12-month study period. RESULTS: When the purchase price of the brand-name medication was used to calculate the cost, the traditional treatment presented a lower cost-effectiveness ratio [US$/mm Hg] than the current treatment in the HT1-2 group. In the HT3 group, however, there was no difference in cost-effectiveness ratio between the traditional treatment and the current treatment. The cost-effectiveness ratio differences between the treatment regimens maintained the same pattern when the purchase price of the lower-cost medication was used. CONCLUSIONS: We conclude that the traditional treatment is more cost-effective (US$/mm Hg) than the current treatment in the HT1-2 group. There was no difference in cost-effectiveness between the traditional treatment and the current treatment for the HT3 group.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bilayer segmentation of live video in uncontrolled environments is an essential task for home applications in which the original background of the scene must be replaced, as in videochats or traditional videoconference. The main challenge in such conditions is overcome all difficulties in problem-situations (e. g., illumination change, distract events such as element moving in the background and camera shake) that may occur while the video is being captured. This paper presents a survey of segmentation methods for background substitution applications, describes the main concepts and identifies events that may cause errors. Our analysis shows that although robust methods rely on specific devices (multiple cameras or sensors to generate depth maps) which aid the process. In order to achieve the same results using conventional devices (monocular video cameras), most current research relies on energy minimization frameworks, in which temporal and spacial information are probabilistically combined with those of color and contrast.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accuracy of ranging measurements depends critically on the knowledge of time delays undergone by signals when retransmitted by a remote transponder and due to propagation effects. A new method determines these delays for every single pulsed signal transmission. It utilizes four ground-based reference stations, synchronized in time and installed at well-known geodesic coordinates and a repeater in space, carried by a satellite, balloon, aircraft, and so forth. Signal transmitted by one of the reference bases is retransmitted by the transponder, received back by the four bases, producing four ranging measurements which are processed to determine uniquely the time delays undergone in every retransmission process. A minimization function is derived comparing repeater's positions referred to at least two groups of three reference bases, providing the signal transit time at the repeater and propagation delays, providing the correct repeater position. The method is applicable to the transponder platform positioning and navigation, time synchronization of remote clocks, and location of targets. The algorithm has been demonstrated by simulations adopting a practical example with the transponder carried by an aircraft moving over bases on the ground.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The clustering problem consists in finding patterns in a data set in order to divide it into clusters with high within-cluster similarity. This paper presents the study of a problem, here called MMD problem, which aims at finding a clustering with a predefined number of clusters that minimizes the largest within-cluster distance (diameter) among all clusters. There are two main objectives in this paper: to propose heuristics for the MMD and to evaluate the suitability of the best proposed heuristic results according to the real classification of some data sets. Regarding the first objective, the results obtained in the experiments indicate a good performance of the best proposed heuristic that outperformed the Complete Linkage algorithm (the most used method from the literature for this problem). Nevertheless, regarding the suitability of the results according to the real classification of the data sets, the proposed heuristic achieved better quality results than C-Means algorithm, but worse than Complete Linkage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Os princípios e as diretrizes do Sistema Único de Saúde (SUS) impõem uma estrutura de assistência baseada em redes de políticas públicas que, combinada ao modelo de financiamento adotado, conduz a falhas de mercado. Isso impõe barreiras à gestão do sistema público de saúde e à concretização dos objetivos do SUS. As características institucionais e a heterogeneidade dos atores, aliadas à existência de diferentes redes de atenção à saúde, geram complexidade analítica no estudo da dinâmica global da rede do SUS. Há limitações ao emprego de métodos quantitativos baseados em análise estática com dados retrospectivos do sistema público de saúde. Assim, propõe-se a abordagem do SUS como sistema complexo, a partir da utilização de metodologia quantitativa inovadora baseada em simulação computacional. O presente artigo buscou analisar desafios e potencialidades na utilização de modelagem com autômatos celulares combinada com modelagem baseada em agentes para simulação da evolução da rede de serviços do SUS. Tal abordagem deve permitir melhor compreensão da organização, heterogeneidade e dinâmica estrutural da rede de serviços do SUS e possibilitar minimização dos efeitos das falhas de mercado no sistema de saúde brasileiro.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Small scale fluid flow systems have been studied for various applications, such as chemical reagent dosages and cooling devices of compact electronic components. This work proposes to present the complete cycle development of an optimized heat sink designed by using Topology Optimization Method (TOM) for best performance, including minimization of pressure drop in fluid flow and maximization of heat dissipation effects, aiming small scale applications. The TOM is applied to a domain, to obtain an optimized channel topology, according to a given multi-objective function that combines pressure drop minimization and heat transfer maximization. Stokes flow hypothesis is adopted. Moreover, both conduction and forced convection effects are included in the steady-state heat transfer model. The topology optimization procedure combines the Finite Element Method (to carry out the physical analysis) with Sequential Linear Programming (as the optimization algorithm). Two-dimensional topology optimization results of channel layouts obtained for a heat sink design are presented as example to illustrate the design methodology. 3D computational simulations and prototype manufacturing have been carried out to validate the proposed design methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] This article describes an implementation of the optical flow estimation method introduced by Zach, Pock and Bischof. This method is based on the minimization of a functional containing a data term using the L norm and a regularization term using the total variation of the flow. The main feature of this formulation is that it allows discontinuities in the flow field, while being more robust to noise than the classical approach. The algorithm is an efficient numerical scheme, which solves a relaxed version of the problem by alternate minimization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] The seminal work of Horn and Schunck [8] is the first variational method for optical flow estimation. It introduced a novel framework where the optical flow is computed as the solution of a minimization problem. From the assumption that pixel intensities do not change over time, the optical flow constraint equation is derived. This equation relates the optical flow with the derivatives of the image. There are infinitely many vector fields that satisfy the optical flow constraint, thus the problem is ill-posed. To overcome this problem, Horn and Schunck introduced an additional regularity condition that restricts the possible solutions. Their method minimizes both the optical flow constraint and the magnitude of the variations of the flow field, producing smooth vector fields. One of the limitations of this method is that, typically, it can only estimate small motions. In the presence of large displacements, this method fails when the gradient of the image is not smooth enough. In this work, we describe an implementation of the original Horn and Schunck method and also introduce a multi-scale strategy in order to deal with larger displacements. For this multi-scale strategy, we create a pyramidal structure of downsampled images and change the optical flow constraint equation with a nonlinear formulation. In order to tackle this nonlinear formula, we linearize it and solve the method iteratively in each scale. In this sense, there are two common approaches: one that computes the motion increment in the iterations, like in ; or the one we follow, that computes the full flow during the iterations, like in. The solutions are incrementally refined ower the scales. This pyramidal structure is a standard tool in many optical flow methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] We present in this paper a variational approach to accurately estimate simultaneously the velocity field and its derivatives directly from PIV image sequences. Our method differs from other techniques that have been presented in the literature in the fact that the energy minimization used to estimate the particles motion depends on a second order Taylor development of the flow. In this way, we are not only able to compute the motion vector field, but we also obtain an accurate estimation of their derivatives. Hence, we avoid the use of numerical schemes to compute the derivatives from the estimated flow that usually yield to numerical amplification of the inherent uncertainty on the estimated flow. The performance of our approach is illustrated with the estimation of the motion vector field and the vorticity on both synthetic and real PIV datasets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] In this paper we present a variational technique for the reconstruction of 3D cylindrical surfaces. Roughly speaking by a cylindrical surface we mean a surface that can be parameterized using the projection on a cylinder in terms of two coordinates, representing the displacement and angle in a cylindrical coordinate system respectively. The starting point for our method is a set of different views of a cylindrical surface, as well as a precomputed disparity map estimation between pair of images. The proposed variational technique is based on an energy minimization where we balance on the one hand the regularity of the cylindrical function given by the distance of the surface points to cylinder axis, and on the other hand, the distance between the projection of the surface points on the images and the expected location following the precomputed disparity map estimation between pair of images. One interesting advantage of this approach is that we regularize the 3D surface by means of a bi-dimensio al minimization problem. We show some experimental results for large stereo sequences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] We present an energy based approach to estimate a dense disparity map from a set of two weakly calibrated stereoscopic images while preserving its discontinuities resulting from image boundaries. We first derive a simplified expression for the disparity that allows us to estimate it from a stereo pair of images using an energy minimization approach. We assume that the epipolar geometry is known, and we include this information in the energy model. Discontinuities are preserved by means of a regularization term based on the Nagel-Enkelmann operator. We investigate the associated Euler-Lagrange equation of the energy functional, and we approach the solution of the underlying partial differential equation (PDE) using a gradient descent method The resulting parabolic problem has a unique solution. In order to reduce the risk to be trapped within some irrelevant local minima during the iterations, we use a focusing strategy based on a linear scalespace. Experimental results on both synthetic and real images arere presented to illustrate the capabilities of this PDE and scale-space based method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] In this paper we present a method for the regularization of 3D cylindrical surfaces. By a cylindrical surface we mean a 3D surface that can be expressed as an application S(l; µ) ! R3 , where (l; µ) represents a cylindrical parametrization of the 3D surface. We built an initial cylindrical parametrization of the surface. We propose a new method to regularize such cylindrical surface. This method takes into account the information supplied by the disparity maps computed between pair of images to constraint the regularization of the set of 3D points. We propose a model based on an energy which is composed of two terms: an attachment term that minimizes the difference between the image coordinates and the disparity maps and a second term that enables a regularization by means of anisotropic diffusion. One interesting advantage of this approach is that we regularize the 3D surface by using a bi-dimensional minimization problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] In the last years we have developed some methods for 3D reconstruction. First we began with the problem of reconstructing a 3D scene from a stereoscopic pair of images. We developed some methods based on energy functionals which produce dense disparity maps by preserving discontinuities from image boundaries. Then we passed to the problem of reconstructing a 3D scene from multiple views (more than 2). The method for multiple view reconstruction relies on the method for stereoscopic reconstruction. For every pair of consecutive images we estimate a disparity map and then we apply a robust method that searches for good correspondences through the sequence of images. Recently we have proposed several methods for 3D surface regularization. This is a postprocessing step necessary for smoothing the final surface, which could be afected by noise or mismatch correspondences. These regularization methods are interesting because they use the information from the reconstructing process and not only from the 3D surface. We have tackled all these problems from an energy minimization approach. We investigate the associated Euler-Lagrange equation of the energy functional, and we approach the solution of the underlying partial differential equation (PDE) using a gradient descent method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN] This paper proposes the incorporation of engineering knowledge through both (a) advanced state-of-the-art preference handling decision-making tools integrated in multiobjective evolutionary algorithms and (b) engineering knowledge-based variance reduction simulation as enhancing tools for the robust optimum design of structural frames taking uncertainties into consideration in the design variables.The simultaneous minimization of the constrained weight (adding structuralweight and average distribution of constraint violations) on the one hand and the standard deviation of the distribution of constraint violation on the other are handled with multiobjective optimization-based evolutionary computation in two different multiobjective algorithms. The optimum design values of the deterministic structural problem in question are proposed as a reference point (the aspiration level) in reference-point-based evolutionary multiobjective algorithms (here g-dominance is used). Results including