997 resultados para Randomized algorithm
Resumo:
This paper proposes a parallel architecture for estimation of the motion of an underwater robot. It is well known that image processing requires a huge amount of computation, mainly at low-level processing where the algorithms are dealing with a great number of data. In a motion estimation algorithm, correspondences between two images have to be solved at the low level. In the underwater imaging, normalised correlation can be a solution in the presence of non-uniform illumination. Due to its regular processing scheme, parallel implementation of the correspondence problem can be an adequate approach to reduce the computation time. Taking into consideration the complexity of the normalised correlation criteria, a new approach using parallel organisation of every processor from the architecture is proposed
Resumo:
This paper proposes a pose-based algorithm to solve the full SLAM problem for an autonomous underwater vehicle (AUV), navigating in an unknown and possibly unstructured environment. The technique incorporate probabilistic scan matching with range scans gathered from a mechanical scanning imaging sonar (MSIS) and the robot dead-reckoning displacements estimated from a Doppler velocity log (DVL) and a motion reference unit (MRU). The proposed method utilizes two extended Kalman filters (EKF). The first, estimates the local path travelled by the robot while grabbing the scan as well as its uncertainty and provides position estimates for correcting the distortions that the vehicle motion produces in the acoustic images. The second is an augment state EKF that estimates and keeps the registered scans poses. The raw data from the sensors are processed and fused in-line. No priory structural information or initial pose are considered. The algorithm has been tested on an AUV guided along a 600 m path within a marina environment, showing the viability of the proposed approach
Resumo:
The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved
Resumo:
The aim of traffic engineering is to optimise network resource utilization. Although several works on minimizing network resource utilization have been published, few works have focused on LSR label space. This paper proposes an algorithm that uses MPLS label stack features in order to reduce the number of labels used in LSPs forwarding. Some tunnelling methods and their MPLS implementation drawbacks are also discussed. The algorithm described sets up the NHLFE tables in each LSR, creating asymmetric tunnels when possible. Experimental results show that the algorithm achieves a large reduction factor in the label space. The work presented here applies for both types of connections: P2MP and P2P
Resumo:
In computer graphics, global illumination algorithms take into account not only the light that comes directly from the sources, but also the light interreflections. This kind of algorithms produce very realistic images, but at a high computational cost, especially when dealing with complex environments. Parallel computation has been successfully applied to such algorithms in order to make it possible to compute highly-realistic images in a reasonable time. We introduce here a speculation-based parallel solution for a global illumination algorithm in the context of radiosity, in which we have taken advantage of the hierarchical nature of such an algorithm
Resumo:
Introducción: el dolor neuropático es una patología de considerable prevalencia e impacto socio-económico en la población latinoamericana, la evidencia clínica sugiere que los ligandos de canales de calcio y el parche de Lidocaína pueden tratar exitosamente el dolor neuropático periférico y localizado. Metodología: se realizo una evaluación económica tipo costo-efectividad, observacional y retrospectiva con datos extraídos de las historias clínicas de pacientes atendidos en la clínica de dolor de la IPS. La variable primaria de efectividad fue la mejoría del dolor medida mediante escala visual análoga. Resultados: se estudiaron 94 pacientes tratados con: Gabapentina (G) 21, Pregabalina (P) 24, Gabapentina+ lidocaína (G/P) 24, Pregabalina + Lidocaína (P/L) 25, los costos asociados al tratamiento son los siguientes COP$114.070.835, COP$105.855.920, COP$88.717.481 COP$89.854.712 respectivamente, el número de pacientes con mejoría significativa de dolor fue: 8,10,9 y 21 pacientes respectivamente. El ICER de G/L con respecto a G fue: COP$ -25.353.354. El ICER de P/L con respecto a P fue: COP$ -1.454.655. Conclusiones: la adición del parche de lidocaína a la terapia regular con P/L represento una disminución de consumo de recursos en salud como uso de medicamentos co-analgésicos, analgésicos de rescate y fármacos para controlar reacciones adversas, de la misma forma que consultas a profesionales de la salud. Cada paciente manejado con P/L representa un ahorro de COP $1.454.655 al contrario si se manejase con el anticonvulsivante de manera exclusiva, en el caso de G/L este ahorro es de COP $ 25.353.354 frente a G sola.
Resumo:
Introducción La infección por Clostridium difficile, es una de las causas más frecuentes de diarrea nosocomial con una alta morbimortalidad, con un aumento exponencial en su incidencia, en Estados Unidos se duplicó, de 261 casos x 100.000 en 1993 pasó a 546 x 100.000 en 2003 2, y en Canadá se encontraron datos similares con un aumento de 4.5 veces, en 1991 de 35.6 casos x 100.000 a 156.3 casos por 100.000 en 2004 3 . Se han descrito varios factores asociados Materiales y Métodos Se trata de un estudio descriptivo de tipo serie de casos en el que se evaluaron pacientes con diagnóstico de infección por C. Difficile y los factores asociados en un Hospital Universitario entre febrero de 2010 hasta septiembre de 2011 Resultados Se recolectaron 31 pacientes la edad promedio fue de 58 años con un rango entre 18 y 93 años, de los cuales 19 (61%) fueron mujeres y 12 (39%) hombres. El factor asociado a la infección por C. Difficile más frecuentemente encontrado fue el uso de inhibidores de bomba de protones con 54.84% (n=17) .No se encontraron pacientes VIH positivos o con diagnóstico de enfermedad inflamatoria intestinal. Ningún paciente presentó complicaciones asociadas a la infección ni mortalidad alguna. Conclusión El factor asociado que más se presentó fue el uso de antimicrobianos en los quince dias previos al inicio del cuadro en el 74% de los pacientes lo que coincide con lo presentado en la literatura mundial.
Resumo:
We carried out a randomized controlled trial in Bogotá, the recipient of Colombia´s highest number of internally displaced people (IDP), to assess whether the use of SMS to communicate eligibility to social benefits fosters the welfare of victimized internal refugees. Only a fraction of IDP are elegible to benefits. We inform eligibility via SMS to a random half of IDP-households who are, and estimate the Local Average Treatment Effect of the text message on the knowledge of the benefits available tothe displaced population. We show that while on the average treated households know their rights better than controls, a more disaggregate analysis suggest that there is variation of awareness across benefits. The intervention was overall successful in empowering IDP and the use of SMS should be widened as a social policy instrument. However our results suggest that text messages should be complemented with other communication strategies, yet to be evaluated.
Resumo:
Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach
Resumo:
This paper discusses the auditory brainstem response (ABR) testing for infants.
Resumo:
This paper describes the results of an investigation which examined the efficacy of a feedback equalization algorithm incorporated into the Central Institute for the Deaf Wearable Digital Hearing Aid. The study examined whether the feedback equalization would allow for greater usable gains when subjects listened to soft speech signals, and if so, whether or not this would improve speech intelligibility.
Resumo:
An improved algorithm for the generation of gridded window brightness temperatures is presented. The primary data source is the International Satellite Cloud Climatology Project, level B3 data, covering the period from July 1983 to the present. The algorithm rakes window brightness, temperatures from multiple satellites, both geostationary and polar orbiting, which have already been navigated and normalized radiometrically to the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer, and generates 3-hourly global images on a 0.5 degrees by 0.5 degrees latitude-longitude grid. The gridding uses a hierarchical scheme based on spherical kernel estimators. As part of the gridding procedure, the geostationary data are corrected for limb effects using a simple empirical correction to the radiances, from which the corrected temperatures are computed. This is in addition to the application of satellite zenith angle weighting to downweight limb pixels in preference to nearer-nadir pixels. The polar orbiter data are windowed on the target time with temporal weighting to account for the noncontemporaneous nature of the data. Large regions of missing data are interpolated from adjacent processed images using a form of motion compensated interpolation based on the estimation of motion vectors using an hierarchical block matching scheme. Examples are shown of the various stages in the process. Also shown are examples of the usefulness of this type of data in GCM validation.
Resumo:
Modern methods of spawning new technological motifs are not appropriate when it is desired to realize artificial life as an actual real world entity unto itself (Pattee 1995; Brooks 2006; Chalmers 1995). Many fundamental aspects of such a machine are absent in common methods, which generally lack methodologies of construction. In this paper we mix classical and modern studies in order to attempt to realize an artificial life form from first principles. A model of an algorithm is introduced, its methodology of construction is presented, and the fundamental source from which it sprang is discussed.