823 resultados para BP algorithm
Resumo:
Image segmentation of natural scenes constitutes a major problem in machine vision. This paper presents a new proposal for the image segmentation problem which has been based on the integration of edge and region information. This approach begins by detecting the main contours of the scene which are later used to guide a concurrent set of growing processes. A previous analysis of the seed pixels permits adjustment of the homogeneity criterion to the region's characteristics during the growing process. Since the high variability of regions representing outdoor scenes makes the classical homogeneity criteria useless, a new homogeneity criterion based on clustering analysis and convex hull construction is proposed. Experimental results have proven the reliability of the proposed approach
Resumo:
This paper proposes a parallel architecture for estimation of the motion of an underwater robot. It is well known that image processing requires a huge amount of computation, mainly at low-level processing where the algorithms are dealing with a great number of data. In a motion estimation algorithm, correspondences between two images have to be solved at the low level. In the underwater imaging, normalised correlation can be a solution in the presence of non-uniform illumination. Due to its regular processing scheme, parallel implementation of the correspondence problem can be an adequate approach to reduce the computation time. Taking into consideration the complexity of the normalised correlation criteria, a new approach using parallel organisation of every processor from the architecture is proposed
Resumo:
This paper proposes a pose-based algorithm to solve the full SLAM problem for an autonomous underwater vehicle (AUV), navigating in an unknown and possibly unstructured environment. The technique incorporate probabilistic scan matching with range scans gathered from a mechanical scanning imaging sonar (MSIS) and the robot dead-reckoning displacements estimated from a Doppler velocity log (DVL) and a motion reference unit (MRU). The proposed method utilizes two extended Kalman filters (EKF). The first, estimates the local path travelled by the robot while grabbing the scan as well as its uncertainty and provides position estimates for correcting the distortions that the vehicle motion produces in the acoustic images. The second is an augment state EKF that estimates and keeps the registered scans poses. The raw data from the sensors are processed and fused in-line. No priory structural information or initial pose are considered. The algorithm has been tested on an AUV guided along a 600 m path within a marina environment, showing the viability of the proposed approach
Resumo:
The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved
Resumo:
The aim of traffic engineering is to optimise network resource utilization. Although several works on minimizing network resource utilization have been published, few works have focused on LSR label space. This paper proposes an algorithm that uses MPLS label stack features in order to reduce the number of labels used in LSPs forwarding. Some tunnelling methods and their MPLS implementation drawbacks are also discussed. The algorithm described sets up the NHLFE tables in each LSR, creating asymmetric tunnels when possible. Experimental results show that the algorithm achieves a large reduction factor in the label space. The work presented here applies for both types of connections: P2MP and P2P
Resumo:
In computer graphics, global illumination algorithms take into account not only the light that comes directly from the sources, but also the light interreflections. This kind of algorithms produce very realistic images, but at a high computational cost, especially when dealing with complex environments. Parallel computation has been successfully applied to such algorithms in order to make it possible to compute highly-realistic images in a reasonable time. We introduce here a speculation-based parallel solution for a global illumination algorithm in the context of radiosity, in which we have taken advantage of the hierarchical nature of such an algorithm
Resumo:
Introducción: el dolor neuropático es una patología de considerable prevalencia e impacto socio-económico en la población latinoamericana, la evidencia clínica sugiere que los ligandos de canales de calcio y el parche de Lidocaína pueden tratar exitosamente el dolor neuropático periférico y localizado. Metodología: se realizo una evaluación económica tipo costo-efectividad, observacional y retrospectiva con datos extraídos de las historias clínicas de pacientes atendidos en la clínica de dolor de la IPS. La variable primaria de efectividad fue la mejoría del dolor medida mediante escala visual análoga. Resultados: se estudiaron 94 pacientes tratados con: Gabapentina (G) 21, Pregabalina (P) 24, Gabapentina+ lidocaína (G/P) 24, Pregabalina + Lidocaína (P/L) 25, los costos asociados al tratamiento son los siguientes COP$114.070.835, COP$105.855.920, COP$88.717.481 COP$89.854.712 respectivamente, el número de pacientes con mejoría significativa de dolor fue: 8,10,9 y 21 pacientes respectivamente. El ICER de G/L con respecto a G fue: COP$ -25.353.354. El ICER de P/L con respecto a P fue: COP$ -1.454.655. Conclusiones: la adición del parche de lidocaína a la terapia regular con P/L represento una disminución de consumo de recursos en salud como uso de medicamentos co-analgésicos, analgésicos de rescate y fármacos para controlar reacciones adversas, de la misma forma que consultas a profesionales de la salud. Cada paciente manejado con P/L representa un ahorro de COP $1.454.655 al contrario si se manejase con el anticonvulsivante de manera exclusiva, en el caso de G/L este ahorro es de COP $ 25.353.354 frente a G sola.
Resumo:
Objetivo: Describir los factores relacionados con la toma de decisión de manejo quirúrgico en pacientes con hidronefrosis secundaria a estrechez de la unión pieloureteral en el servicio de Urología Pediátrica de una institución de IV nivel. Materiales y Métodos: Se realizó un estudio descriptivo retrospectivo. Se seleccionaron por conveniencia a 100 pacientes con diagnóstico antenatal de hidronefrosis, 37 fueron llevados a manejo quirúrgico por estrechez de la unión pieloureteral (EUPU) entre los años 2009 y 2012. Se evaluaron los factores que llevaron a la toma| de esta decisión. Resultados: Los pacientes con diagnóstico postnatal de EUPU representaron el 37% de la población, la indicación de manejo quirúrgico en 13 pacientes (35,1%) fue dilatación caliceal (SFU 3), en 21 pacientes (56,8%) de deterioro de la función renal y en los restantes (8,1%) infección urinaria recurrente. Se encontró una progresión de 30% en la severidad de la dilatación en el periodo postnatal, habían 9 pacientes (24% de la muestra) SFU de 3 y 4 en el periodo prenatal y 20 (54%) en el periodo postnatal que fueron llevados a manejo quirúrgico. De los pacientes que disponíamos de datos precisos de valores de variación porcentual de gammagrafía 16% de la muestra, se encontró que había una variación del 50% en deterioro de la función renal. Conclusión: En el grupo de pacientes colombianos de la consulta externa del servicio de urología pediátrica estudiado se encontró que la decisión de manejo quirúrgico en pacientes con EUPU, está en concordancia con lo encontrado en la literatura mundial, siendo estos la presencia de dilatación caliceal deterioro de la función renal en gammagrafía DMSA.
Resumo:
Introducción: La enfermedad celiaca (EC) es una enfermedad autoinmune (EA) intestinal desencadenada por la ingesta de gluten. Por la falta de información de la presencia de EC en Latinoamérica (LA), nosotros investigamos la prevalencia de la enfermedad en esta región utilizando una revisión sistemática de la literatura y un meta-análisis. Métodos y resultados: Este trabajo fue realizado en dos fases: La primera, fue un estudio de corte transversal de 300 individuos Colombianos. La segunda, fue una revisión sistemática y una meta-regresión siguiendo las guías PRSIMA. Nuestros resultados ponen de manifiesto una falta de anti-transglutaminasa tisular (tTG) e IgA anti-endomisio (EMA) en la población Colombiana. En la revisión sistemática, 72 artículos cumplían con los criterios de selección, la prevalencia estimada de EC en LA fue de 0,46% a 0,64%, mientras que la prevalencia en familiares de primer grado fue de 5,5 a 5,6%, y en los pacientes con diabetes mellitus tipo 1 fue de 4,6% a 8,7% Conclusión: Nuestro estudio muestra que la prevalencia de EC en pacientes sanos de LA es similar a la notificada en la población europea.
Resumo:
Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach
Resumo:
This paper discusses the auditory brainstem response (ABR) testing for infants.
Resumo:
This paper describes the results of an investigation which examined the efficacy of a feedback equalization algorithm incorporated into the Central Institute for the Deaf Wearable Digital Hearing Aid. The study examined whether the feedback equalization would allow for greater usable gains when subjects listened to soft speech signals, and if so, whether or not this would improve speech intelligibility.
Resumo:
An improved algorithm for the generation of gridded window brightness temperatures is presented. The primary data source is the International Satellite Cloud Climatology Project, level B3 data, covering the period from July 1983 to the present. The algorithm rakes window brightness, temperatures from multiple satellites, both geostationary and polar orbiting, which have already been navigated and normalized radiometrically to the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer, and generates 3-hourly global images on a 0.5 degrees by 0.5 degrees latitude-longitude grid. The gridding uses a hierarchical scheme based on spherical kernel estimators. As part of the gridding procedure, the geostationary data are corrected for limb effects using a simple empirical correction to the radiances, from which the corrected temperatures are computed. This is in addition to the application of satellite zenith angle weighting to downweight limb pixels in preference to nearer-nadir pixels. The polar orbiter data are windowed on the target time with temporal weighting to account for the noncontemporaneous nature of the data. Large regions of missing data are interpolated from adjacent processed images using a form of motion compensated interpolation based on the estimation of motion vectors using an hierarchical block matching scheme. Examples are shown of the various stages in the process. Also shown are examples of the usefulness of this type of data in GCM validation.
Resumo:
Modern methods of spawning new technological motifs are not appropriate when it is desired to realize artificial life as an actual real world entity unto itself (Pattee 1995; Brooks 2006; Chalmers 1995). Many fundamental aspects of such a machine are absent in common methods, which generally lack methodologies of construction. In this paper we mix classical and modern studies in order to attempt to realize an artificial life form from first principles. A model of an algorithm is introduced, its methodology of construction is presented, and the fundamental source from which it sprang is discussed.
Resumo:
An algorithm is presented for the generation of molecular models of defective graphene fragments, containing a majority of 6-membered rings with a small number of 5- and 7-membered rings as defects. The structures are generated from an initial random array of points in 2D space, which are then subject to Delaunay triangulation. The dual of the triangulation forms a Voronoi tessellation of polygons with a range of ring sizes. An iterative cycle of refinement, involving deletion and addition of points followed by further triangulation, is performed until the user-defined criteria for the number of defects are met. The array of points and connectivities are then converted to a molecular structure and subject to geometry optimization using a standard molecular modeling package to generate final atomic coordinates. On the basis of molecular mechanics with minimization, this automated method can generate structures, which conform to user-supplied criteria and avoid the potential bias associated with the manual building of structures. One application of the algorithm is the generation of structures for the evaluation of the reactivity of different defect sites. Ab initio electronic structure calculations on a representative structure indicate preferential fluorination close to 5-ring defects.