812 resultados para bilateral filtering
Resumo:
OBJECTIVE: This study aimed to assess prevalence and risk factors for mild/high-frequency bilateral sensorineural hearing loss within a UK population of children at age 11 years. DESIGN: Prospective birth cohort study. STUDY SAMPLE: Repeat hearing thresholds were measured in 5032 children, as part of the Avon Longitudinal Study of Parents and Children (ALSPAC) at age 7, 9, and 11 years. Pregnancy, birth, and early medical history were obtained prospectively through parental questionnaires and medical records. RESULTS: Twenty children had mild and seven had high-frequency bilateral sensorineural hearing loss, giving a combined prevalence of 0.5% (95% CI 0.4-0.8%). These children were more likely than the rest of the study sample to have been admitted to hospital at 6-18 months (OR 2.7, 95% CI 1.00-7.30). Parents of these children were more likely to have suspected a hearing problem when the children were 3 years old (OR 2.4, 95% CI 1.05-5.60). CONCLUSIONS: This is the first UK prospective cohort study to investigate the prevalence of mild and high-frequency hearing loss. This study, which has the advantage of a large sample size and repeat hearing measures over a four year period, reports lower prevalence values than US cross-sectional studies.
Resumo:
Background In recent years new models of intraocular lenses are appearing on the market to reduce requirements for additional optical correction. The purpose of this study is to assess visual outcomes following bilateral cataract surgery and the implant of a FineVision® trifocal intraocular lens (IOL). Methods Prospective, nonrandomized, observational study. Vision was assessed in 44 eyes of 22 patients (mean age 68.4 ± 5.5 years) before and 3 months after surgery. Aberrations were determined using the Topcon KR-1 W wave-front analyzer. LogMAR visual acuity was measured at distance (corrected distance visual acuity, CDVA 4 m), intermediate (distance corrected intermediate visual acuity, DCIVA 60 cm) and near (distance corrected near visual acuity, DCNVA 40 cm). The Pelli-Robson letter chart and the CSV-1000 test were used to estimate contrast sensitivity (CS). Defocus curve testing was performed in photopic and mesopic conditions. Adverse photic phenomena were assessed using the Halo v1.0 program. Results Mean aberration values for a mesopic pupil diameter were: total HOA RMS: 0.41 ± 0.30 μm, coma: 0.32 ± 0.22 μm and spherical aberration: 0.21 ± 0.20 μm. Binocular logMAR measurements were: CDVA −0.05 ± 0.05, DCIVA 0.15 ± 0.10, and DCNVA 0.06 ± 0.10. Mean Pelli-Robson CS was 1.40 ± 0.14 log units. Mean CSV100 CS for the 4 frequencies examined (A: 3 cycles/degree (cpd), B: 6 cpd, C: 12 cpd, D: 18 cpd) were 1.64 ± 0.14, 1.77 ± 0.18, 1.44 ± 0.24 and 0.98 ± 0.24 log units, respectively. Significant differences were observed in defocus curves for photopic and mesopic conditions (p < 0.0001). A mean disturbance index of 0.28 ± 0.22 was obtained. Conclusions Bilateral FineVision IOL implant achieved a full range of adequate vision, satisfactory contrast sensitivity, and a lack of significant adverse photic phenomena. Trial registration Eudract Clinical Trials Registry Number: 2014-003266-2.
Resumo:
The conjugate gradient is the most popular optimization method for solving large systems of linear equations. In a system identification problem, for example, where very large impulse response is involved, it is necessary to apply a particular strategy which diminishes the delay, while improving the convergence time. In this paper we propose a new scheme which combines frequency-domain adaptive filtering with a conjugate gradient technique in order to solve a high order multichannel adaptive filter, while being delayless and guaranteeing a very short convergence time.
Resumo:
SIN FINANCIACIÓN
Resumo:
El interés de esta monografía es analizar la transformación de relación bilateral colombo – estadounidense en materia de seguridad y defensa durante el periodo 2002 – 2014, y cómo dicha transformación puede incidir en la formulación de la política exterior colombiana. Se analizará la política exterior de Álvaro Uribe Vélez y la del actual presidente Juan Manuel Santos. Esto se llevará a cabo bajo dos de las teorías de Relaciones Internacionales, el realismo subalterno y neoclásico, las cuales ayudarán a entender el porqué del cambio de la política exterior de colombiana.
Resumo:
Con la Declaración del Milenio en la cual los Estados miembros de la ONU se comprometieron a trabajar por el cumplimiento de Los Objetivos de Desarrollo del Milenio, los Objetivos se convierten en la principal agenda del desarrollo en la cual la superación dela pobreza en todas sus dimensiones es el centro de la agenda. En este sentido, tanto el gobierno Colombiano como el Programa de las Naciones Unidas para el Desarrollo han construido planes de acción con compromisos y responsabilidades de las partes cooperantes para la consecución de las metas establecidas para el primer ODM. Es así que el propósito de la presente investigación es responder la siguiente pregunta: ¿Qué factores inciden en los resultados de la cooperación entre una organización internacional y un Estado? a través del estudio de caso de la cooperación internacional entre el Programa de las Naciones Unidas para el Desarrollo (PNUD) y el Estado colombiano en la consecución del primer Objetivo del Milenio: eliminar la pobreza extrema y el hambre. La hipótesis de la investigación inicial era que los resultados obtenidos de la cooperación entre el PNUD y el Estado colombiano en cuanto a la consecución del primer Objetivo del Milenio respondían principalmente a los siguientes factores: a) la apropiación de los Objetivos por parte del Estado colombiano; b) los programas que desarrolla el PNUD dentro del territorio colombiano y c) la descoordinación de políticas entre las entidades territoriales de Colombia y el Estado. Sin embargo, al concluir la investigación se determinó que la descoordinación de políticas entre las entidades no es un factor principal. Por el contrario la condición de desigualdad en Colombia si se presenta como el tercer factor que incidió en los resultados.
Resumo:
Perceptual aliasing makes topological navigation a difficult task. In this paper we present a general approach for topological SLAM~(simultaneous localisation and mapping) which does not require motion or odometry information but only a sequence of noisy measurements from visited places. We propose a particle filtering technique for topological SLAM which relies on a method for disambiguating places which appear indistinguishable using neighbourhood information extracted from the sequence of observations. The algorithm aims to induce a small topological map which is consistent with the observations and simultaneously estimate the location of the robot. The proposed approach is evaluated using a data set of sonar measurements from an indoor environment which contains several similar places. It is demonstrated that our approach is capable of dealing with severe ambiguities and, and that it infers a small map in terms of vertices which is consistent with the sequence of observations.
Resumo:
Experience plays an important role in building management. “How often will this asset need repair?” or “How much time is this repair going to take?” are types of questions that project and facility managers face daily in planning activities. Failure or success in developing good schedules, budgets and other project management tasks depend on the project manager's ability to obtain reliable information to be able to answer these types of questions. Young practitioners tend to rely on information that is based on regional averages and provided by publishing companies. This is in contrast to experienced project managers who tend to rely heavily on personal experience. Another aspect of building management is that many practitioners are seeking to improve available scheduling algorithms, estimating spreadsheets and other project management tools. Such “micro-scale” levels of research are important in providing the required tools for the project manager's tasks. However, even with such tools, low quality input information will produce inaccurate schedules and budgets as output. Thus, it is also important to have a broad approach to research at a more “macro-scale.” Recent trends show that the Architectural, Engineering, Construction (AEC) industry is experiencing explosive growth in its capabilities to generate and collect data. There is a great deal of valuable knowledge that can be obtained from the appropriate use of this data and therefore the need has arisen to analyse this increasing amount of available data. Data Mining can be applied as a powerful tool to extract relevant and useful information from this sea of data. Knowledge Discovery in Databases (KDD) and Data Mining (DM) are tools that allow identification of valid, useful, and previously unknown patterns so large amounts of project data may be analysed. These technologies combine techniques from machine learning, artificial intelligence, pattern recognition, statistics, databases, and visualization to automatically extract concepts, interrelationships, and patterns of interest from large databases. The project involves the development of a prototype tool to support facility managers, building owners and designers. This final report presents the AIMMTM prototype system and documents how and what data mining techniques can be applied, the results of their application and the benefits gained from the system. The AIMMTM system is capable of searching for useful patterns of knowledge and correlations within the existing building maintenance data to support decision making about future maintenance operations. The application of the AIMMTM prototype system on building models and their maintenance data (supplied by industry partners) utilises various data mining algorithms and the maintenance data is analysed using interactive visual tools. The application of the AIMMTM prototype system to help in improving maintenance management and building life cycle includes: (i) data preparation and cleaning, (ii) integrating meaningful domain attributes, (iii) performing extensive data mining experiments in which visual analysis (using stacked histograms), classification and clustering techniques, associative rule mining algorithm such as “Apriori” and (iv) filtering and refining data mining results, including the potential implications of these results for improving maintenance management. Maintenance data of a variety of asset types were selected for demonstration with the aim of discovering meaningful patterns to assist facility managers in strategic planning and provide a knowledge base to help shape future requirements and design briefing. Utilising the prototype system developed here, positive and interesting results regarding patterns and structures of data have been obtained.
Resumo:
This report presents the demonstration of software agents prototype system for improving maintenance management [AIMM] including: • Developing and implementing a user focused approach for mining the maintenance data of buildings. This report presents the demonstration of software agents prototype system for improving maintenance management [AIMM] including: • Developing and implementing a user focused approach for mining the maintenance data of buildings. • Refining the development of a multi agent system for data mining in virtual environments (Active Worlds) by developing and implementing a filtering agent on the results obtained from applying data mining techniques on the maintenance data. • Integrating the filtering agent within the multi agents system in an interactive networked multi-user 3D virtual environment. • Populating maintenance data and discovering new rules of knowledge.
Resumo:
Experience plays an important role in building management. “How often will this asset need repair?” or “How much time is this repair going to take?” are types of questions that project and facility managers face daily in planning activities. Failure or success in developing good schedules, budgets and other project management tasks depend on the project manager's ability to obtain reliable information to be able to answer these types of questions. Young practitioners tend to rely on information that is based on regional averages and provided by publishing companies. This is in contrast to experienced project managers who tend to rely heavily on personal experience. Another aspect of building management is that many practitioners are seeking to improve available scheduling algorithms, estimating spreadsheets and other project management tools. Such “micro-scale” levels of research are important in providing the required tools for the project manager's tasks. However, even with such tools, low quality input information will produce inaccurate schedules and budgets as output. Thus, it is also important to have a broad approach to research at a more “macro-scale.” Recent trends show that the Architectural, Engineering, Construction (AEC) industry is experiencing explosive growth in its capabilities to generate and collect data. There is a great deal of valuable knowledge that can be obtained from the appropriate use of this data and therefore the need has arisen to analyse this increasing amount of available data. Data Mining can be applied as a powerful tool to extract relevant and useful information from this sea of data. Knowledge Discovery in Databases (KDD) and Data Mining (DM) are tools that allow identification of valid, useful, and previously unknown patterns so large amounts of project data may be analysed. These technologies combine techniques from machine learning, artificial intelligence, pattern recognition, statistics, databases, and visualization to automatically extract concepts, interrelationships, and patterns of interest from large databases. The project involves the development of a prototype tool to support facility managers, building owners and designers. This Industry focused report presents the AIMMTM prototype system and documents how and what data mining techniques can be applied, the results of their application and the benefits gained from the system. The AIMMTM system is capable of searching for useful patterns of knowledge and correlations within the existing building maintenance data to support decision making about future maintenance operations. The application of the AIMMTM prototype system on building models and their maintenance data (supplied by industry partners) utilises various data mining algorithms and the maintenance data is analysed using interactive visual tools. The application of the AIMMTM prototype system to help in improving maintenance management and building life cycle includes: (i) data preparation and cleaning, (ii) integrating meaningful domain attributes, (iii) performing extensive data mining experiments in which visual analysis (using stacked histograms), classification and clustering techniques, associative rule mining algorithm such as “Apriori” and (iv) filtering and refining data mining results, including the potential implications of these results for improving maintenance management. Maintenance data of a variety of asset types were selected for demonstration with the aim of discovering meaningful patterns to assist facility managers in strategic planning and provide a knowledge base to help shape future requirements and design briefing. Utilising the prototype system developed here, positive and interesting results regarding patterns and structures of data have been obtained.
Resumo:
To navigate successfully in a previously unexplored environment, a mobile robot must be able to estimate the spatial relationships of the objects of interest accurately. A Simultaneous Localization and Mapping (SLAM) sys- tem employs its sensors to build incrementally a map of its surroundings and to localize itself in the map simultaneously. The aim of this research project is to develop a SLAM system suitable for self propelled household lawnmowers. The proposed bearing-only SLAM system requires only an omnidirec- tional camera and some inexpensive landmarks. The main advantage of an omnidirectional camera is the panoramic view of all the landmarks in the scene. Placing landmarks in a lawn field to define the working domain is much easier and more flexible than installing the perimeter wire required by existing autonomous lawnmowers. The common approach of existing bearing-only SLAM methods relies on a motion model for predicting the robot’s pose and a sensor model for updating the pose. In the motion model, the error on the estimates of object positions is cumulated due mainly to the wheel slippage. Quantifying accu- rately the uncertainty of object positions is a fundamental requirement. In bearing-only SLAM, the Probability Density Function (PDF) of landmark position should be uniform along the observed bearing. Existing methods that approximate the PDF with a Gaussian estimation do not satisfy this uniformity requirement. This thesis introduces both geometric and proba- bilistic methods to address the above problems. The main novel contribu- tions of this thesis are: 1. A bearing-only SLAM method not requiring odometry. The proposed method relies solely on the sensor model (landmark bearings only) without relying on the motion model (odometry). The uncertainty of the estimated landmark positions depends on the vision error only, instead of the combination of both odometry and vision errors. 2. The transformation of the spatial uncertainty of objects. This thesis introduces a novel method for translating the spatial un- certainty of objects estimated from a moving frame attached to the robot into the global frame attached to the static landmarks in the environment. 3. The characterization of an improved PDF for representing landmark position in bearing-only SLAM. The proposed PDF is expressed in polar coordinates, and the marginal probability on range is constrained to be uniform. Compared to the PDF estimated from a mixture of Gaussians, the PDF developed here has far fewer parameters and can be easily adopted in a probabilistic framework, such as a particle filtering system. The main advantages of our proposed bearing-only SLAM system are its lower production cost and flexibility of use. The proposed system can be adopted in other domestic robots as well, such as vacuum cleaners or robotic toys when terrain is essentially 2D.
Resumo:
The explosive growth of the World-Wide-Web and the emergence of ecommerce are the major two factors that have led to the development of recommender systems (Resnick and Varian, 1997). The main task of recommender systems is to learn from users and recommend items (e.g. information, products or books) that match the users’ personal preferences. Recommender systems have been an active research area for more than a decade. Many different techniques and systems with distinct strengths have been developed to generate better quality recommendations. One of the main factors that affect recommenders’ recommendation quality is the amount of information resources that are available to the recommenders. The main feature of the recommender systems is their ability to make personalised recommendations for different individuals. However, for many ecommerce sites, it is difficult for them to obtain sufficient knowledge about their users. Hence, the recommendations they provided to their users are often poor and not personalised. This information insufficiency problem is commonly referred to as the cold-start problem. Most existing research on recommender systems focus on developing techniques to better utilise the available information resources to achieve better recommendation quality. However, while the amount of available data and information remains insufficient, these techniques can only provide limited improvements to the overall recommendation quality. In this thesis, a novel and intuitive approach towards improving recommendation quality and alleviating the cold-start problem is attempted. This approach is enriching the information resources. It can be easily observed that when there is sufficient information and knowledge base to support recommendation making, even the simplest recommender systems can outperform the sophisticated ones with limited information resources. Two possible strategies are suggested in this thesis to achieve the proposed information enrichment for recommenders: • The first strategy suggests that information resources can be enriched by considering other information or data facets. Specifically, a taxonomy-based recommender, Hybrid Taxonomy Recommender (HTR), is presented in this thesis. HTR exploits the relationship between users’ taxonomic preferences and item preferences from the combination of the widely available product taxonomic information and the existing user rating data, and it then utilises this taxonomic preference to item preference relation to generate high quality recommendations. • The second strategy suggests that information resources can be enriched simply by obtaining information resources from other parties. In this thesis, a distributed recommender framework, Ecommerce-oriented Distributed Recommender System (EDRS), is proposed. The proposed EDRS allows multiple recommenders from different parties (i.e. organisations or ecommerce sites) to share recommendations and information resources with each other in order to improve their recommendation quality. Based on the results obtained from the experiments conducted in this thesis, the proposed systems and techniques have achieved great improvement in both making quality recommendations and alleviating the cold-start problem.