927 resultados para feature inspection method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, a method that synchronizes two video sequences is proposed. Unlike previous methods, which require the existence of correspondences between features tracked in the two sequences, and/or that the cameras are static or jointly moving, the proposed approach does not impose any of these constraints. It works when the cameras move independently, even if different features are tracked in the two sequences. The assumptions underlying the proposed strategy are that the intrinsic parameters of the cameras are known and that two rigid objects, with independent motions on the scene, are visible in both sequences. The relative motion between these objects is used as clue for the synchronization. The extrinsic parameters of the cameras are assumed to be unknown. A new synchronization algorithm for static or jointly moving cameras that see (possibly) different parts of a common rigidly moving object is also proposed. Proof-of-concept experiments that illustrate the performance of these methods are presented, as well as a comparison with a state-of-the-art approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows that countries characterized by a financial accelerator mechanism may reverse the usual finding of the literature -- flexible exchange rate regimes do a worse job of insulating open economies from external shocks. I obtain this result with a calibrated small open economy model that endogenizes foreign interest rates by linking them to the banking sector's foreign currency leverage. This relationship renders exchange rate policy more important compared to the usual exogeneity assumption. I find empirical support for this prediction using the Local Projections method. Finally, 2nd order approximation to the model finds larger welfare losses under flexible regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In population studies, most current methods focus on identifying one outcome-related SNP at a time by testing for differences of genotype frequencies between disease and healthy groups or among different population groups. However, testing a great number of SNPs simultaneously has a problem of multiple testing and will give false-positive results. Although, this problem can be effectively dealt with through several approaches such as Bonferroni correction, permutation testing and false discovery rates, patterns of the joint effects by several genes, each with weak effect, might not be able to be determined. With the availability of high-throughput genotyping technology, searching for multiple scattered SNPs over the whole genome and modeling their joint effect on the target variable has become possible. Exhaustive search of all SNP subsets is computationally infeasible for millions of SNPs in a genome-wide study. Several effective feature selection methods combined with classification functions have been proposed to search for an optimal SNP subset among big data sets where the number of feature SNPs far exceeds the number of observations. ^ In this study, we take two steps to achieve the goal. First we selected 1000 SNPs through an effective filter method and then we performed a feature selection wrapped around a classifier to identify an optimal SNP subset for predicting disease. And also we developed a novel classification method-sequential information bottleneck method wrapped inside different search algorithms to identify an optimal subset of SNPs for classifying the outcome variable. This new method was compared with the classical linear discriminant analysis in terms of classification performance. Finally, we performed chi-square test to look at the relationship between each SNP and disease from another point of view. ^ In general, our results show that filtering features using harmononic mean of sensitivity and specificity(HMSS) through linear discriminant analysis (LDA) is better than using LDA training accuracy or mutual information in our study. Our results also demonstrate that exhaustive search of a small subset with one SNP, two SNPs or 3 SNP subset based on best 100 composite 2-SNPs can find an optimal subset and further inclusion of more SNPs through heuristic algorithm doesn't always increase the performance of SNP subsets. Although sequential forward floating selection can be applied to prevent from the nesting effect of forward selection, it does not always out-perform the latter due to overfitting from observing more complex subset states. ^ Our results also indicate that HMSS as a criterion to evaluate the classification ability of a function can be used in imbalanced data without modifying the original dataset as against classification accuracy. Our four studies suggest that Sequential Information Bottleneck(sIB), a new unsupervised technique, can be adopted to predict the outcome and its ability to detect the target status is superior to the traditional LDA in the study. ^ From our results we can see that the best test probability-HMSS for predicting CVD, stroke,CAD and psoriasis through sIB is 0.59406, 0.641815, 0.645315 and 0.678658, respectively. In terms of group prediction accuracy, the highest test accuracy of sIB for diagnosing a normal status among controls can reach 0.708999, 0.863216, 0.639918 and 0.850275 respectively in the four studies if the test accuracy among cases is required to be not less than 0.4. On the other hand, the highest test accuracy of sIB for diagnosing a disease among cases can reach 0.748644, 0.789916, 0.705701 and 0.749436 respectively in the four studies if the test accuracy among controls is required to be at least 0.4. ^ A further genome-wide association study through Chi square test shows that there are no significant SNPs detected at the cut-off level 9.09451E-08 in the Framingham heart study of CVD. Study results in WTCCC can only detect two significant SNPs that are associated with CAD. In the genome-wide study of psoriasis most of top 20 SNP markers with impressive classification accuracy are also significantly associated with the disease through chi-square test at the cut-off value 1.11E-07. ^ Although our classification methods can achieve high accuracy in the study, complete descriptions of those classification results(95% confidence interval or statistical test of differences) require more cost-effective methods or efficient computing system, both of which can't be accomplished currently in our genome-wide study. We should also note that the purpose of this study is to identify subsets of SNPs with high prediction ability and those SNPs with good discriminant power are not necessary to be causal markers for the disease.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives. The purpose of this paper is to conduct a literature review of research relating to foodborne illness, food inspection policy, and restaurants in the United States. Aim 1: To convey the public health importance of studying restaurant food inspection policies and suggest that more research is needed in this field, Aim 2: To conduct a systematic literature review of recent literature pertaining to this subject such that future researchers can understand the: (1) Public perception and expectations of restaurant food inspection policies; (2) Arguments in favor of a grade card policy; and, conversely; (3) Reasons why inspection policies may not work. ^ Data/methods. This paper utilizes a systematic review format to review articles relating to food inspections and restaurants in the U.S. Eight articles were reviewed. ^ Results. The resulting data from the literature provides no conclusive answer as to how, when, and in what method inspection policies should be carried out. The authors do, however, put forward varying solutions as to how to fix the problem of foodborne illness outbreaks in restaurants. These solutions include the implementation of grade cards in restaurants and, conversely, a complete overhaul of the inspection policy system.^ Discussion. The literature on foodborne disease, food inspection policy, and restaurants in the U.S. is limited and varied. But, from the research that is available, we can see that two schools of thought exist. The first of these calls for the implementation of a grade card system, while the second proposes a reassessment and possible overhaul of the food inspection policy system. It is still unclear which of these methods would best slow the increase in foodborne disease transmission in the U.S.^ Conclusion. In order to arrive at solutions to the problem of foodborne disease transmission as it relates to restaurants in this country, we may need to look at literature from other countries and, subsequently, begin incremental changes in the way inspection policies are developed and enforced.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nearly continuous recovery of 0.5 km of generally fresh, layer 3 gabbroic rocks at Hole 735B, especially near the bottom of the section, presents scientists an unusual opportunity to study the detailed elastic properties of the lower oceanic crust. Extending compressional-wave and density shipboard measurements at room pressure, Vp and Vs were measured at pressures from 20 to 200 MPa using the pulse transmission method. All of the rocks exhibit significant increases in velocity with increasing pressure up to about 150 MPa, a feature attributed to the closing of microcrack porosity. Measured velocities reflect the mineralogical makeup and microstructures acquired during the tectonic history of Hole 735B. Most of the undeformed and unaltered gabbros are approximately 65:35 plagioclase/clinopyroxene rocks plus olivine or oxide minerals, and the observed densities and velocities are fully consistent with the Voigt-Reuss-Hill (VRH) averages of the component minerals and their proportions. Depending on their olivine content, the predominant olivine gabbros at 200 MPa have average Vp = 7.1 ± 0.2 km/s, Vs = 3.9 ± 0.1 km/s, and grain densities of 2.95 ± 0.5 g/cm3. The less abundant iron-titanium (Fe-Ti) oxide gabbros average Vp = 6.75 ± 0.15 km/s, Vs = 3.70 ± 0.1 km/s, and grain densities of 3.22 ± 0.05 g/cm3, reflecting the higher densities and lower velocities of oxide minerals compared to olivine. About 30% of the core is plastically deformed, and the densities and directionally averaged velocities of these shear-zone tectonites are generally consistent with those of the gabbros, their protoliths. Three sets of observations indicate that the shear-zone metagabbros are elastically anisotropic: (1) directional variations in Vp, both vertical and horizontal and with respect to foliation and lineation; (2) discrepancies among Vp values for the horizontal cores and the VRH averages of the component minerals and their mineral proportions, suggesting preferred crystallographic orientations of anisotropic minerals; and (3) variations of Vs of up to 7%, with polarization directions parallel and perpendicular to foliation. Optical inspection of thin sections of the same samples indicates that plagioclase feldspar, clinopyroxene, and amphibole typically display crystallographic-preferred orientations, and this, plus the elastic anisotropy of these minerals, suggests that preferred orientations are responsible for much of the observed anisotropy, particularly at high pressure. Alteration tends to be localized to brittle faults and brecciated zones, and typical alteration minerals are amphibole and secondary plagioclase, which do not significantly change the velocity-density relationships.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to develop a software that allows the inspection of spur gear manufactured in the sub-millimeter range. The measurements are made using a digital optical machine and using an analysis proprietary software implemented in Matlab®, which is able to handle images, captured using the digital optical machine. The software allows to evaluate the profile and pitch deviations as establish in the ISO/TR 10064-1:1992 standard

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a method for the identification of different partial discharges (PDs) sources through the analysis of a collection of PD signals acquired with a PD measurement system. This method, robust and sensitive enough to cope with noisy data and external interferences, combines the characterization of each signal from the collection, with a clustering procedure, the CLARA algorithm. Several features are proposed for the characterization of the signals, being the wavelet variances, the frequency estimated with the Prony method, and the energy, the most relevant for the performance of the clustering procedure. The result of the unsupervised classification is a set of clusters each containing those signals which are more similar to each other than to those in other clusters. The analysis of the classification results permits both the identification of different PD sources and the discrimination between original PD signals, reflections, noise and external interferences. The methods and graphical tools detailed in this paper have been coded and published as a contributed package of the R environment under a GNU/GPL license.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metrological confirmation process must be designed and implemented to ensure that metrological characteristics of the measurement system meet metrological requirements of the measurement process. The aim of this paper is to present an alternative method to the traditional metrological requirements about the relationship between tolerance and measurement uncertainty, to develop such confirmation processes. The proposed way to metrological confirmation considers a given inspection task of the measurement process into the manufacturing system, and it is based on the Index of Contamination of the Capability, ICC. Metrological confirmation process is then developed taking into account the producer risks and economic considerations on this index. As a consequence, depending on the capability of the manufacturing process, the measurement system will be or will not be in adequate state of metrological confirmation for the measurement process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies feature subset selection in classification using a multiobjective estimation of distribution algorithm. We consider six functions, namely area under ROC curve, sensitivity, specificity, precision, F1 measure and Brier score, for evaluation of feature subsets and as the objectives of the problem. One of the characteristics of these objective functions is the existence of noise in their values that should be appropriately handled during optimization. Our proposed algorithm consists of two major techniques which are specially designed for the feature subset selection problem. The first one is a solution ranking method based on interval values to handle the noise in the objectives of this problem. The second one is a model estimation method for learning a joint probabilistic model of objectives and variables which is used to generate new solutions and advance through the search space. To simplify model estimation, l1 regularized regression is used to select a subset of problem variables before model learning. The proposed algorithm is compared with a well-known ranking method for interval-valued objectives and a standard multiobjective genetic algorithm. Particularly, the effects of the two new techniques are experimentally investigated. The experimental results show that the proposed algorithm is able to obtain comparable or better performance on the tested datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los métodos de detección rápida de microorganismos se están convirtiendo en una herramienta esencial para el control de calidad en el área de la biotecnología, como es el caso de las industrias de alimentos y productos farmacéuticos y bioquímicos. En este escenario, el objetivo de esta tesis doctoral es desarrollar una técnica de inspección rápida de microoganismos basada en ultrasonidos. La hipótesis propuesta es que la combinación de un dispositivo ultrasónico de medida y un medio líquido diseñado específicamente para producir y atrapar burbujas, pueden constituir la base de un método sensible y rápido de detección de contaminaciones microbianas. La técnica presentada es efectiva para bacterias catalasa-positivas y se basa en la hidrólisis del peróxido de hidrógeno inducida por la catalasa. El resultado de esta reacción es un medio con una creciente concentración de burbujas. Tal medio ha sido estudiado y modelado desde el punto de vista de la propagación ultrasónica. Las propiedades deducidas a partir del análisis cinemático de la enzima se han utilizado para evaluar el método como técnica de inspección microbiana. En esta tesis, se han investigado aspectos teóricos y experimentales de la hidrólisis del peróxido de hidrógeno. Ello ha permitido describir cuantitativamente y comprender el fenómeno de la detección de microorganismos catalasa-positivos mediante la medida de parámetros ultrasónicos. Más concretamente, los experimentos realizados muestran cómo el oxígeno que aparece en forma de burbujas queda atrapado mediante el uso de un gel sobre base de agar. Este gel fue diseñado y preparado especialmente para esta aplicación. A lo largo del proceso de hidrólisis del peróxido de hidrógeno, se midió la atenuación de la onda y el “backscattering” producidos por las burbujas, utilizando una técnica de pulso-eco. Ha sido posible detectar una actividad de la catalasa de hasta 0.001 unidades/ml. Por otra parte, este estudio muestra que por medio del método propuesto, se puede lograr una detección microbiana para concentraciones de 105 células/ml en un periodo de tiempo corto, del orden de unos pocos minutos. Estos resultados suponen una mejora significativa de tres órdenes de magnitud en comparación con otros métodos de detección por ultrasonidos. Además, la sensibilidad es competitiva con modernos y rápidos métodos microbiológicos como la detección de ATP por bioluminiscencia. Pero sobre todo, este trabajo muestra una metodología para el desarrollo de nuevas técnicas de detección rápida de bacterias basadas en ultrasonidos. ABSTRACT In an industrial scenario where rapid microbiological methods are becoming essential tools for quality control in the biotechnological area such as food, pharmaceutical and biochemical; the objective of the work presented in this doctoral thesis is to develop a rapid microorganism inspection technique based on ultrasounds. It is proposed that the combination of an ultrasonic measuring device with a specially designed liquid medium, able to produce and trap bubbles could constitute the basis of a sensitive and rapid detection method for microbial contaminations. The proposed technique is effective on catalase positive microorganisms. Well-known catalase induced hydrogen peroxide hydrolysis is the fundamental of the developed method. The physical consequence of the catalase induced hydrogen peroxide hydrolysis is an increasingly bubbly liquid medium. Such medium has been studied and modeled from the point of view of ultrasonic propagation. Properties deduced from enzyme kinematics analysis have been extrapolated to investigate the method as a microbial inspection technique. In this thesis, theoretical and experimental aspects of the hydrogen peroxide hydrolysis were analyzed in order to quantitatively describe and understand the catalase positive microorganism detection by means of ultrasonic measurements. More concretely, experiments performed show how the produced oxygen in form of bubbles is trapped using the new gel medium based on agar, which was specially designed for this application. Ultrasonic attenuation and backscattering is measured in this medium using a pulse-echo technique along the hydrogen peroxide hydrolysis process. Catalase enzymatic activity was detected down to 0.001 units/ml. Moreover, this study shows that by means of the proposed method, microbial detection can be achieved down to 105 cells/ml in a short time period of the order of few minutes. These results suppose a significant improvement of three orders of magnitude compared to other ultrasonic detection methods for microorganisms. In addition, the sensitivity reached is competitive with modern rapid microbiological methods such as ATP detection by bioluminescence. But above all, this work points out a way to proceed for developing new rapid microbial detection techniques based on ultrasound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traumatic Brain Injury -TBI- -1- is defined as an acute event that causes certain damage to areas of the brain. TBI may result in a significant impairment of an individuals physical, cognitive and psychosocial functioning. The main consequence of TBI is a dramatic change in the individuals daily life involving a profound disruption of the family, a loss of future income capacity and an increase of lifetime cost. One of the main challenges of TBI Neuroimaging is to develop robust automated image analysis methods to detect signatures of TBI, such as: hyper-intensity areas, changes in image contrast and in brain shape. The final goal of this research is to develop a method to identify the altered brain structures by automatically detecting landmarks on the image where signal changes and to provide comprehensive information to the clinician about them. These landmarks identify injured structures by co-registering the patient?s image with an atlas where landmarks have been previously detected. The research work has been initiated by identifying brain structures on healthy subjects to validate the proposed method. Later, this method will be used to identify modified structures on TBI imaging studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a method for the identification of different partial discharges (PDs) sources through the analysis of a collection of PD signals acquired with a PD measurement system. This method, robust and sensitive enough to cope with noisy data and external interferences, combines the characterization of each signal from the collection, with a clustering procedure, the CLARA algorithm. Several features are proposed for the characterization of the signals, being the wavelet variances, the frequency estimated with the Prony method, and the energy, the most relevant for the performance of the clustering procedure. The result of the unsupervised classification is a set of clusters each containing those signals which are more similar to each other than to those in other clusters. The analysis of the classification results permits both the identification of different PD sources and the discrimination between original PD signals, reflections, noise and external interferences. The methods and graphical tools detailed in this paper have been coded and published as a contributed package of the R environment under a GNU/GPL license.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the target localization problem in wireless visual sensor networks. Additive noises and measurement errors will affect the accuracy of target localization when the visual nodes are equipped with low-resolution cameras. In the goal of improving the accuracy of target localization without prior knowledge of the target, each node extracts multiple feature points from images to represent the target at the sensor node level. A statistical method is presented to match the most correlated feature point pair for merging the position information of different sensor nodes at the base station. Besides, in the case that more than one target exists in the field of interest, a scheme for locating multiple targets is provided. Simulation results show that, our proposed method has desirable performance in improving the accuracy of locating single target or multiple targets. Results also show that the proposed method has a better trade-off between camera node usage and localization accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Comunicación presentada en el XI Workshop of Physical Agents, Valencia, 9-10 septiembre 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several recent works deal with 3D data in mobile robotic problems, e.g., mapping. Data comes from any kind of sensor (time of flight, Kinect or 3D lasers) that provide a huge amount of unorganized 3D data. In this paper we detail an efficient approach to build complete 3D models using a soft computing method, the Growing Neural Gas (GNG). As neural models deal easily with noise, imprecision, uncertainty or partial data, GNG provides better results than other approaches. The GNG obtained is then applied to a sequence. We present a comprehensive study on GNG parameters to ensure the best result at the lowest time cost. From this GNG structure, we propose to calculate planar patches and thus obtaining a fast method to compute the movement performed by a mobile robot by means of a 3D models registration algorithm. Final results of 3D mapping are also shown.