970 resultados para Automatic Peak Detection
Resumo:
This thesis proposes design methods and test tools, for optical systems, which may be used in an industrial environment, where not only precision and reliability but also ease of use is important. The approach to the problem has been conceived to be as general as possible, although in the present work, the design of a portable device for automatic identification applications has been studied, because this doctorate has been funded by Datalogic Scanning Group s.r.l., a world-class producer of barcode readers. The main functional components of the complete device are: electro-optical imaging, illumination and pattern generator systems. For what concerns the electro-optical imaging system, a characterization tool and an analysis one has been developed to check if the desired performance of the system has been achieved. Moreover, two design tools for optimizing the imaging system have been implemented. The first optimizes just the core of the system, the optical part, improving its performance ignoring all other contributions and generating a good starting point for the optimization of the whole complex system. The second tool optimizes the system taking into account its behavior with a model as near as possible to reality including optics, electronics and detection. For what concerns the illumination and the pattern generator systems, two tools have been implemented. The first allows the design of free-form lenses described by an arbitrary analytical function exited by an incoherent source and is able to provide custom illumination conditions for all kind of applications. The second tool consists of a new method to design Diffractive Optical Elements excited by a coherent source for large pattern angles using the Iterative Fourier Transform Algorithm. Validation of the design tools has been obtained, whenever possible, comparing the performance of the designed systems with those of fabricated prototypes. In other cases simulations have been used.
From fall-risk assessment to fall detection: inertial sensors in the clinical routine and daily life
Resumo:
Falls are caused by complex interaction between multiple risk factors which may be modified by age, disease and environment. A variety of methods and tools for fall risk assessment have been proposed, but none of which is universally accepted. Existing tools are generally not capable of providing a quantitative predictive assessment of fall risk. The need for objective, cost-effective and clinically applicable methods would enable quantitative assessment of fall risk on a subject-specific basis. Tracking objectively falls risk could provide timely feedback about the effectiveness of administered interventions enabling intervention strategies to be modified or changed if found to be ineffective. Moreover, some of the fundamental factors leading to falls and what actually happens during a fall remain unclear. Objectively documented and measured falls are needed to improve knowledge of fall in order to develop more effective prevention strategies and prolong independent living. In the last decade, several research groups have developed sensor-based automatic or semi-automatic fall risk assessment tools using wearable inertial sensors. This approach may also serve to detect falls. At the moment, i) several fall-risk assessment studies based on inertial sensors, even if promising, lack of a biomechanical model-based approach which could provide accurate and more detailed measurements of interests (e.g., joint moments, forces) and ii) the number of published real-world fall data of older people in a real-world environment is minimal since most authors have used simulations with healthy volunteers as a surrogate for real-world falls. With these limitations in mind, this thesis aims i) to suggest a novel method for the kinematics and dynamics evaluation of functional motor tasks, often used in clinics for the fall-risk evaluation, through a body sensor network and a biomechanical approach and ii) to define the guidelines for a fall detection algorithm based on a real-world fall database availability.
Resumo:
This thesis concerns artificially intelligent natural language processing systems that are capable of learning the properties of lexical items (properties like verbal valency or inflectional class membership) autonomously while they are fulfilling their tasks for which they have been deployed in the first place. Many of these tasks require a deep analysis of language input, which can be characterized as a mapping of utterances in a given input C to a set S of linguistically motivated structures with the help of linguistic information encoded in a grammar G and a lexicon L: G + L + C → S (1) The idea that underlies intelligent lexical acquisition systems is to modify this schematic formula in such a way that the system is able to exploit the information encoded in S to create a new, improved version of the lexicon: G + L + S → L' (2) Moreover, the thesis claims that a system can only be considered intelligent if it does not just make maximum usage of the learning opportunities in C, but if it is also able to revise falsely acquired lexical knowledge. So, one of the central elements in this work is the formulation of a couple of criteria for intelligent lexical acquisition systems subsumed under one paradigm: the Learn-Alpha design rule. The thesis describes the design and quality of a prototype for such a system, whose acquisition components have been developed from scratch and built on top of one of the state-of-the-art Head-driven Phrase Structure Grammar (HPSG) processing systems. The quality of this prototype is investigated in a series of experiments, in which the system is fed with extracts of a large English corpus. While the idea of using machine-readable language input to automatically acquire lexical knowledge is not new, we are not aware of a system that fulfills Learn-Alpha and is able to deal with large corpora. To instance four major challenges of constructing such a system, it should be mentioned that a) the high number of possible structural descriptions caused by highly underspeci ed lexical entries demands for a parser with a very effective ambiguity management system, b) the automatic construction of concise lexical entries out of a bulk of observed lexical facts requires a special technique of data alignment, c) the reliability of these entries depends on the system's decision on whether it has seen 'enough' input and d) general properties of language might render some lexical features indeterminable if the system tries to acquire them with a too high precision. The cornerstone of this dissertation is the motivation and development of a general theory of automatic lexical acquisition that is applicable to every language and independent of any particular theory of grammar or lexicon. This work is divided into five chapters. The introductory chapter first contrasts three different and mutually incompatible approaches to (artificial) lexical acquisition: cue-based queries, head-lexicalized probabilistic context free grammars and learning by unification. Then the postulation of the Learn-Alpha design rule is presented. The second chapter outlines the theory that underlies Learn-Alpha and exposes all the related notions and concepts required for a proper understanding of artificial lexical acquisition. Chapter 3 develops the prototyped acquisition method, called ANALYZE-LEARN-REDUCE, a framework which implements Learn-Alpha. The fourth chapter presents the design and results of a bootstrapping experiment conducted on this prototype: lexeme detection, learning of verbal valency, categorization into nominal count/mass classes, selection of prepositions and sentential complements, among others. The thesis concludes with a review of the conclusions and motivation for further improvements as well as proposals for future research on the automatic induction of lexical features.
Resumo:
Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.
Resumo:
Autism Spectrum Disorders (ASDs) describe a set of neurodevelopmental disorders. ASD represents a significant public health problem. Currently, ASDs are not diagnosed before the 2nd year of life but an early identification of ASDs would be crucial as interventions are much more effective than specific therapies starting in later childhood. To this aim, cheap an contact-less automatic approaches recently aroused great clinical interest. Among them, the cry and the movements of the newborn, both involving the central nervous system, are proposed as possible indicators of neurological disorders. This PhD work is a first step towards solving this challenging problem. An integrated system is presented enabling the recording of audio (crying) and video (movements) data of the newborn, their automatic analysis with innovative techniques for the extraction of clinically relevant parameters and their classification with data mining techniques. New robust algorithms were developed for the selection of the voiced parts of the cry signal, the estimation of acoustic parameters based on the wavelet transform and the analysis of the infant’s general movements (GMs) through a new body model for segmentation and 2D reconstruction. In addition to a thorough literature review this thesis presents the state of the art on these topics that shows that no studies exist concerning normative ranges for newborn infant cry in the first 6 months of life nor the correlation between cry and movements. Through the new automatic methods a population of control infants (“low-risk”, LR) was compared to a group of “high-risk” (HR) infants, i.e. siblings of children already diagnosed with ASD. A subset of LR infants clinically diagnosed as newborns with Typical Development (TD) and one affected by ASD were compared. The results show that the selected acoustic parameters allow good differentiation between the two groups. This result provides new perspectives both diagnostic and therapeutic.
Resumo:
Navigated ultrasound (US) imaging is used for the intra-operative acquisition of 3D image data during imageguided surgery. The presented approach includes the design of a compact and easy to use US calibration device and its integration into a software application for navigated liver surgery. User interaction during the calibration process is minimized through automatic detection of the calibration process followed by automatic image segmentation, calculation of the calibration transform and validation of the obtained result. This leads to a fast, interaction-free and fully automatic calibration procedure enabling intra-operative
Resumo:
A new approach for the determination of free and total valproic acid in small samples of 140 μL human plasma based on capillary electrophoresis with contactless conductivity detection is proposed. A dispersive liquid-liquid microextraction technique was employed in order to remove biological matrices prior to instrumental analysis. The free valproic acid was determined by isolating free valproic acid from protein-bound valproic acid by ultrafiltration under centrifugation of 100 μL sample. The filtrate was acidified to turn valproic acid into its protonated neutral form and then extracted. The determination of total valproic acid was carried out by acidifying 40 μL untreated plasma to release the protein-bound valproic acid prior to extraction. A solution consisting of 10 mM histidine, 10 mM 3-(N-morpholino)propanesulfonic acid and 10 μM hexadecyltrimethylammonium bromide of pH 6.5 was used as background electrolyte for the electrophoretic separation. The method showed good linearity in the range of 0.4-300 μg/mL with a correlation coefficient of 0.9996. The limit of detection was 0.08 μg/mL, and the reproducibility of the peak area was excellent (RSD=0.7-3.5%, n=3, for the concentration range from 1 to 150 μg/mL). The results for the free and total valproic acid concentration in human plasma were found to be comparable to those obtained with a standard immunoassay. The corresponding correlation coefficients were 0.9847 for free and 0.9521 for total valproic acid.
Resumo:
Milk cortisol concentration was determined under routine management conditions on 4 farms with an auto-tandem milking parlor and 8 farms with 1 of 2 automatic milking systems (AMS). One of the AMS was a partially forced (AMSp) system, and the other was a free cow traffic (AMSf) system. Milk samples were collected for all the cows on a given farm (20 to 54 cows) for at least 1 d. Behavioral observations were made during the milking process for a subset of 16 to 20 cows per farm. Milk cortisol concentration was evaluated by milking system, time of day, behavior during milking, daily milk yield, and somatic cell count using linear mixed-effects models. Milk cortisol did not differ between systems (AMSp: 1.15 +/- 0.07; AMSf: 1.02 +/- 0.12; auto-tandem parlor: 1.01 +/- 0.16 nmol/L). Cortisol concentrations were lower in evening than in morning milkings (1.01 +/- 0.12 vs. 1.24 +/- 0.13 nmol/L). The daily periodicity of cortisol concentration was characterized by an early morning peak and a late afternoon elevation in AMSp. A bimodal pattern was not evident in AMSf. Finally, milk cortisol decreased by a factor of 0.915 in milking parlors, by 0.998 in AMSp, and increased by a factor of 1.161 in AMSf for each unit of ln(somatic cell count/1,000). We conclude that milking cows in milking parlors or AMS does not result in relevant stress differences as measured by milk cortisol concentrations. The biological relevance of the difference regarding the daily periodicity of milk cortisol concentrations observed between the AMSp and AMSf needs further investigation.
Resumo:
OBJECTIVE: Smuggling dissolved drugs, especially cocaine, in bottled liquids is an ongoing problem at borders. Common fluoroscopy of packages at the border cannot detect contaminated liquids. The objective of our study was to develop an MDCT screening method to detect cocaine-containing vessels that are hidden between uncontaminated ones in a shipment. MATERIALS AND METHODS: Studies were performed on three wine bottles containing cocaine solutions that were confiscated at the Swiss border. Reference values were obtained by scans of different sorts of commercially available wine and aqueous solutions of dissolved sugar. All bottles were scanned using MDCT, and data evaluation was performed by measuring the mean peak of Hounsfield units. To verify the method, simulated testing was performed. RESULTS: Using measurements of the mean peak of Hounsfield units enables the detection of dissolved cocaine in wine bottles in a noninvasive and rapid fashion. Increasing opacity corresponds well with the concentration of dissolved cocaine. Simulated testing showed that it is possible to distinguish between cocaine-contaminated and uncontaminated wine bottles. CONCLUSION: The described method is an efficacious screening method to detect cocaine-contaminated bottles that are hidden between untreated bottles in cargo. The noninvasive examination of cargo allows a questionable delivery to be tracked without arousing the suspicion of the smugglers.
Resumo:
The accuracy of Global Positioning System (GPS) time series is degraded by the presence of offsets. To assess the effectiveness of methods that detect and remove these offsets, we designed and managed the Detection of Offsets in GPS Experiment. We simulated time series that mimicked realistic GPS data consisting of a velocity component, offsets, white and flicker noises (1/f spectrum noises) composed in an additive model. The data set was made available to the GPS analysis community without revealing the offsets, and several groups conducted blind tests with a range of detection approaches. The results show that, at present, manual methods (where offsets are hand picked) almost always give better results than automated or semi‒automated methods (two automated methods give quite similar velocity bias as the best manual solutions). For instance, the fifth percentile range (5% to 95%) in velocity bias for automated approaches is equal to 4.2 mm/year (most commonly ±0.4 mm/yr from the truth), whereas it is equal to 1.8 mm/yr for the manual solutions (most commonly 0.2 mm/yr from the truth). The magnitude of offsets detectable by manual solutions is smaller than for automated solutions, with the smallest detectable offset for the best manual and automatic solutions equal to 5 mm and 8 mm, respectively. Assuming the simulated time series noise levels are representative of real GPS time series, robust geophysical interpretation of individual site velocities lower than 0.2–0.4 mm/yr is therefore certainly not robust, although a limit of nearer 1 mm/yr would be a more conservative choice. Further work to improve offset detection in GPS coordinates time series is required before we can routinely interpret sub‒mm/yr velocities for single GPS stations.
Resumo:
Background Atrial fibrillation (AF) is common and may have severe consequences. Continuous long-term electrocardiogram (ECG) is widely used for AF screening. Recently, commercial ECG analysis software was launched, which automatically detects AF in long-term ECGs. It has been claimed that such tools offer reliable AF screening and save time for ECG analysis. However, this has not been investigated in a real-life patient cohort. Objective To investigate the performance of automatic software-based screening for AF in long-term ECGs. Methods Two independent physicians manually screened 22,601 hours of continuous long-term ECGs from 150 patients for AF. Presence, number, and duration of AF episodes were registered. Subsequently, the recordings were screened for AF by an established ECG analysis software (Pathfinder SL), and its performance was validated against the thorough manual analysis (gold standard). Results Sensitivity and specificity for AF detection was 98.5% (95% confidence interval 91.72%–99.96%) and 80.21% (95% confidence interval 70.83%–87.64%), respectively. Software-based AF detection was inferior to manual analysis by physicians (P < .0001). Median AF duration was underestimated (19.4 hours vs 22.1 hours; P < .001) and median number of AF episodes was overestimated (32 episodes vs 2 episodes; P < .001) by the software. In comparison to extensive quantitative manual ECG analysis, software-based analysis saved time (2 minutes vs 19 minutes; P < .001). Conclusion Owing to its high sensitivity and ability to save time, software-based ECG analysis may be used as a screening tool for AF. An additional manual confirmatory analysis may be required to reduce the number of false-positive findings.
Resumo:
Motivated by the reported dearth of debris discs around M stars, we use survival models to study the occurrence of planetesimal discs around them. These survival models describe a planetesimal disc with a small number of parameters, determine if it may survive a series of dynamical processes and compute the associated infrared excess. For the Wide-field Infrared Survey Explorer (WISE) satellite, we demonstrate that the dearth of debris discs around M stars may be attributed to the small semimajor axes generally probed if either: (1) the dust grains behave like blackbodies emitting at a peak wavelength coincident with the observed one; (2) or the grains are hotter than predicted by their blackbody temperatures and emit at peak wavelengths that are shorter than the observed one. At these small distances from the M star, planetesimals are unlikely to survive or persist for time-scales of 300 Myr or longer if the disc is too massive. Conversely, our survival models allow for the existence of a large population of low-mass debris discs that are too faint to be detected with current instruments. We gain further confidence in our interpretation by demonstrating the ability to compute infrared excesses for Sun-like stars that are broadly consistent with reported values in the literature. However, our interpretation becomes less clear and large infrared excesses are allowed if only one of these scenarios holds: (3) the dust grains are hotter than blackbody and predominantly emit at the observed wavelength; (4) or are blackbody in nature and emit at peak wavelengths longer than the observed one. Both scenarios imply that the parent planetesimals reside at larger distances from the star than inferred if the dust grains behaved like blackbodies. In all scenarios, we show that the infrared excesses detected at 22 μm (via WISE) and 70 μm (via Spitzer) from AU Mic are easily reconciled with its young age (12 Myr). Conversely, the existence of the old debris disc (2–8 Gyr) from GJ 581 is due to the large semimajor axes probed by the Herschel PACS instrument. We elucidate the conditions under which stellar wind drag may be neglected when considering dust populations around M stars. The WISE satellite should be capable of detecting debris discs around young M stars with ages ∼10 Myr.
Resumo:
Ophthalmologists typically acquire different image modalities to diagnose eye pathologies. They comprise e.g., Fundus photography, Optical Coherence Tomography (OCT), Computed Tomography (CT) and Magnetic Resonance Imaging (MRI). Yet, these images are often complementary and do express the same pathologies in a different way. Some pathologies are only visible in a particular modality. Thus, it is beneficial for the ophthalmologist to have these modalities fused into a single patient-specific model. The presented article’s goal is a fusion of Fundus photography with segmented MRI volumes. This adds information to MRI which was not visible before like vessels and the macula. This article’s contributions include automatic detection of the optic disc, the fovea, the optic axis and an automatic segmentation of the vitreous humor of the eye.
Resumo:
OBJECTIVE In contrast to conventional breast imaging techniques, one major diagnostic benefit of breast magnetic resonance imaging (MRI) is the simultaneous acquisition of morphologic and dynamic enhancement characteristics, which are based on angiogenesis and therefore provide insights into tumor pathophysiology. The aim of this investigation was to intraindividually compare 2 macrocyclic MRI contrast agents, with low risk for nephrogenic systemic fibrosis, in the morphologic and dynamic characterization of histologically verified mass breast lesions, analyzed by blinded human evaluation and a fully automatic computer-assisted diagnosis (CAD) technique. MATERIALS AND METHODS Institutional review board approval and patient informed consent were obtained. In this prospective, single-center study, 45 women with 51 histopathologically verified (41 malignant, 10 benign) mass lesions underwent 2 identical examinations at 1.5 T (mean time interval, 2.1 days) with 0.1-mmol kg doses of gadoteric acid and gadobutrol. All magnetic resonance images were visually evaluated by 2 experienced, blinded breast radiologists in consensus and by an automatic CAD system, whereas the morphologic and dynamic characterization as well as the final human classification of lesions were performed based on the categories of the Breast imaging reporting and data system MRI atlas. Lesions were also classified by defining their probability of malignancy (morpho-dynamic index; 0%-100%) by the CAD system. Imaging results were correlated with histopathology as gold standard. RESULTS The CAD system coded 49 of 51 lesions with gadoteric acid and gadobutrol (detection rate, 96.1%); initial signal increase was significantly higher for gadobutrol than for gadoteric acid for all and the malignant coded lesions (P < 0.05). Gadoteric acid resulted in more postinitial washout curves and fewer continuous increases of all and the malignant lesions compared with gadobutrol (CAD hot spot regions, P < 0.05). Morphologically, the margins of the malignancies were different between the 2 agents, whereas gadobutrol demonstrated more spiculated and fewer smooth margins (P < 0.05). Lesion classifications by the human observers and by the morpho-dynamic index compared with the histopathologic results did not significantly differ between gadoteric acid and gadobutrol. CONCLUSIONS Macrocyclic contrast media can be reliably used for breast dynamic contrast-enhanced MRI. However, gadoteric acid and gadobutrol differed in some dynamic and morphologic characterization of histologically verified breast lesions in an intraindividual, comparison. Besides the standardization of technical parameters and imaging evaluation of breast MRI, the standardization of the applied contrast medium seems to be important to receive best comparable MRI interpretation.
Resumo:
Aviation security strongly depends on screeners' performance in the detection of threat objects in x-ray images of passenger bags. We examined for the first time the effects of stress and stress-induced cortisol increases on detection performance of hidden weapons in an x-ray baggage screening task. We randomly assigned 48 participants either to a stress or a nonstress group. The stress group was exposed to a standardized psychosocial stress test (TSST). Before and after stress/nonstress, participants had to detect threat objects in a computer-based object recognition test (X-ray ORT). We repeatedly measured salivary cortisol and X-ray ORT performance before and after stress/nonstress. Cortisol increases in reaction to psychosocial stress induction but not to nonstress independently impaired x-ray detection performance. Our results suggest that stress-induced cortisol increases at peak reactivity impair x-ray screening performance.