919 resultados para Optical pattern recognition Data processing
Resumo:
Mounted on the sides of two widely separated spacecraft, the two Heliospheric Imager (HI) instruments onboard NASA’s STEREO mission view, for the first time, the space between the Sun and Earth. These instruments are wide-angle visible-light imagers that incorporate sufficient baffling to eliminate scattered light to the extent that the passage of solar coronal mass ejections (CMEs) through the heliosphere can be detected. Each HI instrument comprises two cameras, HI-1 and HI-2, which have 20° and 70° fields of view and are off-pointed from the Sun direction by 14.0° and 53.7°, respectively, with their optical axes aligned in the ecliptic plane. This arrangement provides coverage over solar elongation angles from 4.0° to 88.7° at the viewpoints of the two spacecraft, thereby allowing the observation of Earth-directed CMEs along the Sun – Earth line to the vicinity of the Earth and beyond. Given the two separated platforms, this also presents the first opportunity to view the structure and evolution of CMEs in three dimensions. The STEREO spacecraft were launched from Cape Canaveral Air Force Base in late October 2006, and the HI instruments have been performing scientific observations since early 2007. The design, development, manufacture, and calibration of these unique instruments are reviewed in this paper. Mission operations, including the initial commissioning phase and the science operations phase, are described. Data processing and analysis procedures are briefly discussed, and ground-test results and in-orbit observations are used to demonstrate that the performance of the instruments meets the original scientific requirements.
Resumo:
An algorithm for tracking multiple feature positions in a dynamic image sequence is presented. This is achieved using a combination of two trajectory-based methods, with the resulting hybrid algorithm exhibiting the advantages of both. An optimizing exchange algorithm is described which enables short feature paths to be tracked without prior knowledge of the motion being studied. The resulting partial trajectories are then used to initialize a fast predictor algorithm which is capable of rapidly tracking multiple feature paths. As this predictor algorithm becomes tuned to the feature positions being tracked, it is shown how the location of occluded or poorly detected features can be predicted. The results of applying this tracking algorithm to data obtained from real-world scenes are then presented.
Resumo:
The dynamics of inter-regional communication within the brain during cognitive processing – referred to as functional connectivity – are investigated as a control feature for a brain computer interface. EMDPL is used to map phase synchronization levels between all channel pair combinations in the EEG. This results in complex networks of channel connectivity at all time–frequency locations. The mean clustering coefficient is then used as a descriptive feature encapsulating information about inter-channel connectivity. Hidden Markov models are applied to characterize and classify dynamics of the resulting complex networks. Highly accurate levels of classification are achieved when this technique is applied to classify EEG recorded during real and imagined single finger taps. These results are compared to traditional features used in the classification of a finger tap BCI demonstrating that functional connectivity dynamics provide additional information and improved BCI control accuracies.
Resumo:
In a world where massive amounts of data are recorded on a large scale we need data mining technologies to gain knowledge from the data in a reasonable time. The Top Down Induction of Decision Trees (TDIDT) algorithm is a very widely used technology to predict the classification of newly recorded data. However alternative technologies have been derived that often produce better rules but do not scale well on large datasets. Such an alternative to TDIDT is the PrismTCS algorithm. PrismTCS performs particularly well on noisy data but does not scale well on large datasets. In this paper we introduce Prism and investigate its scaling behaviour. We describe how we improved the scalability of the serial version of Prism and investigate its limitations. We then describe our work to overcome these limitations by developing a framework to parallelise algorithms of the Prism family and similar algorithms. We also present the scale up results of a first prototype implementation.
Resumo:
Analysis of human behaviour through visual information has been a highly active research topic in the computer vision community. This was previously achieved via images from a conventional camera, but recently depth sensors have made a new type of data available. This survey starts by explaining the advantages of depth imagery, then describes the new sensors that are available to obtain it. In particular, the Microsoft Kinect has made high-resolution real-time depth cheaply available. The main published research on the use of depth imagery for analysing human activity is reviewed. Much of the existing work focuses on body part detection and pose estimation. A growing research area addresses the recognition of human actions. The publicly available datasets that include depth imagery are listed, as are the software libraries that can acquire it from a sensor. This survey concludes by summarising the current state of work on this topic, and pointing out promising future research directions.
Resumo:
This chapter introduces the latest practices and technologies in the interactive interpretation of environmental data. With environmental data becoming ever larger, more diverse and more complex, there is a need for a new generation of tools that provides new capabilities over and above those of the standard workhorses of science. These new tools aid the scientist in discovering interesting new features (and also problems) in large datasets by allowing the data to be explored interactively using simple, intuitive graphical tools. In this way, new discoveries are made that are commonly missed by automated batch data processing. This chapter discusses the characteristics of environmental science data, common current practice in data analysis and the supporting tools and infrastructure. New approaches are introduced and illustrated from the points of view of both the end user and the underlying technology. We conclude by speculating as to future developments in the field and what must be achieved to fulfil this vision.
Resumo:
This study evaluates model-simulated dust aerosols over North Africa and the North Atlantic from five global models that participated in the Aerosol Comparison between Observations and Models phase II model experiments. The model results are compared with satellite aerosol optical depth (AOD) data from Moderate Resolution Imaging Spectroradiometer (MODIS), Multiangle Imaging Spectroradiometer (MISR), and Sea-viewing Wide Field-of-view Sensor, dust optical depth (DOD) derived from MODIS and MISR, AOD and coarse-mode AOD (as a proxy of DOD) from ground-based Aerosol Robotic Network Sun photometer measurements, and dust vertical distributions/centroid height from Cloud Aerosol Lidar with Orthogonal Polarization and Atmospheric Infrared Sounder satellite AOD retrievals. We examine the following quantities of AOD and DOD: (1) the magnitudes over land and over ocean in our study domain, (2) the longitudinal gradient from the dust source region over North Africa to the western North Atlantic, (3) seasonal variations at different locations, and (4) the dust vertical profile shape and the AOD centroid height (altitude above or below which half of the AOD is located). The different satellite data show consistent features in most of these aspects; however, the models display large diversity in all of them, with significant differences among the models and between models and observations. By examining dust emission, removal, and mass extinction efficiency in the five models, we also find remarkable differences among the models that all contribute to the discrepancies of model-simulated dust amount and distribution. This study highlights the challenges in simulating the dust physical and optical processes, even in the best known dust environment, and stresses the need for observable quantities to constrain the model processes.
Resumo:
In recent years, there has been an increasing interest in the adoption of emerging ubiquitous sensor network (USN) technologies for instrumentation within a variety of sustainability systems. USN is emerging as a sensing paradigm that is being newly considered by the sustainability management field as an alternative to traditional tethered monitoring systems. Researchers have been discovering that USN is an exciting technology that should not be viewed simply as a substitute for traditional tethered monitoring systems. In this study, we investigate how a movement monitoring measurement system of a complex building is developed as a research environment for USN and related decision-supportive technologies. To address the apparent danger of building movement, agent-mediated communication concepts have been designed to autonomously manage large volumes of exchanged information. In this study, we additionally detail the design of the proposed system, including its principles, data processing algorithms, system architecture, and user interface specifics. Results of the test and case study demonstrate the effectiveness of the USN-based data acquisition system for real-time monitoring of movement operations.
Resumo:
Three coupled knowledge transfer partnerships used pattern recognition techniques to produce an e-procurement system which, the National Audit Office reports, could save the National Health Service £500 m per annum. An extension to the system, GreenInsight, allows the environmental impact of procurements to be assessed and savings made. Both systems require suitable products to be discovered and equivalent products recognised, for which classification is a key component. This paper describes the innovative work done for product classification, feature selection and reducing the impact of mislabelled data.
Resumo:
Environment monitoring applications using Wireless Sensor Networks (WSNs) have had a lot of attention in recent years. In much of this research tasks like sensor data processing, environment states and events decision making and emergency message sending are done by a remote server. A proposed cross layer protocol for two different applications where, reliability for delivered data, delay and life time of the network need to be considered, has been simulated and the results are presented in this paper. A WSN designed for the proposed applications needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from source nodes to the sink. A cross layer based on the design given in [1] has been extended and simulated for the proposed applications, with new features, such as routes discovery algorithms added. Simulation results show that the proposed cross layer based protocol can conserve energy for nodes and provide the required performance such as life time of the network, delay and reliability.
Resumo:
Dendritic cells (DC) can produce Th-polarizing cytokines and direct the class of the adaptive immune response. Microbial stimuli, cytokines, chemokines, and T cell-derived signals all have been shown to trigger cytokine synthesis by DC, but it remains unclear whether these signals are functionally equivalent and whether they determine the nature of the cytokine produced or simply initiate a preprogrammed pattern of cytokine production, which may be DC subtype specific. Here, we demonstrate that microbial and T cell-derived stimuli can synergize to induce production of high levels of IL-12 p70 or IL-10 by individual murine DC subsets but that the choice of cytokine is dictated by the microbial pattern recognition receptor engaged. We show that bacterial components such as CpG-containing DNA or extracts from Mycobacterium tuberculosis predispose CD8alpha(+) and CD8alpha(-)CD4(-) DC to make IL-12 p70. In contrast, exposure of CD8alpha(+), CD4(+) and CD8alpha(-)CD4(-) DC to heat-killed yeasts leads to production of IL-10. In both cases, secretion of high levels of cytokine requires a second signal from T cells, which can be replaced by CD40 ligand. Consistent with their differential effects on cytokine production, extracts from M. tuberculosis promote IL-12 production primarily via Toll-like receptor 2 and an MyD88-dependent pathway, whereas heat-killed yeasts activate DC via a Toll-like receptor 2-, MyD88-, and Toll/IL-1R domain containing protein-independent pathway. These results show that T cell feedback amplifies innate signals for cytokine production by DC and suggest that pattern recognition rather than ontogeny determines the production of cytokines by individual DC subsets.
Resumo:
Acute kidney injury (AKI) is an important clinical syndrome characterized by abnormalities in the hydroelectrolytic balance. Because of high rates of morbidity and mortality (from 15% to 60%) associated with AKI, the study of its pathophysiology is critical in searching for clinical targets and therapeutic strategies. Severe sepsis is the major cause of AKI. The host response to sepsis involves an inflammatory response, whereby the pathogen is initially sensed by innate immune receptors (pattern recognition receptors [PRRs]). When it persists, this immune response leads to secretion of proinflammatory products that induce organ dysfunction such as renal failure and consequently increased mortality. Moreover, the injured tissue releases molecules resulting from extracellular matrix degradation or dying cells that function as alarmines, which are recognized by PRR in the absence of pathogens in a second wave of injury. Toll-like receptors (TLRs) and NOD-like receptors (NLRs) are the best characterized PRRs. They are expressed in many cell types and throughout the nephron. Their activation leads to translocation of nuclear factors and synthesis of proinflammatory cytokines and chemokines. TLRs` signaling primes the cells for a robust inflammatory response dependent on NLRs; the interaction of TLRs and NLRs gives rise to the multiprotein complex known as the inflammasome, which in turn activates secretion of mature interleukin 1 beta and interleukin 18. Experimental data show that innate immune receptors, the inflammasome components, and proinflammatory cytokines play crucial roles not only in sepsis, but also in organ-induced dysfunction, especially in the kidneys. In this review, we discuss the significance of the innate immune receptors in the development of acute renal injury secondary to sepsis.
Resumo:
Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.