903 resultados para Data-driven Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the important bioinformatics problem of predicting protein function from a protein's primary sequence. We consider the functional classification of G-Protein-Coupled Receptors (GPCRs), whose functions are specified in a class hierarchy. We tackle this task using a novel top-down hierarchical classification system where, for each node in the class hierarchy, the predictor attributes to be used in that node and the classifier to be applied to the selected attributes are chosen in a data-driven manner. Compared with a previous hierarchical classification system selecting classifiers only, our new system significantly reduced processing time without significantly sacrificing predictive accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AMS Subj. Classification: 62P10, 62H30, 68T01

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A comprehensive, broadly accepted vegetation classification is important for ecosystem management, particularly for planning and monitoring. South Florida vegetation classification systems that are currently in use were largely arrived at subjectively and intuitively with the involvement of experienced botanical observers and ecologists, but with little support in terms of quantitative field data. The need to develop a field data-driven classification of South Florida vegetation that builds on the ecological organization has been recognized by the National Park Service and vegetation practitioners in the region. The present work, funded by the National Park Service Inventory and Monitoring Program - South Florida/Caribbean Network (SFCN), covers the first stage of a larger project whose goal is to apply extant vegetation data to test, and revise as necessary, an existing, widely used classification (Rutchey et al. 2006). The objectives of the first phase of the project were (1) to identify useful existing datasets, (2) to collect these data and compile them into a geodatabase, (3) to conduct an initial classification analysis of marsh sites, and (4) to design a strategy for augmenting existing information from poorly represented landscapes in order to develop a more comprehensive south Florida classification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a growing societal need to address the increasing prevalence of behavioral health issues, such as obesity, alcohol or drug use, and general lack of treatment adherence for a variety of health problems. The statistics, worldwide and in the USA, are daunting. Excessive alcohol use is the third leading preventable cause of death in the United States (with 79,000 deaths annually), and is responsible for a wide range of health and social problems. On the positive side though, these behavioral health issues (and associated possible diseases) can often be prevented with relatively simple lifestyle changes, such as losing weight with a diet and/or physical exercise, or learning how to reduce alcohol consumption. Medicine has therefore started to move toward finding ways of preventively promoting wellness, rather than solely treating already established illness. Evidence-based patient-centered Brief Motivational Interviewing (BMI) interven- tions have been found particularly effective in helping people find intrinsic motivation to change problem behaviors after short counseling sessions, and to maintain healthy lifestyles over the long-term. Lack of locally available personnel well-trained in BMI, however, often limits access to successful interventions for people in need. To fill this accessibility gap, Computer-Based Interventions (CBIs) have started to emerge. Success of the CBIs, however, critically relies on insuring engagement and retention of CBI users so that they remain motivated to use these systems and come back to use them over the long term as necessary. Because of their text-only interfaces, current CBIs can therefore only express limited empathy and rapport, which are the most important factors of health interventions. Fortunately, in the last decade, computer science research has progressed in the design of simulated human characters with anthropomorphic communicative abilities. Virtual characters interact using humans’ innate communication modalities, such as facial expressions, body language, speech, and natural language understanding. By advancing research in Artificial Intelligence (AI), we can improve the ability of artificial agents to help us solve CBI problems. To facilitate successful communication and social interaction between artificial agents and human partners, it is essential that aspects of human social behavior, especially empathy and rapport, be considered when designing human-computer interfaces. Hence, the goal of the present dissertation is to provide a computational model of rapport to enhance an artificial agent’s social behavior, and to provide an experimental tool for the psychological theories shaping the model. Parts of this thesis were already published in [LYL+12, AYL12, AL13, ALYR13, LAYR13, YALR13, ALY14].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Head motion during a Positron Emission Tomography (PET) brain scan can considerably degrade image quality. External motion-tracking devices have proven successful in minimizing this effect, but the associated time, maintenance, and workflow changes inhibit their widespread clinical use. List-mode PET acquisition allows for the retroactive analysis of coincidence events on any time scale throughout a scan, and therefore potentially offers a data-driven motion detection and characterization technique. An algorithm was developed to parse list-mode data, divide the full acquisition into short scan intervals, and calculate the line-of-response (LOR) midpoint average for each interval. These LOR midpoint averages, known as “radioactivity centroids,” were presumed to represent the center of the radioactivity distribution in the scanner, and it was thought that changes in this metric over time would correspond to intra-scan motion.

Several scans were taken of the 3D Hoffman brain phantom on a GE Discovery IQ PET/CT scanner to test the ability of the radioactivity to indicate intra-scan motion. Each scan incrementally surveyed motion in a different degree of freedom (2 translational and 2 rotational). The radioactivity centroids calculated from these scans correlated linearly to phantom positions/orientations. Centroid measurements over 1-second intervals performed on scans with ~1mCi of activity in the center of the field of view had standard deviations of 0.026 cm in the x- and y-dimensions and 0.020 cm in the z-dimension, which demonstrates high precision and repeatability in this metric. Radioactivity centroids are thus shown to successfully represent discrete motions on the submillimeter scale. It is also shown that while the radioactivity centroid can precisely indicate the amount of motion during an acquisition, it fails to distinguish what type of motion occurred.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The creation of Causal Loop Diagrams (CLDs) is a major phase in the System Dynamics (SD) life-cycle, since the created CLDs express dependencies and feedback in the system under study, as well as, guide modellers in building meaningful simulation models. The cre-ation of CLDs is still subject to the modeller's domain expertise (mental model) and her ability to abstract the system, because of the strong de-pendency on semantic knowledge. Since the beginning of SD, available system data sources (written and numerical models) have always been sparsely available, very limited and imperfect and thus of little benefit to the whole modelling process. However, in recent years, we have seen an explosion in generated data, especially in all business related domains that are analysed via Business Dynamics (BD). In this paper, we intro-duce a systematic tool supported CLD creation approach, which analyses and utilises available disparate data sources within the business domain. We demonstrate the application of our methodology on a given business use-case and evaluate the resulting CLD. Finally, we propose directions for future research to further push the automation in the CLD creation and increase confidence in the generated CLDs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microturbines are among the most successfully commercialized distributed energy resources, especially when they are used for combined heat and power generation. However, the interrelated thermal and electrical system dynamic behaviors have not been fully investigated. This is technically challenging due to the complex thermo-fluid-mechanical energy conversion processes which introduce multiple time-scale dynamics and strong nonlinearity into the analysis. To tackle this problem, this paper proposes a simplified model which can predict the coupled thermal and electric output dynamics of microturbines. Considering the time-scale difference of various dynamic processes occuring within microturbines, the electromechanical subsystem is treated as a fast quasi-linear process while the thermo-mechanical subsystem is treated as a slow process with high nonlinearity. A three-stage subspace identification method is utilized to capture the dominant dynamics and predict the electric power output. For the thermo-mechanical process, a radial basis function model trained by the particle swarm optimization method is employed to handle the strong nonlinear characteristics. Experimental tests on a Capstone C30 microturbine show that the proposed modeling method can well capture the system dynamics and produce a good prediction of the coupled thermal and electric outputs in various operating modes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microturbines are among the most successfully commercialized distributed energy resources, especially when they are used for combined heat and power generation. However, the interrelated thermal and electrical system dynamic behaviors have not been fully investigated. This is technically challenging due to the complex thermo-fluid-mechanical energy conversion processes which introduce multiple time-scale dynamics and strong nonlinearity into the analysis. To tackle this problem, this paper proposes a simplified model which can predict the coupled thermal and electric output dynamics of microturbines. Considering the time-scale difference of various dynamic processes occuring within microturbines, the electromechanical subsystem is treated as a fast quasi-linear process while the thermo-mechanical subsystem is treated as a slow process with high nonlinearity. A three-stage subspace identification method is utilized to capture the dominant dynamics and predict the electric power output. For the thermo-mechanical process, a radial basis function model trained by the particle swarm optimization method is employed to handle the strong nonlinear characteristics. Experimental tests on a Capstone C30 microturbine show that the proposed modeling method can well capture the system dynamics and produce a good prediction of the coupled thermal and electric outputs in various operating modes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Au cours des dernières décennies, l’effort sur les applications de capteurs infrarouges a largement progressé dans le monde. Mais, une certaine difficulté demeure, en ce qui concerne le fait que les objets ne sont pas assez clairs ou ne peuvent pas toujours être distingués facilement dans l’image obtenue pour la scène observée. L’amélioration de l’image infrarouge a joué un rôle important dans le développement de technologies de la vision infrarouge de l’ordinateur, le traitement de l’image et les essais non destructifs, etc. Cette thèse traite de la question des techniques d’amélioration de l’image infrarouge en deux aspects, y compris le traitement d’une seule image infrarouge dans le domaine hybride espacefréquence, et la fusion d’images infrarouges et visibles employant la technique du nonsubsampled Contourlet transformer (NSCT). La fusion d’images peut être considérée comme étant la poursuite de l’exploration du modèle d’amélioration de l’image unique infrarouge, alors qu’il combine les images infrarouges et visibles en une seule image pour représenter et améliorer toutes les informations utiles et les caractéristiques des images sources, car une seule image ne pouvait contenir tous les renseignements pertinents ou disponibles en raison de restrictions découlant de tout capteur unique de l’imagerie. Nous examinons et faisons une enquête concernant le développement de techniques d’amélioration d’images infrarouges, et ensuite nous nous consacrons à l’amélioration de l’image unique infrarouge, et nous proposons un schéma d’amélioration de domaine hybride avec une méthode d’évaluation floue de seuil amélioré, qui permet d’obtenir une qualité d’image supérieure et améliore la perception visuelle humaine. Les techniques de fusion d’images infrarouges et visibles sont établies à l’aide de la mise en oeuvre d’une mise en registre précise des images sources acquises par différents capteurs. L’algorithme SURF-RANSAC est appliqué pour la mise en registre tout au long des travaux de recherche, ce qui conduit à des images mises en registre de façon très précise et des bénéfices accrus pour le traitement de fusion. Pour les questions de fusion d’images infrarouges et visibles, une série d’approches avancées et efficaces sont proposés. Une méthode standard de fusion à base de NSCT multi-canal est présente comme référence pour les approches de fusion proposées suivantes. Une approche conjointe de fusion, impliquant l’Adaptive-Gaussian NSCT et la transformée en ondelettes (Wavelet Transform, WT) est propose, ce qui conduit à des résultats de fusion qui sont meilleurs que ceux obtenus avec les méthodes non-adaptatives générales. Une approche de fusion basée sur le NSCT employant la détection comprime (CS, compressed sensing) et de la variation totale (TV) à des coefficients d’échantillons clairsemés et effectuant la reconstruction de coefficients fusionnés de façon précise est proposée, qui obtient de bien meilleurs résultats de fusion par le biais d’une pré-amélioration de l’image infrarouge et en diminuant les informations redondantes des coefficients de fusion. Une procédure de fusion basée sur le NSCT utilisant une technique de détection rapide de rétrécissement itératif comprimé (fast iterative-shrinking compressed sensing, FISCS) est proposée pour compresser les coefficients décomposés et reconstruire les coefficients fusionnés dans le processus de fusion, qui conduit à de meilleurs résultats plus rapidement et d’une manière efficace.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The impact of cancer upon children, teenagers and young people can be profound. Research has been undertaken to explore the impacts upon children, teenagers and young people with cancer, but little is known about how researchers can ‘best’ engage with this group to explore their experiences. This review paper provides an overview of the utility of data collection methods employed when undertaking research with children, teenagers and young people. A systematic review of relevant databases was undertaken utilising the search terms ‘young people’, ‘young adult’, ‘adolescent’ and ‘data collection methods’. The full-text of the papers that were deemed eligible from the title and abstract were accessed and following discussion within the research team, thirty papers were included. Findings: Due to the heterogeneity in terms of the scope of the papers identified the following data collections methods were included in the results section. Three of the papers identified provided an overview of data collection methods utilised with this population and the remaining twenty seven papers covered the following data collection methods: Digital technologies; art based research; comparing the use of ‘paper and pencil’ research with web-based technologies, the use of games; the use of a specific communication tool; questionnaires and interviews; focus groups and telephone interviews/questionnaires. The strengths and limitations of the range of data collection methods included are discussed drawing upon such issues as of the appropriateness of particular methods for particular age groups, or the most appropriate method to employ when exploring a particularly sensitive topic area. Conclusions: There are a number of data collection methods utilised to undertaken research with children, teenagers and young adults. This review provides a summary of the current available evidence and an overview of the strengths and limitations of data collection methods employed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power system engineers face a double challenge: to operate electric power systems within narrow stability and security margins, and to maintain high reliability. There is an acute need to better understand the dynamic nature of power systems in order to be prepared for critical situations as they arise. Innovative measurement tools, such as phasor measurement units, can capture not only the slow variation of the voltages and currents but also the underlying oscillations in a power system. Such dynamic data accessibility provides us a strong motivation and a useful tool to explore dynamic-data driven applications in power systems. To fulfill this goal, this dissertation focuses on the following three areas: Developing accurate dynamic load models and updating variable parameters based on the measurement data, applying advanced nonlinear filtering concepts and technologies to real-time identification of power system models, and addressing computational issues by implementing the balanced truncation method. By obtaining more realistic system models, together with timely updated parameters and stochastic influence consideration, we can have an accurate portrait of the ongoing phenomena in an electrical power system. Hence we can further improve state estimation, stability analysis and real-time operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sharpening is a powerful image transformation because sharp edges can bring out image details. Sharpness is achieved by increasing local contrast and reducing edge widths. We present a method that enhances sharpness of images and thereby their perceptual quality. Most existing enhancement techniques require user input to improve the perception of the scene in a manner most pleasing to the particular user. Our goal of image enhancement is to improve the perception of sharpness in digital images for human viewers. We consider two parameters in order to exaggerate the differences between local intensities. The two parameters exploit local contrast and widths of edges. We start from the assumption that color, texture, or objects of focus such as faces affect the human perception of photographs. When human raters are presented with a collection of images with different sharpness and asked to rank them according to perceived sharpness, the results have shown that there is a statistical consensus among the raters. We introduce a ramp enhancement technique by modifying the optimal overshoot in the ramp for different region contrasts as well as the new ramp width. Optimal parameter values are searched to be applied to regions under the criteria mentioned above. In this way, we aim to enhance digital images automatically to create pleasing image output for common users.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this study is to provide a framework for future researchers to understand and use the FARSITE wildfire-forecasting model with data assimilation. Current wildfire models lack the ability to provide accurate prediction of fire front position faster than real-time. When FARSITE is coupled with a recursive ensemble filter, the data assimilation forecast method improves. The scope includes an explanation of the standalone FARSITE application, technical details on FARSITE integration with a parallel program coupler called OpenPALM, and a model demonstration of the FARSITE-Ensemble Kalman Filter software using the FireFlux I experiment by Craig Clements. The results show that the fire front forecast is improved with the proposed data-driven methodology than with the standalone FARSITE model.