162 resultados para Task-Oriented Methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a theoretical perspective, an extension to the Full Range leadership Theory (FRLT) seems needed. In this paper, we explain why instrumental leadership--a class of leadership includes leader behaviors focusing on task and strategic aspects that are neither values nor exchange oriented--can fulfill this extension. Instrument leadership is composed of four factors: environmental monitoring, strategy formulation and implementation, path-goal facilitation and outcome monitoring; these aspects of leadership are currently not included in any of the FRLT's nine leadership scales (as measured by the MLQ--Multifactor Leadership Questionnaire). We present results from two empirical studies using very large samples from a wide array of countries (N > 3,000) to examine the factorial, discriminant and criterion-related validity of the instrumental leadership scales. We find support for a four-factor instrumental leadership model, which explains incremental variance in leader outcomes in over and above transactional and transformational leadership.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Agenda 21 for the Geneva region is the results from a broad consultation process including all local actors. The article 12 stipulates that « the State facilitates possible synergies between economic activities in order to minimize their environmental impacts » thus opening the way for Industrial Ecology (IE) and Industrial Symbiosis (IS). An Advisory Board for Industrial Ecology and Industrial Symbiosis implementation was established in 2002 involving relevant government agencies. Regulatory and technical conditions for IS are studied in the Swiss context. Results reveal that the Swiss law on waste does not hinder by-product exchanges. Methodology and technical factors including geographic, qualitative, quantitative and economical aspects are detailed. The competition with waste operators in a highly developed recycling system is also tackled.The IS project develops an empirical and systematic method for detecting and implementing by-products synergies between industrial actors disseminated throughout the Geneva region. Database management tool for the treatment of input-output analysis data and GIS tools for detecting potentials industrial partners are constantly improved. Potential symbioses for 17 flows (including energy, water and material flows) are currently studied for implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigated the influences of odor exposure on performance and on breathing measures. The task was composed of tracking, short-term memory, and peripheral reaction parts. During rest or while performing the task, 12 participants were exposed to 4 different odors in 2 intensities. The higher intensity of the malodors induced a short-term decrement in mean inspiration flow (Vi/Ti) after stimulus onset and impaired performance in the short-term memory task, as compared with control trials; no effect was found for the positively judged odors. The study suggests that a distractor as simple as a bad smell may pull a person off task, however briefly, and may result in a detriment to performance. Actual or potential applications of this research involve designing or securing tasks in such a way that a brief withdrawal of attention does not have fatal consequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Inhibitory control refers to our ability to suppress ongoing motor, affective or cognitive processes and mostly depends on a fronto-basal brain network. Inhibitory control deficits participate in the emergence of several prominent psychiatric conditions, including attention deficit/hyperactivity disorder or addiction. The rehabilitation of these pathologies might therefore benefit from training-based behavioral interventions aiming at improving inhibitory control proficiency and normalizing the underlying neurophysiological mechanisms. The development of an efficient inhibitory control training regimen first requires determining the effects of practicing inhibition tasks. METHODS: We addressed this question by contrasting behavioral performance and electrical neuroimaging analyses of event-related potentials (ERPs) recorded from humans at the beginning versus the end of 1 h of practice on a stop-signal task (SST) involving the withholding of responses when a stop signal was presented during a speeded auditory discrimination task. RESULTS: Practicing a short SST improved behavioral performance. Electrophysiologically, ERPs differed topographically at 200 msec post-stimulus onset, indicative of the engagement of distinct brain network with learning. Source estimations localized this effect within the inferior frontal gyrus, the pre-supplementary motor area and the basal ganglia. CONCLUSION: Our collective results indicate that behavioral and brain responses during an inhibitory control task are subject to fast plastic changes and provide evidence that high-order fronto-basal executive networks can be modified by practicing a SST.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The annotation of protein post-translational modifications (PTMs) is an important task of UniProtKB curators and, with continuing improvements in experimental methodology, an ever greater number of articles are being published on this topic. To help curators cope with this growing body of information we have developed a system which extracts information from the scientific literature for the most frequently annotated PTMs in UniProtKB. RESULTS: The procedure uses a pattern-matching and rule-based approach to extract sentences with information on the type and site of modification. A ranked list of protein candidates for the modification is also provided. For PTM extraction, precision varies from 57% to 94%, and recall from 75% to 95%, according to the type of modification. The procedure was used to track new publications on PTMs and to recover potential supporting evidence for phosphorylation sites annotated based on the results of large scale proteomics experiments. CONCLUSIONS: The information retrieval and extraction method we have developed in this study forms the basis of a simple tool for the manual curation of protein post-translational modifications in UniProtKB/Swiss-Prot. Our work demonstrates that even simple text-mining tools can be effectively adapted for database curation tasks, providing that a thorough understanding of the working process and requirements are first obtained. This system can be accessed at http://eagl.unige.ch/PTM/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present prospective study, with a five-year follow-up, presents an extensive psychiatric and educational assessment of an adolescent population (N = 30) in the age range 14-20, suffering from several psychiatric disorders, though apt to follow a normal academic program. The residential settings where the study took place provide both psychiatric and schooling facilities. In this environment, what is the effectiveness of long-term hospitalization? Are there any criteria for predicting results? After discharge, could social adjustments difficulties be prevented? Assessment instruments are described and the results of one preliminary study are presented. The actual data seems to confirm the impact of the special treatment facilities combining schooling and psychiatric settings on the long term outcome of adolescents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anticoagulants are a mainstay of cardiovascular therapy, and parenteral anticoagulants have widespread use in cardiology, especially in acute situations. Parenteral anticoagulants include unfractionated heparin, low-molecular-weight heparins, the synthetic pentasaccharides fondaparinux, idraparinux and idrabiotaparinux, and parenteral direct thrombin inhibitors. The several shortcomings of unfractionated heparin and of low-molecular-weight heparins have prompted the development of the other newer agents. Here we review the mechanisms of action, pharmacological properties and side effects of parenteral anticoagulants used in the management of coronary heart disease treated with or without percutaneous coronary interventions, cardioversion for atrial fibrillation, and prosthetic heart valves and valve repair. Using an evidence-based approach, we describe the results of completed clinical trials, highlight ongoing research with currently available agents, and recommend therapeutic options for specific heart diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter describes the profile of the HIA, provides insight into the process and gives an example of how political decisions may be made on behalf of a concerned population through an HIA approach. [Introduction p. 284]