851 resultados para competitive intelligence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medicine counterfeiting is a crime that has increased in recent years and now involves the whole world. Health and economic repercussions have led pharmaceutical industries and agencies to develop many measures to protect genuine medicines and differentiate them from counterfeits. Detecting counterfeit is chemically relatively simple for the specialists, but much more information can be gained from the analyses in a forensic intelligence perspective. Analytical data can feed criminal investigation and law enforcement by detecting and understanding the criminal phenomenon. Profiling seizures using chemical and packaging data constitutes a strong way to detect organised production and industrialised forms of criminality, and is the focus of this paper. Thirty-three seizures of a commonly counterfeited type of capsule have been studied. The results of the packaging and chemical analyses were gathered within an organised database. Strong linkage was found between the seizures at the different production steps, indicating the presence of a main counterfeit network dominating the market. The interpretation of the links with circumstantial data provided information about the production and the distribution of counterfeits coming from this network. This forensic intelligence perspective has the potential to be generalised to other types of products. This may be the only reliable approach to help the understanding of the organised crime phenomenon behind counterfeiting and to enable efficient strategic and operational decision making in an attempt to dismantle counterfeit network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last 15 years, a new psychological construct has emerged in the field of psychology: Emotional Intelligence. Some models of Emotional Intelligence bear ressemblence with aspects of one of the core constructs of Adlerian Psychology: Social Interest. The authors investigated, if both constructs are also empirically related and which is their capacity to predict psychiatric symptoms and antisocial behavior. Results indicate that Social Interest and Emotional Intelligence are empirically different constructs; Social Interest was negatively correlated to aspects of antisocial attitudes (but not to antisocial behavior). Social Interest also failed to predict symptoms of psychological distress. Emotional Intelligence, in change, was a better predictor for mental problems than Social Interest. The results are discussed in view of the validity of Social Interest measurement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El Business Intelligence ha pasado en los últimos 20 años de ser un capricho de unos pocos CIO, que podían permitirse destinar partidas presupuestarias para tal efecto, a convertirse en una realidad ya presente en muchas de las grandes empresas o una necesidad urgente para las que todavía no han implantado un sistema de esas características.La primera parte del presente documento, denominada “Estudio del Business Intelligence”, presenta una introducción a dicho concepto, desde la base. Explicando los conceptos teóricos clave necesarios para entender este tipo de soluciones, más adelante se comentan los componentes tecnológicos que van desde los procesos de extracción e integración de información a cómo debemos estructurar la información para facilitar el análisis. Por último, se repasan los diferentes tipos de aplicaciones que existen en el mercado así como las tendencias más actuales en este campo.La segunda parte del documento centra su foco en la implantación de un Cuadro de Mandos para el análisis de las ventas de una empresa, se identifican las diferentes fases del proyecto así como se entra en detalle de los requerimientos identificados. En último lugar, se presenta el desarrollo realizado del Cuadro de Mandos con tecnología Xcelsius, que permite exportar a flash el resultado y visualizarlo en cualquier navegador web.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is unclear how physical attributes influence tennis-specific performance in teenage players. The aims of this study were (a) to examine the relationships between speed, explosive power, leg stiffness, and muscular strength of upper and lower limbs; and (b) to determine to what extent these physical qualities relate to tournament play performance in a group of competitive teenage tennis players. A total of 12 male players aged 13.6 +/- 1.4 years performed a series of physical tests: a 5-m, 10-m, and 20-m sprint; squat jump (SJ); countermovement jump (CMJ); drop jump (DJ); multi-rebound jumps; maximum voluntary contraction of isometric grip strength; and plantar flexor of the dominant and nondominant side. Speed (r = 0.69, 0.63, and 0.74 for 5-, 10-, and 20-m sprints, respectively), vertical power abilities (r = -0.71, -0.80 and -0.66 for SJ, CMJ, and DJ, respectively), and maximal strength in the dominant side (r = -0.67 and -0.73 for handgrip and plantar flexor, respectively) were significantly correlated with tennis performance. However, strength in the nondominant side (r = -0.29 and -0.42 for handgrip and plantar flexor) and leg stiffness (r = -0.15) were not correlated with the performance ranking of the players. It seems that physical attributes have a strong influence on tennis performance in this age group and that an important asymmetry is already observed. By monitoring regularly such physical abilities during puberty, the conditioning coach can modify a program to compensate for the imbalances. This would in turn minimize the risks of injuries during this critical period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is now well accepted that cellular responses to materials in a biological medium reflect greatly the adsorbed biomolecular layer, rather than the material itself. Here, we study by molecular dynamics simulations the competitive protein adsorption on a surface (Vroman effect), i.e. the non-monotonic behavior of the amount of protein adsorbed on a surface in contact with plasma as functions of contact time and plasma concentration. We find a complex behavior, with regimes during which small and large proteins are not necessarily competing between them, but are both competing with others in solution ("cooperative" adsorption). We show how the Vroman effect can be understood, controlled and inverted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Memory is essential to adjust behaviour according to past experience. In societies where animals interact on numerous occasions, memory of previous social interactions may help optimise investment in competition. How long information about the resource holding potential and motivation to compete of conspecifics is retained depends on how fast the value of this information fades, but also on the cost and benefit of retaining information. Information retention has never been investigated in the context of interactions prevailing within the family and more specifically sibling competition. In the absence of parents, barn owl (Tyto alba) nestlings vocally compete for priority of access to the next indivisible food item brought by a parent. The finding that owlets eavesdrop on vocal interactions between siblings to adjust investment in vocalization once competing with them suggests that they memorize siblings' vocal interactions. Playback experiments showed that owlets take into account the past siblings' vocal performance that signals hunger for at least 15 min, but only if the performance was witnessed during a sufficiently long period of time (30 min). Moreover, using natural vocal exchanges in another set of individuals, we showed that sibling signalling was no more taken into account after a few minutes. This suggests that young barn owls need to continuously display their motivation to trigger siblings' withdrawal from the current competition. Repeating a vocal display may ensure its honesty. Studying the extent to which individuals retain past information is important to understand how individuals adjust their competitive investment over resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A firm that wishes to launch a new product to the market is faced with a difficult task of deciding what the best moment for the launch is. Timing may also be critical when a firm plans to adopt new processes or intends to head for new markets. The critical question the firm needs to tackle is whether it will try to reach the so-called first-mover advantage by acting earlier than its rivals. The first-mover position may reward the entrant with various opportunities to gain competitive advantage over later movers. However, there are also great risks involved in the early market entry, and sometimes the very first entrant fails even before the followers enter the market. The follower, on the other hand, may be able to free-ride on the earlier entrants' investments and gain from the languished uncertainties that characterize the new markets. According to the current understanding the occurrence of entry order advantages depends not only on the mechanism and attributes in the firm's environment that provide the initial opportunities but also on the firm's ability to capitalize on these advantage opportunities. This study contributes to this discussion by analyzing the linkages between the asset base of the firm, characteristics of the operating environment and the firm's entry timing orientation. To shed light on the relationship between the entry timing strategy and competitive advantage, this study utilizes the concept of entry timing orientation. The rationale for choosing this type of approach arises from the inability of previously employed research tools to reach the underlying factors that result in entry timing advantage. The work consists of an introductory theoretical discussion on entry timing advantages and of four research publication. The empirical findings support the understanding that entry timing advantage is related to the characteristics of the firm's operating environment but may also be related to firm-specific factors. This in turn suggests that some of the traditional ways of detecting and measuring first-mover advantage - which to some extent ignore these dimensions - may be outdated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

False identity documents represent a serious threat through their production and use in organized crime and by terrorist organizations. The present-day fight against this criminal problem and threats to national security does not appropriately address the organized nature of this criminal activity, treating each fraudulent document on its own during investigation and the judicial process, which causes linkage blindness and restrains the analysis capacity. Given the drawbacks of this case-by-case approach, this article proposes an original model in which false identity documents are used to inform a systematic forensic intelligence process. The process aims to detect links, patterns, and tendencies among false identity documents in order to support strategic and tactical decision making, thus sustaining a proactive intelligence-led approach to fighting identity document fraud and the associated organized criminality. This article formalizes both the model and the process, using practical applications to illustrate its powerful capabilities. This model has a general application and can be transposed to other fields of forensic science facing similar difficulties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work was to develop, validate, and compare 190 artificial intelligence-based models for predicting the body mass of chicks from 2 to 21 days of age subjected to different duration and intensities of thermal challenge. The experiment was conducted inside four climate-controlled wind tunnels using 210 chicks. A database containing 840 datasets (from 2 to 21-day-old chicks) - with the variables dry-bulb air temperature, duration of thermal stress (days), chick age (days), and the daily body mass of chicks - was used for network training, validation, and tests of models based on artificial neural networks (ANNs) and neuro-fuzzy networks (NFNs). The ANNs were most accurate in predicting the body mass of chicks from 2 to 21 days of age after they were subjected to the input variables, and they showed an R² of 0.9993 and a standard error of 4.62 g. The ANNs enable the simulation of different scenarios, which can assist in managerial decision-making, and they can be embedded in the heating control systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of an autonomous assistant mobile robot in order to monitor the environmental conditions of a large indoor area and develop an ambient intelligence application. The mobile robot uses single high performance embedded sensors in order to collect and geo-reference environmental information such as ambient temperature, air velocity and orientation and gas concentration. The data collected with the assistant mobile robot is analyzed in order to detect unusual measurements or discrepancies and develop focused corrective ambient actions. This paper shows an example of the measurements performed in a research facility which have enabled the detection and location of an uncomfortable temperature profile inside an office of the research facility. The ambient intelligent application has been developed by performing some localized ambient measurements that have been analyzed in order to propose some ambient actuations to correct the uncomfortable temperature profile.