836 resultados para Framework Model
Resumo:
Sex determination is often seen as a dichotomous process: individual sex is assumed to be determined either by genetic (genotypic sex determination, GSD) or by environmental factors (environmental sex determination, ESD), most often temperature (temperature sex determination, TSD). We endorse an alternative view, which sees GSD and TSD as the ends of a continuum. Both effects interact a priori, because temperature can affect gene expression at any step along the sex-determination cascade. We propose to define sex-determination systems at the population- (rather than individual) level, via the proportion of variance in phenotypic sex stemming from genetic versus environmental factors, and we formalize this concept in a quantitative-genetics framework. Sex is seen as a threshold trait underlain by a liability factor, and reaction norms allow modeling interactions between genotypic and temperature effects (seen as the necessary consequences of thermodynamic constraints on the underlying physiological processes). As this formalization shows, temperature changes (due to e.g., climatic changes or range expansions) are expected to provoke turnovers in sex-determination mechanisms, by inducing large-scale sex reversal and thereby sex-ratio selection for alternative sex-determining genes. The frequency of turnovers and prevalence of homomorphic sex chromosomes in cold-blooded vertebrates might thus directly relate to the temperature dependence in sex-determination mechanisms.
Resumo:
This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.
Resumo:
Identifying the geographic distribution of populations is a basic, yet crucial step in many fundamental and applied ecological projects, as it provides key information on which many subsequent analyses depend. However, this task is often costly and time consuming, especially where rare species are concerned and where most sampling designs generally prove inefficient. At the same time, rare species are those for which distribution data are most needed for their conservation to be effective. To enhance fieldwork sampling, model-based sampling (MBS) uses predictions from species distribution models: when looking for the species in areas of high habitat suitability, chances should be higher to find them. We thoroughly tested the efficiency of MBS by conducting an important survey in the Swiss Alps, assessing the detection rate of three rare and five common plant species. For each species, habitat suitability maps were produced following an ensemble modeling framework combining two spatial resolutions and two modeling techniques. We tested the efficiency of MBS and the accuracy of our models by sampling 240 sites in the field (30 sitesx8 species). Across all species, the MBS approach proved to be effective. In particular, the MBS design strictly led to the discovery of six sites of presence of one rare plant, increasing chances to find this species from 0 to 50%. For common species, MBS doubled the new population discovery rates as compared to random sampling. Habitat suitability maps coming from the combination of four individual modeling methods predicted well the species' distribution and more accurately than the individual models. As a conclusion, using MBS for fieldwork could efficiently help in increasing our knowledge of rare species distribution. More generally, we recommend using habitat suitability models to support conservation plans.
Resumo:
Rough a global coarse problem. Although these techniques are usually employed for problems in which the fine-scale processes are described by Darcy's law, they can also be applied to pore-scale simulations and used as a mathematical framework for hybrid methods that couples a Darcy and pore scales. In this work, we consider a pore-scale description of fine-scale processes. The Navier-Stokes equations are numerically solved in the pore geometry to compute the velocity field and obtain generalized permeabilities. In the case of two-phase flow, the dynamics of the phase interface is described by the volume of fluid method with the continuum surface force model. The MsFV method is employed to construct an algorithm that couples a Darcy macro-scale description with a pore-scale description at the fine scale. The hybrid simulations results presented are in good agreement with the fine-scale reference solutions. As the reconstruction of the fine-scale details can be done adaptively, the presented method offers a flexible framework for hybrid modeling.
Resumo:
With over 68 thousand miles of gravel roads in Iowa and the importance of these roads within the farm-to-market transportation system, proper water management becomes critical for maintaining the integrity of the roadway materials. However, the build-up of water within the aggregate subbase can lead to frost boils and ultimately potholes forming at the road surface. The aggregate subbase and subgrade soils under these gravel roads are produced with material opportunistically chosen from local sources near the site and, many times, the compositions of these sublayers are far from ideal in terms of proper water drainage with the full effects of this shortcut not being fully understood. The primary objective of this project was to provide a physically-based model for evaluating the drainability of potential subbase and subgrade materials for gravel roads in Iowa. The Richards equation provided the appropriate framework to study the transient unsaturated flow that usually occurs through the subbase and subgrade of a gravel road. From which, we identified that the saturated hydraulic conductivity, Ks, was a key parameter driving the time to drain of subgrade soils found in Iowa, thus being a good proxy variable for accessing roadway drainability. Using Ks, derived from soil texture, we were able to identify potential problem areas in terms of roadway drainage . It was found that there is a threshold for Ks of 15 cm/day that determines if the roadway will drain efficiently, based on the requirement that the time to drain, Td, the surface roadway layer does not exceed a 2-hr limit. Two of the three highest abundant textures (loam and silty clay loam), which cover nearly 60% of the state of Iowa, were found to have average Td values greater than the 2-hr limit. With such a large percentage of the state at risk for the formation of boils due to the soil with relatively low saturated hydraulic conductivity values, it seems pertinent that we propose alternative design and/or maintenance practices to limit the expensive repair work in Iowa. The addition of drain tiles or French mattresses my help address drainage problems. However, before pursuing this recommendation, a comprehensive cost-benefit analysis is needed.
Resumo:
Aquest projecte consisteix en la realització d'un framework de persistència per realitzar l'enllaç entre el model de dades relacional i el model de dades orientat a objectes. L'aplicació que implementarem per tal de provar el nostre framework consisteix en l'adaptació web d'un software financer que serveix per tenir un control de la tresoreria.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
EXECUTIVE SUMMARY : Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link. This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective. This dissertation is divided into three parts. Part One: Information Security Evaluation issues consists of four chapters. Chapter 1 is an introduction to the purpose of this research purpose and the Model that will be proposed. In this chapter we raise some questions with respect to "traditional evaluation methods" as well as identifying the principal elements to be addressed in this direction. Then we introduce the baseline attributes of our model and set out the expected result of evaluations according to our model. Chapter 2 is focused on the definition of Information Security to be used as a reference point for our evaluation model. The inherent concepts of the contents of a holistic and baseline Information Security Program are defined. Based on this, the most common roots-of-trust in Information Security are identified. Chapter 3 focuses on an analysis of the difference and the relationship between the concepts of Information Risk and Security Management. Comparing these two concepts allows us to identify the most relevant elements to be included within our evaluation model, while clearing situating these two notions within a defined framework is of the utmost importance for the results that will be obtained from the evaluation process. Chapter 4 sets out our evaluation model and the way it addresses issues relating to the evaluation of Information Security. Within this Chapter the underlying concepts of assurance and trust are discussed. Based on these two concepts, the structure of the model is developed in order to provide an assurance related platform as well as three evaluation attributes: "assurance structure", "quality issues", and "requirements achievement". Issues relating to each of these evaluation attributes are analysed with reference to sources such as methodologies, standards and published research papers. Then the operation of the model is discussed. Assurance levels, quality levels and maturity levels are defined in order to perform the evaluation according to the model. Part Two: Implementation of the Information Security Assurance Assessment Model (ISAAM) according to the Information Security Domains consists of four chapters. This is the section where our evaluation model is put into a welldefined context with respect to the four pre-defined Information Security dimensions: the Organizational dimension, Functional dimension, Human dimension, and Legal dimension. Each Information Security dimension is discussed in a separate chapter. For each dimension, the following two-phase evaluation path is followed. The first phase concerns the identification of the elements which will constitute the basis of the evaluation: ? Identification of the key elements within the dimension; ? Identification of the Focus Areas for each dimension, consisting of the security issues identified for each dimension; ? Identification of the Specific Factors for each dimension, consisting of the security measures or control addressing the security issues identified for each dimension. The second phase concerns the evaluation of each Information Security dimension by: ? The implementation of the evaluation model, based on the elements identified for each dimension within the first phase, by identifying the security tasks, processes, procedures, and actions that should have been performed by the organization to reach the desired level of protection; ? The maturity model for each dimension as a basis for reliance on security. For each dimension we propose a generic maturity model that could be used by every organization in order to define its own security requirements. Part three of this dissertation contains the Final Remarks, Supporting Resources and Annexes. With reference to the objectives of our thesis, the Final Remarks briefly analyse whether these objectives were achieved and suggest directions for future related research. Supporting resources comprise the bibliographic resources that were used to elaborate and justify our approach. Annexes include all the relevant topics identified within the literature to illustrate certain aspects of our approach. Our Information Security evaluation model is based on and integrates different Information Security best practices, standards, methodologies and research expertise which can be combined in order to define an reliable categorization of Information Security. After the definition of terms and requirements, an evaluation process should be performed in order to obtain evidence that the Information Security within the organization in question is adequately managed. We have specifically integrated into our model the most useful elements of these sources of information in order to provide a generic model able to be implemented in all kinds of organizations. The value added by our evaluation model is that it is easy to implement and operate and answers concrete needs in terms of reliance upon an efficient and dynamic evaluation tool through a coherent evaluation system. On that basis, our model could be implemented internally within organizations, allowing them to govern better their Information Security. RÉSUMÉ : Contexte général de la thèse L'évaluation de la sécurité en général, et plus particulièrement, celle de la sécurité de l'information, est devenue pour les organisations non seulement une mission cruciale à réaliser, mais aussi de plus en plus complexe. A l'heure actuelle, cette évaluation se base principalement sur des méthodologies, des bonnes pratiques, des normes ou des standards qui appréhendent séparément les différents aspects qui composent la sécurité de l'information. Nous pensons que cette manière d'évaluer la sécurité est inefficiente, car elle ne tient pas compte de l'interaction des différentes dimensions et composantes de la sécurité entre elles, bien qu'il soit admis depuis longtemps que le niveau de sécurité globale d'une organisation est toujours celui du maillon le plus faible de la chaîne sécuritaire. Nous avons identifié le besoin d'une approche globale, intégrée, systémique et multidimensionnelle de l'évaluation de la sécurité de l'information. En effet, et c'est le point de départ de notre thèse, nous démontrons que seule une prise en compte globale de la sécurité permettra de répondre aux exigences de sécurité optimale ainsi qu'aux besoins de protection spécifiques d'une organisation. Ainsi, notre thèse propose un nouveau paradigme d'évaluation de la sécurité afin de satisfaire aux besoins d'efficacité et d'efficience d'une organisation donnée. Nous proposons alors un modèle qui vise à évaluer d'une manière holistique toutes les dimensions de la sécurité, afin de minimiser la probabilité qu'une menace potentielle puisse exploiter des vulnérabilités et engendrer des dommages directs ou indirects. Ce modèle se base sur une structure formalisée qui prend en compte tous les éléments d'un système ou programme de sécurité. Ainsi, nous proposons un cadre méthodologique d'évaluation qui considère la sécurité de l'information à partir d'une perspective globale. Structure de la thèse et thèmes abordés Notre document est structuré en trois parties. La première intitulée : « La problématique de l'évaluation de la sécurité de l'information » est composée de quatre chapitres. Le chapitre 1 introduit l'objet de la recherche ainsi que les concepts de base du modèle d'évaluation proposé. La maniéré traditionnelle de l'évaluation de la sécurité fait l'objet d'une analyse critique pour identifier les éléments principaux et invariants à prendre en compte dans notre approche holistique. Les éléments de base de notre modèle d'évaluation ainsi que son fonctionnement attendu sont ensuite présentés pour pouvoir tracer les résultats attendus de ce modèle. Le chapitre 2 se focalise sur la définition de la notion de Sécurité de l'Information. Il ne s'agit pas d'une redéfinition de la notion de la sécurité, mais d'une mise en perspectives des dimensions, critères, indicateurs à utiliser comme base de référence, afin de déterminer l'objet de l'évaluation qui sera utilisé tout au long de notre travail. Les concepts inhérents de ce qui constitue le caractère holistique de la sécurité ainsi que les éléments constitutifs d'un niveau de référence de sécurité sont définis en conséquence. Ceci permet d'identifier ceux que nous avons dénommés « les racines de confiance ». Le chapitre 3 présente et analyse la différence et les relations qui existent entre les processus de la Gestion des Risques et de la Gestion de la Sécurité, afin d'identifier les éléments constitutifs du cadre de protection à inclure dans notre modèle d'évaluation. Le chapitre 4 est consacré à la présentation de notre modèle d'évaluation Information Security Assurance Assessment Model (ISAAM) et la manière dont il répond aux exigences de l'évaluation telle que nous les avons préalablement présentées. Dans ce chapitre les concepts sous-jacents relatifs aux notions d'assurance et de confiance sont analysés. En se basant sur ces deux concepts, la structure du modèle d'évaluation est développée pour obtenir une plateforme qui offre un certain niveau de garantie en s'appuyant sur trois attributs d'évaluation, à savoir : « la structure de confiance », « la qualité du processus », et « la réalisation des exigences et des objectifs ». Les problématiques liées à chacun de ces attributs d'évaluation sont analysées en se basant sur l'état de l'art de la recherche et de la littérature, sur les différentes méthodes existantes ainsi que sur les normes et les standards les plus courants dans le domaine de la sécurité. Sur cette base, trois différents niveaux d'évaluation sont construits, à savoir : le niveau d'assurance, le niveau de qualité et le niveau de maturité qui constituent la base de l'évaluation de l'état global de la sécurité d'une organisation. La deuxième partie: « L'application du Modèle d'évaluation de l'assurance de la sécurité de l'information par domaine de sécurité » est elle aussi composée de quatre chapitres. Le modèle d'évaluation déjà construit et analysé est, dans cette partie, mis dans un contexte spécifique selon les quatre dimensions prédéfinies de sécurité qui sont: la dimension Organisationnelle, la dimension Fonctionnelle, la dimension Humaine, et la dimension Légale. Chacune de ces dimensions et son évaluation spécifique fait l'objet d'un chapitre distinct. Pour chacune des dimensions, une évaluation en deux phases est construite comme suit. La première phase concerne l'identification des éléments qui constituent la base de l'évaluation: ? Identification des éléments clés de l'évaluation ; ? Identification des « Focus Area » pour chaque dimension qui représentent les problématiques se trouvant dans la dimension ; ? Identification des « Specific Factors » pour chaque Focus Area qui représentent les mesures de sécurité et de contrôle qui contribuent à résoudre ou à diminuer les impacts des risques. La deuxième phase concerne l'évaluation de chaque dimension précédemment présentées. Elle est constituée d'une part, de l'implémentation du modèle général d'évaluation à la dimension concernée en : ? Se basant sur les éléments spécifiés lors de la première phase ; ? Identifiant les taches sécuritaires spécifiques, les processus, les procédures qui auraient dû être effectués pour atteindre le niveau de protection souhaité. D'autre part, l'évaluation de chaque dimension est complétée par la proposition d'un modèle de maturité spécifique à chaque dimension, qui est à considérer comme une base de référence pour le niveau global de sécurité. Pour chaque dimension nous proposons un modèle de maturité générique qui peut être utilisé par chaque organisation, afin de spécifier ses propres exigences en matière de sécurité. Cela constitue une innovation dans le domaine de l'évaluation, que nous justifions pour chaque dimension et dont nous mettons systématiquement en avant la plus value apportée. La troisième partie de notre document est relative à la validation globale de notre proposition et contient en guise de conclusion, une mise en perspective critique de notre travail et des remarques finales. Cette dernière partie est complétée par une bibliographie et des annexes. Notre modèle d'évaluation de la sécurité intègre et se base sur de nombreuses sources d'expertise, telles que les bonnes pratiques, les normes, les standards, les méthodes et l'expertise de la recherche scientifique du domaine. Notre proposition constructive répond à un véritable problème non encore résolu, auquel doivent faire face toutes les organisations, indépendamment de la taille et du profil. Cela permettrait à ces dernières de spécifier leurs exigences particulières en matière du niveau de sécurité à satisfaire, d'instancier un processus d'évaluation spécifique à leurs besoins afin qu'elles puissent s'assurer que leur sécurité de l'information soit gérée d'une manière appropriée, offrant ainsi un certain niveau de confiance dans le degré de protection fourni. Nous avons intégré dans notre modèle le meilleur du savoir faire, de l'expérience et de l'expertise disponible actuellement au niveau international, dans le but de fournir un modèle d'évaluation simple, générique et applicable à un grand nombre d'organisations publiques ou privées. La valeur ajoutée de notre modèle d'évaluation réside précisément dans le fait qu'il est suffisamment générique et facile à implémenter tout en apportant des réponses sur les besoins concrets des organisations. Ainsi notre proposition constitue un outil d'évaluation fiable, efficient et dynamique découlant d'une approche d'évaluation cohérente. De ce fait, notre système d'évaluation peut être implémenté à l'interne par l'entreprise elle-même, sans recourir à des ressources supplémentaires et lui donne également ainsi la possibilité de mieux gouverner sa sécurité de l'information.
Resumo:
Despite the important benefits for firms of commercial initiatives on the Internet, e-commerce is still an emerging distribution channel, even in developed countries. Thus, more needs to be known about the mechanisms affecting its development. A large number of works have studied firms¿ e-commerce adoption from technological, intraorganizational, institutional, or other specific perspectives, but there is a need for adequately tested integrative frameworks. Hence, this work proposes and tests a model of firms¿ business-to-consumer (called B2C) e-commerce adoption that is founded on a holistic vision of the phenomenon. With this integrative approach, the authors analyze the joint influence of environmental, technological, and organizational factors; moreover, they evaluate this effect over time. Using various representative Spanish data sets covering the period 1996-2005, the findings demonstrate the suitability of the holistic framework. Likewise, some lessons are learned from the analysis of the key building blocks. In particular, the current study provides evidence for the debate about the effect of competitive pressure, since the findings show that competitive pressure disincentivizes e-commerce adoption in the long term. The results also show that the development or enrichment of the consumers¿ consumption patterns, the technological readiness of the market forces, the firm¿s global scope, and its competences in innovation continuously favor e-commerce adoption.
Resumo:
In this paper, the theory of hidden Markov models (HMM) isapplied to the problem of blind (without training sequences) channel estimationand data detection. Within a HMM framework, the Baum–Welch(BW) identification algorithm is frequently used to find out maximum-likelihood (ML) estimates of the corresponding model. However, such a procedureassumes the model (i.e., the channel response) to be static throughoutthe observation sequence. By means of introducing a parametric model fortime-varying channel responses, a version of the algorithm, which is moreappropriate for mobile channels [time-dependent Baum-Welch (TDBW)] isderived. Aiming to compare algorithm behavior, a set of computer simulationsfor a GSM scenario is provided. Results indicate that, in comparisonto other Baum–Welch (BW) versions of the algorithm, the TDBW approachattains a remarkable enhancement in performance. For that purpose, onlya moderate increase in computational complexity is needed.
Resumo:
The software development industry is constantly evolving. The rise of the agile methodologies in the late 1990s, and new development tools and technologies require growing attention for everybody working within this industry. The organizations have, however, had a mixture of various processes and different process languages since a standard software development process language has not been available. A promising process meta-model called Software & Systems Process Engineering Meta- Model (SPEM) 2.0 has been released recently. This is applied by tools such as Eclipse Process Framework Composer, which is designed for implementing and maintaining processes and method content. Its aim is to support a broad variety of project types and development styles. This thesis presents the concepts of software processes, models, traditional and agile approaches, method engineering, and software process improvement. Some of the most well-known methodologies (RUP, OpenUP, OpenMethod, XP and Scrum) are also introduced with a comparison provided between them. The main focus is on the Eclipse Process Framework and SPEM 2.0, their capabilities, usage and modeling. As a proof of concept, I present a case study of modeling OpenMethod with EPF Composer and SPEM 2.0. The results show that the new meta-model and tool have made it possible to easily manage method content, publish versions with customized content, and connect project tools (such as MS Project) with the process content. The software process modeling also acts as a process improvement activity.
Resumo:
Tämän tutkimuksen tavoitteena oli selvittää kuinka houkuttelevan liiketoimintamahdollisuuden mobiilipelit tarjoavat mainostusalustana. Tutkimus suoritettiin tapaustutkimuksena. Tutkimus aloitettiin määrittelemällä liiketoimintamalli, jonka jälkeen suoritettiin yleinen katsaus Suomen mobiilipelimarkkinoille. Tämän jälkeen arvoketju-, arvoverkko- sekä markkina-analyysin avulla selvitettiin liiketoimintamallin mahdollisuudet sekä rajoitukset. Tutkimukseen käytettiinteorettista viitekehystä joka pohjautui Hamelin liiketoimintamalliin, Porterin arvoketjuun sekä Alleenin arvoverkoon. Tutkimuksen tuloksena todettiin, että mainostaminen mobiilipeleissä tarjoaa liiketoimintamahdollisuuden ilman esteitä sentoteuttamiselle. Suomalaiset mobiilipelimarkkinat ovat kuitenkin pirstoutuneet,minkä johdosta tutkittu 'mainosten hallinta-alusta'-liiketoimintamalli aiheuttaa liian suuret integraatiokustannukset. Myös suuri määrä pelitoimittajia heikentää mallin tehokkuutta.
Resumo:
In this article, the author provides a framework to guide¦research in emotional intelligence. Studies conducted up¦to the present bear on a conception of emotional intelligence¦as pertaining to the domain of consciousness and¦investigate the construct with a correlational approach.¦As an alternative, the author explores processes underlying¦emotional intelligence, introducing the distinction¦between conscious and automatic processing as a potential¦source of variability in emotionally intelligent¦behavior. Empirical literature is reviewed to support the¦central hypothesis that individual differences in emotional¦intelligence may be best understood by considering¦the way individuals automatically process emotional¦stimuli. Providing directions for research, the author¦encourages the integration of experimental investigation¦of processes underlying emotional intelligence with¦correlational analysis of individual differences and¦fosters the exploration of the automaticity component¦of emotional intelligence.
Integration in strategic alliances : a conceptual framework of IT use in marketing as NPD key factor
Resumo:
En una economia basada en el coneixement, la innovació del producte es considera un factor clau a l'hora de determinar la competitivitat, la productivitat i el creixement d'una companyia. No obstant això, l'experiència de les companyies demostra la necessitat d'un nou model de gestió de la innovació del producte: una gestió basada en el màrqueting, en què la cooperació i l'ús intensiu de les tecnologies de la informació i de la comunicació (TIC) són especialment importants. En els darrers anys, la bibliografia sobre màrqueting ha analitzat el paper de la cooperació en l'èxit del procés d'innovació. No obstant això, fins ara pocs treballs han estudiat el paper que té l'ús de les TIC en el màrqueting en l'èxit del desenvolupament de nous productes (NPD, New Product Development en anglès). És una omissió curiosa, tenint en compte que el nou entorn competitiu és definit per una economia i una societat basades principalment en l'ús intensiu de les TIC i del coneixement. L'objectiu d'aquest treball és investigar el paper que l'ús de les TIC en el màrqueting té en el procés de desenvolupament de nous productes, com a element que reforça la integració d'agents al projecte, afavorint l'establiment de relacions dirigides a la cooperació i l'adquisició d'intel·ligència de mercat útil en el procés de desenvolupament de nous productes. L'estudi d'una mostra de 2.038 companyies de tots els sectors de l'activitat econòmica a Catalunya ens permet contrastar hipòtesis inicials i establir un perfil de companyia innovadora basat en les importants relacions que hi ha entre la innovació, l'ús de TIC en el màrqueting i la integració. Sobresurten dues idees en la nostra anàlisi. En primer lloc, l'ús intensiu de les TIC en el màrqueting fa que la companyia sigui més innovadora, ja que percep que el seu ús ajuda a superar barreres a la innovació i accelera els processos, que es tornen més eficients. En segon lloc, incrementant l'ús de les TIC en el màrqueting es fa augmentar la predisposició de la companyia a integrar agents particulars en l'entorn de negoci en el desenvolupament del procés d'innovació i a col·laborar-hi, de manera que es millora el grau d'adaptació del nou producte a les demandes del mercat.