791 resultados para Concerns Based Adoption Model CBAM
Resumo:
Perceived discrimination is associated with increased engagement in unhealthy behaviors. We propose an identity-based pathway to explain this link. Drawing on an identity-based motivation model of health behaviors (Oyserman, Fryberg, & Yoder, 2007), we propose that erceptions of discrimination lead individuals to engage in ingroup-prototypical behaviors in the service of validating their identity and creating a sense of ingroup belonging. To the extent that people perceive unhealthy behaviors as ingroup-prototypical, perceived discrimination may thus increase motivation to engage in unhealthy behaviors. We describe our theoretical model and two studies that demonstrate initial support for some paths in this model. In Study 1, African American participants who reflected on racial discrimination were more likely to endorse unhealthy ingroup-prototypical behavior as self-characteristic than those who reflected on a neutral event. In Study 2, among African American participants who perceived unhealthy behaviors to be ingroup-prototypical, discrimination predicted greater endorsement of unhealthy behaviors as self-characteristic as compared to a control condition. These effects held both with and without controlling for body mass index (BMI) and income. Broader implications of this model for how discrimination adversely affects health-related decisions are discussed.
Resumo:
The major function of this model is to access the UCI Wisconsin Breast Cancer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classification can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artificial Immune Systems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to problem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifically for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based modelling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environment called AnyLogic, where the immune entities in the DCA are represented by intelligent agents. If this model can be successfully implemented, it makes it possible to implement more complicated and adaptive AIS models in the agent-based simulation environment.
Resumo:
Database schemas, in many organizations, are considered one of the critical assets to be protected. From database schemas, it is not only possible to infer the information being collected but also the way organizations manage their businesses and/or activities. One of the ways to disclose database schemas is through the Create, Read, Update and Delete (CRUD) expressions. In fact, their use can follow strict security rules or be unregulated by malicious users. In the first case, users are required to master database schemas. This can be critical when applications that access the database directly, which we call database interface applications (DIA), are developed by third party organizations via outsourcing. In the second case, users can disclose partially or totally database schemas following malicious algorithms based on CRUD expressions. To overcome this vulnerability, we propose a new technique where CRUD expressions cannot be directly manipulated by DIAs any more. Whenever a DIA starts-up, the associated database server generates a random codified token for each CRUD expression and sends it to the DIA that the database servers can use to execute the correspondent CRUD expression. In order to validate our proposal, we present a conceptual architectural model and a proof of concept.
Resumo:
This paper discusses the results and propositions of organizational knowledge management research conducted in the period 2001-2007. This longitudinal study had the unique goal of investigating and analyzing “Knowledge Management” (KM) processes effectively implemented in world class organizations. The main objective was to investigate and analyze the conceptions, motivations, practices, metrics and results of KM processes implemented in different industries. The first set of studies involved 20 world cases related in the literature and served as a basis for a theoretical framework entitled “KM Integrative Conceptual Mapping Proposition”. This theoretical proposal was then tested in a qualitative study in three large organizations in Brazil. The results of the qualitative study validated the mapping proposition and left questions for new research concerning the implementation of a knowledge-based organizational model strategy.
Resumo:
In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store’s fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.
Resumo:
2016
Resumo:
O presente estudo tem por objetivo avaliar se o perfil do adotante de inovações altera a relação entre o Valor Percebido e a Intenção de Compra de mídias móveis (smartphones, tablets, ultrabooks e leitores de e-books). Trata-se de uma pesquisa quantitativa que busca explorar a relação estrutural entre as variáveis por meio de Modelagem de Equações Estruturais (SEM – Strutural Equation Modeling). O modelo de pesquisa proposto foi desenvolvido tendo como base a Teoria da Difusão da Inovação (TDI), a Teoria Unificada de Aceitação e Uso de Tecnologia (UTAUT), o Modelo de Aceitação de Tecnologia (TAM), o Modelo Baseado em Valor (VAM), o Modelo de Aceitação de Tecnologia pelo Consumidor (CAT) e o Modelo de Influência Social (IS). Para coletar os dados foi utilizada a técnica snowball sampling ou amostragem em bola de neve, forma de amostragem não probabilística utilizada em pesquisas sociais. Foi feito um levantamento (survey), distribuindo-se questionário disponibilizado pela rede social Facebook, a partir dos contatos do autorsolicitando-se que os respondentes replicassem em suas páginas pessoais o link da pesquisa, ampliando a amostra. A coleta dos dados foi realizada nos meses de setembro e outubro de 2013, obtendo-se um total de 362 questionários respondidos. O estudo apresentou um efeito significativo da variável Valor Percebido na Intenção de Compra (estatística t = 4,506; nível de significância de 1%), além de sustentar a influência moderadora do Perfil do Adotante sobre essa relação (estatística t = 4,066; nível de significância de 1%), apresentando alto impacto sobre a Intenção de Compra (f 2 = 0,582) e relevância preditiva moderada (q2 = 0,290). Entre as variáveis antecedentes relacionadas à adoção de tecnologia, não apresentaram efeito significativo sobre o Valor Percebido: a Facilidade de Uso Percebida, a Complexidade Percebida e o Risco Percebido. O modelo contribuiu significativamente para explicar a influência dos fatores que impactam o Valor Percebido (R2 = 51,7%) o efeito do Valor Percebido na Intenção de Compra (R2 = 49,1%) de equipamentos eletrônicos portáteis. O suporte da presumidade influência moderadora do Perfil do Adotante sobre a relação Valor Percebido e Intenção de Compra indica que as organizações devem conhecer melhor os consumidores desse tipo de equipamento móveis, segmentando e desenvolvendo ações alinhadas com cada perfil de adotante.
Resumo:
Over the past decade, organizations worldwide have begun to widely adopt agile software development practices, which offer greater flexibility to frequently changing business requirements, better cost effectiveness due to minimization of waste, faster time-to-market, and closer collaboration between business and IT. At the same time, IT services are continuing to be increasingly outsourced to third parties providing the organizations with the ability to focus on their core capabilities as well as to take advantage of better demand scalability, access to specialized skills, and cost benefits. An output-based pricing model, where the customers pay directly for the functionality that was delivered rather than the effort spent, is quickly becoming a new trend in IT outsourcing allowing to transfer the risk away from the customer while at the same time offering much better incentives for the supplier to optimize processes and improve efficiency, and consequently producing a true win-win outcome. Despite the widespread adoption of both agile practices and output-based outsourcing, there is little formal research available on how the two can be effectively combined in practice. Moreover, little practical guidance exists on how companies can measure the performance of their agile projects, which are being delivered in an output-based outsourced environment. This research attempted to shed light on this issue by developing a practical project monitoring framework which may be readily applied by organizations to monitor the performance of agile projects in an output-based outsourcing context, thus taking advantage of the combined benefits of such an arrangement Modified from action research approach, this research was divided into two cycles, each consisting of the Identification, Analysis, Verification, and Conclusion phases. During Cycle 1, a list of six Key Performance Indicators (KPIs) was proposed and accepted by the professionals in the studied multinational organization, which formed the core of the proposed framework and answered the first research sub-question of what needs to be measured. In Cycle 2, a more in-depth analysis was provided for each of the suggested Key Performance Indicators including the techniques for capturing, calculating, and evaluating the information provided by each KPI. In the course of Cycle 2, the second research sub-question was answered, clarifying how the data for each KPI needed to be measured, interpreted, and acted upon. Consequently, after two incremental research cycles, the primary research question was answered describing the practical framework that may be used for monitoring the performance of agile IT projects delivered in an output-based outsourcing context. This framework was evaluated by the professionals within the context of the studied organization and received positive feedback across all four evaluation criteria set forth in this research, including the low overhead of data collection, high value of provided information, ease of understandability of the metric dashboard, and high generalizability of the proposed framework.
Resumo:
We describe a technique for finding pixelwise correspondences between two images by using models of objects of the same class to guide the search. The object models are 'learned' from example images (also called prototypes) of an object class. The models consist of a linear combination ofsprototypes. The flow fields giving pixelwise correspondences between a base prototype and each of the other prototypes must be given. A novel image of an object of the same class is matched to a model by minimizing an error between the novel image and the current guess for the closest modelsimage. Currently, the algorithm applies to line drawings of objects. An extension to real grey level images is discussed.
Resumo:
The formulation of a new process-based crop model, the general large-area model (GLAM) for annual crops is presented. The model has been designed to operate on spatial scales commensurate with those of global and regional climate models. It aims to simulate the impact of climate on crop yield. Procedures for model parameter determination and optimisation are described, and demonstrated for the prediction of groundnut (i.e. peanut; Arachis hypogaea L.) yields across India for the period 1966-1989. Optimal parameters (e.g. extinction coefficient, transpiration efficiency, rate of change of harvest index) were stable over space and time, provided the estimate of the yield technology trend was based on the full 24-year period. The model has two location-specific parameters, the planting date, and the yield gap parameter. The latter varies spatially and is determined by calibration. The optimal value varies slightly when different input data are used. The model was tested using a historical data set on a 2.5degrees x 2.5degrees grid to simulate yields. Three sites are examined in detail-grid cells from Gujarat in the west, Andhra Pradesh towards the south, and Uttar Pradesh in the north. Agreement between observed and modelled yield was variable, with correlation coefficients of 0.74, 0.42 and 0, respectively. Skill was highest where the climate signal was greatest, and correlations were comparable to or greater than correlations with seasonal mean rainfall. Yields from all 35 cells were aggregated to simulate all-India yield. The correlation coefficient between observed and simulated yields was 0.76, and the root mean square error was 8.4% of the mean yield. The model can be easily extended to any annual crop for the investigation of the impacts of climate variability (or change) on crop yield over large areas. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The agent-based model presented here, comprises an algorithm that computes the degree of hydration, the water consumption and the layer thickness of C-S-H gel as functions of time for different temperatures and different w/c ratios. The results are in agreement with reported experimental studies, demonstrating the applicability of the model. As the available experimental results regarding elevated curing temperature are scarce, the model could be recalibrated in the future. Combining the agent-based computational model with TGA analysis, a semiempirical method is achieved to be used for better understanding the microstructure development in ordinary cement pastes and to predict the influence of temperature on the hydration process.
Resumo:
This paper presents a simple profitability-based decision model to show how synergistic gains generated by the joint adoption of complementary innovations may influence the firm's adoption decision. For this purpose a weighted index of intra-firm diffusion is built to investigate empirically the drivers of the intensity of joint use of a set of complementary innovations. The findings indicate that establishment size, ownership structure and product market concentration are important determinants of the intensity of use. Interestingly, the factors that affect the extent of use of technological innovations do also affect that of clusters of management practices. However, they can explain only part of the heterogeneity of the benefits from joint use.
Resumo:
Purpose - The purpose of this paper is to construct a new e-commerce innovation and adoption model that takes into account various stages of e-commerce adoption (interactive, non-interactive and stabilised) and covers technological, organisational and environmental factors. This was tested using data collected from manufacturing and service companies in Saudi Arabia (SA) to reveal inhibitors and catalysts for e-commerce adoption. Design/methodology/approach - This study uses new data from surveys from 202 companies and then uses exploratory factor analysis and structural equation modelling for analyses. Findings - This study shows that the new stage-oriented model (SOM) is valid and can reveal specific detailed nuances of e-commerce adoption within a particular setting. Surprising results show that SA is not so very different to developed western countries in respect to e-commerce adoption. However there are some important differences which are discussed in detail. Research limitations/implications - A new SOM for e-commerce adoption is provided which may be used by other IS adoption researchers. Practical implications - Managers responsible for the adoption of e-commerce in SA, the Middle East and beyond can learn from these findings to speed up adoption rates and make e-commerce more effective. Social implications - This work may help spread e-commerce use throughout SA, the Middle East and to other developing nations. Originality/value - The results add to the extremely limited number of empirical studies that has been conducted to investigate e-commerce adoption in the context of Arabic countries.
Resumo:
There has been an increasing interest in the use of agent-based simulation and some discussion of the relative merits of this approach as compared to discrete-event simulation. There are differing views on whether an agent-based simulation offers capabilities that discrete-event cannot provide or whether all agent-based applications can at least in theory be undertaken using a discrete-event approach. This paper presents a simple agent-based NetLogo model and corresponding discrete-event versions implemented in the widely used ARENA software. The two versions of the discrete-event model presented use a traditional process flow approach normally adopted in discrete-event simulation software and also an agent-based approach to the model build. In addition a real-time spatial visual display facility is provided using a spreadsheet platform controlled by VBA code embedded within the ARENA model. Initial findings from this investigation are that discrete-event simulation can indeed be used to implement agent-based models and with suitable integration elements such as VBA provide the spatial displays associated with agent-based software.