395 resultados para Business Model Adaption .
Resumo:
Purpose: The purpose of this paper is to review, critique and develop a research agenda for the Elaboration Likelihood Model (ELM). The model was introduced by Petty and Cacioppo over three decades ago and has been modified, revised and extended. Given modern communication contexts, it is appropriate to question the models validity and relevance. Design/methodology/approach: The authors develop a conceptual approach, based on a fully comprehensive and extensive review and critique of ELM and its development since its inception. Findings: This paper focuses on major issues concerning the ELM. These include model assumptions and its descriptive nature; continuum questions, multi-channel processing and mediating variables before turning to the need to replicate the ELM and to offer recommendations for its future development. Research limitations/implications: This paper offers a series of questions in terms of research implications. These include whether ELM could or should be replicated, its extension, a greater conceptualization of argument quality, an explanation of movement along the continuum and between central and peripheral routes to persuasion, or to use new methodologies and technologies to help better understanding consume thinking and behaviour? All these relate to the current need to explore the relevance of ELM in a more modern context. Practical implications: It is time to question the validity and relevance of the ELM. The diversity of on- and off-line media options and the variants of consumer choice raise significant issues. Originality/value: While the ELM model continues to be widely cited and taught as one of the major cornerstones of persuasion, questions are raised concerning its relevance and validity in 21st century communication contexts.
Resumo:
In-memory databases have become a mainstay of enterprise computing offering significant performance and scalability boosts for online analytical and (to a lesser extent) transactional processing as well as improved prospects for integration across different applications through an efficient shared database layer. Significant research and development has been undertaken over several years concerning data management considerations of in-memory databases. However, limited insights are available on the impacts of applications and their supportive middleware platforms and how they need to evolve to fully function through, and leverage, in-memory database capabilities. This paper provides a first, comprehensive exposition into how in-memory databases impact Business Pro- cess Management, as a mission-critical and exemplary model-driven integration and orchestration middleware. Through it, we argue that in-memory databases will render some prevalent uses of legacy BPM middleware obsolete, but also open up exciting possibilities for tighter application integration, better process automation performance and some entirely new BPM capabilities such as process-based application customization. To validate the feasibility of an in-memory BPM, we develop a surprisingly simple BPM runtime embedded into SAP HANA and providing for BPMN-based process automation capabilities.
Resumo:
Information and Communication Technology (ICT) has become an integral part of societies across the globe. This study demonstrates how successful technology integration by 10 experienced teachers in an Australian high school was dependent on teacher-driven change and innovation that influenced the core business of teaching and learning. The teachers were subject specialists across a range of disciplines, engaging their Year Eight students (aged 1214 years) in the Technology Rich Classrooms programme. Two classrooms were renovated to accommodate the newly acquired computer hardware. The first classroom adopted a one-to-one desktop model with all the computers with Internet access arranged in a front-facing pattern. The second classroom had computers arranged in small groups. The students also used Blackboard to access learning materials after school hours. Qualitative data were gathered from teachers mainly through structured and unstructured interviews and a range of other approaches to ascertain their perceptions of the new initiative. This investigation showed that ICT was impacting positively on the core business of teaching and learning. Through the support of the school leadership team, the built environment was enabling teachers to use ICT. This influenced their pedagogical approaches and the types of learning activities they designed and implemented. As a consequence, teachers felt that students were motivated and benefited through this experience.
Resumo:
Occupational stress research has consistently demonstrated negative effects for employees. Research also describes potential moderators of this relationship. While research has revealed some positive effects of emotional intelligence (EI) on employee adjustment, it has neglected investigation of their potential stress buffering effects. Based on the Job-Demand Resources model, it was predicted that higher trait emotional intelligence would act as a buffer to the potential negative effects of stressors on employee adjustment. Hierarchical multiple regression analyses with a sample of 306 nurses found no main effects of EI but revealed eight moderating effects. While some interactions support the buffering hypothesis, others revealed buffering for those with low EI. Findings are discussed in terms of theoretical and practical implications.
Resumo:
This paper presents a layered framework for the purposes of integrating different Socio-Technical Systems (STS) models and perspectives into a whole-of-systems model. Holistic modelling plays a critical role in the engineering of STS due to the interplay between social and technical elements within these systems and resulting emergent behaviour. The framework decomposes STS models into components, where each component is either a static object, dynamic object or behavioural object. Based on existing literature, a classification of the different elements that make up STS, whether it be a social, technical or a natural environment element, is developed; each object can in turn be classified according to the STS elements it represents. Using the proposed framework, it is possible to systematically decompose models to an extent such that points of interface can be identified and the contextual factors required in transforming the component of one model to interface into another is obtained. Using an airport inbound passenger facilitation process as a case study socio-technical system, three different models are analysed: a Business Process Modelling Notation (BPMN) model, Hybrid Queue-based Bayesian Network (HQBN) model and an Agent Based Model (ABM). It is found that the framework enables the modeller to identify non-trivial interface points such as between the spatial interactions of an ABM and the causal reasoning of a HQBN, and between the process activity representation of a BPMN and simulated behavioural performance in a HQBN. Such a framework is a necessary enabler in order to integrate different modelling approaches in understanding and managing STS.
Resumo:
Business process models have traditionally been an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach for process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions as they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. Empirical data obtained in this study suggests that this approach may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.
Resumo:
Business Process Management describes a holistic management approach for the systematic design, modeling, execution, validation, monitoring and improvement of organizational business processes. Traditionally, most attention within this community has been given to control-flow aspects, i.e., the ordering and sequencing of business activities, oftentimes in isolation with regards to the context in which these activities occur. In this paper, we propose an approach that allows executable process models to be integrated with Geographic Information Systems. This approach enables process models to take geospatial and other geographic aspects into account in an explicit manner both during the modeling phase and the execution phase. We contribute a structured modeling methodology, based on the well-known Business Process Model and Notation standard, which is formalized by means of a mapping to executable Colored Petri nets. We illustrate the feasibility of our approach by means of a sustainability-focused case example of a process with important ecological concerns.
Resumo:
In an ever evolving business landscape, change is an ever present part of any organisations lifecycle. This thesis presents communication as a fundamental element of effective change management. Drawing from the existing change communication literature and two case studies, this thesis examines how organisations utilise strategic change communication to manage identity change. As a result this study presents a conceptual model that outlines a process of change communication strategy and implementation. This model is offered as a step toward connecting important scholarship into a more comprehensive portrait of change communication during identity change than so far has been available.
Resumo:
Stock indexes are passive 'value-weighted' portfolios and should not have alphas which are significantly different from zero. If an index produces an insignificant alphan, then significant alphas for equity funds using this index can be attributed solely to manager performance. However, recent literature sugests that US Stock indexes can demonstrate significant alphas, which ultimately raise questions regarding equity fund manager performance in both the US and abroad. in this paper, we employ the Carhart four-factor model and newly available Asian-Pacific risk factors to generate alphas and risk factor loadings for eight Australian stock indexes from January 2004 to December 2012. We ifnd that the initial full sample period analysis does not provide indication of significant alphas in the indexes examined. However, by carrying out 36-month rolling regressions, we discover at least four significant alphas in seven of the eight indexes and factor loading variability. As previously reported in the US, this paper confirms similar issues with the four-factor model using Australian stock indexes and performance benchmarking. In effectively measuring Australian equity fund manager performance, it is therefore essential to evaluate a fund's alpha and risk factors relative to the alpha and risk factors of the appropriate benchmark index.
Resumo:
We describe the development and parameterization of a grid-based model of African savanna vegetation processes. The model was developed with the objective of exploring elephant effects on the diversity of savanna species and structure, and in this formulation concentrates on the relative cover of grass and woody plants, the vertical structure of the woody plant community, and the distribution of these over space. Grid cells are linked by seed dispersal and fire, and environmental variability is included in the form of stochastic rainfall and fire events. The model was parameterized from an extensive review of the African savanna literature; when available, parameter values varied widely. The most plausible set of parameters produced long-term coexistence between woody plants and grass, with the tree-grass balance being more sensitive to changes in parameters influencing demographic processes and drought incidence and response, while less sensitive to fire regime. There was considerable diversity in the woody structure of savanna systems within the range of uncertainty in tree growth rate parameters. Thus, given the paucity of height growth data regarding woody plant species in southern African savannas, managers of natural areas should be cognizant of different tree species growth and damage response attributes when considering whether to act on perceived elephant threats to vegetation. 2007 Springer Science+Business Media B.V.
Resumo:
The DeLone and McLean (D&M) model (2003) has been broadly used and generally recognised as a useful model for gauging the success of IS implementations. However, it is not without limitations. In this study, we evaluate a model that extends the D&M model and attempts to address some of it slimitations by providing a more complete measurement model of systems success. To that end, we augment the D&M (2003) model and include three variables: business value, institutional trust, and future readiness. We propose that the addition of these variables allows systems success to be assessed at both the systems level and the business level. Consequently, we develop a measurement model rather than a structural or predictive model of systems success.
Resumo:
We develop a hybrid cellular automata model to describe the effect of the immune system and chemokines on a growing tumor. The hybrid cellular automata model consists of partial differential equations to model chemokine concentrations, and discrete cellular automata to model cellcell interactions and changes. The computational implementation overlays these two components on the same spatial region. We present representative simulations of the model and show that increasing the number of immature dendritic cells (DCs) in the domain causes a decrease in the number of tumor cells. This result strongly supports the hypothesis that DCs can be used as a cancer treatment. Furthermore, we also use the hybrid cellular automata model to investigate the growth of a tumor in a number of computational cancer patients. Using these virtual patients, the model can explain that increasing the number of DCs in the domain causes longer survival. Not surprisingly, the model also reflects the fact that the parameter related to tumor division rate plays an important role in tumor metastasis.