851 resultados para business process deduction
Resumo:
I am sure you’ve heard it too: Green is the new Black. While this was true back in the days when Henry Ford introduced process standardization with his assembly line for the Ford Model T (over 15 million of these were sold!), Green is also the color of choice for many business organizations, private and public. I am not talking about the actual color of their business shirts or their logo 2.0.; I am referring to the eco-aware movement that has pushed sustainability into the top ten list of business buzz-words. What used to be a boutique market for tourism and political activists has become the biggest business revolution since the e-commerce boom. Public and private organizations alike push towards “sustainable” solutions and practices. That push is partly triggered by the immense reputational gains associated with branding your organization as “green”, and partly by emerging societal, legal and constitutional regulations that force organizations to become more ecologically aware and sustainable. But the boom goes beyond organizational reality. Even in academia, sustainability has become a research “fashion wave” (see [1] if you are interested in research fashion waves) similar to the hype around Neuroscience that our colleagues in the natural sciences are witnessing these days. Mind you, I’m a fan. A big fan in fact. As academics, we are constantly searching for problem areas that are characterized by an opportunity to do rigorous research (studies that are executed to perfection) on relevant topics (studies that have applied practical value and provide impact to the community). What would be a better playground than exploring the options that Business Process Management provides for creating a sustainable, green future? I’m getting excited just writing about this! So, join me in exploring some of the current thoughts around how BPM can contribute to the sustainability fashion parade and let me introduce you to some of the works that scholars have produced recently in their attempts to identify solutions.
Resumo:
Research found that today’s organisations are increasingly aware of the potential barriers and perceived challenges associated with the successful delivery of change — including cultural and sub-cultural indifferences; financial constraints; restricted timelines; insufficient senior management support; fragmented key stakeholder commitment; and inadequate training. The delivery and application of Innovative Change (see glossary) within a construction industry organisation tends to require a certain level of ‘readiness’. This readiness is the combination of an organisation’s ability to part from undertakings that may be old, traditional, or inefficient; and then being able to readily adopt a procedure or initiative which is new, improved, or more efficient. Despite the construction industry’s awareness of the various threats and opportunities associated with the delivery of change, research found little attention is currently given to develop a ‘decision-making framework’ that comprises measurable elements (dynamics) that may assist in more accurately determining an organisation’s level of readiness or ability to deliver innovative change. To resolve this, an initial Background Literature Review in 2004 identified six such dynamics, those of Change, Innovation, Implementation, Culture, Leadership, and Training and Education, which were then hypothesised to be key components of a ‘Conceptual Decision-making Framework’ (CDF) for delivering innovative change within an organisation. To support this hypothesis, a second (more extensive) Literature Review was undertaken from late 2007 to mid 2009. A Delphi study was embarked on in June 2008, inviting fifteen building and construction industry members to form a panel and take part in a Delphi study. The selection criterion required panel members to have senior positions (manager and above) within a recognised field or occupation, and to have experience, understanding and / or knowledge in the process of delivering change within organisations. The final panel comprised nine representatives from private and public industry organisations and tertiary / research and development (R&D) universities. The Delphi study developed, distributed and collated two rounds of survey questionnaires over a four-month period, comprising open-ended and closed questions (referred to as factors). The first round of Delphi survey questionnaires were distributed to the panel in August 2008, asking them to rate the relevancy of the six hypothesised dynamics. In early September 2008, round-one responses were returned, analysed and documented. From this, an additional three dynamics were identified and confirmed by the panel as being highly relevant during the decision-making process when delivering innovative change within an organisation. The additional dynamics (‘Knowledge-sharing and Management’; ‘Business Process Requirements’; and ‘Life-cycle Costs’) were then added to the first six dynamics and used to populate the second (final) Delphi survey questionnaire. This was distributed to the same nine panel members in October 2008, this time asking them to rate the relevancy of all nine dynamics. In November 2008, round-two responses were returned, analysed, summarised and documented. Final results confirmed stability in responses and met Delphi study guidelines. The final contribution is twofold. Firstly, findings confirm all nine dynamics as key components of the proposed CDF for delivering innovative change within an organisation. Secondly, the future development and testing of an ‘Innovative Change Delivery Process’ (ICDP) is proposed, one that is underpinned by an ‘Innovative Change Decision-making Framework’ (ICDF), an ‘Innovative Change Delivery Analysis’ (ICDA) program, and an ‘Innovative Change Delivery Guide’ (ICDG).
Resumo:
The concept of system use has suffered from a "too simplistic definition" (DeLone and McLean [9], p. 16). This paper reviews various attempts at conceptualization and measurement of system use and then proposes a re-conceptualization of it as "the level of incorporation of an information system within a user's processes." We then go on to develop the concept of a Functional Interface Point and four dimensions of system usage: automation level, the proportion of the business process encoded by the information system; extent, the proportion of the FIPs used by the business process; frequency, the rate at which FIPs are used by the participants in the process; and thoroughness, the level of use of information/functionality provided by the system at an FIP. The article concludes with a discussion of some implications of this re-conceptualization and areas for follow on research.
Resumo:
Intermediaries have introduced electronic services with varying success. One of the problems an intermediary faces is deciding what kind of exchange service it should offer to its customers and suppliers. For example, should it only provide a catalogue or should it also enable customers to order products? Developing the right exchange design is a complex undertaking because of the many design options on the one hand and the interests of multiple actors to be considered on the other. This is far more difficult than simple prescriptions like ‘creating a win-win situation’ suggest. We address this problem by developing design patterns for the exchanges between customers, intermediary, and suppliers related to role, linkage, transparency, and ovelty choices. For developing these design patterns, we studied four distinct electronic intermediaries and dentified exchange design choices that require trade-offs relating to the interests of customers, intermediary, and suppliers. The exchange design patterns contribute to the development of design theory for electronic intermediaries by filling a gap between basic business models and detailed business process designs.
Resumo:
To facilitate the implementation of workflows, enterprise and workflow system vendors typically provide workflow templates for their software. Each of these templates depicts a variant of how the software supports a certain business process, allowing the user to save the effort of creating models and links to system components from scratch by selecting and activating the appropriate template. A combination of the strengths from different templates is however only achievable by manually adapting the templates which is cumbersome. We therefore suggest in this paper to combine different workflow templates into a single configurable workflow template. Using the workflow modeling language of SAP’s WebFlow engine, we show how such a configurable workflow modeling language can be created by identifying the configurable elements in the original language. Requirements imposed on configurations inhibit invalid configurations. Based on a default configuration such configurable templates can be used as easy as the traditional templates. The suggested approach is also applicable to other workflow modeling languages
Resumo:
There is increasing attention to the importance of Enterprise Systems (ES) and Information Systems (IS) for Small and Medium Enterprises (SMEs). The same attention must be addressed in IS graduate curriculum. Studies reveal that despite healthy demand from the industry for IS management expertise, most IS graduates are ill-equipped to meet the challenges of modern organizations. The majority of contemporary firms, represented by SMEs, seek employees with a balance of business process knowledge and ES software skills. This article describes a curriculum that teaches Information Technology (IT) and IS managementconcepts in a SMEs context. The curriculum conceptualises a ‘learn-by-doing’ approach, to provide business process and ES software specific knowledge for its students. The approach recommends coverage of traditional content related to SMEs’’ operations, strategies, IT investment and management issues while providing an increased focus on strategic use of enterprise IT. The study addresses to an extent, the perennial challenge of updating IS curriculum, given the rapid pace of technological change.
Resumo:
Web service technology is increasingly being used to build various e-Applications, in domains such as e-Business and e-Science. Characteristic benefits of web service technology are its inter-operability, decoupling and just-in-time integration. Using web service technology, an e-Application can be implemented by web service composition — by composing existing individual web services in accordance with the business process of the application. This means the application is provided to customers in the form of a value-added composite web service. An important and challenging issue of web service composition, is how to meet Quality-of-Service (QoS) requirements. This includes customer focused elements such as response time, price, throughput and reliability as well as how to best provide QoS results for the composites. This in turn best fulfils customers’ expectations and achieves their satisfaction. Fulfilling these QoS requirements or addressing the QoS-aware web service composition problem is the focus of this project. From a computational point of view, QoS-aware web service composition can be transformed into diverse optimisation problems. These problems are characterised as complex, large-scale, highly constrained and multi-objective problems. We therefore use genetic algorithms (GAs) to address QoS-based service composition problems. More precisely, this study addresses three important subproblems of QoS-aware web service composition; QoS-based web service selection for a composite web service accommodating constraints on inter-service dependence and conflict, QoS-based resource allocation and scheduling for multiple composite services on hybrid clouds, and performance-driven composite service partitioning for decentralised execution. Based on operations research theory, we model the three problems as a constrained optimisation problem, a resource allocation and scheduling problem, and a graph partitioning problem, respectively. Then, we present novel GAs to address these problems. We also conduct experiments to evaluate the performance of the new GAs. Finally, verification experiments are performed to show the correctness of the GAs. The major outcomes from the first problem are three novel GAs: a penaltybased GA, a min-conflict hill-climbing repairing GA, and a hybrid GA. These GAs adopt different constraint handling strategies to handle constraints on interservice dependence and conflict. This is an important factor that has been largely ignored by existing algorithms that might lead to the generation of infeasible composite services. Experimental results demonstrate the effectiveness of our GAs for handling the QoS-based web service selection problem with constraints on inter-service dependence and conflict, as well as their better scalability than the existing integer programming-based method for large scale web service selection problems. The major outcomes from the second problem has resulted in two GAs; a random-key GA and a cooperative coevolutionary GA (CCGA). Experiments demonstrate the good scalability of the two algorithms. In particular, the CCGA scales well as the number of composite services involved in a problem increases, while no other algorithms demonstrate this ability. The findings from the third problem result in a novel GA for composite service partitioning for decentralised execution. Compared with existing heuristic algorithms, the new GA is more suitable for a large-scale composite web service program partitioning problems. In addition, the GA outperforms existing heuristic algorithms, generating a better deployment topology for a composite web service for decentralised execution. These effective and scalable GAs can be integrated into QoS-based management tools to facilitate the delivery of feasible, reliable and high quality composite web services.
Resumo:
DeLone and McLean (1992, p. 16) argue that the concept of “system use” has suffered from a “too simplistic definition.” Despite decades of substantial research on system use, the concept is yet to receive strong theoretical scrutiny. Many measures of system use and the development of measures have been often idiosyncratic and lack credibility or comparability. This paper reviews various attempts at conceptualization and measurement of system use and then proposes a re-conceptualization of it as “the level of incorporation of an information system within a user’s processes.” The definition is supported with the theory of work systems, system, and Key-User-Group considerations. We then go on to develop the concept of a Functional- Interface-Point (FIP) and four dimensions of system usage: extent, the proportion of the FIPs used by the business process; frequency, the rate at which FIPs are used by the participants in the process; thoroughness, the level of use of information/functionality provided by the system at an FIP; and attitude towards use, a set of measures that assess the level of comfort, degree of respect and the challenges set forth by the system. The paper argues that the automation level, the proportion of the business process encoded by the information system has a mediating impact on system use. The article concludes with a discussion of some implications of this re-conceptualization and areas for follow on research.
Resumo:
Despite promising benefits and advantages, there are reports of failures and low realisation of benefits in Enterprise System (ES) initiatives. Among the research on the factors that influence ES success, there is a dearth of studies on the knowledge implications of multiple end-user groups using the same ES application. An ES facilitates the work of several user groups, ranging from strategic management, management, to operational staff, all using the same system for multiple objectives. Given the fundamental characteristics of ES – integration of modules, business process views, and aspects of information transparency – it is necessary that all frequent end-users share a reasonable amount of common knowledge and integrate their knowledge to yield new knowledge. Recent literature on ES implementation highlights the importance of Knowledge Integration (KI) for implementation success. Unfortunately, the importance of KI is often overlooked and little about the role of KI in ES success is known. Many organisations do not achieve the potential benefits from their ES investment because they do not consider the need or their ability to integrate their employees’ knowledge. This study is designed to improve our understanding of the influence of KI among ES end-users on operational ES success. The three objectives of the study are: (I) to identify and validate the antecedents of KI effectiveness, (II) to investigate the impact of KI effectiveness on the goodness of individuals’ ES-knowledge base, and (III) to examine the impact of the goodness of individuals’ ES-knowledge base on the operational ES success. For this purpose, we employ the KI factors identified by Grant (1996) and an IS-impact measurement model from the work of Gable et al. (2008) to examine ES success. The study derives its findings from data gathered from six Malaysian companies in order to obtain the three-fold goal of this thesis as outlined above. The relationships between the antecedents of KI effectiveness and its consequences are tested using 188 responses to a survey representing the views of management and operational employment cohorts. Using statistical methods, we confirm three antecedents of KI effectiveness and the consequences of the antecedents on ES success are validated. The findings demonstrate a statistically positive impact of KI effectiveness of ES success, with KI effectiveness contributing to almost one-third of ES success. This research makes a number of contributions to the understanding of the influence of KI on ES success. First, based on the empirical work using a complete nomological net model, the role of KI effectiveness on ES success is evidenced. Second, the model provides a theoretical lens for a more comprehensive understanding of the impact of KI on the level of ES success. Third, restructuring the dimensions of the knowledge-based theory to fit the context of ES extends its applicability and generalisability to contemporary Information Systems. Fourth, the study develops and validates measures for the antecedents of KI effectiveness. Fifth, the study demonstrates the statistically significant positive influence of the goodness of KI on ES success. From a practical viewpoint, this study emphasises the importance of KI effectiveness as a direct antecedent of ES success. Practical lessons can be drawn from the work done in this study to empirically identify the critical factors among the antecedents of KI effectiveness that should be given attention.
Resumo:
In keeping with the proliferation of free software development initiatives and the increased interest in the business process management domain, many open source workflow and business process management systems have appeared during the last few years and are now under active development. This upsurge gives rise to two important questions: What are the capabilities of these systems? and How do they compare to each other and to their closed source counterparts? In other words: What is the state-of-the-art in the area?. To gain an insight into these questions, we have conducted an in-depth analysis of three of the major open source workflow management systems – jBPM, OpenWFE, and Enhydra Shark, the results of which are reported here. This analysis is based on the workflow patterns framework and provides a continuation of the series of evaluations performed using the same framework on closed source systems, business process modelling languages, and web-service composition standards. The results from evaluations of the three open source systems are compared with each other and also with the results from evaluations of three representative closed source systems: Staffware, WebSphere MQ, and Oracle BPEL PM. The overall conclusion is that open source systems are targeted more toward developers rather than business analysts. They generally provide less support for the patterns than closed source systems, particularly with respect to the resource perspective, i.e. the various ways in which work is distributed amongst business users and managed through to completion.
Resumo:
A service-oriented system is composed of independent software units, namely services, that interact with one another exclusively through message exchanges. The proper functioning of such system depends on whether or not each individual service behaves as the other services expect it to behave. Since services may be developed and operated independently, it is unrealistic to assume that this is always the case. This article addresses the problem of checking and quantifying how much the actual behavior of a service, as recorded in message logs, conforms to the expected behavior as specified in a process model.We consider the case where the expected behavior is defined using the BPEL industry standard (Business Process Execution Language for Web Services). BPEL process definitions are translated into Petri nets and Petri net-based conformance checking techniques are applied to derive two complementary indicators of conformance: fitness and appropriateness. The approach has been implemented in a toolset for business process analysis and mining, namely ProM, and has been tested in an environment comprising multiple Oracle BPEL servers.