908 resultados para Collaborative business processes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Companies must not see e-Business as a panacea but instead assess the specific impact of implementing e-Business on their business from both an internal and external perspective. E-Business is promoted as being able to increase the speed of response and reduce costs locally but these benefits must be assessed for the wider business rather than as local improvements. This paper argues that any assessment must include quantitative analysis that covers the physical as well as the information flows within a business. It is noted that as business processes are e-enabled their structure does not significantly change and it is only by the use of modelling techniques that the operational impact can be ascertained. The paper reviews techniques that are appropriate for this type of analysis as well as specific modelling tools and applications. Through this review a set of requirements for e-Business process modelling is derived.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Benchmarking techniques have evolved over the years since Xerox’s pioneering visits to Japan in the late 1970s. The focus of benchmarking has also shifted during this period. By tracing in detail the evolution of benchmarking in one specific area of business activity, supply and distribution management, as seen by the participants in that evolution, creates a picture of a movement from single function, cost-focused, competitive benchmarking, through cross-functional, cross-sectoral, value-oriented benchmarking to process benchmarking. As process efficiency and effectiveness become the primary foci of benchmarking activities, the measurement parameters used to benchmark performance converge with the factors used in business process modelling. The possibility is therefore emerging of modelling business processes and then feeding the models with actual data from benchmarking exercises. This would overcome the most common criticism of benchmarking, namely that it intrinsically lacks the ability to move beyond current best practice. In fact the combined power of modelling and benchmarking may prove to be the basic building block of informed business process re-engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Knowledge has been a subject of interest and inquiry for thousands of years since at least the time of the ancient Greeks, and no doubt even before that. “What is knowledge” continues to be an important topic of discussion in philosophy. More recently, interest in managing knowledge has grown in step with the perception that increasingly we live in a knowledge-based economy. Drucker (1969) is usually credited as being the first to popularize the knowledge-based economy concept by linking the importance of knowledge with rapid technological change in Drucker (1969). Karl Wiig coined the term knowledge management (hereafter KM) for a NATO seminar in 1986, and its popularity took off following the publication of Nonaka and Takeuchi’s book “The Knowledge Creating Company” (Nonaka & Takeuchi, 1995). Knowledge creation is in fact just one of many activities involved in KM. Others include sharing, retaining, refining, and using knowledge. There are many such lists of activities (Holsapple & Joshi, 2000; Probst, Raub, & Romhardt, 1999; Skyrme, 1999; Wiig, De Hoog, & Van der Spek, 1997). Both academic and practical interest in KM has continued to increase throughout the last decade. In this article, first the different types of knowledge are outlined, then comes a discussion of various routes by which knowledge management can be implemented, advocating a process-based route. An explanation follows of how people, processes, and technology need to fit together for effective KM, and some examples of this route in use are given. Finally, there is a look towards the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With new and emerging e-business technologies to transform business processes, it is important to understand how those technologies will affect the performance of a business. Will the overall business process be cheaper, faster and more accurate or will a sub-optimal change have been implemented? The use of simulation to model the behaviour of business processes is well established, and it has been applied to e-business processes to understand their performance in terms of measures such as lead-time, cost and responsiveness. This paper introduces the concept of simulation components that enable simulation models of e-business processes to be built quickly from generic e-business templates. The paper demonstrates how these components were devised, as well as the results from their application through case studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a structured method from the perspective of value to organise and optimise the business processes of a product servitised supply chain (PSSC). This method integrates the modelling tool of e3value with the associated value measurement, evaluation and analysis techniques. It enables visualisation, modelling and optimisation of the business processes of a PSSC. At the same time, the value co-creation and potential contribution to an organisation’s profitability can also be enhanced. The findings not only facilitate organisations that are attempting to adopt servitisation by helping avert any paradox, but also help a servitised organisation to identify the key business processes and clarify their influences to supply chain operations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the presented work the problem of management business-processes with changeable structure is considered and situational based approach to its decision is offered. The approach is based on situational model of management business-process according to which process is represented as a set of situations. The script defining necessary actions is connected with each situation. Management of process is carried out by means of the rules formalizing functional requirements to processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sharing of near real-time traceability knowledge in supply chains plays a central role in coordinating business operations and is a key driver for their success. However before traceability datasets received from external partners can be integrated with datasets generated internally within an organisation, they need to be validated against information recorded for the physical goods received as well as against bespoke rules defined to ensure uniformity, consistency and completeness within the supply chain. In this paper, we present a knowledge driven framework for the runtime validation of critical constraints on incoming traceability datasets encapuslated as EPCIS event-based linked pedigrees. Our constraints are defined using SPARQL queries and SPIN rules. We present a novel validation architecture based on the integration of Apache Storm framework for real time, distributed computation with popular Semantic Web/Linked data libraries and exemplify our methodology on an abstraction of the pharmaceutical supply chain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EPCIS specification provides an event oriented mechanism to record product movement information across stakeholders in supply chain business processes. Besides enabling the sharing of event-based traceability datasets, track and trace implementations must also be equipped with the capabilities to validate integrity constraints and detect runtime exceptions without compromising the time-to-deliver schedule of the shipping and receiving parties. In this paper we present a methodology for detecting exceptions arising during the processing of EPCIS event datasets. We propose an extension to the EEM ontology for modelling EPCIS exceptions and show how runtime exceptions can be detected and reported. We exemplify and evaluate our approach on an abstraction of pharmaceutical supply chains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supply chains comprise of complex processes spanning across multiple trading partners. The various operations involved generate large number of events that need to be integrated in order to enable internal and external traceability. Further, provenance of artifacts and agents involved in the supply chain operations is now a key traceability requirement. In this paper we propose a Semantic web/Linked data powered framework for the event based representation and analysis of supply chain activities governed by the EPCIS specification. We specifically show how a new EPCIS event type called "Transformation Event" can be semantically annotated using EEM - The EPCIS Event Model to generate linked data, that can be exploited for internal event based traceability in supply chains involving transformation of products. For integrating provenance with traceability, we propose a mapping from EEM to PROV-O. We exemplify our approach on an abstraction of the production processes that are part of the wine supply chain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordination of business processes is the management of dependencies where dependencies constrain how the tasks are performed. It has been traditionally done in an intuitive fashion, without paying much attention to the coordination load. Coordination load is being defined as the ratio between the time spent on coordination activities and the total task time. Previous efforts to understand and analyze coordination have resulted in mostly qualitative approaches to categorize and recommend coordination strategies. This research seeks to answer two questions: (1) How can we analyze process coordination problems to improve overall performance? (2) What guidance can we provide to reduce the coordination load of the process and consequently improve the organization's performance? Thus, this effort developed a quantitative measure for coordination load of business processes and a methodology to apply such measure. ^ This effort used a management simulation game to have a controlled laboratory environment enabling the manipulation of the task factors variability, analyzability, and interdependence to measure their impact on coordination load. The hypothesis was that the more variable, non-analyzable, and interdependent a process, the higher the coordination load, and that a higher coordination load would have a negative impact on performance. Coordination load was measured via the surrogate coordination time, and performance via profit. ^ A 22 x 31 full factorial design, with two replicates, was run to observe the impact on the variables coordination time and profit. Properly validated spreadsheets and questionnaires were used as data collection instruments for each scenario. The experimental results indicate that lower task analyzability (ρ=0.036) and higher task interdependence (ρ=0.000) lead to higher coordination load, and higher levels of task variability (ρ=0.049) lead to lower performance. However, contrary to the hypotheses postulated by this work, coordination load did not prove to be strong predictor of performance (correlation of -0.086). ^ These findings from the laboratory experiment and other lessons learned were incorporated to develop a quantitative measure, a tool (survey) to use to gather data for the variables in the measures, and a methodology to quantify coordination load of production business processes. The practicality of the methodology is demonstrated with an example.^