736 resultados para Open Business Model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper takes its root in a trivial observation: management approaches are unable to provide relevant guidelines to cope with uncertainty, and trust of our modern worlds. Thus, managers are looking for reducing uncertainty through information’s supported decision-making, sustained by ex-ante rationalization. They strive to achieve best possible solution, stability, predictability, and control of “future”. Hence, they turn to a plethora of “prescriptive panaceas”, and “management fads” to bring simple solutions through best practices. However, these solutions are ineffective. They address only one part of a system (e.g. an organization) instead of the whole. They miss the interactions and interdependencies with other parts leading to “suboptimization”. Further classical cause-effects investigations and researches are not very helpful to this regard. Where do we go from there? In this conversation, we want to challenge the assumptions supporting the traditional management approaches and shed some lights on the problem of management discourse fad using the concept of maturity and maturity models in the context of temporary organizations as support for reflexion. Global economy is characterized by use and development of standards and compliance to standards as a practice is said to enable better decision-making by managers in uncertainty, control complexity, and higher performance. Amongst the plethora of standards, organizational maturity and maturity models hold a specific place due to general belief in organizational performance as dependent variable of (business) processes continuous improvement, grounded on a kind of evolutionary metaphor. Our intention is neither to offer a new “evidence based management fad” for practitioners, nor to suggest research gap to scholars. Rather, we want to open an assumption-challenging conversation with regards to main stream approaches (neo-classical economics and organization theory), turning “our eyes away from the blinding light of eternal certitude towards the refracted world of turbid finitude” (Long, 2002, p. 44) generating what Bernstein has named “Cartesian Anxiety” (Bernstein, 1983, p. 18), and revisit the conceptualization of maturity and maturity models. We rely on conventions theory and a systemic-discursive perspective. These two lenses have both information & communication and self-producing systems as common threads. Furthermore the narrative approach is well suited to explore complex way of thinking about organizational phenomena as complex systems. This approach is relevant with our object of curiosity, i.e. the concept of maturity and maturity models, as maturity models (as standards) are discourses and systems of regulations. The main contribution of this conversation is that we suggest moving from a neo-classical “theory of the game” aiming at making the complex world simpler in playing the game, to a “theory of the rules of the game”, aiming at influencing and challenging the rules of the game constitutive of maturity models – conventions, governing systems – making compatible individual calculation and social context, and possible the coordination of relationships and cooperation between agents with or potentially divergent interests and values. A second contribution is the reconceptualization of maturity as structural coupling between conventions, rather than as an independent variable leading to organizational performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This case study examines the way in which Knowledge Unlatched is combining collective action and open access licenses to encourage innovation in markets for specialist academic books. Knowledge Unlatched is a not for profit organisation that has been established to help a global community of libraries coordinate their book purchasing activities more effectively and, in so doing, to ensure that books librarians select for their own collections become available for free for anyone in the world to read. The Knowledge Unlatched model is an attempt to re-coordinate a market in order to facilitate a transition to digitally appropriate publishing models that include open access. It offers librarians an opportunity to facilitate the open access publication of books that their own readers would value access to. It provides publishers with a stable income stream on titles selected by libraries, as well as an ability to continue selling books to a wider market on their own terms. Knowledge Unlatched provides a rich case study for researchers and practitioners interested in understanding how innovations in procurement practices can be used to stimulate more effective, equitable markets for socially valuable products.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An encryption scheme is non-malleable if giving an encryption of a message to an adversary does not increase its chances of producing an encryption of a related message (under a given public key). Fischlin introduced a stronger notion, known as complete non-malleability, which requires attackers to have negligible advantage, even if they are allowed to transform the public key under which the related message is encrypted. Ventre and Visconti later proposed a comparison-based definition of this security notion, which is more in line with the well-studied definitions proposed by Bellare et al. The authors also provide additional feasibility results by proposing two constructions of completely non-malleable schemes, one in the common reference string model using non-interactive zero-knowledge proofs, and another using interactive encryption schemes. Therefore, the only previously known completely non-malleable (and non-interactive) scheme in the standard model, is quite inefficient as it relies on generic NIZK approach. They left the existence of efficient schemes in the common reference string model as an open problem. Recently, two efficient public-key encryption schemes have been proposed by Libert and Yung, and Barbosa and Farshim, both of them are based on pairing identity-based encryption. At ACISP 2011, Sepahi et al. proposed a method to achieve completely non-malleable encryption in the public-key setting using lattices but there is no security proof for the proposed scheme. In this paper we review the mentioned scheme and provide its security proof in the standard model. Our study shows that Sepahi’s scheme will remain secure even for post-quantum world since there are currently no known quantum algorithms for solving lattice problems that perform significantly better than the best known classical (i.e., non-quantum) algorithms.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis presents novel techniques for addressing the problems of continuous change and inconsistencies in large process model collections. The developed techniques treat process models as a collection of fragments and facilitate version control, standardization and automated process model discovery using fragment-based concepts. Experimental results show that the presented techniques are beneficial in consolidating large process model collections, specifically when there is a high degree of redundancy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In order to execute, study, or improve operating procedures, companies document them as business process models. Often, business process analysts capture every single exception handling or alternative task handling scenario within a model. Such a tendency results in large process specifications. The core process logic becomes hidden in numerous modeling constructs. To fulfill different tasks, companies develop several model variants of the same business process at different abstraction levels. Afterwards, maintenance of such model groups involves a lot of synchronization effort and is erroneous. We propose an abstraction technique that allows generalization of process models. Business process model abstraction assumes a detailed model of a process to be available and derives coarse-grained models from it. The task of abstraction is to tell significant model elements from insignificant ones and to reduce the latter. We propose to learn insignificant process elements from supplementary model information, e.g., task execution time or frequency of task occurrence. Finally, we discuss a mechanism for user control of the model abstraction level – an abstraction slider.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Empirical evidence shows that repositories of business process models used in industrial practice contain significant amounts of duplication. This duplication arises for example when the repository covers multiple variants of the same processes or due to copy-pasting. Previous work has addressed the problem of efficiently retrieving exact clones that can be refactored into shared subprocess models. This article studies the broader problem of approximate clone detection in process models. The article proposes techniques for detecting clusters of approximate clones based on two well-known clustering algorithms: DBSCAN and Hi- erarchical Agglomerative Clustering (HAC). The article also defines a measure of standardizability of an approximate clone cluster, meaning the potential benefit of replacing the approximate clones with a single standardized subprocess. Experiments show that both techniques, in conjunction with the proposed standardizability measure, accurately retrieve clusters of approximate clones that originate from copy-pasting followed by independent modifications to the copied fragments. Additional experiments show that both techniques produce clusters that match those produced by human subjects and that are perceived to be standardizable.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In-memory databases have become a mainstay of enterprise computing offering significant performance and scalability boosts for online analytical and (to a lesser extent) transactional processing as well as improved prospects for integration across different applications through an efficient shared database layer. Significant research and development has been undertaken over several years concerning data management considerations of in-memory databases. However, limited insights are available on the impacts of applications and their supportive middleware platforms and how they need to evolve to fully function through, and leverage, in-memory database capabilities. This paper provides a first, comprehensive exposition into how in-memory databases impact Business Pro- cess Management, as a mission-critical and exemplary model-driven integration and orchestration middleware. Through it, we argue that in-memory databases will render some prevalent uses of legacy BPM middleware obsolete, but also open up exciting possibilities for tighter application integration, better process automation performance and some entirely new BPM capabilities such as process-based application customization. To validate the feasibility of an in-memory BPM, we develop a surprisingly simple BPM runtime embedded into SAP HANA and providing for BPMN-based process automation capabilities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper proposes a new multi-resource multi-stage scheduling problem for optimising the open-pit drilling, blasting and excavating operations under equipment capacity constraints. The flow process is analysed based on the real-life data from an Australian iron ore mine site. The objective of the model is to maximise the throughput and minimise the total idle times of equipment at each stage. The following comprehensive mining attributes and constraints have been considered: types of equipment; operating capacities of equipment; ready times of equipment; speeds of equipment; block-sequence-dependent movement times of equipment; equipment-assignment-dependent operation times of blocks; distances between each pair of blocks; due windows of blocks; material properties of blocks; swell factors of blocks; and slope requirements of blocks. It is formulated by mixed integer programming and solved by ILOG-CPLEX optimiser. The proposed model is validated with extensive computational experiments to improve mine production efficiency at the operational level. The model also provides an intelligent decision support tool to account for the availability and usage of equipment units for drilling, blasting and excavating stages.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This special issue of Cultural Science Journal is devoted to the report of a groundbreaking experiment in re-coordinating global markets for specialist scholarly books and enabling the knowledge commons: the Knowledge Unlatched proof-of-concept pilot. The pilot took place between January 2012 and September 2014. It involved libraries, publishers, authors, readers and research funders in the process of developing and testing a global library consortium model for supporting Open Access books. The experiment established that authors, librarians, publishers and research funding agencies can work together in powerful new ways to enable open access; that doing so is cost effective; and that a global library consortium model has the potential dramatically to widen access to the knowledge and ideas contained in book-length scholarly works.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper proposes a new multi-resource multi-stage mine production timetabling problem for optimising the open-pit drilling, blasting and excavating operations under equipment capacity constraints. The flow process is analysed based on the real-life data from an Australian iron ore mine site. The objective of the model is to maximise the throughput and minimise the total idle times of equipment at each stage. The following comprehensive mining attributes and constraints are considered: types of equipment; operating capacities of equipment; ready times of equipment; speeds of equipment; block-sequence-dependent movement times; equipment-assignment-dependent operational times; etc. The model also provides the availability and usage of equipment units at multiple operational stages such as drilling, blasting and excavating stages. The problem is formulated by mixed integer programming and solved by ILOG-CPLEX optimiser. The proposed model is validated with extensive computational experiments to improve mine production efficiency at the operational level.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis provides two main contributions. The first one is BP-TRBAC, a unified authorisation model that can support legacy systems as well as business process systems. BP-TRBAC supports specific features that are required by business process environments. BP-TRBAC is designed to be used as an independent enterprise-wide authorisation model, rather than having it as part of the workflow system. It is designed to be the main authorisation model for an organisation. The second contribution is BP-XACML, an authorisation policy language that is designed to represent BPM authorisation policies for business processes. The contribution also includes a policy model for BP-XACML. Using BP-TRBAC as an authorisation model together with BP-XACML as an authorisation policy language will allow an organisation to manage and control authorisation requests from workflow systems and other legacy systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Similar to most other creative industries, the evolution of the music industry is heavily shaped by media technologies. This was equally true in 1999, when the global recorded music industry had experienced two decades of continuous growth largely driven by the rapid transition from vinyl records to Compact Discs. The transition encouraged avid music listeners to purchase much of their music collections all over again in order to listen to their favourite music with ‘digital sound’. As a consequence of this successful product innovation, recorded music sales (unit measure) more than doubled between the early 1980s and the end of the 1990s. It was with this backdrop that the first peer-to-peer file sharing service was developed and released to the mainstream music market in 1999 by the college student Shawn Fanning. The service was named Napster and it marks the beginning of an era that is now a classic example of how an innovation is able to disrupt an entire industry and make large swathes of existing industry competences obsolete. File sharing services such as Napster, followed by a range of similar services in its path, reduced physical unit sales in the music industry to levels that had not been seen since the 1970s. The severe impact of the internet on physical sales shocked many music industry executives who spent much of the 2000s vigorously trying to reverse the decline and make the disruptive technologies go away. At the end, they learned that their efforts were to no avail and the impact on the music industry proved to be transformative, irreversible and, to many music industry professionals, also devastating. Thousands of people lost their livelihood, large and small music companies have folded or been forced into mergers or acquisitions. But as always during periods of disruption, the past 15 years have also been very innovative, spurring a plethora of new music business models. These new business models have mainly emerged outside the music industry and the innovators have been often been required to be both persuasive and persistent in order to get acceptance from the risk-averse and cash-poor music industry establishment. Apple was one such change agent that in 2003 was the first company to open up a functioning and legal market for online music. iTunes Music Store was the first online retail outlet that was able to offer the music catalogues from all the major music companies; it used an entirely novel pricing model, and it allowed consumers to de-bundle the music album and only buy the songs that they actually liked. Songs had previously been bundled by physical necessity as discs or cassettes, but with iTunes Music Store, the institutionalized album bundle slowly started to fall apart. The consequences had an immediate impact on music retailing and within just a few years, many brick and mortar record stores were forced out of business in markets across the world. The transformation also had disruptive consequences beyond music retailing and redefined music companies’ organizational structures, work processes and routines, as well as professional roles. iTunes Music Store in one sense was a disruptive innovation, but it was at the same time relatively incremental, since the major labels’ positions and power structures remained largely unscathed. The rights holders still controlled their intellectual properties and the structures that guided the royalties paid per song that was sold were predictable, transparent and in line with established music industry practices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study presents a comprehensive mathematical formulation model for a short-term open-pit mine block sequencing problem, which considers nearly all relevant technical aspects in open-pit mining. The proposed model aims to obtain the optimum extraction sequences of the original-size (smallest) blocks over short time intervals and in the presence of real-life constraints, including precedence relationship, machine capacity, grade requirements, processing demands and stockpile management. A hybrid branch-and-bound and simulated annealing algorithm is developed to solve the problem. Computational experiments show that the proposed methodology is a promising way to provide quantitative recommendations for mine planning and scheduling engineers.