465 resultados para New business enterprises - Management
Resumo:
BIM as a suite of technologies has been enabled by the significant improvements in IT infrastructure, the capabilities of computer hardware and software, the increasing adoption of BIM, and the development of Industry Foundation Classes (IFC) which facilitate the sharing of information between firms. The report highlights the advantages of BIM, particularly the increased utility and speed, better data quality and enhanced fault finding in all construction phases. Additionally BIM promotes enhanced collaborations and visualisation of data mainly in the design and construction phase. There are a number of barriers to the effective implementation of BIM. These include, somewhat paradoxically, a single detailed model (which precludes scenarios and development of detailed alternative designs); the need for three different interoperability standards for effective implementation; added work for the designer which needs to be recognised and remunerated; the size and complexity of BIM, which requires significant investment in human capital to enable the realisation of its full potential. There are also a number of challenges to implementing BIM. The report has identified these as a range of issues concerning: IP, liability, risks and contracts, and the authenticity of users. Additionally, implementing BIM requires investment in new technology, skills training and development of news ways of collaboration. Finally, there are likely to be Trade Practices concerns as requiring certain technology owned by relatively few firms may limit
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
This series of research vignettes is aimed at sharing current and interesting research findings from our team of international Entrepreneurship researchers. This vignette deals with the process of new venture creation, and specifically the sequence in which different ‘start-up activities’ are undertaken.
Resumo:
Knowledge management (KM) strategy is the planned or actual coordination of a firm's major goals and learning in time; this coordination continually co-aligns the firm's knowledge-based resources with the environment. Based on the organic perspective of strategy, a KM performance evaluation approach should be able to 1) review the knowledge governance mechanisms and learning routines that underpin the KM strategy, as well as the performance outcomes driven by the strategy, and 2) predict the evolution of performance drivers and outcomes into the future to facilitate strategic planning. This study combined a survey study and a system dynamics (SD) simulation to demonstrate the transformation from a mechanistic to an organic perspective on KM strategy and performance evaluation. The survey study was conducted based on a sample of 143 construction contractors and used structural equation modeling (SEM) techniques to develop a KM performance index for reviewing the key elements that underpin KM strategy. The SD simulation predicted the development of KM strategy configurations and the evolution of KM performance over time. The organic KM performance evaluation approach demonstrated by this study has significant potential to improve the alignment of KM strategy within an increasingly dynamic business environment.
Resumo:
Business process models have traditionally been an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach for process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions as they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. Empirical data obtained in this study suggests that this approach may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.
Resumo:
Structural reform through forced mergers has been a dominant feature of Australian local government for decades. Advocates of compulsory consolidation contend that larger municipalities perform better across a wide range of attributes, including financial sustainability. While empirical scholars of local government have invested considerable effort into investigating these claims, no-one has yet examined the performance of Brisbane City Council against other local authorities, despite the fact that it is by far the largest council in Australia. This paper seeks to remedy this neglect by comparing Brisbane with Sydney City Council, an average of six south east Queensland councils and an average of ten metropolitan New South Wales councils against four measures of financial performance over the period 2008 to 2011.
Resumo:
This chapter analyses recent policy reforms in the national history curriculum in both Australia and the Russian Federation. It analyses those emphases in the national curriculum in history that depict new representations and historiography and the ways in which this is foregrounded in History school textbooks. In doing so, it considers the debates about what version of the nation’s past are deemed significant, and what should be transmitted to future generations of citizens. In this discussion of national history curricula, consideration is made of the curriculum’s officially defined status as an instrument in the process of ideological transformation, and nation-building. The chapter also examines how history textbooks are implicit in this process, in terms of reproducing and representing what content is selected and emphasised in a national history curriculum.
Resumo:
Business processes are prone to continuous and unexpected changes. Process workers may start executing a process differently in order to adjust to changes in workload, season, guidelines or regulations for example. Early detection of business process changes based on their event logs – also known as business process drift detection – enables analysts to identify and act upon changes that may otherwise affect process performance. Previous methods for business process drift detection are based on an exploration of a potentially large feature space and in some cases they require users to manually identify the specific features that characterize the drift. Depending on the explored feature set, these methods may miss certain types of changes. This paper proposes a fully automated and statistically grounded method for detecting process drift. The core idea is to perform statistical tests over the distributions of runs observed in two consecutive time windows. By adaptively sizing the window, the method strikes a trade-off between classification accuracy and drift detection delay. A validation on synthetic and real-life logs shows that the method accurately detects typical change patterns and scales up to the extent it is applicable for online drift detection.
Resumo:
This paper addresses the problem of identifying and explaining behavioral differences between two business process event logs. The paper presents a method that, given two event logs, returns a set of statements in natural language capturing behavior that is present or frequent in one log, while absent or infrequent in the other. This log delta analysis method allows users to diagnose differences between normal and deviant executions of a process or between two versions or variants of a process. The method relies on a novel approach to losslessly encode an event log as an event structure, combined with a frequency-enhanced technique for differencing pairs of event structures. A validation of the proposed method shows that it accurately diagnoses typical change patterns and can explain differences between normal and deviant cases in a real-life log, more compactly and precisely than previously proposed methods.
Resumo:
Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.
Resumo:
The African philosophy of Ubuntu is typically characterised as a communitarian philosophy that emphasises virtues such as compassion, tolerance and harmony. In recent years there has been growing interest in this philosophy, and in how it can be applied to a variety of disciplines and issues. Several authors have provided useful introductions of Ubuntu in the field of business ethics and suggested theoretical ways in which it could be applied. The purpose of this paper is to extend this discussion by providing a more critical analysis of Ubuntu and business ethics with the aim of clarifying the role that Ubuntu can play, and providing guidance for further research in this area. The analysis consists of three sections. In the first, certain problems are identified within the existing literature. This is followed by a consideration of alternative perspectives and interpretations of Ubuntu. The last section, following from the first two, identifies specific areas requiring further research, both empirical and non-empirical, as well as ways in which Ubuntu could be fruitfully applied.
Resumo:
Research question / issue This paper frames the debate on corporate governance convergence in terms of the morality underlying corporate governance models. The claims and arguments of moral relativism are presented to provide theoretical structure to the moral aspects of corporate governance convergence, and ultimately the normative question of whether convergence should occur. Research findings / insights: The morality underlying different models of corporate governance has largely been ignored in the corporate governance convergence literature. A range of moral philosophies and principles that underlie the dominant corporate governance models are identified. This leads to a consideration of the claims and arguments of moral relativism relating to corporate governance. A research agenda around the claims of Descriptive and Metaethical moral relativism, and which ultimately informs the associated normative argument, is then suggested. Theoretical / Academic implications The application of moral relativism to the debate on corporate governance convergence presents a theoretical structure to the analysis and consideration of its moral aspects. This structure lends itself to further research, both empirical and conceptual. Practitioner / Policy implications The claims and arguments of moral relativism provide a means of analysing calls that are made for a culturally or nationally ‘appropriate’ model of corporate governance. This can assist in providing direction for corporate governance reforms and is of particular relevance for developing countries which have inherited Western corporate governance models through colonialism.