690 resultados para Incineration process
Resumo:
Project selection is a decision-making process that is not merely influenced by technical aspects but also by the people who involved in the process. Organisational culture is described as a set of values and norms that are shared by people within the organisation that affects the way they interact with each other and with stakeholders from outside the organisation. The aim of this paper is to emphasize the importance of organisational culture on improving the quality of decisions in the project selection process, in addition to the influence of technical aspects of a project. The discussion is based on an extensive literature review and, as such, represents the first part of a research agenda investigating the impact of organisational culture on the project selection process applicable specifically to road infrastructure contracts. Four existing models of organisational culture (Denison 1990; Cameron and Quinn 2006; Hofstede 2001; Glaser et al 1987) are discussed and reviewed in view of their use in the larger research project to investigate the impact of culture on identified critical elements of decision-making. An understating of the way organisational culture impacts on project selection will increase the likelihood in future of relevant government departments selecting projects that achieve their stated organisational goals.
Resumo:
Calibration process in micro-simulation is an extremely complicated phenomenon. The difficulties are more prevalent if the process encompasses fitting aggregate and disaggregate parameters e.g. travel time and headway. The current practice in calibration is more at aggregate level, for example travel time comparison. Such practices are popular to assess network performance. Though these applications are significant there is another stream of micro-simulated calibration, at disaggregate level. This study will focus on such micro-calibration exercise-key to better comprehend motorway traffic risk level, management of variable speed limit (VSL) and ramp metering (RM) techniques. Selected section of Pacific Motorway in Brisbane will be used as a case study. The discussion will primarily incorporate the critical issues encountered during parameter adjustment exercise (e.g. vehicular, driving behaviour) with reference to key traffic performance indicators like speed, land distribution and headway; at specific motorway points. The endeavour is to highlight the utility and implications of such disaggregate level simulation for improved traffic prediction studies. The aspects of calibrating for points in comparison to that for whole of the network will also be briefly addressed to examine the critical issues such as the suitability of local calibration at global scale. The paper will be of interest to transport professionals in Australia/New Zealand where micro-simulation in particular at point level, is still comparatively a less explored territory in motorway management.
Resumo:
Transmission smart grids will use a digital platform for the automation of high voltage substations. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. A time synchronisation system is required for a sampled value process bus, however the details are not defined in IEC 61850-9-2. PTPv2 provides the greatest accuracy of network based time transfer systems, with timing errors of less than 100 ns achievable. The suitability of PTPv2 to synchronise sampling in a digital process bus is evaluated, with preliminary results indicating that steady state performance of low cost clocks is an acceptable ±300 ns, but that corrections issued by grandmaster clocks can introduce significant transients. Extremely stable grandmaster oscillators are required to ensure any corrections are sufficiently small that time synchronising performance is not degraded.
Resumo:
In this editorial letter, we provide the readers of Information Systems with a birds-eye introduction to Process-aware Information Systems (PAIS) – a sub-field of Information Systems that has drawn growing attention in the past two decades, both as an engineering and as a management discipline. Against this backdrop, we briefly discuss how the papers included in this special issue contribute to extending the body of knowledge in this field.
Resumo:
Business Process Management (BPM) is a topic that continues to grow in significance as organisations seek to gain and sustain competitive advantage in an increasingly global environment. Despite anecdotal evidence of organisations improving performance by pursuing a BPM approach, there is little theory that explains and substantiates this relationship. This study provides the first theory on the progression and maturity of BPM Initiatives within organisations and provides a vital starting block upon which future research in this area can build. The Researcher starts by clearly defining three key terms (BPM Initiative, BPM Progression and BPM Maturity), showing the relationship between these three concepts and proposing their relationship with Organisational Performance. The Researcher then combines extant literature and use of the Delphi Technique and the case study method to explore the progression and measurement of the BPM Initiatives within organisations. The study builds upon the principles of general theories including the Punctuated Equilibrium Model and Dynamic Capabilities to present theory on BPM Progression and BPM Maturity. Using the BPM Capability Framework developed through an international Delphi study series, the Researcher shows how the specific organisational context influences which capability areas an organisation chooses to progress. By comparing five separate organisations over an extended time the Researcher is able to show that, despite this disparity, there is some evidence of consistency with regard to the capability areas progressed. This suggests that subsequent identification of progression paths may be possible. The study also shows that the approach and scope taken to BPM within each organisation is a likely predictor of such paths. These outcomes result in the proposal of a formative model for measuring BPM Maturity.
Resumo:
Project selection is a complex decision-making process as it involves multiple objectives, constraints and stakeholders. Understanding the organisation, in particular organisational culture, is an essential stage in improving decision-making process. The influences of organisational culture on decision-making can be seen in the way people work as a team, act and cooperate in their teamwork to achieve the set goals, and also in how people think, prioritize and decide. This paper aims to give evidence of the impact of organisational culture on the decision-making process in project selection, in the Indonesian context. Data was collected from a questionnaire survey developed based on the existing models of organisational culture (Denison 1990, Hofstede 2001, and Glaser et al 1987). Four main cultural traits (involvement, consistency, mission and power-distance) were selected and employed to examine the influence of organisational culture on the effectiveness of decision-making in the current Indonesian project selection processes. The results reveal that there are differences in organisational cultures for two organisations in three provinces. It also suggests that organisational culture (particularly the traits of ‘involvement’, ‘consistency’ and ‘mission’) contribute to the effectiveness of decision-making in the selected cases.
Resumo:
Formalised service innovation is a central tenet of enterprise systems lifecycle phases. Event driven process models extended with knowledge objects are found to be not useful in early lifecycle phases. When an upgrade is required, a map of the knowledge infrastructure is needed to better design further service innovation because functional maps no longer adequately describe the context adequately. By looking at formal changes to business processes as service innovations, and recognising the knowledge infrastructure inherent in services generally, changes driven through technology such as ES can be better understood with the application of frameworks such as B-KIDE.
Resumo:
Product Lifecycle Management has been developed as an approach to providing timely engineering information. However, the number of domain specializations within manufacturing makes such information communication disjointed, inefficient and error-prone. In this paper we propose an immersive 3D visualization of linked domain- specific information views for improving and accelerating communication processes in Product Lifecycle Management. With a common and yet understandable visualization of several domain views, interconnections and dependencies become obvious. The conceptual framework presented here links domain-specific information extracts from Product Lifecycle Management systems with each other and displays them via an integrated 3D representation scheme. We expect that this visualization framework should support holistic tactical decision making processes between domain-experts in operational and tactical manufacturing scenarios.
Resumo:
This study examines a dialogue process managers can use to explore community attitudes. The objectives of the research are to develop a dialogue process that engages community audiences on climate mitigation strategies. Secondly, to understand participants perspectives and potential reactions in particular to underground storage of CO2 and determine the strategies that most effectively engage people in dialogue to enable the climate change debate to move forward. Finally, to develop a dialogue process that can be used by managers on other politically sensitive topics. Knowledge of the dynamics of psychosocial relationships and communication between stakeholders contributed to increased understanding of the issues. The key findings of this study indicate that the public can be engaged in dialogue on the issue of CO2 capture and storage and low emission technologies without engendering adverse reactions. The dialogue process is critical to participant’s engagement and led to behaviour change in energy use.
Resumo:
There has been discussion whether corporate decision-making can be helped by decision support systems regarding qualitative aspects of decision making (e.g. trouble shooting)(Löf and Möller, 2003). Intelligent decision support systems have been developed to help business controllers to perform their business analysis. However, few papers investigated the user’s point of view regarding such systems. How do decision-makers perceive the use of decision support systems, in general, and dashboards in particular? Are dashboards useful tools for business controllers? Based on the technology acceptance model and on the positive mood theory, we suggest a series of antecedent factors that influence the perceived usefulness and perceived ease of use of dashboards. A survey is used to collect data regarding the measurement constructs. The managerial implications of this paper consist in showing the degree of penetration of dashboards in the decision making in organizations and some of the factors that explain this respective penetration rate.
Resumo:
Previous research has put forward a number of properties of business process models that have an impact on their understandability. Two such properties are compactness and(block-)structuredness. What has not been sufficiently appreciated at this point is that these desirable properties may be at odds with one another. This paper presents the results of a two-pronged study aimed at exploring the trade-off between compactness and structuredness of process models. The first prong of the study is a comparative analysis of the complexity of a set of unstructured process models from industrial practice and of their corresponding structured versions. The second prong is an experiment wherein a cohort of students was exposed to semantically equivalent unstructured and structured process models. The key finding is that structuredness is not an absolute desideratum vis-a-vis for process model understandability. Instead, subtle trade-offs between structuredness and other model properties are at play.
Resumo:
Variants of the same process can be encountered within one organization or across different organizations. For example, different municipalities, courts, and rental agencies all need to support highly similar processes. In fact, procurement and sales processes can be found in almost any organization. However, despite these similarities, there is also the need to allow for local variations in a controlled manner. Therefore, many academics and practitioners have advocated the use of configurable process models (sometimes referred to as reference models). A configurable process model describes a family of similar process models in a given domain. Such a model can be configured to obtain a specific process model that is subsequently used to handle individual cases, for instance, to process customer orders. Process configuration is notoriously difficult as there may be all kinds of interdependencies between configuration decisions. In fact, an incorrect configuration may lead to behavioral issues such as deadlocks and livelocks. To address this problem, we present a novel verification approach inspired by the “operating guidelines” used for partner synthesis. We view the configuration process as an external service, and compute a characterization of all such services which meet particular requirements via the notion of configuration guideline. As a result, we can characterize all feasible configurations (i. e., configurations without behavioral problems) at design time, instead of repeatedly checking each individual configuration while configuring a process model.
Resumo:
Process mining techniques are able to extract knowledge from event logs commonly available in today’s information systems. These techniques provide new means to discover, monitor, and improve processes in a variety of application domains. There are two main drivers for the growing interest in process mining. On the one hand, more and more events are being recorded, thus, providing detailed information about the history of processes. On the other hand, there is a need to improve and support business processes in competitive and rapidly changing environments. This manifesto is created by the IEEE Task Force on Process Mining and aims to promote the topic of process mining. Moreover, by defining a set of guiding principles and listing important challenges, this manifesto hopes to serve as a guide for software developers, scientists, consultants, business managers, and end-users. The goal is to increase the maturity of process mining as a new tool to improve the (re)design, control, and support of operational business processes.
Resumo:
The health system is one sector dealing with a deluge of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Also, there are many healthcare organisations, which still have stand-alone systems, not integrated for management of information and decision-making. This shows, there is a need for an effective system to capture, collate and distribute this health data. Therefore, implementing the data warehouse concept in healthcare is potentially one of the solutions to integrate health data. Data warehousing has been used to support business intelligence and decision-making in many other sectors such as the engineering, defence and retail sectors. The research problem that is going to be addressed is, "how can data warehousing assist the decision-making process in healthcare". To address this problem the researcher has narrowed an investigation focusing on a cardiac surgery unit. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. The cardiac surgery unit at TPCH uses a stand-alone database of patient clinical data, which supports clinical audit, service management and research functions. However, much of the time, the interaction between the cardiac surgery unit information system with other units is minimal. There is a limited and basic two-way interaction with other clinical and administrative databases at TPCH which support decision-making processes. The aims of this research are to investigate what decision-making issues are faced by the healthcare professionals with the current information systems and how decision-making might be improved within this healthcare setting by implementing an aligned data warehouse model or models. As a part of the research the researcher will propose and develop a suitable data warehouse prototype based on the cardiac surgery unit needs and integrating the Intensive Care Unit database, Clinical Costing unit database (Transition II) and Quality and Safety unit database [electronic discharge summary (e-DS)]. The goal is to improve the current decision-making processes. The main objectives of this research are to improve access to integrated clinical and financial data, providing potentially better information for decision-making for both improved from the questionnaire and by referring to the literature, the results indicate a centralised data warehouse model for the cardiac surgery unit at this stage. A centralised data warehouse model addresses current needs and can also be upgraded to an enterprise wide warehouse model or federated data warehouse model as discussed in the many consulted publications. The data warehouse prototype was able to be developed using SAS enterprise data integration studio 4.2 and the data was analysed using SAS enterprise edition 4.3. In the final stage, the data warehouse prototype was evaluated by collecting feedback from the end users. This was achieved by using output created from the data warehouse prototype as examples of the data desired and possible in a data warehouse environment. According to the feedback collected from the end users, implementation of a data warehouse was seen to be a useful tool to inform management options, provide a more complete representation of factors related to a decision scenario and potentially reduce information product development time. However, there are many constraints exist in this research. For example the technical issues such as data incompatibilities, integration of the cardiac surgery database and e-DS database servers and also, Queensland Health information restrictions (Queensland Health information related policies, patient data confidentiality and ethics requirements), limited availability of support from IT technical staff and time restrictions. These factors have influenced the process for the warehouse model development, necessitating an incremental approach. This highlights the presence of many practical barriers to data warehousing and integration at the clinical service level. Limitations included the use of a small convenience sample of survey respondents, and a single site case report study design. As mentioned previously, the proposed data warehouse is a prototype and was developed using only four database repositories. Despite this constraint, the research demonstrates that by implementing a data warehouse at the service level, decision-making is supported and data quality issues related to access and availability can be reduced, providing many benefits. Output reports produced from the data warehouse prototype demonstrated usefulness for the improvement of decision-making in the management of clinical services, and quality and safety monitoring for better clinical care. However, in the future, the centralised model selected can be upgraded to an enterprise wide architecture by integrating with additional hospital units’ databases.