918 resultados para Process model alignment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Temporary Traffic Control Plans (TCP’s), which provide construction phasing to maintain traffic during construction operations, are integral component of highway construction project design. Using the initial design, designers develop estimated quantities for the required TCP devices that become the basis for bids submitted by highway contractors. However, actual as-built quantities are often significantly different from the engineer’s original estimate. The total cost of TCP phasing on highway construction projects amounts to 6–10% of the total construction cost. Variations between engineer estimated quantities and final quantities contribute to reduced cost control, increased chances of cost related litigations, and bid rankings and selection. Statistical analyses of over 2000 highway construction projects were performed to determine the sources of variation, which later were used as the basis of development for an automated-hybrid prediction model that uses multiple regressions and heuristic rules to provide accurate TCP quantities and costs. The predictive accuracy of the model developed was demonstrated through several case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The previous chapters gave an insightful introduction into the various facets of Business Process Management. We now share a rich understanding of the essential ideas behind designing and managing processes for organizational purposes. We have also learned about the various streams of research and development that have influenced contemporary BPM. As a matter of fact, BPM has become a holistic management discipline. As such, it requires that a plethora of facets needs to be addressed for its successful und sustainable application. This chapter provides a framework that consolidates and structures the essential factors that constitute BPM as a whole. Drawing from research in the field of maturity models, we suggest six core elements of BPM: strategic alignment, governance, methods, information technology, people, and culture. These six elements serve as the structure for this BPM Handbook.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A BPMN model is well-structured if splits and joins are always paired into single-entry-single-exit blocks. Well-structuredness is often a desirable property as it promotes readability and makes models easier to analyze. However, many process models found in practice are not well-structured, and it is not always feasible or even desirable to restrict process modelers to produce only well-structured models. Also, not all processes can be captured as well-structured process models. An alternative to forcing modelers to produce well-structured models, is to automatically transform unstructured models into well-structured ones when needed and possible. This talk reviews existing results on automatic transformation of unstructured process models into structured ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Validation is an important issue in the development and application of Bayesian Belief Network (BBN) models, especially when the outcome of the model cannot be directly observed. Despite this, few frameworks for validating BBNs have been proposed and fewer have been applied to substantive real-world problems. In this paper we adopt the approach by Pitchforth and Mengersen (2013), which includes nine validation tests that each focus on the structure, discretisation, parameterisation and behaviour of the BBNs included in the case study. We describe the process and result of implementing a validation framework on a model of a real airport terminal system with particular reference to its effectiveness in producing a valid model that can be used and understood by operational decision makers. In applying the proposed validation framework we demonstrate the overall validity of the Inbound Passenger Facilitation Model as well as the effectiveness of the validity framework itself.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ineffectiveness of current design processes has been well studied and has resulted in widespread calls for the evolution and development of new management processes. Even following the advent of BIM, we continue to move from one stage to another without necessarily having resolved all the issues. CAD design technology, if well handled, could have significantly raised the level of quality and efficiency of current processes, but in practice this was not fully realized. Therefore, technology alone can´t solve all the problems and the advent of BIM could result in a similar bottleneck. For a precise definition of the problem to be solved we should start by understanding what are the main current bottlenecks that have yet to be overcome by either new technologies or management processes, and the impact of human behaviour-related issues which impact the adoption and utilization of new technologies. The fragmented and dispersed nature of the AEC sector, and the huge number of small organizations that comprise it, are a major limiting factor. Several authors have addressed this issue and more recently IDDS has been defined as the highest level of achievement. However, what is written on IDDS shows an extremely ideal situation on a state to be achieved; it shows a holistic utopian proposition with the intent to create the research agenda to move towards that state. Key to IDDS is the framing of a new management model which should address the problems associated with key aspects: technology, processes, policies and people. One of the primary areas to be further studied is the process of collaborative work and understanding, together with the development of proposals to overcome the many cultural barriers that currently exist and impede the advance of new management methods. The purpose of this paper is to define and delimit problems to be solved so that it is possible to implement a new management model for a collaborative design process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluid–Structure Interaction (FSI) problem is significant in science and engineering, which leads to challenges for computational mechanics. The coupled model of Finite Element and Smoothed Particle Hydrodynamics (FE-SPH) is a robust technique for simulation of FSI problems. However, two important steps of neighbor searching and contact searching in the coupled FE-SPH model are extremely time-consuming. Point-In-Box (PIB) searching algorithm has been developed by Swegle to improve the efficiency of searching. However, it has a shortcoming that efficiency of searching can be significantly affected by the distribution of points (nodes in FEM and particles in SPH). In this paper, in order to improve the efficiency of searching, a novel Striped-PIB (S-PIB) searching algorithm is proposed to overcome the shortcoming of PIB algorithm that caused by points distribution, and the two time-consuming steps of neighbor searching and contact searching are integrated into one searching step. The accuracy and efficiency of the newly developed searching algorithm is studied on by efficiency test and FSI problems. It has been found that the newly developed model can significantly improve the computational efficiency and it is believed to be a powerful tool for the FSI analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The banking industry is under pressure. In order to compete, banks should adapt to concentrating on the specific customer needs, following an outside-in perspective. This paper presents the design of a business model for banks that considers this development by providing flexible and comprehensive support for retail banking clients. It is demonstrated that the identification of customer processes and the consequent alignment of banking services to those processes implies great potential to increase customer retention in banking. It will be shown that information technology – especially smartphones – can serve as an interface between customer and suppliers to enable an alignment of offerings to customer processes. This approach enables the integration of banks into their customers’ lifestyle, creating emotional value added, improving the personal relationship and the customers’ affiliation with the bank. The paper presents the design of such a customer-process-centric smartphone application and derives success factors for implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Passenger experience has become a major factor that influences the success of an airport. In this context, passenger flow simulation has been used in designing and managing airports. However, most passenger flow simulations failed to consider the group dynamics when developing passenger flow models. In this paper, an agent-based model is presented to simulate passenger behaviour at the airport check-in and evacuation process. The simulation results show that the passenger behaviour can have significant influences on the performance and utilisation of services in airport terminals. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metastasis, the passage of primary tumour cells throughout the body via the vascular system and their subsequent proliferation into secondary lesions in distant organs, represents a poor prognosis and therefore an understandably feared event for cancer patients. Despite considerable advances in cancer diagnosis and treatment, most deaths are the result of metastases resistant to conventional treatment [1]. Rather than being a random process, metastasis involves a series of organised steps leading to the growth of a secondary tumour. Malignant tumours stimulate the production of new vessels by the host, and this process is a prerequisite for the increase in size of a new tumour [2]. Angiogenesis, not only permits tumour expansion but also allows the entry of tumour cells into the circulation and is probably the most vital event for the metastatic process [3]. Metastasis and angiogenesis [4] have received much attention in recent years. A biological understanding of both phenomena seems to be an urgent priority towards the search for an effective prevention and treatment of tumour progression. Studies in vitro and in vivo have shown that one of the most important barriers to the passage of malignant cells is the basement membrane. The crossing of such barriers is a vital step in the formation of a metastasis [5].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problem of determining optimal designs for biological process models with intractable likelihoods, with the goal of parameter inference. The Bayesian approach is to choose a design that maximises the mean of a utility, and the utility is a function of the posterior distribution. Therefore, its estimation requires likelihood evaluations. However, many problems in experimental design involve models with intractable likelihoods, that is, likelihoods that are neither analytic nor can be computed in a reasonable amount of time. We propose a novel solution using indirect inference (II), a well established method in the literature, and the Markov chain Monte Carlo (MCMC) algorithm of Müller et al. (2004). Indirect inference employs an auxiliary model with a tractable likelihood in conjunction with the generative model, the assumed true model of interest, which has an intractable likelihood. Our approach is to estimate a map between the parameters of the generative and auxiliary models, using simulations from the generative model. An II posterior distribution is formed to expedite utility estimation. We also present a modification to the utility that allows the Müller algorithm to sample from a substantially sharpened utility surface, with little computational effort. Unlike competing methods, the II approach can handle complex design problems for models with intractable likelihoods on a continuous design space, with possible extension to many observations. The methodology is demonstrated using two stochastic models; a simple tractable death process used to validate the approach, and a motivating stochastic model for the population evolution of macroparasites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ROBERT EVAPORATORS in Australian sugar factories are traditionally constructed with 44.45 mm outside diameter stainless steel tubes of ~2 m length for all stages of evaporation. There are a few vessels with longer tubes (up to 2.8 m) and smaller and larger diameters (38.1 and 50.8 mm). Queensland University of Technology is undertaking a study to investigate the heat transfer performance of tubes of different lengths and diameters for the whole range of process conditions typically encountered in the evaporator set. Incorporation of these results into practical evaporator designs requires an understanding of the cost implications for constructing evaporator vessels with calandrias having tubes of different dimensions. Cost savings are expected for tubes of smaller diameter and longer length in terms of material, labour and installation costs in the factory. However these savings must be considered in terms of the heat transfer area requirements for the evaporation duty, which will likely be a function of the tube dimensions. In this paper a capital cost model is described which provides a relative cost of constructing and installing Robert evaporators of the same heating surface area but with different tube dimensions. Evaporators of 2000, 3000, 4000 and 5000 m2 are investigated. This model will be used in conjunction with the heat transfer efficiency data (when available) to determine the optimum tube dimensions for a new evaporator at a specified evaporation duty. Consideration is also given to other factors such as juice residence time (and implications for sucrose degradation and control) and droplet de-entrainment in evaporators of different tube dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carbon nanotips have been synthesized from a thin carbon film deposited on silicon by bias-enhanced hot filament chemical vapor deposition under different process parameters. The results of scanning electron microscopy indicate that high-quality carbon nanotips can only be obtained under conditions when the ion flux is effectively drawn from the plasma sustained in a CH4 + NH3 + H2 gas mixture. It is shown that the morphology of the carbon nanotips can be controlled by varying the process parameters such as the applied bias, gas pressure, and the NH3 / H2 mass flow ratios. The nanotip formation process is examined through a model that accounts for surface diffusion, in addition to sputtering and deposition processes included in the existing models. This model makes it possible to explain the major difference in the morphologies of the carbon nanotips formed without and with the aid of the plasma as well as to interpret the changes of their aspect ratio caused by the variation in the ion/gas fluxes. Viable ways to optimize the plasma-based process parameters to synthesize high-quality carbon nanotips are suggested. The results are relevant to the development of advanced plasma-/ion-assisted methods of nanoscale synthesis and processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here we report on an unconventional Ni-P alloy-catalyzed, high-throughput, highly reproducible chemical vapor deposition of ultralong carbon microcoils using acetylene precursor in the temperature range 700-750 °C. Scanning electron microscopy analysis reveals that the carbon microcoils have a unique double-helix structure and a uniform circular cross-section. It is shown that double-helix carbon microcoils have outstanding superelastic properties. The microcoils can be extended up to 10-20 times of their original coil length, and quickly recover the original state after releasing the force. A mechanical model of the carbon coils with a large spring index is developed to describe their extension and contraction. Given the initial coil parameters, this mechanical model can successfully account for the geometric nonlinearity of the spring constants for carbon micro- and nanocoils, and is found in a good agreement with the experimental data in the whole stretching process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have developed a Hierarchical Look-Ahead Trajectory Model (HiLAM) that incorporates the firing pattern of medial entorhinal grid cells in a planning circuit that includes interactions with hippocampus and prefrontal cortex. We show the model’s flexibility in representing large real world environments using odometry information obtained from challenging video sequences. We acquire the visual data from a camera mounted on a small tele-operated vehicle. The camera has a panoramic field of view with its focal point approximately 5 cm above the ground level, similar to what would be expected from a rat’s point of view. Using established algorithms for calculating perceptual speed from the apparent rate of visual change over time, we generate raw dead reckoning information which loses spatial fidelity over time due to error accumulation. We rectify the loss of fidelity by exploiting the loop-closure detection ability of a biologically inspired, robot navigation model termed RatSLAM. The rectified motion information serves as a velocity input to the HiLAM to encode the environment in the form of grid cell and place cell maps. Finally, we show goal directed path planning results of HiLAM in two different environments, an indoor square maze used in rodent experiments and an outdoor arena more than two orders of magnitude larger than the indoor maze. Together these results bridge for the first time the gap between higher fidelity bio-inspired navigation models (HiLAM) and more abstracted but highly functional bio-inspired robotic mapping systems (RatSLAM), and move from simulated environments into real-world studies in rodent-sized arenas and beyond.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design Science is the process of solving ‘wicked problems’ through designing, developing, instantiating, and evaluating novel solutions (Hevner, March, Park and Ram, 2004). Wicked problems are described as agent finitude in combination with problem complexity and normative constraint (Farrell and Hooker, 2013). In Information Systems Design Science, determining that problems are ‘wicked’ differentiates Design Science research from Solutions Engineering (Winter, 2008) and is a necessary part of proving the relevance to Information Systems Design Science research (Hevner, 2007; Iivari, 2007). Problem complexity is characterised as many problem components with nested, dependent and co-dependent relationships interacting through multiple feedback and feed-forward loops. Farrell and Hooker (2013) specifically state for wicked problems “it will often be impossible to disentangle the consequences of specific actions from those of other co-occurring interactions”. This paper discusses the application of an Enterprise Information Architecture modelling technique to disentangle the wicked problem complexity for one case. It proposes that such a modelling technique can be applied to other wicked problems and can lay the foundations for proving relevancy to DSR, provide solution pathways for artefact development, and aid to substantiate those elements required to produce Design Theory.