868 resultados para process modelling
Resumo:
Purpose – The purpose of this paper is to outline a seven-phase simulation conceptual modelling procedure that incorporates existing practice and embeds a process reference model (i.e. SCOR). Design/methodology/approach – An extensive review of the simulation and SCM literature identifies a set of requirements for a domain-specific conceptual modelling procedure. The associated design issues for each requirement are discussed and the utility of SCOR in the process of conceptual modelling is demonstrated using two development cases. Ten key concepts are synthesised and aligned to a general process for conceptual modelling. Further work is outlined to detail, refine and test the procedure with different process reference models in different industrial contexts. Findings - Simulation conceptual modelling is often regarded as the most important yet least understood aspect of a simulation project (Robinson, 2008a). Even today, there has been little research development into guidelines to aid in the creation of a conceptual model. Design issues are discussed for building an ‘effective’ conceptual model and the domain-specific requirements for modelling supply chains are addressed. The ten key concepts are incorporated to aid in describing the supply chain problem (i.e. components and relationships that need to be included in the model), model content (i.e. rules for determining the simplest model boundary and level of detail to implement the model) and model validation. Originality/value – Paper addresses Robinson (2008a) call for research in defining and developing new approaches for conceptual modelling and Manuj et al., (2009) discussion on improving the rigour of simulation studies in SCM. It is expected that more detailed guidelines will yield benefits to both expert (i.e. avert typical modelling failures) and novice modellers (i.e. guided practice; less reliance on hopeful intuition)
Resumo:
Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients’ department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages.
Resumo:
The sharing of product and process information plays a central role in coordinating supply chains operations and is a key driver for their success. "Linked pedigrees" - linked datasets, that encapsulate event based traceability information of artifacts as they move along the supply chain, provide a scalable mechanism to record and facilitate the sharing of track and trace knowledge among supply chain partners. In this paper we present "OntoPedigree" a content ontology design pattern for the representation of linked pedigrees, that can be specialised and extended to define domain specific traceability ontologies. Events captured within the pedigrees are specified using EPCIS - a GS1 standard for the specification of traceability information within and across enterprises, while certification information is described using PROV - a vocabulary for modelling provenance of resources. We exemplify the utility of OntoPedigree in linked pedigrees generated for supply chains within the perishable goods and pharmaceuticals sectors.
Resumo:
Popular dimension reduction and visualisation algorithms rely on the assumption that input dissimilarities are typically Euclidean, for instance Metric Multidimensional Scaling, t-distributed Stochastic Neighbour Embedding and the Gaussian Process Latent Variable Model. It is well known that this assumption does not hold for most datasets and often high-dimensional data sits upon a manifold of unknown global geometry. We present a method for improving the manifold charting process, coupled with Elastic MDS, such that we no longer assume that the manifold is Euclidean, or of any particular structure. We draw on the benefits of different dissimilarity measures allowing for the relative responsibilities, under a linear combination, to drive the visualisation process.
Resumo:
Nowadays financial institutions due to regulation and internal motivations care more intensively on their risks. Besides previously dominating market and credit risk new trend is to handle operational risk systematically. Operational risk is the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. First we show the basic features of operational risk and its modelling and regulatory approaches, and after we will analyse operational risk in an own developed simulation model framework. Our approach is based on the analysis of latent risk process instead of manifest risk process, which widely popular in risk literature. In our model the latent risk process is a stochastic risk process, so called Ornstein- Uhlenbeck process, which is a mean reversion process. In the model framework we define catastrophe as breach of a critical barrier by the process. We analyse the distributions of catastrophe frequency, severity and first time to hit, not only for single process, but for dual process as well. Based on our first results we could not falsify the Poisson feature of frequency, and long tail feature of severity. Distribution of “first time to hit” requires more sophisticated analysis. At the end of paper we examine advantages of simulation based forecasting, and finally we concluding with the possible, further research directions to be done in the future.
Resumo:
A szerző egy, a szennyezőanyag-kibocsátás európai kereskedelmi rendszerében megfelelésre kötelezett gázturbinás erőmű szén-dioxid-kibocsátását modellezi négy termékre (völgy- és csúcsidőszaki áramár, gázár, kibocsátási kvóta) vonatkozó reálopciós modell segítségével. A profitmaximalizáló erőmű csak abban az esetben termel és szennyez, ha a megtermelt áramon realizálható fedezete pozitív. A jövőbeli időszak összesített szén-dioxid-kibocsátása megfeleltethető európai típusú bináris különbözetopciók összegének. A modell keretein belül a szén-dioxid-kibocsátás várható értékét és sűrűségfüggvényét becsülhetjük, az utóbbi segítségével a szén-dioxid-kibocsátási pozíció kockáztatott értékét határozhatjuk meg, amely az erőmű számára előírt megfelelési kötelezettség teljesítésének adott konfidenciaszint melletti költségét jelenti. A sztochasztikus modellben az alaptermékek geometriai Ornstein-Uhlenbeck-folyamatot követnek. Ezt illesztette a szerző a német energiatőzsdéről származó publikus piaci adatokra. A szimulációs modellre támaszkodva megvizsgálta, hogy a különböző technológiai és piaci tényezők ceteris paribus megváltozása milyen hatással van a megfelelés költségére, a kockáztatott értékére. ______ The carbon-dioxide emissions of an EU Emissions Trading System participant, gas-fuelled power generator are modelled by using real options for four underlying instruments (peak and off-peak electricity, gas, emission quota). This profit-maximizing power plant operates and emits pollution only if its profit (spread) on energy produced is positive. The future emissions can be estimated by a sum of European binary-spread options. Based on the real-option model, the expected value of emissions and its probability-density function can be deducted. Also calculable is the Value at Risk of emission quota position, which gives the cost of compliance at a given confidence level. To model the prices of the four underlying instruments, the geometric Ornstein-Uhlenbeck process is supposed and matched to public available price data from EEX. Based on the simulation model, the effects of various technological and market factors are analysed for the emissions level and the cost of compliance.
Resumo:
The shallow water configuration of the gulf of Trieste allows the propagation of the stress due to wind and waves along the whole water column down to the bottom. When the stress overcomes a particular threshold it produces resuspension processes of the benthic detritus. The benthic sediments in the North Adriatic are rich of organic matter, transported here by many rivers. This biological active particulate, when remaining in the water, can be transported in all the Adriatic basin by the basin-wide circulation. In this work is presented a first implementation of a resuspension/deposition submodel in the oceanographic coupled physical-biogeochemical 1-dimensional numerical model POM-BFM. At first has been considered the only climatological wind stress forcing, next has been introduced, on the surface, an annual cycle of wave motion and finally have been imposed some exceptional wave event in different periods of the year. The results show a strong relationship between the efficiency of the resuspension process and the stratification of the water column. During summer the strong stratification can contained a great quantity of suspended matter near to the bottom, while during winter even a low concentration of particulate can reach the surface and remains into the water for several months without settling and influencing the biogeochemical system. Looking at the biologic effects, the organic particulate, injected in the water column, allow a sudden growth of the pelagic bacteria which competes with the phytoplankton for nutrients strongly inhibiting its growth. This happen especially during summer when the suspended benthic detritus concentration is greater.
Resumo:
Atomisation of an aqueous solution for tablet film coating is a complex process with multiple factors determining droplet formation and properties. The importance of droplet size for an efficient process and a high quality final product has been noted in the literature, with smaller droplets reported to produce smoother, more homogenous coatings whilst simultaneously avoiding the risk of damage through over-wetting of the tablet core. In this work the effect of droplet size on tablet film coat characteristics was investigated using X-ray microcomputed tomography (XμCT) and confocal laser scanning microscopy (CLSM). A quality by design approach utilising design of experiments (DOE) was used to optimise the conditions necessary for production of droplets at a small (20 μm) and large (70 μm) droplet size. Droplet size distribution was measured using real-time laser diffraction and the volume median diameter taken as a response. DOE yielded information on the relationship three critical process parameters: pump rate, atomisation pressure and coating-polymer concentration, had upon droplet size. The model generated was robust, scoring highly for model fit (R2 = 0.977), predictability (Q2 = 0.837), validity and reproducibility. Modelling confirmed that all parameters had either a linear or quadratic effect on droplet size and revealed an interaction between pump rate and atomisation pressure. Fluidised bed coating of tablet cores was performed with either small or large droplets followed by CLSM and XμCT imaging. Addition of commonly used contrast materials to the coating solution improved visualisation of the coating by XμCT, showing the coat as a discrete section of the overall tablet. Imaging provided qualitative and quantitative evidence revealing that smaller droplets formed thinner, more uniform and less porous film coats.
Resumo:
Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.
Resumo:
The distribution, abundance, behaviour, and morphology of marine species is affected by spatial variability in the wave environment. Maps of wave metrics (e.g. significant wave height Hs, peak energy wave period Tp, and benthic wave orbital velocity URMS) are therefore useful for predictive ecological models of marine species and ecosystems. A number of techniques are available to generate maps of wave metrics, with varying levels of complexity in terms of input data requirements, operator knowledge, and computation time. Relatively simple "fetch-based" models are generated using geographic information system (GIS) layers of bathymetry and dominant wind speed and direction. More complex, but computationally expensive, "process-based" models are generated using numerical models such as the Simulating Waves Nearshore (SWAN) model. We generated maps of wave metrics based on both fetch-based and process-based models and asked whether predictive performance in models of benthic marine habitats differed. Predictive models of seagrass distribution for Moreton Bay, Southeast Queensland, and Lizard Island, Great Barrier Reef, Australia, were generated using maps based on each type of wave model. For Lizard Island, performance of the process-based wave maps was significantly better for describing the presence of seagrass, based on Hs, Tp, and URMS. Conversely, for the predictive model of seagrass in Moreton Bay, based on benthic light availability and Hs, there was no difference in performance using the maps of the different wave metrics. For predictive models where wave metrics are the dominant factor determining ecological processes it is recommended that process-based models be used. Our results suggest that for models where wave metrics provide secondarily useful information, either fetch- or process-based models may be equally useful.
Resumo:
Geomorphic process units have been derived in order to allow quantification via GIS techniques at a catchment scale. Mass movement rates based on existing field measurements are employed in the budget calculations. In the Kärkevagge catchment, Northern Sweden, 80% of the area can be identified either as a source area for sediments or as a zone where sediments are deposited. The overall budget for the slopes beneath the rockwalls in the Kärkevagge is approximately 680 t/a whilst about 150 t a-1 are transported into the fluvial system.
Resumo:
This paper aims to present a state-of-the-art review of the scope and practical implications of the Building Information Modelling (BIM) platform in the UK construction practice. Theoretical developments suggest that BIM is an integration of both product and process innovation, not just a disparate set of software tools. BIM provides effective collaboration, visual representation and data management, which enable the smooth flow of information throughout the project’s lifecycle. The most frequently reported benefits are related to Capital Cost (capex) and Operational costs (opex) and time savings. Key challenges, however, focus on the interoperability of software, capital installation costs, in-house experience, client preference and cultural issues within design teams and within the organisation. The paper concludes with a critical commentary on the changing roles and a process required to implement BIM in UK construction projects, and suggests areas for further research.
Resumo:
Due to the variability and stochastic nature of wind power system, accurate wind power forecasting has an important role in developing reliable and economic power system operation and control strategies. As wind variability is stochastic, Gaussian Process regression has recently been introduced to capture the randomness of wind energy. However, the disadvantages of Gaussian Process regression include its computation complexity and incapability to adapt to time varying time-series systems. A variant Gaussian Process for time series forecasting is introduced in this study to address these issues. This new method is shown to be capable of reducing computational complexity and increasing prediction accuracy. It is further proved that the forecasting result converges as the number of available data approaches innite. Further, a teaching learning based optimization (TLBO) method is used to train the model and to accelerate
the learning rate. The proposed modelling and optimization method is applied to forecast both the wind power generation of Ireland and that from a single wind farm to show the eectiveness of the proposed method.
Resumo:
The stretch blow moulding (SBM) process is the main method for the mass production of PET containers. And understanding the constitutive behaviour of PET during this process is critical for designing the optimum product and process. However due to its nonlinear viscoelastic behaviour, the behaviour of PET is highly sensitive to its thermomechanical history making the task of modelling its constitutive behaviour complex. This means that the constitutive model will be useful only if it is known to be valid under the actual conditions of interest to the SBM process. The aim of this work was to develop a new material characterization method providing new data for the deformation behaviour of PET relevant to the SBM process. In order to achieve this goal, a reliable and robust characterization method was developed based on an instrumented stretch rod and a digital image correlation system to determine the stress-strain relationship of material in deforming preforms during free stretch-blow tests. The effect of preform temperature and air mass flow rate on the deformation behaviour of PET was also investigated.
Resumo:
This paper details the results from a large European Union rotomoulding research project on the adaptation and development of industrial microwave oven technology to the rotational moulding process. Following computer modelling, an industrial scale microwave oven was specifically designed, manufactured and attached to the drop-arm of a convention rotational moulding machine where extensive moulding trials were carried out. The design and development of the microwave oven and test mould, together with the savings in terms of energy efficiency and mould heating rate that were achieved are discussed.