891 resultados para Process Modelling, Process Management, Risk Modelling
Resumo:
This master thesis presents a study on the requisite cooling of an activated sludge process in paper and pulp industry. The energy consumption of paper and pulp industry and it’s wastewater treatment plant in particular is relatively high. It is therefore useful to understand the wastewater treatment process of such industries. The activated sludge process is a biological mechanism which degrades carbonaceous compounds that are present in waste. The modified activated sludge model constructed here aims to imitate the bio-kinetics of an activated sludge process. However, due to the complicated non-linear behavior of the biological process, modelling this system is laborious and intriguing. We attempt to find a system solution first using steady-state modelling of Activated Sludge Model number 1 (ASM1), approached by Euler’s method and an ordinary differential equation solver. Furthermore, an enthalpy study of paper and pulp industry’s vital pollutants was carried out and applied to revise the temperature shift over a period of time to formulate the operation of cooling water. This finding will lead to a forecast of the plant process execution in a cost-effective manner and management of effluent efficiency. The final stage of the thesis was achieved by optimizing the steady state of ASM1.
Resumo:
Transmission system operators and distribution system operators are experiencing new challenges in terms of reliability, power quality, and cost efficiency. Although the potential of energy storages to face those challenges is recognized, the economic implications are still obscure, which introduce the risk into the business models. This thesis aims to investigate the technical and economic value indicators of lithium-ion battery energy storage systems (BESS) in grid-scale applications. In order to do that, a comprehensive performance lithium-ion BESS model with degradation effects estimation is developed. The model development process implies literature review on lifetime modelling, use, and modification of previous study progress, building the additional system parts and integrating it into a complete tool. The constructed model is capable of describing the dynamic behavior of the BESS voltage, state of charge, temperature and capacity loss. Five control strategies for BESS unit providing primary frequency regulation are implemented, in addition to the model. The questions related to BESS dimensioning and the end of life (EoL) criterion are addressed. Simulations are performed with one-month real frequency data acquired from Fingrid. The lifetime and cost-benefit analysis of the simulation results allow to compare and determine the preferable control strategy. Finally, the study performs the sensitivity analysis of economic profitability with variable size, EoL and system price. The research reports that BESS can be profitable in certain cases and presents the recommendations.
Resumo:
Landwirtschaft spielt eine zentrale Rolle im Erdsystem. Sie trägt durch die Emission von CO2, CH4 und N2O zum Treibhauseffekt bei, kann Bodendegradation und Eutrophierung verursachen, regionale Wasserkreisläufe verändern und wird außerdem stark vom Klimawandel betroffen sein. Da all diese Prozesse durch die zugrunde liegenden Nährstoff- und Wasserflüsse eng miteinander verknüpft sind, sollten sie in einem konsistenten Modellansatz betrachtet werden. Dennoch haben Datenmangel und ungenügendes Prozessverständnis dies bis vor kurzem auf der globalen Skala verhindert. In dieser Arbeit wird die erste Version eines solchen konsistenten globalen Modellansatzes präsentiert, wobei der Schwerpunkt auf der Simulation landwirtschaftlicher Erträge und den resultierenden N2O-Emissionen liegt. Der Grund für diese Schwerpunktsetzung liegt darin, dass die korrekte Abbildung des Pflanzenwachstums eine essentielle Voraussetzung für die Simulation aller anderen Prozesse ist. Des weiteren sind aktuelle und potentielle landwirtschaftliche Erträge wichtige treibende Kräfte für Landnutzungsänderungen und werden stark vom Klimawandel betroffen sein. Den zweiten Schwerpunkt bildet die Abschätzung landwirtschaftlicher N2O-Emissionen, da bislang kein prozessbasiertes N2O-Modell auf der globalen Skala eingesetzt wurde. Als Grundlage für die globale Modellierung wurde das bestehende Agrarökosystemmodell Daycent gewählt. Neben der Schaffung der Simulationsumgebung wurden zunächst die benötigten globalen Datensätze für Bodenparameter, Klima und landwirtschaftliche Bewirtschaftung zusammengestellt. Da für Pflanzzeitpunkte bislang keine globale Datenbasis zur Verfügung steht, und diese sich mit dem Klimawandel ändern werden, wurde eine Routine zur Berechnung von Pflanzzeitpunkten entwickelt. Die Ergebnisse zeigen eine gute Übereinstimmung mit Anbaukalendern der FAO, die für einige Feldfrüchte und Länder verfügbar sind. Danach wurde das Daycent-Modell für die Ertragsberechnung von Weizen, Reis, Mais, Soja, Hirse, Hülsenfrüchten, Kartoffel, Cassava und Baumwolle parametrisiert und kalibriert. Die Simulationsergebnisse zeigen, dass Daycent die wichtigsten Klima-, Boden- und Bewirtschaftungseffekte auf die Ertragsbildung korrekt abbildet. Berechnete Länderdurchschnitte stimmen gut mit Daten der FAO überein (R2 = 0.66 für Weizen, Reis und Mais; R2 = 0.32 für Soja), und räumliche Ertragsmuster entsprechen weitgehend der beobachteten Verteilung von Feldfrüchten und subnationalen Statistiken. Vor der Modellierung landwirtschaftlicher N2O-Emissionen mit dem Daycent-Modell stand eine statistische Analyse von N2O-und NO-Emissionsmessungen aus natürlichen und landwirtschaftlichen Ökosystemen. Die als signifikant identifizierten Parameter für N2O (Düngemenge, Bodenkohlenstoffgehalt, Boden-pH, Textur, Feldfrucht, Düngersorte) und NO (Düngemenge, Bodenstickstoffgehalt, Klima) entsprechen weitgehend den Ergebnissen einer früheren Analyse. Für Emissionen aus Böden unter natürlicher Vegetation, für die es bislang keine solche statistische Untersuchung gab, haben Bodenkohlenstoffgehalt, Boden-pH, Lagerungsdichte, Drainierung und Vegetationstyp einen signifikanten Einfluss auf die N2O-Emissionen, während NO-Emissionen signifikant von Bodenkohlenstoffgehalt und Vegetationstyp abhängen. Basierend auf den daraus entwickelten statistischen Modellen betragen die globalen Emissionen aus Ackerböden 3.3 Tg N/y für N2O, und 1.4 Tg N/y für NO. Solche statistischen Modelle sind nützlich, um Abschätzungen und Unsicherheitsbereiche von N2O- und NO-Emissionen basierend auf einer Vielzahl von Messungen zu berechnen. Die Dynamik des Bodenstickstoffs, insbesondere beeinflusst durch Pflanzenwachstum, Klimawandel und Landnutzungsänderung, kann allerdings nur durch die Anwendung von prozessorientierten Modellen berücksichtigt werden. Zur Modellierung von N2O-Emissionen mit dem Daycent-Modell wurde zunächst dessen Spurengasmodul durch eine detailliertere Berechnung von Nitrifikation und Denitrifikation und die Berücksichtigung von Frost-Auftau-Emissionen weiterentwickelt. Diese überarbeitete Modellversion wurde dann an N2O-Emissionsmessungen unter verschiedenen Klimaten und Feldfrüchten getestet. Sowohl die Dynamik als auch die Gesamtsummen der N2O-Emissionen werden befriedigend abgebildet, wobei die Modelleffizienz für monatliche Mittelwerte zwischen 0.1 und 0.66 für die meisten Standorte liegt. Basierend auf der überarbeiteten Modellversion wurden die N2O-Emissionen für die zuvor parametrisierten Feldfrüchte berechnet. Emissionsraten und feldfruchtspezifische Unterschiede stimmen weitgehend mit Literaturangaben überein. Düngemittelinduzierte Emissionen, die momentan vom IPCC mit 1.25 +/- 1% der eingesetzten Düngemenge abgeschätzt werden, reichen von 0.77% (Reis) bis 2.76% (Mais). Die Summe der berechneten Emissionen aus landwirtschaftlichen Böden beträgt für die Mitte der 1990er Jahre 2.1 Tg N2O-N/y, was mit den Abschätzungen aus anderen Studien übereinstimmt.
Resumo:
The capability for collaboration is a key success factor for networked enterprises. The paper introduces a methodology supporting the application of Enterprise Modelling in order to improve the maturity for collaboration. The methodology considers the current status of maturity for interoperability for deducing the right modelling approach. The approach is combined with quality criteria of the models in order to guide the modelling process. Both the deducing approach and the quality criteria are related to the levels of interoperability proposed by the ATHENA Interoperability Framework.
Resumo:
Formal and analytical risk models prescribe how risk should be incorporated in construction bids. However, the actual process of how contractors and their clients negotiate and agree on price is complex, and not clearly articulated in the literature. Using participant observation, the entire tender process was shadowed in two leading UK construction firms. This was compared to propositions in analytical models and significant differences were found. 670 hours of work observed in both firms revealed three stages of the bidding process. Bidding activities were categorized and their extent estimated as deskwork (32%), calculations (19%), meetings (14%), documents (13%), off-days (11%), conversations (7%), correspondence (3%) and travel (1%). Risk allowances of 1-2% were priced in some bids and three tiers of risk apportionment in bids were identified. However, priced risks may sometimes be excluded from the final bidding price to enhance competitiveness. Thus, although risk apportionment affects a contractor’s pricing strategy, other complex, microeconomic factors also affect price. Instead of pricing in contingencies, risk was priced mostly through contractual rather than price mechanisms, to reflect commercial imperatives. The findings explain why some assumptions underpinning analytical models may not be sustainable in practice and why what actually happens in practice is important for those who seek to model the pricing of construction bids.
Resumo:
The rate and scale of human-driven changes can exert profound impacts on ecosystems, the species that make them up and the services they provide that sustain humanity. Given the speed at which these changes are occurring, one of society's major challenges is to coexist within ecosystems and to manage ecosystem services in a sustainable way. The effect of possible scenarios of global change on ecosystem services can be explored using ecosystem models. Such models should adequately represent ecosystem processes above and below the soil surface (aboveground and belowground) and the interactions between them. We explore possibilities to include such interactions into ecosystem models at scales that range from global to local. At the regional to global scale we suggest to expand the plant functional type concept (aggregating plants into groups according to their physiological attributes) to include functional types of aboveground-belowground interactions. At the scale of discrete plant communities, process-based and organism-oriented models could be combined into "hybrid approaches" that include organism-oriented mechanistic representation of a limited number of trophic interactions in an otherwise process - oriented approach. Under global change the density and activity of organisms determining the processes may change non-linearly and therefore explicit knowledge of the organisms and their responses should ideally be included. At the individual plant scale a common organism-based conceptual model of aboveground-belowground interactions has emerged. This conceptual model facilitates the formulation of research questions to guide experiments aiming to identify patterns that are common within, but differ between, ecosystem types and biomes. Such experiments inform modelling approaches at larger scales. Future ecosystem models should better include this evolving knowledge of common patterns of aboveground-belowground interactions. Improved ecosystem models are necessary toots to reduce the uncertainty in the information that assists us in the sustainable management of our environment in a changing world. (C) 2004 Elsevier GmbH. All rights reserved.
Resumo:
This paper reviews four approaches used to create rational tools to aid the planning and the management of the building design process and then proposes a fifth approach. The new approach that has been developed is based on the mechanical aspects of technology rather than subjective design issues. The knowledge base contains, for each construction technology, a generic model of the detailed design process. Each activity in the process is specified by its input and output information needs. By connecting the input demands of one technology with the output supply from another technology a map or network of design activity is formed. Thus, it is possible to structure a specific model from the generic knowledge base within a KBE system.
Resumo:
Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.
Resumo:
Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.
Resumo:
Elevated levels of low-density-lipoprotein cholesterol (LDL-C) in the plasma are a well-established risk factor for the development of coronary heart disease. Plasma LDL-C levels are in part determined by the rate at which LDL particles are removed from the bloodstream by hepatic uptake. The uptake of LDL by mammalian liver cells occurs mainly via receptor-mediated endocytosis, a process which entails the binding of these particles to specific receptors in specialised areas of the cell surface, the subsequent internalization of the receptor-lipoprotein complex, and ultimately the degradation and release of the ingested lipoproteins' constituent parts. We formulate a mathematical model to study the binding and internalization (endocytosis) of LDL and VLDL particles by hepatocytes in culture. The system of ordinary differential equations, which includes a cholesterol-dependent pit production term representing feedback regulation of surface receptors in response to intracellular cholesterol levels, is analysed using numerical simulations and steady-state analysis. Our numerical results show good agreement with in vitro experimental data describing LDL uptake by cultured hepatocytes following delivery of a single bolus of lipoprotein. Our model is adapted in order to reflect the in vivo situation, in which lipoproteins are continuously delivered to the hepatocyte. In this case, our model suggests that the competition between the LDL and VLDL particles for binding to the pits on the cell surface affects the intracellular cholesterol concentration. In particular, we predict that when there is continuous delivery of low levels of lipoproteins to the cell surface, more VLDL than LDL occupies the pit, since VLDL are better competitors for receptor binding. VLDL have a cholesterol content comparable to LDL particles; however, due to the larger size of VLDL, one pit-bound VLDL particle blocks binding of several LDLs, and there is a resultant drop in the intracellular cholesterol level. When there is continuous delivery of lipoprotein at high levels to the hepatocytes, VLDL particles still out-compete LDL particles for receptor binding, and consequently more VLDL than LDL particles occupy the pit. Although the maximum intracellular cholesterol level is similar for high and low levels of lipoprotein delivery, the maximum is reached more rapidly when the lipoprotein delivery rates are high. The implications of these results for the design of in vitro experiments is discussed.
Resumo:
Supplier selection has a great impact on supply chain management. The quality of supplier selection also affects profitability of organisations which work in the supply chain. As suppliers can provide variety of services and customers demand higher quality of service provision, the organisation is facing challenges for making the right choice of supplier for the right needs. The existing methods for supplier selection, such as data envelopment analysis (DEA) and analytical hierarchy process (AHP) can automatically perform selection of competitive suppliers and further decide winning supplier(s). However, these methods are not capable of determining the right selection criteria which should be derived from the business strategy. An ontology model described in this paper integrates the strengths of DEA and AHP with new mechanisms which ensure the right supplier to be selected by the right criteria for the right customer's needs.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
The process of how contractors take account of risk when calculating their bids for construction work is investigated based on preliminary investigations and case studies in Ghana and UK. Ghana and UK were chosen, more or less arbitrarily, for the purpose of case studies, and to test the idea that there are systematic differences between the approaches in different places. Clear differences were found in the risk pricing approaches of contractors in the two countries. The difference appeared to emanate from the professional knowledge and competence of the bid team members, company policy, corporate accountability and the business environments in which the contractors operate. Both groups of contractors take account of risk in estimates. However, risk accountability was found to be higher on the agenda in the tender process of UK contractors, documented more systematically, and assessed and managed more rigorously with input from the whole bid team. Risk accountability takes place at three levels of the tender process and is dictated strongly by market forces and company circumstances.
Modelling sediment supply and transport in the River Lugg: strategies for controlling sediment loads
Resumo:
The River Lugg has particular problems with high sediment loads that have resulted in detrimental impacts on ecology and fisheries. A new dynamic, process-based model of hydrology and sediments (INCA- SED) has been developed and applied to the River Lugg system using an extensive data set from 1995–2008. The model simulates sediment sources and sinks throughout the catchment and gives a good representation of the sediment response at 22 reaches along the River Lugg. A key question considered in using the model is the management of sediment sources so that concentrations and bed loads can be reduced in the river system. Altogether, five sediment management scenarios were selected for testing on the River Lugg, including land use change, contour tillage, hedging and buffer strips. Running the model with parameters altered to simulate these five scenarios produced some interesting results. All scenarios achieved some reduction in sediment levels, with the 40% land use change achieving the best result with a 19% reduction. The other scenarios also achieved significant reductions of between 7% and 9%. Buffer strips produce the best result at close to 9%. The results suggest that if hedge introduction, contour tillage and buffer strips were all applied, sediment reductions would total 24%, considerably improving the current sediment situation. We present a novel cost-effectiveness analysis of our results where we use percentage of land removed from production as our cost function. Given the minimal loss of land associated with contour tillage, hedges and buffer strips, we suggest that these management practices are the most cost-effective combination to reduce sediment loads.
Resumo:
Purpose – While Freeman's stakeholder management approach has attracted much attention from both scholars and practitioners, little empirical work has considered the interconnectedness of organisational perspectives and stakeholder perspectives. The purpose of this paper is to respond to this gap by developing and empirically testing a bi-directional model of organisation/stakeholder relationships. Design/methodology/approach – A conceptual framework is developed that integrates how stakeholders are affected by organisations with how they affect organisations. Quantitative data relating to both sides of the relationship are obtained from 700 customers of a European service organisation and analysed using partial least squares structural equation modelling technique. Findings – The findings provide empirical support for the notion of mutual dependency between organisations and stakeholders as advocated by stakeholder theorists. The results suggest that the way stakeholders relate to organisations is dependent on how organisations relate to stakeholders. Originality/value – The study is original on two fronts: first, it provides a framework and process that can be used by researchers to model bi-directional research with other stakeholder groups and in different contexts. Second, the study presents an example application of bi-directional research by empirically linking organisational and stakeholder expectations in the case of customers of a UK service organisation.