987 resultados para 280111 Conceptual Modelling
Resumo:
Increasing amount of renewable energy source based electricity production has set high load control requirements for power grid balance markets. The essential grid balance between electricity consumption and generation is currently hard to achieve economically with new-generation solutions. Therefore conventional combustion power generation will be examined in this thesis as a solution to the foregoing issue. Circulating fluidized bed (CFB) technology is known to have sufficient scale to acts as a large grid balancing unit. Although the load change rate of the CFB unit is known to be moderately high, supplementary repowering solution will be evaluated in this thesis for load change maximization. The repowering heat duty is delivered to the CFB feed water preheating section by smaller gas turbine (GT) unit. Consequently, steam extraction preheating may be decreased and large amount of the gas turbine exhaust heat may be utilized in the CFB process to reach maximum plant electrical efficiency. Earlier study of the repowering has focused on the efficiency improvements and retrofitting to maximize plant electrical output. This study however presents the CFB load change improvement possibilities achieved with supplementary GT heat. The repowering study is prefaced with literature and theory review for both of the processes to maximize accuracy of the research. Both dynamic and steady-state simulations accomplished with APROS simulation tool will be used to evaluate repowering effects to the CFB unit operation. Eventually, a conceptual level analysis is completed to compare repowered plant performance to the state-of-the-art CFB performance. Based on the performed simulations, considerably good improvements to the CFB process parameters are achieved with repowering. Consequently, the results show possibilities to higher ramp rate values achieved with repowered CFB technology. This enables better plant suitability to the grid balance markets.
Resumo:
Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.
Resumo:
This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron. The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.
Resumo:
Across Europe, elevated phosphorus (P) concentrations in lowland rivers have made them particularly susceptible to eutrophication. This is compounded in southern and central UK by increasing pressures on water resources, which may be further enhanced by the potential effects of climate change. The EU Water Framework Directive requires an integrated approach to water resources management at the catchment scale and highlights the need for modelling tools that can distinguish relative contributions from multiple nutrient sources and are consistent with the information content of the available data. Two such models are introduced and evaluated within a stochastic framework using daily flow and total phosphorus concentrations recorded in a clay catchment typical of many areas of the lowland UK. Both models disaggregate empirical annual load estimates, derived from land use data, as a function of surface/near surface runoff, generated using a simple conceptual rainfall-runoff model. Estimates of the daily load from agricultural land, together with those from baseflow and point sources, feed into an in-stream routing algorithm. The first model assumes constant concentrations in runoff via surface/near surface pathways and incorporates an additional P store in the river-bed sediments, depleted above a critical discharge, to explicitly simulate resuspension. The second model, which is simpler, simulates P concentrations as a function of surface/near surface runoff, thus emphasising the influence of non-point source loads during flow peaks and mixing of baseflow and point sources during low flows. The temporal consistency of parameter estimates and thus the suitability of each approach is assessed dynamically following a new approach based on Monte-Carlo analysis. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
The aim of this work was to couple a nitrogen (N) sub-model to already existent hydrological lumped (LU4-N) and semi-distributed (LU4-R-N and SD4-R-N) conceptual models, to improve our understanding of the factors and processes controlling nitrogen cycling and losses in Mediterranean catchments. The N model adopted provides a simplified conceptualization of the soil nitrogen cycle considering mineralization, nitrification, immobilization, denitrification, plant uptake, and ammonium adsorption/desorption. It also includes nitrification and denitrification in the shallow perched aquifer. We included a soil moisture threshold for all the considered soil biological processes. The results suggested that all the nitrogen processes were highly influenced by the rain episodes and that soil microbial processes occurred in pulses stimulated by soil moisture increasing after rain. Our simulation highlighted the riparian zone as a possible source of nitrate, especially after the summer drought period, but it can also act as an important sink of nitrate due to denitrification, in particular during the wettest period of the year. The riparian zone was a key element to simulate the catchment nitrate behaviour. The lumped LU4-N model (which does not include the riparian zone) could not be validated, while both the semi-distributed LU4-R-N and SD4-R-N model (which include the riparian zone) gave satisfactory results for the calibration process and acceptable results for the temporal validation process.
Resumo:
The rate and scale of human-driven changes can exert profound impacts on ecosystems, the species that make them up and the services they provide that sustain humanity. Given the speed at which these changes are occurring, one of society's major challenges is to coexist within ecosystems and to manage ecosystem services in a sustainable way. The effect of possible scenarios of global change on ecosystem services can be explored using ecosystem models. Such models should adequately represent ecosystem processes above and below the soil surface (aboveground and belowground) and the interactions between them. We explore possibilities to include such interactions into ecosystem models at scales that range from global to local. At the regional to global scale we suggest to expand the plant functional type concept (aggregating plants into groups according to their physiological attributes) to include functional types of aboveground-belowground interactions. At the scale of discrete plant communities, process-based and organism-oriented models could be combined into "hybrid approaches" that include organism-oriented mechanistic representation of a limited number of trophic interactions in an otherwise process - oriented approach. Under global change the density and activity of organisms determining the processes may change non-linearly and therefore explicit knowledge of the organisms and their responses should ideally be included. At the individual plant scale a common organism-based conceptual model of aboveground-belowground interactions has emerged. This conceptual model facilitates the formulation of research questions to guide experiments aiming to identify patterns that are common within, but differ between, ecosystem types and biomes. Such experiments inform modelling approaches at larger scales. Future ecosystem models should better include this evolving knowledge of common patterns of aboveground-belowground interactions. Improved ecosystem models are necessary toots to reduce the uncertainty in the information that assists us in the sustainable management of our environment in a changing world. (C) 2004 Elsevier GmbH. All rights reserved.
Resumo:
Many G protein-coupled receptors have been shown to exist as oligomers, but the oligomerization state and the effects of this on receptor function are unclear. For some G protein-coupled receptors, in ligand binding assays, different radioligands provide different maximal binding capacities. Here we have developed mathematical models for co-expressed dimeric and tetrameric species of receptors. We have considered models where the dimers and tetramers are in equilibrium and where they do not interconvert and we have also considered the potential influence of the ligands on the degree of oligomerization. By analogy with agonist efficacy, we have considered ligands that promote, inhibit or have no effect on oligomerization. Cell surface receptor expression and the intrinsic capacity of receptors to oligomerize are quantitative parameters of the equations. The models can account for differences in the maximal binding capacities of radioligands in different preparations of receptors and provide a conceptual framework for simulation and data fitting in complex oligomeric receptor situations.
Resumo:
Purpose – While Freeman's stakeholder management approach has attracted much attention from both scholars and practitioners, little empirical work has considered the interconnectedness of organisational perspectives and stakeholder perspectives. The purpose of this paper is to respond to this gap by developing and empirically testing a bi-directional model of organisation/stakeholder relationships. Design/methodology/approach – A conceptual framework is developed that integrates how stakeholders are affected by organisations with how they affect organisations. Quantitative data relating to both sides of the relationship are obtained from 700 customers of a European service organisation and analysed using partial least squares structural equation modelling technique. Findings – The findings provide empirical support for the notion of mutual dependency between organisations and stakeholders as advocated by stakeholder theorists. The results suggest that the way stakeholders relate to organisations is dependent on how organisations relate to stakeholders. Originality/value – The study is original on two fronts: first, it provides a framework and process that can be used by researchers to model bi-directional research with other stakeholder groups and in different contexts. Second, the study presents an example application of bi-directional research by empirically linking organisational and stakeholder expectations in the case of customers of a UK service organisation.
Resumo:
The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.
Resumo:
The urban boundary layer (UBL) is the part of the atmosphere in which most of the planet’s population now lives, and is one of the most complex and least understood microclimates. Given potential climate change impacts and the requirement to develop cities sustainably, the need for sound modelling and observational tools becomes pressing. This review paper considers progress made in studies of the UBL in terms of a conceptual framework spanning microscale to mesoscale determinants of UBL structure and evolution. Considerable progress in observing and modelling the urban surface energy balance has been made. The urban roughness sub-layer is an important region requiring attention as assumptions about atmospheric turbulence break down in this layer and it may dominate coupling of the surface to the UBL due to its considerable depth. The upper 90% of the UBL (mixed and residual layers) remains under-researched but new remote sensing methods and high resolution modelling tools now permit rapid progress. Surface heterogeneity dominates from neighbourhood to regional scales and should be more strongly considered in future studies. Specific research priorities include humidity within the UBL, high-rise urban canopies and the development of long-term, spatially extensive measurement networks coupled strongly to model development.
Resumo:
Mirroring the paper versions exchanged between businesses today, electronic contracts offer the possibility of dynamic, automatic creation and enforcement of restrictions and compulsions on agent behaviour that are designed to ensure business objectives are met. However, where there are many contracts within a particular application, it can be difficult to determine whether the system can reliably fulfil them all; computer-parsable electronic contracts may allow such verification to be automated. In this paper, we describe a conceptual framework and architecture specification in which normative business contracts can be electronically represented, verified, established, renewed, etc. In particular, we aim to allow systems containing multiple contracts to be checked for conflicts and violations of business objectives. We illustrate the framework and architecture with an aerospace example.
Resumo:
Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.
Resumo:
This paper addresses topics - either relevant or confusing or needing more attention - related to measuring the trade and poverty nexus. It sheds a critical light on the existing material and suggests needed research lines. It starts with questions akin to the LAC realities; then, keeping this view, general methodological issues are also examined. In a broader perspective, further ideas for the research agenda are formulated. The main conclusion is that relevant findings still demand considerable efforts. Moreover, the Information-measurement-model-evaluation paradigm is not enough, policy guidelines being usually too general. In LAC, it must be extended and deepened, accounting more for the heterogeneity of cases, including, whenever possible, the physical constraints and incorporating new ways of integrating both the local and global perspectives. Other aspects, like the role of specific juridical measures, should play a role. How all this can be combined into more encompassing evaluations remains open
Resumo:
The increase of computing power of the microcomputers has stimulated the building of direct manipulation interfaces that allow graphical representation of Linear Programming (LP) models. This work discusses the components of such a graphical interface as the basis for a system to assist users in the process of formulating LP problems. In essence, this work proposes a methodology which considers the modelling task as divided into three stages which are specification of the Data Model, the Conceptual Model and the LP Model. The necessity for using Artificial Intelligence techniques in the problem conceptualisation and to help the model formulation task is illustrated.
Resumo:
Many findings from research as well as reports from teachers describe students' problem solving strategies as manipulation of formulas by rote. The resulting dissatisfaction with quantitative physical textbook problems seems to influence the attitude towards the role of mathematics in physics education in general. Mathematics is often seen as a tool for calculation which hinders a conceptual understanding of physical principles. However, the role of mathematics cannot be reduced to this technical aspect. Hence, instead of putting mathematics away we delve into the nature of physical science to reveal the strong conceptual relationship between mathematics and physics. Moreover, we suggest that, for both prospective teaching and further research, a focus on deeply exploring such interdependency can significantly improve the understanding of physics. To provide a suitable basis, we develop a new model which can be used for analysing different levels of mathematical reasoning within physics. It is also a guideline for shifting the attention from technical to structural mathematical skills while teaching physics. We demonstrate its applicability for analysing physical-mathematical reasoning processes with an example.