886 resultados para Systems Simulation
Resumo:
This paper describes an assessment of the nitrogen and phosphorus dynamics of the River Kennet in the south east of England. The Kennet catchment (1200 km(2)) is a predominantly groundwater fed river impacted by agricultural and sewage sources of nutrient (nitrogen and phosphorus) pollution. The results from a suite of simulation models are integrated to assess the key spatial and temporal variations in the nitrogen (N) and phosphorus (P) chemistry, and the influence of changes in phosphorous inputs from a Sewage Treatment Works on the macrophyte and epiphyte growth patterns. The models used are the Export Co-efficient model, the Integrated Nitrogen in Catchments model, and a new model of in-stream phosphorus and macrophyte dynamics: the 'Kennet' model. The paper concludes with a discussion on the present state of knowledge regarding the water quality functioning, future research needs regarding environmental modelling and the use of models as management tools for large, nutrient impacted riverine systems. (C) 2003 IMACS. Published by Elsevier B.V. All rights reserved.
Resumo:
Brief periods of high temperature which occur near flowering can severely reduce the yield of annual crops such as wheat and groundnut. A parameterisation of this well-documented effect is presented for groundnut (i.e. peanut; Arachis hypogaeaL.). This parameterisation was combined with an existing crop model, allowing the impact of season-mean temperature, and of brief high-temperature episodes at various times near flowering, to be both independently and jointly examined. The extended crop model was tested with independent data from controlled environment experiments and field experiments. The impact of total crop duration was captured, with simulated duration being within 5% of observations for the range of season-mean temperatures used (20-28 degrees C). In simulations across nine differently timed high temperature events, eight of the absolute differences between observed and simulated yield were less than 10% of the control (no-stress) yield. The parameterisation of high temperature stress also allows the simulation of heat tolerance across different genotypes. Three parameter sets, representing tolerant, moderately sensitive and sensitive genotypes were developed and assessed. The new parameterisation can be used in climate change studies to estimate the impact of heat stress on yield. It can also be used to assess the potential for adaptation of cropping systems to increased temperature threshold exceedance via the choice of genotype characteristics. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Reanalysis data provide an excellent test bed for impacts prediction systems. because they represent an upper limit on the skill of climate models. Indian groundnut (Arachis hypogaea L.) yields have been simulated using the General Large-Area Model (GLAM) for annual crops and the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA-40). The ability of ERA-40 to represent the Indian summer monsoon has been examined. The ability of GLAM. when driven with daily ERA-40 data, to model both observed yields and observed relationships between subseasonal weather and yield has been assessed. Mean yields "were simulated well across much of India. Correlations between observed and modeled yields, where these are significant. are comparable to correlations between observed yields and ERA-40 rainfall. Uncertainties due to the input planting window, crop duration, and weather data have been examined. A reduction in the root-mean-square error of simulated yields was achieved by applying bias correction techniques to the precipitation. The stability of the relationship between weather and yield over time has been examined. Weather-yield correlations vary on decadal time scales. and this has direct implications for the accuracy of yield simulations. Analysis of the skewness of both detrended yields and precipitation suggest that nonclimatic factors are partly responsible for this nonstationarity. Evidence from other studies, including data on cereal and pulse yields, indicates that this result is not particular to groundnut yield. The detection and modeling of nonstationary weather-yield relationships emerges from this study as an important part of the process of understanding and predicting the impacts of climate variability and change on crop yields.
Resumo:
Grass-based diets are of increasing social-economic importance in dairy cattle farming, but their low supply of glucogenic nutrients may limit the production of milk. Current evaluation systems that assess the energy supply and requirements are based on metabolisable energy (ME) or net energy (NE). These systems do not consider the characteristics of the energy delivering nutrients. In contrast, mechanistic models take into account the site of digestion, the type of nutrient absorbed and the type of nutrient required for production of milk constituents, and may therefore give a better prediction of supply and requirement of nutrients. The objective of the present study is to compare the ability of three energy evaluation systems, viz. the Dutch NE system, the agricultural and food research council (AFRC) ME system, and the feed into milk (FIM) ME system, and of a mechanistic model based on Dijkstra et al. [Simulation of digestion in cattle fed sugar cane: prediction of nutrient supply for milk production with locally available supplements. J. Agric. Sci., Cambridge 127, 247-60] and Mills et al. [A mechanistic model of whole-tract digestion and methanogenesis in the lactating dairy cow: model development, evaluation and application. J. Anim. Sci. 79, 1584-97] to predict the feed value of grass-based diets for milk production. The dataset for evaluation consists of 41 treatments of grass-based diets (at least 0.75 g ryegrass/g diet on DM basis). For each model, the predicted energy or nutrient supply, based on observed intake, was compared with predicted requirement based on observed performance. Assessment of the error of energy or nutrient supply relative to requirement is made by calculation of mean square prediction error (MSPE) and by concordance correlation coefficient (CCC). All energy evaluation systems predicted energy requirement to be lower (6-11%) than energy supply. The root MSPE (expressed as a proportion of the supply) was lowest for the mechanistic model (0.061), followed by the Dutch NE system (0.082), FIM ME system (0.097) and AFRCME system(0.118). For the energy evaluation systems, the error due to overall bias of prediction dominated the MSPE, whereas for the mechanistic model, proportionally 0.76 of MSPE was due to random variation. CCC analysis confirmed the higher accuracy and precision of the mechanistic model compared with energy evaluation systems. The error of prediction was positively related to grass protein content for the Dutch NE system, and was also positively related to grass DMI level for all models. In conclusion, current energy evaluation systems overestimate energy supply relative to energy requirement on grass-based diets for dairy cattle. The mechanistic model predicted glucogenic nutrients to limit performance of dairy cattle on grass-based diets, and proved to be more accurate and precise than the energy systems. The mechanistic model could be improved by allowing glucose maintenance and utilization requirements parameters to be variable. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
This article introduces a quantitative approach to e-commerce system evaluation based on the theory of process simulation. The general concept of e-commerce system simulation is presented based on the considerations of some limitations in e-commerce system development such as the huge amount of initial investments of time and money, and the long period from business planning to system development, then to system test and operation, and finally to exact return; in other words, currently used system analysis and development method cannot tell investors about some keen attentions such as how good their e-commerce system could be, how many investment repayments they could have, and which area they should improve regarding the initial business plan. In order to exam the value and its potential effects of an e-commerce business plan, it is necessary to use a quantitative evaluation approach and the authors of this article believe that process simulation is an appropriate option. The overall objective of this article is to apply the theory of process simulation to e-commerce system evaluation, and the authors will achieve this though an experimental study on a business plan for online construction and demolition waste exchange. The methodologies adopted in this article include literature review, system analysis and development, simulation modelling and analysis, and case study. The results from this article include the concept of e-commerce system simulation, a comprehensive review of simulation methods adopted in e-commerce system evaluation, and a real case study of applying simulation to e-commerce system evaluation. Furthermore, the authors hope that the adoption and implementation of the process simulation approach can effectively support business decision-making, and improve the efficiency of e-commerce systems.
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
User interfaces have the primary role of enabling access to information meeting individual users' needs. However, the user-systems interaction is still rigid, especially in support of complex environments where various types of users are involved. Among the approaches for improving user interface agility, we present a normative approach to the design interfaces of web applications, which allow delivering users personalized services according to parameters extracted from the simulation of norms in the social context. A case study in an e-Government context is used to illustrate the implications of the approach.
Resumo:
A beamforming algorithm is introduced based on the general objective function that approximates the bit error rate for the wireless systems with binary phase shift keying and quadrature phase shift keying modulation schemes. The proposed minimum approximate bit error rate (ABER) beamforming approach does not rely on the Gaussian assumption of the channel noise. Therefore, this approach is also applicable when the channel noise is non-Gaussian. The simulation results show that the proposed minimum ABER solution improves the standard minimum mean squares error beamforming solution, in terms of a smaller achievable system's bit error rate.
Resumo:
Current e-learning systems are increasing their importance in higher education. However, the state of the art of e-learning applications, besides the state of the practice, does not achieve the level of interactivity that current learning theories advocate. In this paper, the possibility of enhancing e-learning systems to achieve deep learning has been studied by replicating an experiment in which students had to learn basic software engineering principles. One group learned these principles using a static approach, while the other group learned the same principles using a system-dynamics-based approach, which provided interactivity and feedback. The results show that, quantitatively, the latter group achieved a better understanding of the principles; furthermore, qualitatively, they enjoyed the learning experience
Resumo:
Since its introduction in 1993, the Message Passing Interface (MPI) has become a de facto standard for writing High Performance Computing (HPC) applications on clusters and Massively Parallel Processors (MPPs). The recent emergence of multi-core processor systems presents a new challenge for established parallel programming paradigms, including those based on MPI. This paper presents a new Java messaging system called MPJ Express. Using this system, we exploit multiple levels of parallelism - messaging and threading - to improve application performance on multi-core processors. We refer to our approach as nested parallelism. This MPI-like Java library can support nested parallelism by using Java or Java OpenMP (JOMP) threads within an MPJ Express process. Practicality of this approach is assessed by porting to Java a massively parallel structure formation code from Cosmology called Gadget-2. We introduce nested parallelism in the Java version of the simulation code and report good speed-ups. To the best of our knowledge it is the first time this kind of hybrid parallelism is demonstrated in a high performance Java application. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
State-of-the-art computational methodologies are used to investigate the energetics and dynamics of photodissociated CO and NO in myoglobin (Mb···CO and Mb···NO). This includes the combination of molecular dynamics, ab initio MD, free energy sampling, and effective dynamics methods to compare the results with studies using X-ray crystallography and ultrafast spectroscopy metho ds. It is shown that modern simulation techniques along with careful description of the intermolecular interactions can give quantitative agreement with experiments on complex molecular systems. Based on this agreement predictions for as yet uncharacterized species can be made.
Resumo:
Recent research in multi-agent systems incorporate fault tolerance concepts, but does not explore the extension and implementation of such ideas for large scale parallel computing systems. The work reported in this paper investigates a swarm array computing approach, namely 'Intelligent Agents'. A task to be executed on a parallel computing system is decomposed to sub-tasks and mapped onto agents that traverse an abstracted hardware layer. The agents intercommunicate across processors to share information during the event of a predicted core/processor failure and for successfully completing the task. The feasibility of the approach is validated by simulations on an FPGA using a multi-agent simulator, and implementation of a parallel reduction algorithm on a computer cluster using the Message Passing Interface.
Resumo:
The transport of stratospheric air into the troposphere within deep convection was investigated using the Met Office Unified Model version 6.1. Three cases were simulated in which convective systems formed over the UK in the summer of 2005. For each of these three cases, simulations were performed on a grid having 4 km horizontal grid spacing in which the convection was parameterized and on a grid having 1 km horizontal grid spacing, which permitted explicit representation of the largest energy-containing scales of deep convection. Cross-tropopause transport was diagnosed using passive tracers that were initialized above the dynamically defined tropopause (2 potential vorticity unit surface) with a mixing ratio of 1. Although the synoptic-scale environment and triggering mechanisms varied between the cases, the total simulated transport was similar in all three cases. The total stratosphere-to-troposphere transport over the lifetime of the convective systems ranged from 25 to 100 kg/m2 across the simulated convective systems and resolutions, which corresponds to ∼5–20% of the total mass located within a stratospheric column extending 2 km above the tropopause. In all simulations, the transport into the lower troposphere (defined as below 3.5 km elevation) accounted for ∼1% of the total transport across the tropopause. In the 4 km runs most of the transport was due to parameterized convection, whereas in the 1 km runs the transport was due to explicitly resolved convection. The largest difference between the simulations with different resolutions occurred in the one case of midlevel convection considered, in which the total transport in the 1 km grid spacing simulation with explicit convection was 4 times that in the 4 km grid spacing simulation with parameterized convection. Although the total cross-tropopause transport was similar, stratospheric tracer was deposited more deeply to near-surface elevations in the convection-parameterizing simulations than in convection-permitting simulations.
Resumo:
In this paper stability of one-step ahead predictive controllers based on non-linear models is established. It is shown that, under conditions which can be fulfilled by most industrial plants, the closed-loop system is robustly stable in the presence of plant uncertainties and input–output constraints. There is no requirement that the plant should be open-loop stable and the analysis is valid for general forms of non-linear system representation including the case out when the problem is constraint-free. The effectiveness of controllers designed according to the algorithm analyzed in this paper is demonstrated on a recognized benchmark problem and on a simulation of a continuous-stirred tank reactor (CSTR). In both examples a radial basis function neural network is employed as the non-linear system model.