941 resultados para Simulation-models
Resumo:
Solar plus heat pump systems are often very complex in design, with sometimes special heat pump arrangements and control. Therefore detailed heat pump models can give very slow system simulations and still not so accurate results compared to real heat pump performance in a system. The idea here is to start from a standard measured performance map of test points for a heat pump according to EN 14825 and then determine characteristic parameters for a simplified correlation based model of the heat pump. By plotting heat pump test data in different ways including power input and output form and not only as COP, a simplified relation could be seen. By using the same methodology as in the EN 12975 QDT part in the collector test standard it could be shown that a very simple model could describe the heat pump test data very accurately, by identifying 4 parameters in the correlation equation found. © 2012 The Authors.
Resumo:
Renewable energy production is a basic supplement to stabilize rapidly increasing global energy demand and skyrocketing energy price as well as to balance the fluctuation of supply from non-renewable energy sources at electrical grid hubs. The European energy traders, government and private company energy providers and other stakeholders have been, since recently, a major beneficiary, customer and clients of Hydropower simulation solutions. The relationship between rainfall-runoff model outputs and energy productions of hydropower plants has not been clearly studied. In this research, association of rainfall, catchment characteristics, river network and runoff with energy production of a particular hydropower station is examined. The essence of this study is to justify the correspondence between runoff extracted from calibrated catchment and energy production of hydropower plant located at a catchment outlet; to employ a unique technique to convert runoff to energy based on statistical and graphical trend analysis of the two, and to provide environment for energy forecast. For rainfall-runoff model setup and calibration, MIKE 11 NAM model is applied, meanwhile MIKE 11 SO model is used to track, adopt and set a control strategy at hydropower location for runoff-energy correlation. The model is tested at two selected micro run-of-river hydropower plants located in South Germany. Two consecutive calibration is compromised to test the model; one for rainfall-runoff model and other for energy simulation. Calibration results and supporting verification plots of two case studies indicated that simulated discharge and energy production is comparable with the measured discharge and energy production respectively.
Resumo:
This study presents an approach to combine uncertainties of the hydrological model outputs predicted from a number of machine learning models. The machine learning based uncertainty prediction approach is very useful for estimation of hydrological models' uncertainty in particular hydro-metrological situation in real-time application [1]. In this approach the hydrological model realizations from Monte Carlo simulations are used to build different machine learning uncertainty models to predict uncertainty (quantiles of pdf) of the a deterministic output from hydrological model . Uncertainty models are trained using antecedent precipitation and streamflows as inputs. The trained models are then employed to predict the model output uncertainty which is specific for the new input data. We used three machine learning models namely artificial neural networks, model tree, locally weighted regression to predict output uncertainties. These three models produce similar verification results, which can be improved by merging their outputs dynamically. We propose an approach to form a committee of the three models to combine their outputs. The approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model in the Brue catchment in UK and the Bagmati catchment in Nepal. The verification results show that merged output is better than an individual model output. [1] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press, 2013.
Resumo:
While the simulation of flood risks originating from the overtopping of river banks is well covered within continuously evaluated programs to improve flood protection measures, flash flooding is not. Flash floods are triggered by short, local thunderstorm cells with high precipitation intensities. Small catchments have short response times and flow paths and convective thunder cells may result in potential flooding of endangered settlements. Assessing local flooding and pathways of flood requires a detailed hydraulic simulation of the surface runoff. Hydrological models usually do not incorporate surface runoff at this detailedness but rather empirical equations are applied for runoff detention. In return 2D hydrodynamic models usually do not allow distributed rainfall as input nor are any types of soil/surface interaction implemented as in hydrological models. Considering several cases of local flash flooding during the last years the issue emerged for practical reasons but as well as research topics to closing the model gap between distributed rainfall and distributed runoff formation. Therefore, a 2D hydrodynamic model, depth-averaged flow equations using the finite volume discretization, was extended to accept direct rainfall enabling to simulate the associated runoff formation. The model itself is used as numerical engine, rainfall is introduced via the modification of waterlevels at fixed time intervals. The paper not only deals with the general application of the software, but intends to test the numerical stability and reliability of simulation results. The performed tests are made using different artificial as well as measured rainfall series as input. Key parameters of the simulation such as losses, roughness or time intervals for water level manipulations are tested regarding their impact on the stability.
Resumo:
Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.
Resumo:
This paper develops background considerations to help better framing the results of a CGE exercise. Three main criticisms are usually addressed to CGE efforts. First, they are too aggregate, their conclusions failing to shed light on relevant sectors or issues. Second, they imply huge data requirements. Timeliness is frequently jeopardised by out-dated sources, benchmarks referring to realities gone by. Finally, results are meaningless, as they answer wrong or ill-posed questions. Modelling demands end up by creating a rather artificial context, where the original questions lose content. In spite of a positive outlook on the first two, crucial questions lie in the third point. After elaborating such questions, and trying to answer some, the text argues that CGE models can come closer to reality. If their use is still scarce to give way to a fruitful symbiosis between negotiations and simulation results, they remain the only available technique providing a global, inter-related way of capturing economy-wide effects of several different policies. International organisations can play a major role supporting and encouraging improvements. They are also uniquely positioned to enhance information and data sharing, as well as putting people from various origins together, to share their experiences. A serious and complex homework is however required, to correct, at least, the most dangerous present shortcomings of the technique.
Resumo:
In this study, we verify the existence of predictability in the Brazilian equity market. Unlike other studies in the same sense, which evaluate original series for each stock, we evaluate synthetic series created on the basis of linear models of stocks. Following Burgess (1999), we use the “stepwise regression” model for the formation of models of each stock. We then use the variance ratio profile together with a Monte Carlo simulation for the selection of models with potential predictability. Unlike Burgess (1999), we carry out White’s Reality Check (2000) in order to verify the existence of positive returns for the period outside the sample. We use the strategies proposed by Sullivan, Timmermann & White (1999) and Hsu & Kuan (2005) amounting to 26,410 simulated strategies. Finally, using the bootstrap methodology, with 1,000 simulations, we find strong evidence of predictability in the models, including transaction costs.
Resumo:
This paper is concerned with evaluating value at risk estimates. It is well known that using only binary variables to do this sacrifices too much information. However, most of the specification tests (also called backtests) avaliable in the literature, such as Christoffersen (1998) and Engle and Maganelli (2004) are based on such variables. In this paper we propose a new backtest that does not realy solely on binary variable. It is show that the new backtest provides a sufficiant condition to assess the performance of a quantile model whereas the existing ones do not. The proposed methodology allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker & Xiao, 2002). Our theorical findings are corroborated through a monte Carlo simulation and an empirical exercise with daily S&P500 time series.
Resumo:
A control system was designed to allow humans to manually drive an, usually automatic, two wheeled hovercraft. The size, the mass and the way of driving this vehicle proves to be an issue for the everyday, untrained person to achieve. During this thesis several control layouts were designed with the objective of creating an intuitive and easy way of driving such a vehicle. At the end two where usertested using a simulation (also developed during this thesis) of the said hovercraft set against obstacles similar to those expected to be encountered on its real environment. The two layouts are just slightly apart in performance but numerous issues were found that can be used to redesign a better control layout. This means that no definitive winner was found but a foundation for a better design was indeed found.
Resumo:
The work done in this thesis attempts to demonstrate the importance of using models that can predict and represent the mobility of our society. To answer the proposed challenges two models were examined, the first corresponds to macro simulation with the intention of finding a solution to the frequency of the bus company Horários do Funchal, responsible for transport in the city of Funchal, and some surrounding areas. Where based on a simplified model of the city it was possible to increase the frequency of journeys getting an overall reduction in costs. The second model concerns the micro simulation of Avenida do Mar, where currently is being built a new roundabout (Praça da Autonomia), which connects with this avenue. Therefore it was proposed to study the impact on local traffic, and the implementation of new traffic lights for this purpose. Four possible situations in which was seen the possibility of increasing the number of lanes on the roundabout or the insertion of a bus lane were created. The results showed that having a roundabout with three lanes running is the best option because the waiting queues are minimal, and at environmental level this model will project fewer pollutants. Thus, this thesis presents two possible methods of urban planning. Transport modelling is an area that is under constant development, the global goal is to encourage more and more the use of these models, and as such it is important to have more people to devote themselves to studying new ways of addressing current problems, so that we can have more accurate models and increasing their credibility.
Resumo:
This paper aims to present, using a set of guidelines, how to apply the conservative distributed simulation paradigm (CMB protocol) to develop efficient applications. Using these guidelines, even a user with little experience on distributed simulation and computer architecture can have good performance on distributed simulations using conservative synchronization protocols for parallel processes.The set of guidelines is focus on a specific application domain, the performance evaluation of computer systems, considering models with coarse granularity and few logical processes and running over two platforms: parallel (high performance communication environment) and distributed (low performance communication environment).
Resumo:
A theoretical investigation has been carried out to characterize bulk and selected surfaces of anatase TiO2. The calculations are performed using a B3LYP hybrid functional and 6-31G basis set within the periodic density functional approximation. Optimization procedures have been employed to determine the equilibrium geometry of the crystal and slab surface models. The compressibility, band structure, and the bulk and surface charge distributions are reported. The surface relative energies are identified to follow the sequence: (001) < (101) < (100) much less than (110) < < < (111), from the most stable surface to the least stable one. Relaxation of (001) and (101) surfaces are moderate, with no displacements exceeding; approximate to0.19 Angstrom. The theoretical results are compared with previous theoretical studies and available experimental data. (C) 2001 Elsevier B.V. B.V. All rights reserved.
Resumo:
This paper describes a novel approach for mapping lightning processes using fuzzy logic. The estimation process is carried out using a fuzzy system based on Sugeno's architecture. Simulation results confirm that proposed approach can be efficiently used in these types of problem.
Resumo:
A Lyapunov-based stabilizing control design method for uncertain nonlinear dynamical systems using fuzzy models is proposed. The controller is constructed using a design model of the dynamical process to be controlled. The design model is obtained from the truth model using a fuzzy modeling approach. The truth model represents a detailed description of the process dynamics. The truth model is used in a simulation experiment to evaluate the performance of the controller design. A method for generating local models that constitute the design model is proposed. Sufficient conditions for stability and stabilizability of fuzzy models using fuzzy state-feedback controllers are given. The results obtained are illustrated with a numerical example involving a four-dimensional nonlinear model of a stick balancer.
Resumo:
Today, the trend within the electronics industry is for the use of rapid and advanced simulation methodologies in association with synthesis toolsets. This paper presents an approach developed to support mixed-signal circuit design and analysis. The methodology proposed shows a novel approach to the problem of developing behvioural model descriptions of mixed-signal circuit topologies, by construction of a set of subsystems, that supports the automated mapping of MATLAB (R)/SINIULINK (R) models to structural VHDL-AMS descriptions. The tool developed, named (MSSV)-S-2, reads a SIMULINK (R) model file and translates it to a structural VHDL-AMS code. It also creates the file structure required to simulate the translated model in the SystemVision (TM). To validate the methodology and the developed program, the DAC08, AD7524 and AD5450 data converters were studied and initially modelled in MATLAB (R)/SIMULINK (R). The VHDL-AMS code generated automatically by (MSSV)-S-2, (MATLAB (R)/SIMULINK (R) to SystemVision (TM)), was then simulated in the SystemVision (TM). The simulation results show that the proposed approach, which is based on VHDL-AMS descriptions of the original model library elements, allows for the behavioural level simulation of complex mixed-signal circuits.