952 resultados para Dynamic simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of computational fluid dynamics simulations for calibrating a flush air data system is described, In particular, the flush air data system of the HYFLEX hypersonic vehicle is used as a case study. The HYFLEX air data system consists of nine pressure ports located flush with the vehicle nose surface, connected to onboard pressure transducers, After appropriate processing, surface pressure measurements can he converted into useful air data parameters. The processing algorithm requires an accurate pressure model, which relates air data parameters to the measured pressures. In the past, such pressure models have been calibrated using combinations of flight data, ground-based experimental results, and numerical simulation. We perform a calibration of the HYFLEX flush air data system using computational fluid dynamics simulations exclusively, The simulations are used to build an empirical pressure model that accurately describes the HYFLEX nose pressure distribution ol cr a range of flight conditions. We believe that computational fluid dynamics provides a quick and inexpensive way to calibrate the air data system and is applicable to a broad range of flight conditions, When tested with HYFLEX flight data, the calibrated system is found to work well. It predicts vehicle angle of attack and angle of sideslip to accuracy levels that generally satisfy flight control requirements. Dynamic pressure is predicted to within the resolution of the onboard inertial measurement unit. We find that wind-tunnel experiments and flight data are not necessary to accurately calibrate the HYFLEX flush air data system for hypersonic flight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. A model of the population dynamics of Banksia ornata was developed, using stochastic dynamic programming (a state-dependent decision-making tool), to determine optimal fire management strategies that incorporate trade-offs between biodiversity conservation and fuel reduction. 2. The modelled population of B. ornata was described by its age and density, and was exposed to the risk of unplanned fires and stochastic variation in germination success. 3. For a given population in each year, three management strategies were considered: (i) lighting a prescribed fire; (ii) controlling the incidence of unplanned fire; (iii) doing nothing. 4. The optimal management strategy depended on the state of the B. ornata population, with the time since the last fire (age of the population) being the most important variable. Lighting a prescribed fire at an age of less than 30 years was only optimal when the density of seedlings after a fire was low (< 100 plants ha(-1)) or when there were benefits of maintaining a low fuel load by using more frequent fire. 5. Because the cost of management was assumed to be negligible (relative to the value of the persistence of the population), the do-nothing option was never the optimal strategy, although lighting prescribed fires had only marginal benefits when the mean interval between unplanned fires was less than 20-30 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A dynamic modelling methodology, which combines on-line variable estimation and parameter identification with physical laws to form an adaptive model for rotary sugar drying processes, is developed in this paper. In contrast to the conventional rate-based models using empirical transfer coefficients, the heat and mass transfer rates are estimated by using on-line measurements in the new model. Furthermore, a set of improved sectional solid transport equations with localized parameters is developed in this work to reidentified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.place the global correlation for the computation of solid retention time. Since a number of key model variables and parameters are identified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generalised model for the prediction of single char particle gasification dynamics, accounting for multi-component mass transfer with chemical reaction, heat transfer, as well as structure evolution and peripheral fragmentation is developed in this paper. Maxwell-Stefan analysis is uniquely applied to both micro and macropores within the framework of the dusty-gas model to account for the bidisperse nature of the char, which differs significantly from the conventional models that are based on a single pore type. The peripheral fragmentation and random-pore correlation incorporated into the model enable prediction of structure/reactivity relationships. The occurrence of chemical reaction within the boundary layer reported by Biggs and Agarwal (Chem. Eng. Sci. 52 (1997) 941) has been confirmed through an analysis of CO/CO2 product ratio obtained from model simulations. However, it is also quantitatively observed that the significance of boundary layer reaction reduces notably with the reduction of oxygen concentration in the flue gas, operational pressure and film thickness. Computations have also shown that in the presence of diffusional gradients peripheral fragmentation occurs in the early stages on the surface, after which conversion quickens significantly due to small particle size. Results of the early commencement of peripheral fragmentation at relatively low overall conversion obtained from a large number of simulations agree well with experimental observations reported by Feng and Bhatia (Energy & Fuels 14 (2000) 297). Comprehensive analysis of simulation results is carried out based on well accepted physical principles to rationalise model prediction. (C) 2001 Elsevier Science Ltd. AH rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developments in computer and three dimensional (3D) digitiser technologies have made it possible to keep track of the broad range of data required to simulate an insect moving around or over the highly heterogeneous habitat of a plant's surface. Properties of plant parts vary within a complex canopy architecture, and insect damage can induce further changes that affect an animal's movements, development and likelihood of survival. Models of plant architectural development based on Lindenmayer systems (L-systems) serve as dynamic platforms for simulation of insect movement, providing ail explicit model of the developing 3D structure of a plant as well as allowing physiological processes associated with plant growth and responses to damage to be described and Simulated. Simple examples of the use of the L-system formalism to model insect movement, operating Lit different spatial scales-from insects foraging on an individual plant to insects flying around plants in a field-are presented. Such models can be used to explore questions about the consequences of changes in environmental architecture and configuration on host finding, exploitation and its population consequences. In effect this model is a 'virtual ecosystem' laboratory to address local as well as landscape-level questions pertinent to plant-insect interactions, taking plant architecture into account. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been argued that power-law time-to-failure fits for cumulative Benioff strain and an evolution in size-frequency statistics in the lead-up to large earthquakes are evidence that the crust behaves as a Critical Point (CP) system. If so, intermediate-term earthquake prediction is possible. However, this hypothesis has not been proven. If the crust does behave as a CP system, stress correlation lengths should grow in the lead-up to large events through the action of small to moderate ruptures and drop sharply once a large event occurs. However this evolution in stress correlation lengths cannot be observed directly. Here we show, using the lattice solid model to describe discontinuous elasto-dynamic systems subjected to shear and compression, that it is for possible correlation lengths to exhibit CP-type evolution. In the case of a granular system subjected to shear, this evolution occurs in the lead-up to the largest event and is accompanied by an increasing rate of moderate-sized events and power-law acceleration of Benioff strain release. In the case of an intact sample system subjected to compression, the evolution occurs only after a mature fracture system has developed. The results support the existence of a physical mechanism for intermediate-term earthquake forecasting and suggest this mechanism is fault-system dependent. This offers an explanation of why accelerating Benioff strain release is not observed prior to all large earthquakes. The results prove the existence of an underlying evolution in discontinuous elasto-dynamic, systems which is capable of providing a basis for forecasting catastrophic failure and earthquakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pectus excavatum is the most common congenital deformity of the anterior chest wall, in which an abnormal formation of the rib cage gives the chest a caved-in or sunken appearance. Today, the surgical correction of this deformity is carried out in children and adults through Nuss technic, which consists in the placement of a prosthetic bar under the sternum and over the ribs. Although this technique has been shown to be safe and reliable, not all patients have achieved adequate cosmetic outcome. This often leads to psychological problems and social stress, before and after the surgical correction. This paper targets this particular problem by presenting a method to predict the patient surgical outcome based on pre-surgical imagiologic information and chest skin dynamic modulation. The proposed approach uses the patient pre-surgical thoracic CT scan and anatomical-surgical references to perform a 3D segmentation of the left ribs, right ribs, sternum and skin. The technique encompasses three steps: a) approximation of the cartilages, between the ribs and the sternum, trough b-spline interpolation; b) a volumetric mass spring model that connects two layers - inner skin layer based on the outer pleura contour and the outer surface skin; and c) displacement of the sternum according to the prosthetic bar position. A dynamic model of the skin around the chest wall region was generated, capable of simulating the effect of the movement of the prosthetic bar along the sternum. The results were compared and validated with patient postsurgical skin surface acquired with Polhemus FastSCAN system

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a predictive optimal matrix converter controller for a flywheel energy storage system used as Dynamic Voltage Restorer (DVR). The flywheel energy storage device is based on a steel seamless tube mounted as a vertical axis flywheel to store kinetic energy. The motor/generator is a Permanent Magnet Synchronous Machine driven by the AC-AC Matrix Converter. The matrix control method uses a discrete-time model of the converter system to predict the expected values of the input and output currents for all the 27 possible vectors generated by the matrix converter. An optimal controller minimizes control errors using a weighted cost functional. The flywheel and control process was tested as a DVR to mitigate voltage sags and swells. Simulation results show that the DVR is able to compensate the critical load voltage without delays, voltage undershoots or overshoots, overcoming the input/output coupling of matrix converters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents new integrated model for variable-speed wind energy conversion systems, considering a more accurate dynamic of the wind turbine, rotor, generator, power converter and filter. Pulse width modulation by space vector modulation associated with sliding mode is used for controlling the power converters. Also, power factor control is introduced at the output of the power converters. Comprehensive performance simulation studies are carried out with matrix, two-level and multilevel power converter topologies in order to adequately assert the system performance. Conclusions are duly drawn.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because of the adverse effect of CO2 from fossil fuel combustion on the earth's ecosystems, the most cost-effective method for CO2 capture is an important area of research. The predominant process for CO2 capture currently employed by industry is chemical absorption in amine solutions. A dynamic model for the de-absorption process was developed with monoethanolamine (MEA) solution. Henry's law was used for modelling the vapour phase equilibrium of the CO2, and fugacity ratios calculated by the Peng-Robinson equation of state (EOS) were used for H2O, MEA, N-2 and O-2. Chemical reactions between CO2 and MEA were included in the model along with the enhancement factor for chemical absorption. Liquid and vapour energy balances were developed to calculate the liquid and vapour temperature, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving numerous entities trying to obtain the best advantages and profits while limited by power-network characteristics and constraints.1 The restructuring and consequent deregulation of electricity markets introduced a new economic dimension to the power industry. Some observers have criticized the restructuring process, however, because it has failed to improve market efficiency and has complicated the assurance of reliability and fairness of operations. To study and understand this type of market, we developed the Multiagent Simulator of Competitive Electricity Markets (MASCEM) platform based on multiagent simulation. The MASCEM multiagent model includes players with strategies for bid definition, acting in forward, day-ahead, and balancing markets and considering both simple and complex bids. Our goal with MASCEM was to simulate as many market models and player types as possible. This approach makes MASCEM both a short- and mediumterm simulation as well as a tool to support long-term decisions, such as those taken by regulators. This article proposes a new methodology integrated in MASCEM for bid definition in electricity markets. This methodology uses reinforcement learning algorithms to let players perceive changes in the environment, thus helping them react to the dynamic environment and adapt their bids accordingly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity markets are complex environments with very particular characteristics. A critical issue regarding these specific characteristics concerns the constant changes they are subject to. This is a result of the electricity markets’ restructuring, which was performed so that the competitiveness could be increased, but it also had exponential implications in the increase of the complexity and unpredictability in those markets scope. The constant growth in markets unpredictability resulted in an amplified need for market intervenient entities in foreseeing market behaviour. The need for understanding the market mechanisms and how the involved players’ interaction affects the outcomes of the markets, contributed to the growth of usage of simulation tools. Multi-agent based software is particularly well fitted to analyze dynamic and adaptive systems with complex interactions among its constituents, such as electricity markets. This dissertation presents ALBidS – Adaptive Learning strategic Bidding System, a multiagent system created to provide decision support to market negotiating players. This system is integrated with the MASCEM electricity market simulator, so that its advantage in supporting a market player can be tested using cases based on real markets’ data. ALBidS considers several different methodologies based on very distinct approaches, to provide alternative suggestions of which are the best actions for the supported player to perform. The approach chosen as the players’ actual action is selected by the employment of reinforcement learning algorithms, which for each different situation, simulation circumstances and context, decides which proposed action is the one with higher possibility of achieving the most success. Some of the considered approaches are supported by a mechanism that creates profiles of competitor players. These profiles are built accordingly to their observed past actions and reactions when faced with specific situations, such as success and failure. The system’s context awareness and simulation circumstances analysis, both in terms of results performance and execution time adaptation, are complementary mechanisms, which endow ALBidS with further adaptation and learning capabilities.