950 resultados para Java Simulation Tools


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developments in theory and experiment have raised the prospect of an electronic technology based on the discrete nature of electron tunnelling through a potential barrier. This thesis deals with novel design and analysis tools developed to study such systems. Possible devices include those constructed from ultrasmall normal tunnelling junctions. These exhibit charging effects including the Coulomb blockade and correlated electron tunnelling. They allow transistor-like control of the transfer of single carriers, and present the prospect of digital systems operating at the information theoretic limit. As such, they are often referred to as single electronic devices. Single electronic devices exhibit self quantising logic and good structural tolerance. Their speed, immunity to thermal noise, and operating voltage all scale beneficially with junction capacitance. For ultrasmall junctions the possibility of room temperature operation at sub picosecond timescales seems feasible. However, they are sensitive to external charge; whether from trapping-detrapping events, externally gated potentials, or system cross-talk. Quantum effects such as charge macroscopic quantum tunnelling may degrade performance. Finally, any practical system will be complex and spatially extended (amplifying the above problems), and prone to fabrication imperfection. This summarises why new design and analysis tools are required. Simulation tools are developed, concentrating on the basic building blocks of single electronic systems; the tunnelling junction array and gated turnstile device. Three main points are considered: the best method of estimating capacitance values from physical system geometry; the mathematical model which should represent electron tunnelling based on this data; application of this model to the investigation of single electronic systems. (DXN004909)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Simulating passenger flows within airports is very important as it can provide an indication of queue lengths, bottlenecks, system capacity and overall level of service. To date, visual simulation tools such as agent based models have focused on processing formalities such as check-in, and not incorporate discretionary activities such as duty-free shopping. As airport retail contributes greatly to airport revenue generation, but also has potentially detrimental effects on facilitation efficiency benchmarks, this study developed a simplistic simulation model which captures common duty-free purchasing opportunities, as well as high-level behaviours of passengers. It is argued that such a model enables more realistic simulation of passenger facilitation, and provides a platform for simulating real-time revenue generation as well as more complex passenger behaviours within the airport. Simulations are conducted to verify the suitability of the model for inclusion in the international arrivals process for assessing passenger flow and infrastructure utilization.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In microscopic traffic simulators, the interaction between vehicles is considered. The dynamics of the system then becomes an emergent property of the interaction between its components. Such interactions include lane-changing, car-following behaviours and intersection management. Although, in some cases, such simulators produce realistic prediction, they do not allow for an important aspect of the dynamics, that is, the driver-vehicle interaction. This paper introduces a physically sound vehicle-driver model for realistic microscopic simulation. By building a nanoscopic traffic simulation model that uses steering angle and throttle position as parameters, the model aims to overcome unrealistic acceleration and deceleration values, as found in various microscopic simulation tools. A physics engine calculates the driving force of the vehicle, and the preliminary results presented here, show that, through a realistic driver-vehicle-environment simulator, it becomes possible to model realistic driver and vehicle behaviours in a traffic simulation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In power hardware in the loop (PHIL) simulations, a real-time simulated power system is interfaced to a piece of hardware, usually called hardware under test (HuT). A PHIL test can be realized using several simulation tools. Among them Real Time Digital Simulator (RTDS) is an ideal tool to perform complex power system simulations in near real-time. Stable operation of the entire system, along with the accuracy of simulation results are the main concerns regarding a PHIL simulation. In this paper, a simulated power network on RTDS will be interfaced to HuT through a voltage source converter (VSC). Issues around stability and other interface problems are studied and a new method to stabilize some unstable PHIL cases is proposed. PHIL simulation results in PSCAD and RSCAD are presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Because of the variable and changing environment, advisors and farmers are seeking systems that provide risk management support at a number of time scales. The Agricultural Production Systems Research Unit, Toowoomba, Australia has developed a suite of tools to assist advisors and farmers to better manage risk in cropping. These tools range from simple rainfall analysis tools (Rainman, HowWet, HowOften) through crop simulation tools (WhopperCropper and YieldProphet) to the most complex, APSFarm, a whole-farm analysis tool. Most are derivatives of the APSIM crop model. These tools encompass a range of complexity and potential benefit to both the farming community and for government policy. This paper describes, the development and usage of two specific products; WhopperCropper and APSFarm. WhopperCropper facilitates simulation-aided discussion of growers' exposure to risk when comparing alternative crop input options. The user can readily generate 'what-if' scenarios that separate the major influences whilst holding other factors constant. Interactions of the major inputs can also be tested. A manager can examine the effects of input levels (and Southern Oscillation Index phase) to broadly determine input levels that match their attitude to risk. APSFarm has been used to demonstrate that management changes can have different effects in short and long time periods. It can be used to test local advisors and farmers' knowledge and experience of their desired rotation system. This study has shown that crop type has a larger influence than more conservative minimum soil water triggers in the long term. However, in short term dry periods, minimum soil water triggers and maximum area of the various crops can give significant financial gains.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Agricultural systems models worldwide are increasingly being used to explore options and solutions for the food security, climate change adaptation and mitigation and carbon trading problem domains. APSIM (Agricultural Production Systems sIMulator) is one such model that continues to be applied and adapted to this challenging research agenda. From its inception twenty years ago, APSIM has evolved into a framework containing many of the key models required to explore changes in agricultural landscapes with capability ranging from simulation of gene expression through to multi-field farms and beyond. Keating et al. (2003) described many of the fundamental attributes of APSIM in detail. Much has changed in the last decade, and the APSIM community has been exploring novel scientific domains and utilising software developments in social media, web and mobile applications to provide simulation tools adapted to new demands. This paper updates the earlier work by Keating et al. (2003) and chronicles the changing external challenges and opportunities being placed on APSIM during the last decade. It also explores and discusses how APSIM has been evolving to a “next generation” framework with improved features and capabilities that allow its use in many diverse topics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In a probabilistic assessment of the performance of structures subjected to uncertain environmental loads such as earthquakes, an important problem is to determine the probability that the structural response exceeds some specified limits within a given duration of interest. This problem is known as the first excursion problem, and it has been a challenging problem in the theory of stochastic dynamics and reliability analysis. In spite of the enormous amount of attention the problem has received, there is no procedure available for its general solution, especially for engineering problems of interest where the complexity of the system is large and the failure probability is small.

The application of simulation methods to solving the first excursion problem is investigated in this dissertation, with the objective of assessing the probabilistic performance of structures subjected to uncertain earthquake excitations modeled by stochastic processes. From a simulation perspective, the major difficulty in the first excursion problem comes from the large number of uncertain parameters often encountered in the stochastic description of the excitation. Existing simulation tools are examined, with special regard to their applicability in problems with a large number of uncertain parameters. Two efficient simulation methods are developed to solve the first excursion problem. The first method is developed specifically for linear dynamical systems, and it is found to be extremely efficient compared to existing techniques. The second method is more robust to the type of problem, and it is applicable to general dynamical systems. It is efficient for estimating small failure probabilities because the computational effort grows at a much slower rate with decreasing failure probability than standard Monte Carlo simulation. The simulation methods are applied to assess the probabilistic performance of structures subjected to uncertain earthquake excitation. Failure analysis is also carried out using the samples generated during simulation, which provide insight into the probable scenarios that will occur given that a structure fails.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nowadays, train control in-lab simulation tools play a crucial role in reducing extensive and expensive on-site railway testing activities. In this paper, we present our contribution in this arena by detailing the internals of our European Railway Train Management System in-lab demonstrator. This demonstrator is built over a general-purpose simulation framework, Riverbed Modeler, previously Opnet Modeler. Our framework models both ERTMS subsystems, the Automatic Train Protection application layer based on movement authority message exchange and the telecommunication subsystem based on GSM-R communication technology. We provide detailed information on our modelling strategy. We also validate our simulation framework with real trace data. To conclude, under current industry migration scenario from GSM-R legacy obsolescence to IP-based heterogeneous technologies, our simulation framework represents a singular tool to railway operators. As an example, we present the assessment of related performance indicators for a specific railway network using a candidate replacement technology, LTE, versus current legacy technology. To the best of our knowledge, there is no similar initiative able to measure the impact of the telecommunication subsystem in the railway network availability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The newly formed Escape and Evacuation Naval Authority regulates the provision of abandonment equipment and procedures for all Ministry of Defence Vessels. As such, it assures that access routes on board are evaluated early in the design process to maximize their efficiency and to eliminate, as far as possible, any congestion that might occur during escape. This analysis can be undertaken using a computer-based simulation for given escape scenarios and replicates the layout of the vessel and the interactions between each individual and the ship structure. One such software tool that facilitates this type of analysis is maritimeEXODUS. This tool, through large scale testing and validation, emulates human shipboard behaviour during emergency scenarios; however it is largely based around the behaviour of civilian passengers and fixtures and fittings of merchant vessels. Hence there existed a clear requirement to understand the behaviour of well-trained naval personnel as opposed to civilian passengers and be able to model the fixtures and fittings that are exclusive to warships, thus allowing improvements to both maritimeEXODUS and other software products. Human factor trials using the Royal Navy training facilities at Whale Island, Portsmouth were recently undertaken to collect data that improves our understanding of the aforementioned differences. It is hoped that this data will form the basis of a long-term improvement package that will provide global validation of these simulation tools and assist in the development of specific Escape and Evacuation standards for warships. © 2005: Royal Institution of Naval Architects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Problems in the preservation of the quality of granular material products are complex and arise from a series of sources during transport and storage. In either designing a new plant or, more likely, analysing problems that give rise to product quality degradation in existing operations, practical measurement and simulation tools and technologies are required to support the process engineer. These technologies are required to help in both identifying the source of such problems and then designing them out. As part of a major research programme on quality in particulate manufacturing computational models have been developed for segregation in silos, degradation in pneumatic conveyors, and the development of caking during storage, which use where possible, micro-mechanical relationships to characterize the behaviour of granular materials. The objective of the work presented here is to demonstrate the use of these computational models of unit processes involved in the analysis of large-scale processes involving the handling of granular materials. This paper presents a set of simulations of a complete large-scale granular materials handling operation, involving the discharge of the materials from a silo, its transport through a dilute-phase pneumatic conveyor, and the material storage in a big bag under varying environmental temperature and humidity conditions. Conclusions are drawn on the capability of the computational models to represent key granular processes, including particle size segregation, degradation, and moisture migration caking.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The increasing complexity and scale of cloud computing environments due to widespread data centre heterogeneity makes measurement-based evaluations highly difficult to achieve. Therefore the use of simulation tools to support decision making in cloud computing environments to cope with this problem is an increasing trend. However the data required in order to model cloud computing environments with an appropriate degree of accuracy is typically large, very difficult to collect without some form of automation, often not available in a suitable format and a time consuming process if done manually. In this research, an automated method for cloud computing topology definition, data collection and model creation activities is presented, within the context of a suite of tools that have been developed and integrated to support these activities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Electricity Markets are not only a new reality but an evolving one as the involved players and rules change at a relatively high rate. Multi-agent simulation combined with Artificial Intelligence techniques may result in sophisticated tools very helpful under this context. Some simulation tools have already been developed, some of them very interesting. However, at the present state it is important to go a step forward in Electricity Markets simulators as this is crucial for facing changes in Power Systems. This paper explains the context and needs of electricity market simulation, describing the most important characteristics of available simulators. We present our work concerning MASCEM simulator, presenting its features as well as the improvements being made to accomplish the change and challenging reality of Electricity Markets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Electricity markets are complex environments with very particular characteristics. A critical issue regarding these specific characteristics concerns the constant changes they are subject to. This is a result of the electricity markets’ restructuring, which was performed so that the competitiveness could be increased, but it also had exponential implications in the increase of the complexity and unpredictability in those markets scope. The constant growth in markets unpredictability resulted in an amplified need for market intervenient entities in foreseeing market behaviour. The need for understanding the market mechanisms and how the involved players’ interaction affects the outcomes of the markets, contributed to the growth of usage of simulation tools. Multi-agent based software is particularly well fitted to analyze dynamic and adaptive systems with complex interactions among its constituents, such as electricity markets. This dissertation presents ALBidS – Adaptive Learning strategic Bidding System, a multiagent system created to provide decision support to market negotiating players. This system is integrated with the MASCEM electricity market simulator, so that its advantage in supporting a market player can be tested using cases based on real markets’ data. ALBidS considers several different methodologies based on very distinct approaches, to provide alternative suggestions of which are the best actions for the supported player to perform. The approach chosen as the players’ actual action is selected by the employment of reinforcement learning algorithms, which for each different situation, simulation circumstances and context, decides which proposed action is the one with higher possibility of achieving the most success. Some of the considered approaches are supported by a mechanism that creates profiles of competitor players. These profiles are built accordingly to their observed past actions and reactions when faced with specific situations, such as success and failure. The system’s context awareness and simulation circumstances analysis, both in terms of results performance and execution time adaptation, are complementary mechanisms, which endow ALBidS with further adaptation and learning capabilities.