946 resultados para Time domain simulation tools
Multivariate analyses of variance and covariance for simulation studies involving normal time series
Resumo:
Photocopy.
Resumo:
* This paper was made according to the program of fundamental scientific research of the Presidium of the Russian Academy of Sciences «Mathematical simulation and intellectual systems», the project "Theoretical foundation of the intellectual systems based on ontologies for intellectual support of scientific researches".
Resumo:
Context. Fossil systems are defined to be X- ray bright galaxy groups ( or clusters) with a two- magnitude difference between their two brightest galaxies within half the projected virial radius, and represent an interesting extreme of the population of galaxy agglomerations. However, the physical conditions and processes leading to their formation are still poorly constrained. Aims. We compare the outskirts of fossil systems with that of normal groups to understand whether environmental conditions play a significant role in their formation. We study the groups of galaxies in both, numerical simulations and observations. Methods. We use a variety of statistical tools including the spatial cross- correlation function and the local density parameter Delta(5) to probe differences in the density and structure of the environments of "" normal"" and "" fossil"" systems in the Millennium simulation. Results. We find that the number density of galaxies surrounding fossil systems evolves from greater than that observed around normal systems at z = 0.69, to lower than the normal systems by z = 0. Both fossil and normal systems exhibit an increment in their otherwise radially declining local density measure (Delta(5)) at distances of order 2.5 r(vir) from the system centre. We show that this increment is more noticeable for fossil systems than normal systems and demonstrate that this difference is linked to the earlier formation epoch of fossil groups. Despite the importance of the assembly time, we show that the environment is different for fossil and non- fossil systems with similar masses and formation times along their evolution. We also confirm that the physical characteristics identified in the Millennium simulation can also be detected in SDSS observations. Conclusions. Our results confirm the commonly held belief that fossil systems assembled earlier than normal systems but also show that the surroundings of fossil groups could be responsible for the formation of their large magnitude gap.
Resumo:
Computational simulations of the title reaction are presented, covering a temperature range from 300 to 2000 K. At lower temperatures we find that initial formation of the cyclopropene complex by addition of methylene to acetylene is irreversible, as is the stabilisation process via collisional energy transfer. Product branching between propargyl and the stable isomers is predicted at 300 K as a function of pressure for the first time. At intermediate temperatures (1200 K), complex temporal evolution involving multiple steady states begins to emerge. At high temperatures (2000 K) the timescale for subsequent unimolecular decay of thermalized intermediates begins to impinge on the timescale for reaction of methylene, such that the rate of formation of propargyl product does not admit a simple analysis in terms of a single time-independent rate constant until the methylene supply becomes depleted. Likewise, at the elevated temperatures the thermalized intermediates cannot be regarded as irreversible product channels. Our solution algorithm involves spectral propagation of a symmetrised version of the discretized master equation matrix, and is implemented in a high precision environment which makes hitherto unachievable low-temperature modelling a reality.
Resumo:
Earthquakes and tsunamis along Morocco's coasts have been reported since historical times. The threat posed by tsunamis must be included in coastal risk studies. This study focuses on the tsunami impact and vulnerability assessment of the Casablanca harbour and surrounding area using a combination of tsunami inundation numerical modelling, field survey data and geographic information system. The tsunami scenario used here is compatible with the 1755 Lisbon event that we considered to be the worst case tsunami scenario. Hydrodynamic modelling was performed with an adapted version of the Cornell Multigrid Coupled Tsunami Model from Cornell University. The simulation covers the eastern domain of the Azores-Gibraltar fracture zone corresponding to the largest tsunamigenic area in the North Atlantic. The proposed vulnerability model attempts to provide an insight into the tsunami vulnerability of building stock. Results in the form of a vulnerability map will be useful for decision makers and local authorities in preventing the community resiliency for tsunami hazards.
Resumo:
In almost all industrialized countries, the energy sector has suffered a severe restructuring that originated a greater complexity in market players’ interactions. The complexity that these changes brought made way for the creation of decision support tools that facilitate the study and understanding of these markets. MASCEM – “Multiagent Simulator for Competitive Electricity Markets” arose in this context providing a framework for evaluating new rules, new behaviour, and new participants in deregulated electricity markets. MASCEM uses game theory, machine learning techniques, scenario analysis and optimisation techniques to model market agents and to provide them with decision-support. ALBidS is a multiagent system created to provide decision support to market negotiating players. Fully integrated with MASCEM it considers several different methodologies based on very distinct approaches. The Six Thinking Hats is a powerful technique used to look at decisions from different perspectives. This tool’s goal is to force the thinker to move outside his habitual thinking style. It was developed to be used mainly at meetings in order to “run better meetings, make faster decisions”. This dissertation presents a study about the applicability of the Six Thinking Hats technique in Decision Support Systems, particularly with the multiagent paradigm like the MASCEM simulator. As such this work’s proposal is of a new agent, a meta-learner based on STH technique that organizes several different ALBidS’ strategies and combines the distinct answers into a single one that, expectedly, out-performs any of them.
Resumo:
In previous works we have proposed a hybrid wired/wireless PROFIBUS solution where the interconnection between the heterogeneous media was accomplished through bridge-like devices with wireless stations being able to move between different wireless cells. Additionally, we had also proposed a worst-case timing analysis assuming that stations were stationary. In this paper we advance these previous works by proposing a worst-case timing analysis for the system’s message streams considering the effect of inter-cell mobility.
Resumo:
A number of characteristics are boosting the eagerness of extending Ethernet to also cover factory-floor distributed real-time applications. Full-duplex links, non-blocking and priority-based switching, bandwidth availability, just to mention a few, are characteristics upon which that eagerness is building up. But, will Ethernet technologies really manage to replace traditional Fieldbus networks? Ethernet technology, by itself, does not include features above the lower layers of the OSI communication model. In the past few years, it is particularly significant the considerable amount of work that has been devoted to the timing analysis of Ethernet-based technologies. It happens, however, that the majority of those works are restricted to the analysis of sub-sets of the overall computing and communication system, thus without addressing timeliness at a holistic level. To this end, we are addressing a few inter-linked research topics with the purpose of setting a framework for the development of tools suitable to extract temporal properties of Commercial-Off-The-Shelf (COTS) Ethernet-based factory-floor distributed systems. This framework is being applied to a specific COTS technology, Ethernet/IP. In this paper, we reason about the modelling and simulation of Ethernet/IP-based systems, and on the use of statistical analysis techniques to provide usable results. Discrete event simulation models of a distributed system can be a powerful tool for the timeliness evaluation of the overall system, but particular care must be taken with the results provided by traditional statistical analysis techniques.
Resumo:
WorldFIP is standardised as European Norm EN 50170 - General Purpose Field Communication System. Field communication systems (fieldbuses) started to be widely used as the communication support for distributed computer-controlled systems (DCCS), and are being used in all sorts of process control and manufacturing applications within different types of industries. There are several advantages in using fieldbuses as a replacement of for the traditional point-to-point links between sensors/actuators and computer-based control systems. Indeed they concern economical ones (cable savings) but, importantly, fieldbuses allow an increased decentralisation and distribution of the processing power over the field. Typically DCCS have real-time requirements that must be fulfilled. By this, we mean that process data must be transferred between network computing nodes within a maximum admissible time span. WorldFIP has very interesting mechanisms to schedule data transfers. It explicit distinguishes to types of traffic: periodic and aperiodic. In this paper we describe how WorldFIP handles these two types of traffic, and more importantly, we provide a comprehensive analysis for guaranteeing the real-time requirements of both types of traffic. A major contribution is made in the analysis of worst-case response time of aperiodic transfer requests.
Resumo:
In the past few years, a significant amount of work has been devoted to the timing analysis of Ethernet-based technologies. However, none of these address the problem of timeliness evaluation at a holistic level. This paper describes a research framework embracing this objective. It is advocated that, simulation models can be a powerful tool, not only for timeliness evaluation, but also to enable the introduction of less pessimistic assumptions in an analytical response time approach, which, most often, are afflicted with simplifications leading to pessimistic assumptions and, therefore, delusive results. To this end, we address a few inter-linked research topics with the purpose of setting a framework for developing tools suitable to extract temporal properties of commercial-off-the-shelf (COTS) factory-floor communication systems.
Resumo:
Compositional schedulability analysis of hierarchical realtime systems is a well-studied problem. Various techniques have been developed to abstract resource requirements of components in such systems, and schedulability has been addressed using these abstract representations (also called component interfaces). These approaches for compositional analysis incur resource overheads when they abstract components into interfaces. In this talk, we define notions of resource schedulability and optimality for component interfaces, and compare various approaches.
Resumo:
The simulation analysis is important approach to developing and evaluating the systems in terms of development time and cost. This paper demonstrates the application of Time Division Cluster Scheduling (TDCS) tool for the configuration of IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs using the simulation analysis, as an illustrative example that confirms the practical applicability of the tool. The simulation study analyses how the number of retransmissions impacts the reliability of data transmission, the energy consumption of the nodes and the end-to-end communication delay, based on the simulation model that was implemented in the Opnet Modeler. The configuration parameters of the network are obtained directly from the TDCS tool. The simulation results show that the number of retransmissions impacts the reliability, the energy consumption and the end-to-end delay, in a way that improving the one may degrade the others.
Resumo:
This technical report presents a description of the output data files and the tools used to validate and to extract information from the output data files generated by the Repeater-Based Hybrid Wired/Wireless Network Simulator and the Bridge-Based Hybrid Wired/Wireless Network Simulator.
Resumo:
The performance of the Weather Research and Forecast (WRF) model in wind simulation was evaluated under different numerical and physical options for an area of Portugal, located in complex terrain and characterized by its significant wind energy resource. The grid nudging and integration time of the simulations were the tested numerical options. Since the goal is to simulate the near-surface wind, the physical parameterization schemes regarding the boundary layer were the ones under evaluation. Also, the influences of the local terrain complexity and simulation domain resolution on the model results were also studied. Data from three wind measuring stations located within the chosen area were compared with the model results, in terms of Root Mean Square Error, Standard Deviation Error and Bias. Wind speed histograms, occurrences and energy wind roses were also used for model evaluation. Globally, the model accurately reproduced the local wind regime, despite a significant underestimation of the wind speed. The wind direction is reasonably simulated by the model especially in wind regimes where there is a clear dominant sector, but in the presence of low wind speeds the characterization of the wind direction (observed and simulated) is very subjective and led to higher deviations between simulations and observations. Within the tested options, results show that the use of grid nudging in simulations that should not exceed an integration time of 2 days is the best numerical configuration, and the parameterization set composed by the physical schemes MM5–Yonsei University–Noah are the most suitable for this site. Results were poorer in sites with higher terrain complexity, mainly due to limitations of the terrain data supplied to the model. The increase of the simulation domain resolution alone is not enough to significantly improve the model performance. Results suggest that error minimization in the wind simulation can be achieved by testing and choosing a suitable numerical and physical configuration for the region of interest together with the use of high resolution terrain data, if available.