141 resultados para event-driven simulation
Resumo:
With the advent of object-oriented languages and the portability of Java, the development and use of class libraries has become widespread. Effective class reuse depends on class reliability which in turn depends on thorough testing. This paper describes a class testing approach based on modeling each test case with a tuple and then generating large numbers of tuples to thoroughly cover an input space with many interesting combinations of values. The testing approach is supported by the Roast framework for the testing of Java classes. Roast provides automated tuple generation based on boundary values, unit operations that support driver standardization, and test case templates used for code generation. Roast produces thorough, compact test drivers with low development and maintenance cost. The framework and tool support are illustrated on a number of non-trivial classes, including a graphical user interface policy manager. Quantitative results are presented to substantiate the practicality and effectiveness of the approach. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
Developments in computer and three dimensional (3D) digitiser technologies have made it possible to keep track of the broad range of data required to simulate an insect moving around or over the highly heterogeneous habitat of a plant's surface. Properties of plant parts vary within a complex canopy architecture, and insect damage can induce further changes that affect an animal's movements, development and likelihood of survival. Models of plant architectural development based on Lindenmayer systems (L-systems) serve as dynamic platforms for simulation of insect movement, providing ail explicit model of the developing 3D structure of a plant as well as allowing physiological processes associated with plant growth and responses to damage to be described and Simulated. Simple examples of the use of the L-system formalism to model insect movement, operating Lit different spatial scales-from insects foraging on an individual plant to insects flying around plants in a field-are presented. Such models can be used to explore questions about the consequences of changes in environmental architecture and configuration on host finding, exploitation and its population consequences. In effect this model is a 'virtual ecosystem' laboratory to address local as well as landscape-level questions pertinent to plant-insect interactions, taking plant architecture into account. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Models of plant architecture allow us to explore how genotype environment interactions effect the development of plant phenotypes. Such models generate masses of data organised in complex hierarchies. This paper presents a generic system for creating and automatically populating a relational database from data generated by the widely used L-system approach to modelling plant morphogenesis. Techniques from compiler technology are applied to generate attributes (new fields) in the database, to simplify query development for the recursively-structured branching relationship. Use of biological terminology in an interactive query builder contributes towards making the system biologist-friendly. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
CULTURE is an Artificial Life simulation that aims to provide primary school children with opportunities to become actively engaged in the high-order thinking processes of problem solving and critical thinking. A preliminary evaluation of CULTURE has found that it offers the freedom for children to take part in process-oriented learning experiences. Through providing children with opportunities to make inferences, validate results, explain discoveries and analyse situations, CULTURE encourages the development of high-order thinking skills. The evaluation found that CULTURE allows users to autonomously explore the important scientific concepts of life and living, and energy and change within a software environment that children find enjoyable and easy to use.
Resumo:
This paper presents results on the simulation of the solid state sintering of copper wires using Monte Carlo techniques based on elements of lattice theory and cellular automata. The initial structure is superimposed onto a triangular, two-dimensional lattice, where each lattice site corresponds to either an atom or vacancy. The number of vacancies varies with the simulation temperature, while a cluster of vacancies is a pore. To simulate sintering, lattice sites are picked at random and reoriented in terms of an atomistic model governing mass transport. The probability that an atom has sufficient energy to jump to a vacant lattice site is related to the jump frequency, and hence the diffusion coefficient, while the probability that an atomic jump will be accepted is related to the change in energy of the system as a result of the jump, as determined by the change in the number of nearest neighbours. The jump frequency is also used to relate model time, measured in Monte Carlo Steps, to the actual sintering time. The model incorporates bulk, grain boundary and surface diffusion terms and includes vacancy annihilation on the grain boundaries. The predictions of the model were found to be consistent with experimental data, both in terms of the microstructural evolution and in terms of the sintering time. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Multi-environment trials (METs) used to evaluate breeding lines vary in the number of years that they sample. We used a cropping systems model to simulate the target population of environments (TPE) for 6 locations over 108 years for 54 'near-isolines' of sorghum in north-eastern Australia. For a single reference genotype, each of 547 trials was clustered into 1 of 3 'drought environment types' (DETs) based on a seasonal water stress index. Within sequential METs of 2 years duration, the frequencies of these drought patterns often differed substantially from those derived for the entire TPE. This was reflected in variation in the mean yield of the reference genotype. For the TPE and for 2-year METs, restricted maximum likelihood methods were used to estimate components of genotypic and genotype by environment variance. These also varied substantially, although not in direct correlation with frequency of occurrence of different DETs over a 2-year period. Combined analysis over different numbers of seasons demonstrated the expected improvement in the correlation between MET estimates of genotype performance and the overall genotype averages as the number of seasons in the MET was increased.
Resumo:
This paper presents a new approach to the LU decomposition method for the simulation of stationary and ergodic random fields. The approach overcomes the size limitations of LU and is suitable for any size simulation. The proposed approach can facilitate fast updating of generated realizations with new data, when appropriate, without repeating the full simulation process. Based on a novel column partitioning of the L matrix, expressed in terms of successive conditional covariance matrices, the approach presented here demonstrates that LU simulation is equivalent to the successive solution of kriging residual estimates plus random terms. Consequently, it can be used for the LU decomposition of matrices of any size. The simulation approach is termed conditional simulation by successive residuals as at each step, a small set (group) of random variables is simulated with a LU decomposition of a matrix of updated conditional covariance of residuals. The simulated group is then used to estimate residuals without the need to solve large systems of equations.
Resumo:
Seasonal climate forecasting offers potential for improving management of crop production risks in the cropping systems of NE Australia. But how is this capability best connected to management practice? Over the past decade, we have pursued participative systems approaches involving simulation-aided discussion with advisers and decision-makers. This has led to the development of discussion support software as a key vehicle for facilitating infusion of forecasting capability into practice. In this paper, we set out the basis of our approach, its implementation and preliminary evaluation. We outline the development of the discussion support software Whopper Cropper, which was designed for, and in close consultation with, public and private advisers. Whopper Cropper consists of a database of simulation output and a graphical user interface to generate analyses of risks associated with crop management options. The charts produced provide conversation pieces for advisers to use with their farmer clients in relation to the significant decisions they face. An example application, detail of the software development process and an initial survey of user needs are presented. We suggest that discussion support software is about moving beyond traditional notions of supply-driven decision support systems. Discussion support software is largely demand-driven and can compliment participatory action research programs by providing cost-effective general delivery of simulation-aided discussions about relevant management actions. The critical role of farm management advisers and dialogue among key players is highlighted. We argue that the discussion support concept, as exemplified by the software tool Whopper Cropper and the group processes surrounding it, provides an effective means to infuse innovations, like seasonal climate forecasting, into farming practice. Crown Copyright (C) 2002 Published by Elsevier Science Ltd. All rights reserved.
Resumo:
In the picture-word interference task, naming responses are facilitated when a distractor word is orthographically and phonologically related to the depicted object as compared to an unrelated word. We used event-related functional magnetic resonance imaging (fMRI) to investigate the cerebral hemodynamic responses associated with this priming effect. Serial (or independent-stage) and interactive models of word production that explicitly account for picture-word interference effects assume that the locus of the effect is at the level of retrieving phonological codes, a role attributed recently to the left posterior superior temporal cortex (Wernicke's area). This assumption was tested by randomly presenting participants with trials from orthographically related and unrelated distractor conditions and acquiring image volumes coincident with the estimated peak hemodynamic response for each trial. Overt naming responses occurred in the absence of scanner noise, allowing reaction time data to be recorded. Analysis of this data confirmed the priming effect. Analysis of the fMRI data revealed blood oxygen level-dependent signal decreases in Wernicke's area and the right anterior temporal cortex, whereas signal increases were observed in the anterior cingulate, the right orbitomedial prefrontal, somatosensory, and inferior parietal cortices, and the occipital lobe. The results are interpreted as supporting the locus for the facilitation effect as assumed by both classes of theoretical model of word production. In addition, our results raise the possibilities that, counterintuitively, picture-word interference might be increased by the presentation of orthographically related distractors, due to competition introduced by activation of phonologically related word forms, and that this competition requires inhibitory processes to be resolved. The priming effect is therefore viewed as being sufficient to offset the increased interference. We conclude that information from functional imaging studies might be useful for constraining theoretical models of word production. (C) 2002 Elsevier Science (USA).
Resumo:
Input-driven models provide an explicit and readily testable account of language learning. Although we share Ellis's view that the statistical structure of the linguistic environment is a crucial and, until recently, relatively neglected variable in language learning, we also recognize that the approach makes three assumptions about cognition and language learning that are not universally shared. The three assumptions concern (a) the language learner as an intuitive statistician, (b) the constraints on what constitute relevant surface cues, and (c) the redescription problem faced by any system that seeks to derive abstract grammatical relations from the frequency of co-occurring surface forms and functions. These are significant assumptions that must be established if input-driven models are to gain wider acceptance. We comment on these issues and briefly describe a distributed, instance-based approach that retains the key features of the input-driven account advocated by Ellis but that also addresses shortcomings of the current approaches.
Resumo:
It has been argued that power-law time-to-failure fits for cumulative Benioff strain and an evolution in size-frequency statistics in the lead-up to large earthquakes are evidence that the crust behaves as a Critical Point (CP) system. If so, intermediate-term earthquake prediction is possible. However, this hypothesis has not been proven. If the crust does behave as a CP system, stress correlation lengths should grow in the lead-up to large events through the action of small to moderate ruptures and drop sharply once a large event occurs. However this evolution in stress correlation lengths cannot be observed directly. Here we show, using the lattice solid model to describe discontinuous elasto-dynamic systems subjected to shear and compression, that it is for possible correlation lengths to exhibit CP-type evolution. In the case of a granular system subjected to shear, this evolution occurs in the lead-up to the largest event and is accompanied by an increasing rate of moderate-sized events and power-law acceleration of Benioff strain release. In the case of an intact sample system subjected to compression, the evolution occurs only after a mature fracture system has developed. The results support the existence of a physical mechanism for intermediate-term earthquake forecasting and suggest this mechanism is fault-system dependent. This offers an explanation of why accelerating Benioff strain release is not observed prior to all large earthquakes. The results prove the existence of an underlying evolution in discontinuous elasto-dynamic, systems which is capable of providing a basis for forecasting catastrophic failure and earthquakes.
Resumo:
The particle-based Lattice Solid Model (LSM) was developed to provide a basis to study the physics of rocks and the nonlinear dynamics of earthquakes (MORA and PLACE, 1994; PLACE and MORA, 1999). A new modular and flexible LSM approach has been developed that allows different microphysics to be easily included in or removed from the model. The approach provides a virtual laboratory where numerical experiments can easily be set up and all measurable quantities visualised. The proposed approach provides a means to simulate complex phenomena such as fracturing or localisation processes, and enables the effect of different micro-physics on macroscopic behaviour to be studied. The initial 2-D model is extended to allow three-dimensional simulations to be performed and particles of different sizes to be specified. Numerical bi-axial compression experiments under different confining pressure are used to calibrate the model. By tuning the different microscopic parameters (such as coefficient of friction, microscopic strength and distribution of grain sizes), the macroscopic strength of the material and can be adjusted to be in agreement with laboratory experiments, and the orientation of fractures is consistent with the theoretical value predicted based on Mohr-Coulomb diagram. Simulations indicate that 3-D numerical models have different macroscopic properties than in 2-D and, hence, the model must be recalibrated for 3-D simulations. These numerical experiments illustrate that the new approach is capable of simulating typical rock fracture behaviour. The new model provides a basis to investigate nucleation, rupture and slip pulse propagation in complex fault zones without the previous model limitations of a regular low-level surface geometry and being restricted to two-dimensions.
Resumo:
In order to understand the earthquake nucleation process, we need to understand the effective frictional behavior of faults with complex geometry and fault gouge zones. One important aspect of this is the interaction between the friction law governing the behavior of the fault on the microscopic level and the resulting macroscopic behavior of the fault zone. Numerical simulations offer a possibility to investigate the behavior of faults on many different scales and thus provide a means to gain insight into fault zone dynamics on scales which are not accessible to laboratory experiments. Numerical experiments have been performed to investigate the influence of the geometric configuration of faults with a rate- and state-dependent friction at the particle contacts on the effective frictional behavior of these faults. The numerical experiments are designed to be similar to laboratory experiments by DIETERICH and KILGORE (1994) in which a slide-hold-slide cycle was performed between two blocks of material and the resulting peak friction was plotted vs. holding time. Simulations with a flat fault without a fault gouge have been performed to verify the implementation. These have shown close agreement with comparable laboratory experiments. The simulations performed with a fault containing fault gouge have demonstrated a strong dependence of the critical slip distance D-c on the roughness of the fault surfaces and are in qualitative agreement with laboratory experiments.
Resumo:
Solid earth simulations have recently been developed to address issues such as natural disasters, global environmental destruction and the conservation of natural resources. The simulation of solid earth phenomena involves the analysis of complex structures including strata, faults, and heterogeneous material properties. Simulation of the generation and cycle of earthquakes is particularly important, but such simulations require the analysis of complex fault dynamics. GeoFEM is a parallel finite-element analysis system intended for solid earth field phenomena problems. This paper describes recent development in the GeoFEM project for the simulation of earthquake generation and cycles.
Resumo:
The Agricultural Production Systems Simulator (APSIM) is a modular modelling framework that has been developed by the Agricultural Production Systems Research Unit in Australia. APSIM was developed to simulate biophysical process in farming systems, in particular where there is interest in the economic and ecological outcomes of management practice in the face of climatic risk. The paper outlines APSIM's structure and provides details of the concepts behind the different plant, soil and management modules. These modules include a diverse range of crops, pastures and trees, soil processes including water balance, N and P transformations, soil pH, erosion and a full range of management controls. Reports of APSIM testing in a diverse range of systems and environments are summarised. An example of model performance in a long-term cropping systems trial is provided. APSIM has been used in a broad range of applications, including support for on-farm decision making, farming systems design for production or resource management objectives, assessment of the value of seasonal climate forecasting, analysis of supply chain issues in agribusiness activities, development of waste management guidelines, risk assessment for government policy making and as a guide to research and education activity. An extensive citation list for these model testing and application studies is provided. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.