18 resultados para Implementation Model

em University of Queensland eSpace - Australia


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study, 3-D Lattice Solid Model (LSMearth or LSM) was extended by introducing particle-scale rotation. In the new model, for each 3-D particle, we introduce six degrees of freedom: Three for translational motion, and three for orientation. Six kinds of relative motions are permitted between two neighboring particles, and six interactions are transferred, i.e., radial, two shearing forces, twisting and two bending torques. By using quaternion algebra, relative rotation between two particles is decomposed into two sequence-independent rotations such that all interactions due to the relative motions between interactive rigid bodies can be uniquely decided. After incorporating this mechanism and introducing bond breaking under torsion and bending into the LSM, several tests on 2-D and 3-D rock failure under uni-axial compression are carried out. Compared with the simulations without the single particle rotational mechanism, the new simulation results match more closely experimental results of rock fracture and hence, are encouraging. Since more parameters are introduced, an approach for choosing the new parameters is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A parallel computing environment to support optimization of large-scale engineering systems is designed and implemented on Windows-based personal computer networks, using the master-worker model and the Parallel Virtual Machine (PVM). It is involved in decomposition of a large engineering system into a number of smaller subsystems optimized in parallel on worker nodes and coordination of subsystem optimization results on the master node. The environment consists of six functional modules, i.e. the master control, the optimization model generator, the optimizer, the data manager, the monitor, and the post processor. Object-oriented design of these modules is presented. The environment supports steps from the generation of optimization models to the solution and the visualization on networks of computers. User-friendly graphical interfaces make it easy to define the problem, and monitor and steer the optimization process. It has been verified by an example of a large space truss optimization. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It was hypothesized that employees' perceptions of an organizational culture strong in human relations values and open systems values would be associated with heightened levels of readiness for change which, in turn, would be predictive of change implementation success. Similarly, it was predicted that reshaping capabilities would lead to change implementation success, via its effects on employees' perceptions of readiness for change. Using a temporal research design, these propositions were tested for 67 employees working in a state government department who were about to undergo the implementation of a new end-user computing system in their workplace. Change implementation success was operationalized as user satisfaction and system usage. There was evidence to suggest that employees who perceived strong human relations values in their division at Time 1 reported higher levels of readiness for change at pre-implementation which, in turn, predicted system usage at Time 2. In addition, readiness for change mediated the relationship between reshaping capabilities and system usage. Analyses also revealed that pre-implementation levels of readiness for change exerted a positive main effect on employees' satisfaction with the system's accuracy, user friendliness, and formatting functions at post-implementation. These findings are discussed in terms of their theoretical contribution to the readiness for change literature, and in relation to the practical importance of developing positive change attitudes among employees if change initiatives are to be successful.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses how the AustLit: Australian Literature Gateway's interpretation, enhancement, and implementation of the International Federation of Library Associations and Institutions' Functional Requirements for Bibliographic Records (FRBR Final Report 1998) model is meeting the needs of Australian literature scholars for accurate bibliographic representation of the histories of literary texts. It also explores how the AustLit Gateway's underpinning research principles, which are based on the tradition of scholarly enumerative and descriptive bibliography, with enhancements from analytical bibliography and literary biography, have impacted upon our implementation of the FRBR model. The major enhancement or alteration to the model is the use of enhanced manifestations, which allow the full representation of all agents' contributions to be shown in a highly granular format by enabling creation events to be incorporated at all levels of the Work, Expression, and Manifestation nexus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the study presented was to implement a process model to simulate the dynamic behaviour of a pilot-scale process for anaerobic two-stage digestion of sewage sludge. The model implemented was initiated to support experimental investigations of the anaerobic two-stage digestion process. The model concept implemented in the simulation software package MATLAB(TM)/Simulink(R) is a derivative of the IWA Anaerobic Digestion Model No.1 (ADM1) that has been developed by the IWA task group for mathematical modelling of anaerobic processes. In the present study the original model concept has been adapted and applied to replicate a two-stage digestion process. Testing procedures, including balance checks and 'benchmarking' tests were carried out to verify the accuracy of the implementation. These combined measures ensured a faultless model implementation without numerical inconsistencies. Parameters for both, the thermophilic and the mesophilic process stage, have been estimated successfully using data from lab-scale experiments described in literature. Due to the high number of parameters in the structured model, it was necessary to develop a customised procedure that limited the range of parameters to be estimated. The accuracy of the optimised parameter sets has been assessed against experimental data from pilot-scale experiments. Under these conditions, the model predicted reasonably well the dynamic behaviour of a two-stage digestion process in pilot scale. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. Results: By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hysteresis models that eliminate the artificial pumping errors associated with the Kool-Parker (KP) soil moisture hysteresis model, such as the Parker-Lenhard (PL) method, can be computationally demanding in unsaturated transport models since they need to retain the wetting-drying history of the system. The pumping errors in these models need to be eliminated for correct simulation of cyclical systems (e.g. transport above a tidally forced watertable, infiltration and redistribution under periodic irrigation) if the soils exhibit significant hysteresis. A modification is made here to the PL method that allows it to be more readily applied to numerical models by eliminating the need to store a large number of soil moisture reversal points. The modified-PL method largely eliminates any artificial pumping error and so essentially retains the accuracy of the original PL approach. The modified-PL method is implemented in HYDRUS-1D (version 2.0), which is then used to simulate cyclic capillary fringe dynamics to show the influence of removing artificial pumping errors and to demonstrate the ease of implementation. Artificial pumping errors are shown to be significant for the soils and system characteristics used here in numerical experiments of transport above a fluctuating watertable. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the insight gained from 2-D particle models, and given that the dynamics of crustal faults occur in 3-D space, the question remains, how do the 3-D fault gouge dynamics differ from those in 2-D? Traditionally, 2-D modeling has been preferred over 3-D simulations because of the computational cost of solving 3-D problems. However, modern high performance computing architectures, combined with a parallel implementation of the Lattice Solid Model (LSM), provide the opportunity to explore 3-D fault micro-mechanics and to advance understanding of effective constitutive relations of fault gouge layers. In this paper, macroscopic friction values from 2-D and 3-D LSM simulations, performed on an SGI Altix 3700 super-cluster, are compared. Two rectangular elastic blocks of bonded particles, with a rough fault plane and separated by a region of randomly sized non-bonded gouge particles, are sheared in opposite directions by normally-loaded driving plates. The results demonstrate that the gouge particles in the 3-D models undergo significant out-of-plane motion during shear. The 3-D models also exhibit a higher mean macroscopic friction than the 2-D models for varying values of interparticle friction. 2-D LSM gouge models have previously been shown to exhibit accelerating energy release in simulated earthquake cycles, supporting the Critical Point hypothesis. The 3-D models are shown to also display accelerating energy release, and good fits of power law time-to-failure functions to the cumulative energy release are obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presents a new approach to the problem of simultaneous localization and mapping - SLAM - inspired by computational models of the hippocampus of rodents. The rodent hippocampus has been extensively studied with respect to navigation tasks, and displays many of the properties of a desirable SLAM solution. RatSLAM is an implementation of a hippocampal model that can perform SLAM in real time on a real robot. It uses a competitive attractor network to integrate odometric information with landmark sensing to form a consistent representation of the environment. Experimental results show that RatSLAM can operate with ambiguous landmark information and recover from both minor and major path integration errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Achieving consistency between a specification and its implementation is an important part of software development In previous work, we have presented a method and tool support for testing a formal specification using animation and then verifying an implementation of that specification. The method is based on a testgraph, which provides a partial model of the application under test. The testgraph is used in combination with an animator to generate test sequences for testing the formal specification. The same testgraph is used during testing to execute those same sequences on the implementation and to ensure that the implementation conforms to the specification. So far, the method and its tool support have been applied to software components that can be accessed through an application programmer interface (API). In this paper, we use an industrially-based case study to discuss the problems associated with applying the method to a software system with a graphical user interface (GUI). In particular, the lack of a standardised interface, as well as controllability and observability problems, make it difficult to automate the testing of the implementation. The method can still be applied, but the amount of testing that can be carried on the implementation is limited by the manual effort involved.