142 resultados para implementation method
em University of Queensland eSpace - Australia
Resumo:
In population pharmacokinetic studies, the precision of parameter estimates is dependent on the population design. Methods based on the Fisher information matrix have been developed and extended to population studies to evaluate and optimize designs. In this paper we propose simple programming tools to evaluate population pharmacokinetic designs. This involved the development of an expression for the Fisher information matrix for nonlinear mixed-effects models, including estimation of the variance of the residual error. We implemented this expression as a generic function for two software applications: S-PLUS and MATLAB. The evaluation of population designs based on two pharmacokinetic examples from the literature is shown to illustrate the efficiency and the simplicity of this theoretic approach. Although no optimization method of the design is provided, these functions can be used to select and compare population designs among a large set of possible designs, avoiding a lot of simulations.
Resumo:
We present an efficient and robust method for calculating state-to-state reaction probabilities utilising the Lanczos algorithm for a real symmetric Hamiltonian. The method recasts the time-independent Artificial Boundary Inhomogeneity technique recently introduced by Jang and Light (J. Chem. Phys. 102 (1995) 3262) into a tridiagonal (Lanczos) representation. The calculation proceeds at the cost of a single Lanczos propagation for each boundary inhomogeneity function and yields all state-to-state probabilities (elastic, inelastic and reactive) over an arbitrary energy range. The method is applied to the collinear H + H-2 reaction and the results demonstrate it is accurate and efficient in comparison with previous calculations. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Free-space optical interconnects (FSOIs), made up of dense arrays of vertical-cavity surface-emitting lasers, photodetectors and microlenses can be used for implementing high-speed and high-density communication links, and hence replace the inferior electrical interconnects. A major concern in the design of FSOIs is minimization of the optical channel cross talk arising from laser beam diffraction. In this article we introduce modifications to the mode expansion method of Tanaka et al. [IEEE Trans. Microwave Theory Tech. MTT-20, 749 (1972)] to make it an efficient tool for modelling and design of FSOIs in the presence of diffraction. We demonstrate that our modified mode expansion method has accuracy similar to the exact solution of the Huygens-Kirchhoff diffraction integral in cases of both weak and strong beam clipping, and that it is much more accurate than the existing approximations. The strength of the method is twofold: first, it is applicable in the region of pronounced diffraction (strong beam clipping) where all other approximations fail and, second, unlike the exact-solution method, it can be efficiently used for modelling diffraction on multiple apertures. These features make the mode expansion method useful for design and optimization of free-space architectures containing multiple optical elements inclusive of optical interconnects and optical clock distribution systems. (C) 2003 Optical Society of America.
Resumo:
We have recently developed a scaleable Artificial Boundary Inhomogeneity (ABI) method [Chem. Phys. Lett.366, 390–397 (2002)] based on the utilization of the Lanczos algorithm, and in this work explore an alternative iterative implementation based on the Chebyshev algorithm. Detailed comparisons between the two iterative methods have been made in terms of efficiency as well as convergence behavior. The Lanczos subspace ABI method was also further improved by the use of a simpler three-term backward recursion algorithm to solve the subspace linear system. The two different iterative methods are tested on the model collinear H+H2 reactive state-to-state scattering.
Resumo:
Complementing our recent work on subspace wavepacket propagation [Chem. Phys. Lett. 336 (2001) 149], we introduce a Lanczos-based implementation of the Faber polynomial quantum long-time propagator. The original version [J. Chem. Phys. 101 (1994) 10493] implicitly handles non-Hermitian Hamiltonians, that is, those perturbed by imaginary absorbing potentials to handle unwanted reflection effects. However, like many wavepacket propagation schemes, it encounters a bottleneck associated with dense matrix-vector multiplications. Our implementation seeks to reduce the quantity of such costly operations without sacrificing numerical accuracy. For some benchmark scattering problems, our approach compares favourably with the original. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
This article describes the construction and use of a systematic structured method of mental health country situation appraisal, in order to help meet the need for conceptual tools to assist planners and policy makers develop and audit policy and implementation strategies. The tool encompasses the key domains of context, needs, resources, provisions and outcomes, and provides a framework for synthesizing key qualitative and quantitative information, flagging up gaps in knowledge, and for reviewing existing policies. It serves as an enabling tool to alert and inform policy makers, professionals and other key stakeholders about important issues which need to be considered in mental health policy development. It provides detailed country specific information in a systematic format, to facilitate global sharing of experiences of mental health reform and strategies between policy makers and other stakeholders. Lastly, it is designed to be a capacity building tool for local stakeholders to enhance situation appraisal, and multisectorial policy development and implementation.
Resumo:
The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.
Resumo:
Recently the Balanced method was introduced as a class of quasi-implicit methods for solving stiff stochastic differential equations. We examine asymptotic and mean-square stability for several implementations of the Balanced method and give a generalized result for the mean-square stability region of any Balanced method. We also investigate the optimal implementation of the Balanced method with respect to strong convergence.
Resumo:
Motivation: An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. Results: By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.
Resumo:
In this study, 3-D Lattice Solid Model (LSMearth or LSM) was extended by introducing particle-scale rotation. In the new model, for each 3-D particle, we introduce six degrees of freedom: Three for translational motion, and three for orientation. Six kinds of relative motions are permitted between two neighboring particles, and six interactions are transferred, i.e., radial, two shearing forces, twisting and two bending torques. By using quaternion algebra, relative rotation between two particles is decomposed into two sequence-independent rotations such that all interactions due to the relative motions between interactive rigid bodies can be uniquely decided. After incorporating this mechanism and introducing bond breaking under torsion and bending into the LSM, several tests on 2-D and 3-D rock failure under uni-axial compression are carried out. Compared with the simulations without the single particle rotational mechanism, the new simulation results match more closely experimental results of rock fracture and hence, are encouraging. Since more parameters are introduced, an approach for choosing the new parameters is presented.
Resumo:
Summary form only given. The Java programming language supports concurrency. Concurrent programs are harder to verify than their sequential counterparts due to their inherent nondeterminism and a number of specific concurrency problems such as interference and deadlock. In previous work, we proposed a method for verifying concurrent Java components based on a mix of code inspection, static analysis tools, and the ConAn testing tool. The method was derived from an analysis of concurrency failures in Java components, but was not applied in practice. In this paper, we explore the method by applying it to an implementation of the well-known readers-writers problem and a number of mutants of that implementation. We only apply it to a single, well-known example, and so we do not attempt to draw any general conclusions about the applicability or effectiveness of the method. However, the exploration does point out several strengths and weaknesses in the method, which enable us to fine-tune the method before we carry out a more formal evaluation on other, more realistic components.
Resumo:
Achieving consistency between a specification and its implementation is an important part of software development In previous work, we have presented a method and tool support for testing a formal specification using animation and then verifying an implementation of that specification. The method is based on a testgraph, which provides a partial model of the application under test. The testgraph is used in combination with an animator to generate test sequences for testing the formal specification. The same testgraph is used during testing to execute those same sequences on the implementation and to ensure that the implementation conforms to the specification. So far, the method and its tool support have been applied to software components that can be accessed through an application programmer interface (API). In this paper, we use an industrially-based case study to discuss the problems associated with applying the method to a software system with a graphical user interface (GUI). In particular, the lack of a standardised interface, as well as controllability and observability problems, make it difficult to automate the testing of the implementation. The method can still be applied, but the amount of testing that can be carried on the implementation is limited by the manual effort involved.
Resumo:
The Equilibrium Flux Method [1] is a kinetic theory based finite volume method for calculating the flow of a compressible ideal gas. It is shown here that, in effect, the method solves the Euler equations with added pseudo-dissipative terms and that it is a natural upwinding scheme. The method can be easily modified so that the flow of a chemically reacting gas mixture can be calculated. Results from the method for a one-dimensional non-equilibrium reacting flow are shown to agree well with a conventional continuum solution. Results are also presented for the calculation of a plane two-dimensional flow, at hypersonic speed, of a dissociating gas around a blunt-nosed body.
Resumo:
The level set method has been implemented in a computational volcanology context. New techniques are presented to solve the advection equation and the reinitialisation equation. These techniques are based upon an algorithm developed in the finite difference context, but are modified to take advantage of the robustness of the finite element method. The resulting algorithm is tested on a well documented Rayleigh–Taylor instability benchmark [19], and on an axisymmetric problem where the analytical solution is known. Finally, the algorithm is applied to a basic study of lava dome growth.
Resumo:
Inaccurate species identification confounds insect ecological studies. Examining aspects of Trichogramma ecology pertinent to the novel insect resistance management strategy for future transgenic cotton, Gossypium hirsutum L., production in the Ord River Irrigation Area (ORIA) of Western Australia required accurate differentiation between morphologically similar Trichogramma species. Established molecular diagnostic methods for Trichogramma identification use species-specific sequence difference in the internal transcribed spacer (ITS)-2 chromosomal region; yet, difficulties arise discerning polymerase chain reaction (PCR) fragments of similar base pair length by gel electrophoresis. This necessitates the restriction enzyme digestion of PCR-amplified ITS-2 fragments to readily differentiate Trichogramma australicum Girault and Trichogramma pretiosum Riley. To overcome the time and expense associated with a two-step diagnostic procedure, we developed a “one-step” multiplex PCR technique using species-specific primers designed to the ITS-2 region. This approach allowed for a high-throughput analysis of samples as part of ongoing ecological studies examining Trichogramma biological control potential in the ORIA where these two species occur in sympatry.