70 resultados para RT-LAB simulation
em University of Queensland eSpace - Australia
Resumo:
What interactions are sufficient to simulate arbitrary quantum dynamics in a composite quantum system? We provide an efficient algorithm to simulate any desired two-body Hamiltonian evolution using any fixed two-body entangling n-qubit Hamiltonian and local unitary operations. It follows that universal quantum computation can be performed using any entangling interaction and local unitary operations.
Resumo:
The N-methyl-D-aspartate (NMDA)-selective subtype of ionotropic glutamate receptor is of importance in neuronal differentiation and synapse consolidation, activity-dependent forms of synaptic plasticity, and excitatory amino acid-mediated neuronal toxicity [Neurosci. Res. Program, Bull. 19 (1981) 1; Lab. Invest. 68 (1993) 372]. NMDA receptors exist in vivo as tetrameric or pentameric complexes comprising proteins from two families of homologous subunits, designated NR1 and NR2(A-D) [Biochem. Biophys. Res. Commun. 185 (1992) 826]. The gene coding for the human NR1 subunit (hNR1) is composed of 21 exons, three of which (4, 20 and 21) can be differentially spliced to generate a total of eight distinct subunit variants. We detail here a competitive RT-PCR (cRT-PCR) protocol to quantify endogenous levels of hNR1 splice variants in autopsied human brain. Quantitation of each hNR1 splice variant is performed using standard curve methodology in which a known amount of synthetic ribonucleic acid competitor (internal standard) is co-amplified against total RNA. This method can be used for the quantitation of hNR1 mRNA levels in response to acute or chronic disease states, in particular in the glutamatergic-associated neuronal loss observed in Alzheimer's disease [J. Neurochem. 78 (2001) 175]. Furthermore, alterations in hNR1 mRNA expression may be reflected at the translational level, resulting in functional changes in the NMDA receptor. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Air Traffic Control Laboratory Simulator (ATC-lab) is a new low- and medium-fidelity task environment that simulates air traffic control. ATC-lab allows the researcher to study human performance of tasks under tightly controlled experimental conditions in a dynamic, spatial environment. The researcher can create standardized air traffic scenarios by manipulating a wide variety of parameters. These include temporal and spatial variables. There are two main versions of ATC-lab. The medium-fidelity simulator provides a simplified version of en route air traffic control, requiring participants to visually search a screen and both recognize and resolve conflicts so that adequate separation is maintained between all aircraft. The low-fidelity simulator presents pairs of aircraft in isolation, controlling the participant's focus of attention, which provides a more systematic measurement of conflict recognition and resolution performance. Preliminary studies have demonstrated that ATC-lab is a flexible tool for applied cognition research.
Resumo:
The XSophe-Sophe-XeprView((R)) computer simulation software suite enables scientists to easily determine spin Hamiltonian parameters from isotropic, randomly oriented and single crystal continuous wave electron paramagnetic resonance (CW EPR) spectra from radicals and isolated paramagnetic metal ion centers or clusters found in metalloproteins, chemical systems and materials science. XSophe provides an X-windows graphical user interface to the Sophe programme and allows: creation of multiple input files, local and remote execution of Sophe, the display of sophelog (output from Sophe) and input parameters/files. Sophe is a sophisticated computer simulation software programme employing a number of innovative technologies including; the Sydney OPera HousE (SOPHE) partition and interpolation schemes, a field segmentation algorithm, the mosaic misorientation linewidth model, parallelization and spectral optimisation. In conjunction with the SOPHE partition scheme and the field segmentation algorithm, the SOPHE interpolation scheme and the mosaic misorientation linewidth model greatly increase the speed of simulations for most spin systems. Employing brute force matrix diagonalization in the simulation of an EPR spectrum from a high spin Cr(III) complex with the spin Hamiltonian parameters g(e) = 2.00, D = 0.10 cm(-1), E/D = 0.25, A(x) = 120.0, A(y) = 120.0, A(z) = 240.0 x 10(-4) cm(-1) requires a SOPHE grid size of N = 400 (to produce a good signal to noise ratio) and takes 229.47 s. In contrast the use of either the SOPHE interpolation scheme or the mosaic misorientation linewidth model requires a SOPHE grid size of only N = 18 and takes 44.08 and 0.79 s, respectively. Results from Sophe are transferred via the Common Object Request Broker Architecture (CORBA) to XSophe and subsequently to XeprView((R)) where the simulated CW EPR spectra (1D and 2D) can be compared to the experimental spectra. Energy level diagrams, transition roadmaps and transition surfaces aid the interpretation of complicated randomly oriented CW EPR spectra and can be viewed with a web browser and an OpenInventor scene graph viewer.
Resumo:
OctVCE is a cartesian cell CFD code produced especially for numerical simulations of shock and blast wave interactions with complex geometries. Virtual Cell Embedding (VCE) was chosen as its cartesian cell kernel as it is simple to code and sufficient for practical engineering design problems. This also makes the code much more ‘user-friendly’ than structured grid approaches as the gridding process is done automatically. The CFD methodology relies on a finite-volume formulation of the unsteady Euler equations and is solved using a standard explicit Godonov (MUSCL) scheme. Both octree-based adaptive mesh refinement and shared-memory parallel processing capability have also been incorporated. For further details on the theory behind the code, see the companion report 2007/12.
Resumo:
Numerical experiments using a finite difference method were carried out to determine the motion of axisymmetric Taylor vortices for narrow-gap Taylor vortex flow. When a pressure gradient is imposed on the flow the vortices are observed to move with an axial speed of 1.16 +/- 0.005 times the mean axial flow velocity. The method of Brenner was used to calculate the long-time axial spread of material in the flow. For flows where there is no pressure gradient, the axial dispersion scales with the square root of the molecular diffusion, in agreement with the results of Rosen-bluth et al. for high Peclet number dispersion in spatially periodic flows with a roll structure. When a pressure gradient is imposed the dispersion increases by an amount approximately equal to 6.5 x 10(-4) (W) over bar(2)d(2)/D-m, where (W) over bar is the average axial velocity in the annulus, analogous to Taylor dispersion for laminar flow in an empty tube.
Resumo:
To simulate cropping systems, crop models must not only give reliable predictions of yield across a wide range of environmental conditions, they must also quantify water and nutrient use well, so that the status of the soil at maturity is a good representation of the starting conditions for the next cropping sequence. To assess the suitability for this task a range of crop models, currently used in Australia, were tested. The models differed in their design objectives, complexity and structure and were (i) tested on diverse, independent data sets from a wide range of environments and (ii) model components were further evaluated with one detailed data set from a semi-arid environment. All models were coded into the cropping systems shell APSIM, which provides a common soil water and nitrogen balance. Crop development was input, thus differences between simulations were caused entirely by difference in simulating crop growth. Under nitrogen non-limiting conditions between 73 and 85% of the observed kernel yield variation across environments was explained by the models. This ranged from 51 to 77% under varying nitrogen supply. Water and nitrogen effects on leaf area index were predicted poorly by all models resulting in erroneous predictions of dry matter accumulation and water use. When measured light interception was used as input, most models improved in their prediction of dry matter and yield. This test highlighted a range of compensating errors in all modelling approaches. Time course and final amount of water extraction was simulated well by two models, while others left up to 25% of potentially available soil water in the profile. Kernel nitrogen percentage was predicted poorly by all models due to its sensitivity to small dry matter changes. Yield and dry matter could be estimated adequately for a range of environmental conditions using the general concepts of radiation use efficiency and transpiration efficiency. However, leaf area and kernel nitrogen dynamics need to be improved to achieve better estimates of water and nitrogen use if such models are to be use to evaluate cropping systems. (C) 1998 Elsevier Science B.V.
Resumo:
Previous work has identified several short-comings in the ability of four spring wheat and one barley model to simulate crop processes and resource utilization. This can have important implications when such models are used within systems models where final soil water and nitrogen conditions of one crop define the starting conditions of the following crop. In an attempt to overcome these limitations and to reconcile a range of modelling approaches, existing model components that worked demonstrably well were combined with new components for aspects where existing capabilities were inadequate. This resulted in the Integrated Wheat Model (I_WHEAT), which was developed as a module of the cropping systems model APSIM. To increase predictive capability of the model, process detail was reduced, where possible, by replacing groups of processes with conservative, biologically meaningful parameters. I_WHEAT does not contain a soil water or soil nitrogen balance. These are present as other modules of APSIM. In I_WHEAT, yield is simulated using a linear increase in harvest index whereby nitrogen or water limitations can lead to early termination of grainfilling and hence cessation of harvest index increase. Dry matter increase is calculated either from the amount of intercepted radiation and radiation conversion efficiency or from the amount of water transpired and transpiration efficiency, depending on the most limiting resource. Leaf area and tiller formation are calculated from thermal time and a cultivar specific phyllochron interval. Nitrogen limitation first reduces leaf area and then affects radiation conversion efficiency as it becomes more severe. Water or nitrogen limitations result in reduced leaf expansion, accelerated leaf senescence or tiller death. This reduces the radiation load on the crop canopy (i.e. demand for water) and can make nitrogen available for translocation to other organs. Sensitive feedbacks between light interception and dry matter accumulation are avoided by having environmental effects acting directly on leaf area development, rather than via biomass production. This makes the model more stable across environments without losing the interactions between the different external influences. When comparing model output with models tested previously using data from a wide range of agro-climatic conditions, yield and biomass predictions were equal to the best of those models, but improvements could be demonstrated for simulating leaf area dynamics in response to water and nitrogen supply, kernel nitrogen content, and total water and nitrogen use. I_WHEAT does not require calibration for any of the environments tested. Further model improvement should concentrate on improving phenology simulations, a more thorough derivation of coefficients to describe leaf area development and a better quantification of some processes related to nitrogen dynamics. (C) 1998 Elsevier Science B.V.
Resumo:
The linearity of daily linear harvest index (HI) increase can provide a simple means to predict grain growth and yield in field crops. However, the stability of the rate of increase across genotypes and environments is uncertain. Data from three field experiments were collated to investigate the phase of linear HI increase of sunflower (Helianthus annuus L,) across environments by changing genotypes, sowing time, N level, and solar irradiation level. Linear increase in HI was similar among different genotypes, N levels, and radiation treatments (mean 0.0125 d(-1)). but significant differences occurred between sowings, The linear increase in HI was not stable at very low temperatures (down to 9 degrees C) during grain filling, due to possible limitations to biomass accumulation and translocation (mean 0.0091 d(-1)). Using the linear increase in HI to predict grain yield requires predictions of the duration from anthesis to the onset of linear HI increase (lag phase) and the cessation of linear RT increase. These studies showed that the lag phase differed, and the linear HI increase ceased when 91% of the anthesis to physiological maturity period had been completed.
Resumo:
A version of the Agricultural Production Systems Simulator (APSIM) capable of simulating the key agronomic aspects of intercropping maize between legume shrub hedgerows was described and parameterised in the first paper of this series (Nelson et al., this issue). In this paper, APSIM is used to simulate maize yields and soil erosion from traditional open-field farming and hedgerow intercropping in the Philippine uplands. Two variants of open-field farming were simulated using APSIM, continuous and fallow, for comparison with intercropping maize between leguminous shrub hedgerows. Continuous open-field maize farming was predicted to be unsustainable in the long term, while fallow open-field farming was predicted to slow productivity decline by spreading the effect of erosion over a larger cropping area. Hedgerow intercropping was predicted to reduce erosion by maintaining soil surface cover during periods of intense rainfall, contributing to sustainable production of maize in the long term. In the third paper in this series, Nelson et al. (this issue) use cost-benefit analysis to compare the economic viability of hedgerow intercropping relative to traditional open-field farming of maize in relatively inaccessible upland areas. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
The use of computational fluid dynamics simulations for calibrating a flush air data system is described, In particular, the flush air data system of the HYFLEX hypersonic vehicle is used as a case study. The HYFLEX air data system consists of nine pressure ports located flush with the vehicle nose surface, connected to onboard pressure transducers, After appropriate processing, surface pressure measurements can he converted into useful air data parameters. The processing algorithm requires an accurate pressure model, which relates air data parameters to the measured pressures. In the past, such pressure models have been calibrated using combinations of flight data, ground-based experimental results, and numerical simulation. We perform a calibration of the HYFLEX flush air data system using computational fluid dynamics simulations exclusively, The simulations are used to build an empirical pressure model that accurately describes the HYFLEX nose pressure distribution ol cr a range of flight conditions. We believe that computational fluid dynamics provides a quick and inexpensive way to calibrate the air data system and is applicable to a broad range of flight conditions, When tested with HYFLEX flight data, the calibrated system is found to work well. It predicts vehicle angle of attack and angle of sideslip to accuracy levels that generally satisfy flight control requirements. Dynamic pressure is predicted to within the resolution of the onboard inertial measurement unit. We find that wind-tunnel experiments and flight data are not necessary to accurately calibrate the HYFLEX flush air data system for hypersonic flight.
Resumo:
A space-marching code for the simulation and optimization of inviscid supersonic flow in three dimensions is described. The now in a scramjet module with a relatively complex three-dimensional geometry is examined and wall-pressure estimates are compared with experimental data. Given that viscous effects are not presently included, the comparison is reasonable. The thermodynamic compromise of adding heat in a diverging combustor is also examined. The code is then used to optimize the shape of a thrust surface for a simpler (box-section) scramjet module in the presence of uniform and nonuniform heat distributions. The optimum two-dimensional profiles for the thrust surface are obtained via a perturbation procedure that requires about 30-50 now solutions. It is found that the final shapes are fairly insensitive to the details of the heat distribution.
Resumo:
Two experimental studies were conducted to examine whether the stress-buffering effects of behavioral control on work task responses varied as a function of procedural information. Study 1 manipulated low and high levels of task demands, behavioral control, and procedural information for 128 introductory psychology students completing an in-basket activity. ANOVA procedures revealed a significant three-way interaction among these variables in the prediction of subjective task performance and task satisfaction. It was found that procedural information buffered the negative effects of task demands on ratings of performance and satisfaction only under conditions of low behavioral control. This pattern of results suggests that procedural information may have a compensatory effect when the work environment is characterized by a combination of high task demands and low behavioral control. Study 2 (N = 256) utilized simple and complex versions of the in-basket activity to examine the extent to which the interactive relationship among task demands, behavioral control, and procedural information varied as a function of task complexity. There was further support for the stress-buffering role of procedural information on work task responses under conditions of low behavioral control. This effect was, however, only present when the in-basket activity was characterized by high task complexity, suggesting that the interactive relationship among these variables may depend on the type of tasks performed at work. Copyright (C) 1999 John Wiley & Sons, Ltd.