142 resultados para Partial oxalate method
em University of Queensland eSpace - Australia
Resumo:
The authors describe rock art dating research in Australia using the oxalate method While the array of dates obtained (which range from c. 1200 to c. 25000 BP) show a satisfactory correlation with other archaeological data, there are mismatches which suggest that some motifs were often imitated by later artists, and/or that the mineral accretions continued to form periodically, perhaps continuously, as a regional phenomenon over a long period of time.
Resumo:
Lateral ventricular volumes based on segmented brain MR images can be significantly underestimated if partial volume effects are not considered. This is because a group of voxels in the neighborhood of lateral ventricles is often mis-classified as gray matter voxels due to partial volume effects. This group of voxels is actually a mixture of ventricular cerebro-spinal fluid and the white matter and therefore, a portion of it should be included as part of the lateral ventricular structure. In this note, we describe an automated method for the measurement of lateral ventricular volumes on segmented brain MR images. Image segmentation was carried in combination of intensity correction and thresholding. The method is featured with a procedure for addressing mis-classified voxels in the surrounding of lateral ventricles. A detailed analysis showed that lateral ventricular volumes could be underestimated by 10 to 30% depending upon the size of the lateral ventricular structure, if mis-classified voxels were not included. Validation of the method was done through comparison with the averaged manually traced volumes. Finally, the merit of the method is demonstrated in the evaluation of the rate of lateral ventricular enlargement. (C) 2001 Elsevier Science Inc. All rights reserved.
Resumo:
Six Burkholderia solanacearum (formerly Pseudomonas solanacearum) genomic DNA fragments were isolated, using RAPD techniques and cloning, from the three genetically diverse strains: ACH092 (Biovar 4), ACH0158 (Biovar 2) and ACH0171 (Biovar 3) (1). One of these cloned fragments was selected because it was present constantly in all bacterial strains analysed. The remaining five clones were selected because Southern hybridisation revealed that each showed partial or complete specificity towards the strain of origin. A seventh genomic fragment showing a strain-specific distribution in Southern hybridisations was obtained by differential restriction, hybridisation and cloning of genomic DNA. Each of these clones was sequenced and primers to amplify the insert were designed. When DNA from the strain of origin was used as template, PCR amplification for each of these fragments yielded a single band on gel analysis. One pair of primers amplified the species-constant fragment of 281 bp from DNA of all B. solanacearum strains investigated, from DNA of the closely related bacterium which causes ''blood disease'' of banana (BDB) and in P. syzigii. The sensitivity of detection of B. solanacearum using these ubiquitous primers was between 1.3 and 20 bacterial cells. The feasibility and reliability of a PCR approach to detection and identification of B. solanacearum was tested in diverse strains of the bacterium in several countries and laboratories.
Resumo:
A detailed analysis procedure is described for evaluating rates of volumetric change in brain structures based on structural magnetic resonance (MR) images. In this procedure, a series of image processing tools have been employed to address the problems encountered in measuring rates of change based on structural MR images. These tools include an algorithm for intensity non-uniforniity correction, a robust algorithm for three-dimensional image registration with sub-voxel precision and an algorithm for brain tissue segmentation. However, a unique feature in the procedure is the use of a fractional volume model that has been developed to provide a quantitative measure for the partial volume effect. With this model, the fractional constituent tissue volumes are evaluated for voxels at the tissue boundary that manifest partial volume effect, thus allowing tissue boundaries be defined at a sub-voxel level and in an automated fashion. Validation studies are presented on key algorithms including segmentation and registration. An overall assessment of the method is provided through the evaluation of the rates of brain atrophy in a group of normal elderly subjects for which the rate of brain atrophy due to normal aging is predictably small. An application of the method is given in Part 11 where the rates of brain atrophy in various brain regions are studied in relation to normal aging and Alzheimer's disease. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
Subcycling, or the use of different timesteps at different nodes, can be an effective way of improving the computational efficiency of explicit transient dynamic structural solutions. The method that has been most widely adopted uses a nodal partition. extending the central difference method, in which small timestep updates are performed interpolating on the displacement at neighbouring large timestep nodes. This approach leads to narrow bands of unstable timesteps or statistical stability. It also can be in error due to lack of momentum conservation on the timestep interface. The author has previously proposed energy conserving algorithms that avoid the first problem of statistical stability. However, these sacrifice accuracy to achieve stability. An approach to conserve momentum on an element interface by adding partial velocities is considered here. Applied to extend the central difference method. this approach is simple. and has accuracy advantages. The method can be programmed by summing impulses of internal forces, evaluated using local element timesteps, in order to predict a velocity change at a node. However, it is still only statistically stable, so an adaptive timestep size is needed to monitor accuracy and to be adjusted if necessary. By replacing the central difference method with the explicit generalized alpha method. it is possible to gain stability by dissipating the high frequency response that leads to stability problems. However. coding the algorithm is less elegant, as the response depends on previous partial accelerations. Extension to implicit integration, is shown to be impractical due to the neglect of remote effects of internal forces acting across a timestep interface. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The Equilibrium Flux Method [1] is a kinetic theory based finite volume method for calculating the flow of a compressible ideal gas. It is shown here that, in effect, the method solves the Euler equations with added pseudo-dissipative terms and that it is a natural upwinding scheme. The method can be easily modified so that the flow of a chemically reacting gas mixture can be calculated. Results from the method for a one-dimensional non-equilibrium reacting flow are shown to agree well with a conventional continuum solution. Results are also presented for the calculation of a plane two-dimensional flow, at hypersonic speed, of a dissociating gas around a blunt-nosed body.
Resumo:
A general, fast wavelet-based adaptive collocation method is formulated for heat and mass transfer problems involving a steep moving profile of the dependent variable. The technique of grid adaptation is based on sparse point representation (SPR). The method is applied and tested for the case of a gas–solid non-catalytic reaction in a porous solid at high Thiele modulus. Accurate and convergent steep profiles are obtained for Thiele modulus as large as 100 for the case of slab and found to match the analytical solution.
Resumo:
This paper identifies research priorities in evaluating the ways in which "genomic medicine"-the use of genetic information to prevent and treat disease-may reduce tobacco-related harm by: (1) assisting more smokers to quit; (2) preventing non-smokers from beginning to smoke tobacco; and (3) reducing the harm caused by tobacco smoking. The method proposed to achieve the first aim is pharmacogenetics", the use of genetic information to optimise the selection of smoking-cessation programmes by screening smokers for polymorphisms that predict responses to different methods of smoking cessation. This method competes with the development of more effective forms of smoking cessation that involve vaccinating smokers against the effects of nicotine and using new pharmaceuticals (such as cannabinoid antagonists and nicotine agonists). The second and third aims are more speculative. They include: screening the population for genetic susceptibility to nicotine dependence and intervening (eg, by vaccinating children and adolescents against the effects of nicotine) to prevent smoking uptake, and screening the population for genetic susceptibility to tobacco-related diseases. A framework is described for future research on these policy options. This includes: epidemiological modelling and economic evaluation to specify the conditions under which these strategies are cost-effective; and social psychological research into the effect of providing genetic information on smokers' preparedness to quit, and the general views of the public on tobacco smoking.
Resumo:
The level set method has been implemented in a computational volcanology context. New techniques are presented to solve the advection equation and the reinitialisation equation. These techniques are based upon an algorithm developed in the finite difference context, but are modified to take advantage of the robustness of the finite element method. The resulting algorithm is tested on a well documented Rayleigh–Taylor instability benchmark [19], and on an axisymmetric problem where the analytical solution is known. Finally, the algorithm is applied to a basic study of lava dome growth.
Resumo:
Inaccurate species identification confounds insect ecological studies. Examining aspects of Trichogramma ecology pertinent to the novel insect resistance management strategy for future transgenic cotton, Gossypium hirsutum L., production in the Ord River Irrigation Area (ORIA) of Western Australia required accurate differentiation between morphologically similar Trichogramma species. Established molecular diagnostic methods for Trichogramma identification use species-specific sequence difference in the internal transcribed spacer (ITS)-2 chromosomal region; yet, difficulties arise discerning polymerase chain reaction (PCR) fragments of similar base pair length by gel electrophoresis. This necessitates the restriction enzyme digestion of PCR-amplified ITS-2 fragments to readily differentiate Trichogramma australicum Girault and Trichogramma pretiosum Riley. To overcome the time and expense associated with a two-step diagnostic procedure, we developed a “one-step” multiplex PCR technique using species-specific primers designed to the ITS-2 region. This approach allowed for a high-throughput analysis of samples as part of ongoing ecological studies examining Trichogramma biological control potential in the ORIA where these two species occur in sympatry.
Resumo:
A narrow absorption feature in an atomic or molecular gas (such as iodine or methane) is used as the frequency reference in many stabilized lasers. As part of the stabilization scheme an optical frequency dither is applied to the laser. In optical heterodyne experiments, this dither is transferred to the RF beat signal, reducing the spectral power density and hence the signal to noise ratio over that in the absence of dither. We removed the dither by mixing the raw beat signal with a dithered local oscillator signal. When the dither waveform is matched to that of the reference laser the output signal from the mixer is rendered dither free. Application of this method to a Winters iodine-stabilized helium-neon laser reduced the bandwidth of the beat signal from 6 MHz to 390 kHz, thereby lowering the detection threshold from 5 pW of laser power to 3 pW. In addition, a simple signal detection model is developed which predicts similar threshold reductions.
Resumo:
Clifford Geertz was best known for his pioneering excursions into symbolic or interpretive anthropology, especially in relation to Indonesia. Less well recognised are his stimulating explorations of the modern economic history of Indonesia. His thinking on the interplay of economics and culture was most fully and vigorously expounded in Agricultural Involution. That book deployed a succinctly packaged past in order to solve a pressing contemporary puzzle, Java's enduring rural poverty and apparent social immobility. Initially greeted with acclaim, later and ironically the book stimulated the deep and multi-layered research that in fact led to the eventual rejection of Geertz's central contentions. But the veracity or otherwise of Geertz's inventive characterisation of Indonesian economic development now seems irrelevant; what is profoundly important is the extraordinary stimulus he gave to a generation of scholars to explore Indonesia's modern economic history with a depth and intensity previously unimaginable.
Resumo:
In this review we demonstrate how the algebraic Bethe ansatz is used for the calculation of the-energy spectra and form factors (operator matrix elements in the basis of Hamiltonian eigenstates) in exactly solvable quantum systems. As examples we apply the theory to several models of current interest in the study of Bose-Einstein condensates, which have been successfully created using ultracold dilute atomic gases. The first model we introduce describes Josephson tunnelling between two coupled Bose-Einstein condensates. It can be used not only for the study of tunnelling between condensates of atomic gases, but for solid state Josephson junctions and coupled Cooper pair boxes. The theory is also applicable to models of atomic-molecular Bose-Einstein condensates, with two examples given and analysed. Additionally, these same two models are relevant to studies in quantum optics; Finally, we discuss the model of Bardeen, Cooper and Schrieffer in this framework, which is appropriate for systems of ultracold fermionic atomic gases, as well as being applicable for the description of superconducting correlations in metallic grains with nanoscale dimensions.; In applying all the above models to. physical situations, the need for an exact analysis of small-scale systems is established due to large quantum fluctuations which render mean-field approaches inaccurate.
Resumo:
In this paper, we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First, we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level. Finally, the tilting parameter just found is used to estimate the overflow probability of interest. We study various properties of the method in more detail for the M/M/1 queue and conjecture that similar properties also hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.