50 resultados para multi-method study
Resumo:
A kinetic theory based Navier-Stokes solver has been implemented on a parallel supercomputer (Intel iPSC Touchstone Delta) to study the leeward flowfield of a blunt nosed delta wing at 30-deg incidence at hypersonic speeds (similar to the proposed HERMES aerospace plane). Computational results are presented for a series of grids for both inviscid and laminar viscous flows at Reynolds numbers of 225,000 and 2.25 million. In addition, comparisons are made between the present and two independent calculations of the some flows (by L. LeToullec and P. Guillen, and S. Menne) which were presented at the Workshop on Hypersonic Flows for Re-entry Problems, Antibes, France, 1991.
Resumo:
The level set method has been implemented in a computational volcanology context. New techniques are presented to solve the advection equation and the reinitialisation equation. These techniques are based upon an algorithm developed in the finite difference context, but are modified to take advantage of the robustness of the finite element method. The resulting algorithm is tested on a well documented Rayleigh–Taylor instability benchmark [19], and on an axisymmetric problem where the analytical solution is known. Finally, the algorithm is applied to a basic study of lava dome growth.
Resumo:
The research reported here draws on a study of five teenagers from a Dinka-speaking community of Sudanese settling in Australia. A range of factors including language proficiency, social network structure and language attitudes are examined as possible causes for the variability of language use. The results and discussion illustrate how the use of a triangular research approach captured the complexity of the participants' language situation and was critical to developing a full understanding of the interplay of factors influencing the teens' language maintenance and shift in a way that no single method could. Further, it shows that employment of different methodologies allowed for flexibility in data collection to ensure the fullest response from participants. Overall, this research suggests that for studies of non-standard communities, variability in research methods may prove more of a strength that the use of standardised instruments and approaches.
Resumo:
Rupture of a light cellophane diaphragm in an expansion tube has been studied by an optical method. The influence of the light diaphragm on test flow generation has long been recognised, however the diaphragm rupture mechanism is less well known. It has been previously postulated that the diaphragm ruptures around its periphery due to the dynamic pressure loading of the shock wave, with the diaphragm material at some stage being removed from the flow to allow the shock to accelerate to the measured speeds downstream. The images obtained in this series of experiments are the first to show the mechanism of diaphragm rupture and mass removal in an expansion tube. A light diaphragm was impulsively loaded via a shock wave and a series of images was recorded holographically throughout the rupture process, showing gradual destruction of the diaphragm. Features such as the diaphragm material, the interface between gases, and a reflected shock were clearly visualised. Both qualitative and quantitative aspects of the rupture dynamics were derived from the images and compared with existing one-dimensional theory.
Resumo:
Clifford Geertz was best known for his pioneering excursions into symbolic or interpretive anthropology, especially in relation to Indonesia. Less well recognised are his stimulating explorations of the modern economic history of Indonesia. His thinking on the interplay of economics and culture was most fully and vigorously expounded in Agricultural Involution. That book deployed a succinctly packaged past in order to solve a pressing contemporary puzzle, Java's enduring rural poverty and apparent social immobility. Initially greeted with acclaim, later and ironically the book stimulated the deep and multi-layered research that in fact led to the eventual rejection of Geertz's central contentions. But the veracity or otherwise of Geertz's inventive characterisation of Indonesian economic development now seems irrelevant; what is profoundly important is the extraordinary stimulus he gave to a generation of scholars to explore Indonesia's modern economic history with a depth and intensity previously unimaginable.
Resumo:
In this review we demonstrate how the algebraic Bethe ansatz is used for the calculation of the-energy spectra and form factors (operator matrix elements in the basis of Hamiltonian eigenstates) in exactly solvable quantum systems. As examples we apply the theory to several models of current interest in the study of Bose-Einstein condensates, which have been successfully created using ultracold dilute atomic gases. The first model we introduce describes Josephson tunnelling between two coupled Bose-Einstein condensates. It can be used not only for the study of tunnelling between condensates of atomic gases, but for solid state Josephson junctions and coupled Cooper pair boxes. The theory is also applicable to models of atomic-molecular Bose-Einstein condensates, with two examples given and analysed. Additionally, these same two models are relevant to studies in quantum optics; Finally, we discuss the model of Bardeen, Cooper and Schrieffer in this framework, which is appropriate for systems of ultracold fermionic atomic gases, as well as being applicable for the description of superconducting correlations in metallic grains with nanoscale dimensions.; In applying all the above models to. physical situations, the need for an exact analysis of small-scale systems is established due to large quantum fluctuations which render mean-field approaches inaccurate.
Resumo:
In this paper, we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First, we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level. Finally, the tilting parameter just found is used to estimate the overflow probability of interest. We study various properties of the method in more detail for the M/M/1 queue and conjecture that similar properties also hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.
Resumo:
A reversible linear master equation model is presented for pressure- and temperature-dependent bimolecular reactions proceeding via multiple long-lived intermediates. This kinetic treatment, which applies when the reactions are measured under pseudo-first-order conditions, facilitates accurate and efficient simulation of the time dependence of the populations of reactants, intermediate species and products. Detailed exploratory calculations have been carried out to demonstrate the capabilities of the approach, with applications to the bimolecular association reaction C3H6 + H reversible arrow C3H7 and the bimolecular chemical activation reaction C2H2 +(CH2)-C-1--> C3H3+H. The efficiency of the method can be dramatically enhanced through use of a diffusion approximation to the master equation, and a methodology for exploiting the sparse structure of the resulting rate matrix is established.
Resumo:
The reconstruction of power industries has brought fundamental changes to both power system operation and planning. This paper presents a new planning method using multi-objective optimization (MOOP) technique, as well as human knowledge, to expand the transmission network in open access schemes. The method starts with a candidate pool of feasible expansion plans. Consequent selection of the best candidates is carried out through a MOOP approach, of which multiple objectives are tackled simultaneously, aiming at integrating the market operation and planning as one unified process in context of deregulated system. Human knowledge has been applied in both stages to ensure the selection with practical engineering and management concerns. The expansion plan from MOOP is assessed by reliability criteria before it is finalized. The proposed method has been tested with the IEEE 14-bus system and relevant analyses and discussions have been presented.
Resumo:
This paper describes U2DE, a finite-volume code that numerically solves the Euler equations. The code was used to perform multi-dimensional simulations of the gradual opening of a primary diaphragm in a shock tube. From the simulations, the speed of the developing shock wave was recorded and compared with other estimates. The ability of U2DE to compute shock speed was confirmed by comparing numerical results with the analytic solution for an ideal shock tube. For high initial pressure ratios across the diaphragm, previous experiments have shown that the measured shock speed can exceed the shock speed predicted by one-dimensional models. The shock speeds computed with the present multi-dimensional simulation were higher than those estimated by previous one-dimensional models and, thus, were closer to the experimental measurements. This indicates that multi-dimensional flow effects were partly responsible for the relatively high shock speeds measured in the experiments.
Resumo:
The large fat globules that can be present in UHT milk due to inadequate homogenisation cause a cream layer to form that limits the shelf life of UHT milk. Four different particle size measurement techniques were used to measure the size of fat globules in poorly homogenised UHT milk processed in a UHT pilot plant. The thickness of the cream layer that formed during storage was negatively correlated with homogenisation pressure. It was positively correlated with the mass mean diameter and the percentage volume of particles between 1.5 and 2 mu m diameter, as determined by laser light scattering using the Malvern Mastersizer. Also, the thickness of the cream layer was positively correlated with the volume mode diameter and the percentage volume of particles between 1.5 and 2 mu m diameter, as determined by electrical impedance using the Coulter Counter. The cream layer thickness did not correlate significantly with the Coulter Counter measurements of volume mean diameter, or volume percentages of particles between 2 and 5 mu m or 5 and 10 mu m diameter. Spectroturbidimetry (Emulsion Quality Analyser) and light microscopy analyses were found to be unsuitable for assessing the size of the fat particles. This study suggests that the fat globule size distribution as determined by the electrical impedance method (Coulter Counter) is the most useful for determining the efficiency of homogenisation and therefore for predicting the stability of the fat emulsion in UHT milk during storage.
Resumo:
Algorithms for explicit integration of structural dynamics problems with multiple time steps (subcycling) are investigated. Only one such algorithm, due to Smolinski and Sleith has proved to be stable in a classical sense. A simplified version of this algorithm that retains its stability is presented. However, as with the original version, it can be shown to sacrifice accuracy to achieve stability. Another algorithm in use is shown to be only statistically stable, in that a probability of stability can be assigned if appropriate time step limits are observed. This probability improves rapidly with the number of degrees of freedom in a finite element model. The stability problems are shown to be a property of the central difference method itself, which is modified to give the subcycling algorithm. A related problem is shown to arise when a constraint equation in time is introduced into a time-continuous space-time finite element model. (C) 1998 Elsevier Science S.A.