422 resultados para QA76
Resumo:
We consider the problem of finding the heat distribution and the shape of the liquid fraction during laser welding of a thick steel plate using the finite volume CFD package PHYSICA. Since the shape of the keyhole is not known in advance, the following two-step approach to handling this problem has been employed. In the first stage, we determine the geometry of the keyhole for the steady-state case and form an appropriate mesh that includes both the workpiece and the keyhole. In the second stage, we impose the boundary conditions by assigning temperature to the walls of the keyhole and find the heat distribution and the shape of the liquid fraction for a given welding speed and material properties. We construct a fairly accurate approximation of the keyhole as a sequence of include sliced cones. A formula for finding the initial radius of the keyhole is derived by determining the radius of the vaporisation isotherm for the line heat source. We report on the results of a series of computational experiments for various heat input values and welding velocities.
Resumo:
A new approach to the prediction of bend lifetime in pneumatic conveyors, subject to erosive wear is described. Mathematical modelling is exploited. Commercial Computational Fluid Dynamics (CFD) software is used for the prediction of air flow and particle tracks, and custom code for the modelling of bend erosion and lifetime prediction. The custom code uses a toroidal geometry, and employs a range of empirical data rather than trying to fit classical erosion models to a particular circumstance. The data used was obtained relatively quickly and easily from a gas-blast erosion tester. A full-scale pneumatic conveying rig was used to validate a sample of the bend lifetime predictions, and the results suggest accuracy of within ±65%, using calibration methods. Finally, the work is distilled into user-friendly interactive software that will make erosion lifetime predictions for a wide range of bends under varying conveying conditions. This could be a valuable tool for the pneumatic conveyor design or maintenance engineer.
Resumo:
When designing a new passenger ship or modifying an existing design, how do we ensure that the proposed design and crew emergency procedures are safe from an evacuation point of view? In the wake of major maritime disasters such as the Herald of Free Enterprise and the Estonia and in light of the growth in the numbers of high density, high-speed ferries and large capacity cruise ships, issues concerned with the evacuation of passengers and crew at sea are receiving renewed interest. In the maritime industry, ship evacuation models offer the promise to quickly and efficiently bring evacuation considerations into the design phase, while the ship is "on the drawing board". maritimeEXODUS-winner of the BCS, CITIS and RINA awards - is such a model. Features such as the ability to realistically simulate human response to fire, the capability to model human performance in heeled orientations, a virtual reality environment that produces realistic visualisations of the modelled scenarios and with an integrated abandonment model, make maritimeEXODUS a truly unique tool for assessing the evacuation capabilities of all types of vessels under a variety of conditions. This paper describes the maritimeEXODUS model, the SHEBA facility from which data concerning passenger/crew performance in conditions of heel is derived and an example application demonstrating the models use in performing an evacuation analysis for a large passenger ship partially based on the requirements of MSC circular 1033.
Resumo:
In this paper, the continuous casting process for steel slab production is modelled using a mult-physics approach. For this purpose, a Finite Volume (FV) numerical model was constructed in 3D, with the following characteristics: Time dependent, turbulent fluid flow and heat transfer in the molten steel and flux regions, solidification of the skin layer, under prescribed heat loss boundary conditions, particle tracking simulation of argon bubbles injected with the metal into the mould, full coupling between bubbles and liquid through buoyancy and interfacial forces using a novel gas accumulation technique, and a full transient simulation of flux-metal interface behaviour under the influence of gravity and fluid inertial forces and bubble plume buoyancy. The unstructure mesh FV code PHYSICA developed at Greenwich was used for carry out the simulations with physical process data and properties supplied by IRSID SA.
Resumo:
The computational modelling of metal forming processes is now well established. In this work
Resumo:
In this paper the use of free-surface techniques, within the framework of a finite volume methodology, are investigated for the simulation of metal forming processes. In such processes, for example extrusion and forging, a workpiece is subjected to large scale deformation to create the product's shape. The use of Eulerian free-surface techniques to predict this final shape offers the advantage, over the traditionally used Lagrangian finite element method, of not requiring remmeshing. Two free-surface techniques to predict this final shape offers the advantage, over the traditionally used Lagrangian finite element method, of not requiring remesingh. Two free-surface techniques are compared by modelling a typical example of this type of process - non-Newtonian extrusion of an aluminium workpiece through a conical die.
Resumo:
The computational modelling of extrusion and forging processes is now well established. There are two main approaches: Lagrangian and Eulerian. The first has considerable complexities associated with remeshing, especially when the code is parallelised. The second approach means that the mould has to be assumed to be entirely rigid and this may not be the case. In this paper, a novel approach is described which utilises finite volume methods on unstructured meshes. This approach involves the solution of free surface non-Newtonian fluid flow equations in an Eulerian context to track the behaviour of the workpiece and its extrusion/forging, and the solution of the solid mechanics equations in the Lagrangian context to predict the deformation/stress behaviour of the die. Test cases for modelling extrusion and forging problems using this approach will be presented.
Resumo:
Social network analysts have tried to capture the idea of social role explicitly by proposing a framework that precisely gives conditions under which group actors are playing equivalent roles. They term these methods positional analysis techniques. The most general definition is regular equivalence which captures the idea that equivalent actors are related in a similar way to equivalent alters. Regular equivalence gives rise to a whole class of partitions on a network. Given a network we have two different computational problems. The first is how to find a particular regular equivalence. An algorithm exists to find the largest regular partition but there are not efficient algorithms to test whether there is a regular k-partition. That is a partition in k groups that is regular. In addition, when dealing with real data, it is unlikely that any regular partitions exist. To overcome this problem relaxations of regular equivalence have been proposed along with optimisation techniques to find nearly regular partitions. In this paper we review the algorithms that have developed to find particular regular equivalences and look at some of the recent theoretical results which give an insight into the complexity of finding regular partitions.
Resumo:
The problems encountered when using traditional rectangular pulse hierarchical point processmodels for fine temporal resolution and the growing number of available tip-time records suggest that rainfall increments from tipping-bucket gauges be modelled directly. Poisson processes are used with an arrival rate modulated by a Markov chain in Continuous time. The paper shows how, by using two or three states for this chain, much of the structure of the rainfall intensity distribution and the wet/dry sequences can be represented for time-scales as small as 5 minutes.
Resumo:
In attempts to conserve the species diversity of trees in tropical forests, monitoring of diversity in inventories is essential. For effective monitoring it is crucial to be able to make meaningful comparisons between different regions, or comparisons of the diversity of a region at different times. Many species diversity measures have been defined, including the well-known abundance and entropy measures. All such measures share a number of problems in their effective practical use. However, probably the most problematic is that they cannot be used to meaningfully assess changes, since thay are only concerned with the number of species or the proportions of the population/sample which they constitute. A natural (though simplistic) model of a species frequency distribution is the multinomial distribution. It is shown that the likelihood analysis of samples from such a distribution are closely related to a number of entropy-type measures of diversity. Hence a comparison of the species distribution on two plots, using the multinomial model and likelihood methods, leads to generalised cross-entropy as the LRT test statistic of the null that the species distributions are the same. Data from 30 contiguous plots in a forest in Sumatra are analysed using these methods. Significance tests between all pairs of plots yield extremely low p-values, indicating strongly that it ought to been "Obvious" that the observed species distributions are different on different plots. In terms of how different the plots are, and how these differences vary over the whole study site, a display of the degrees of freedom of the test, (equivalent to the number of shared species) seems to be the most revealing indicator, as well as the simplest.
Resumo:
The concept of a “true” ground-truth map is introduced, from which the inaccuracy/error of any production map may be measured. A partition of the mapped region is defined in terms of the “residual rectification” transformation. Geometric RMS-type and Geometric Distortion error criteria are defined as well as a map mis-classification error criterion (the latter for hard and fuzzy produc-tion maps). The total map error is defined to be the sum (over each set of the map partition men-tioned above) of these three error components integrated over each set of the partition.
Resumo:
The recent history and current trends in the collection and archiving of forest information and models is reviewed. The question is posed as to whether the community of forest modellers ought to take some action in setting up a Forest Model Archive (FMA) as a means of conserving and sharing the heritage of forest models that have been developed over several decades. The paper discusses the various alternatives of what an FMA could be, and should be. It then goes on to formulate a conceptual model as the basis for the construction of a FMA. Finally the question of software architecture is considered. Again there are a number of possible solutions. We discuss the alternatives, some in considerable detail, but leave the final decisions on these issues to the forest modelling community. This paper has spawned the “Greenwich Initiative” on the FMA. An internet discussion group on the topic will be started and launched by the “Trafalar Group”, which will span both IUFRO 4.1 and 4.11, and further discussion is planned to take place at the Forest Modelling Conference in Portugal, June 2002.