933 resultados para 3D-t computational simulation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OctVCE is a cartesian cell CFD code produced especially for numerical simulations of shock and blast wave interactions with complex geometries. Virtual Cell Embedding (VCE) was chosen as its cartesian cell kernel as it is simple to code and sufficient for practical engineering design problems. This also makes the code much more ‘user-friendly’ than structured grid approaches as the gridding process is done automatically. The CFD methodology relies on a finite-volume formulation of the unsteady Euler equations and is solved using a standard explicit Godonov (MUSCL) scheme. Both octree-based adaptive mesh refinement and shared-memory parallel processing capability have also been incorporated. For further details on the theory behind the code, see the companion report 2007/12.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical experiments using a finite difference method were carried out to determine the motion of axisymmetric Taylor vortices for narrow-gap Taylor vortex flow. When a pressure gradient is imposed on the flow the vortices are observed to move with an axial speed of 1.16 +/- 0.005 times the mean axial flow velocity. The method of Brenner was used to calculate the long-time axial spread of material in the flow. For flows where there is no pressure gradient, the axial dispersion scales with the square root of the molecular diffusion, in agreement with the results of Rosen-bluth et al. for high Peclet number dispersion in spatially periodic flows with a roll structure. When a pressure gradient is imposed the dispersion increases by an amount approximately equal to 6.5 x 10(-4) (W) over bar(2)d(2)/D-m, where (W) over bar is the average axial velocity in the annulus, analogous to Taylor dispersion for laminar flow in an empty tube.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To simulate cropping systems, crop models must not only give reliable predictions of yield across a wide range of environmental conditions, they must also quantify water and nutrient use well, so that the status of the soil at maturity is a good representation of the starting conditions for the next cropping sequence. To assess the suitability for this task a range of crop models, currently used in Australia, were tested. The models differed in their design objectives, complexity and structure and were (i) tested on diverse, independent data sets from a wide range of environments and (ii) model components were further evaluated with one detailed data set from a semi-arid environment. All models were coded into the cropping systems shell APSIM, which provides a common soil water and nitrogen balance. Crop development was input, thus differences between simulations were caused entirely by difference in simulating crop growth. Under nitrogen non-limiting conditions between 73 and 85% of the observed kernel yield variation across environments was explained by the models. This ranged from 51 to 77% under varying nitrogen supply. Water and nitrogen effects on leaf area index were predicted poorly by all models resulting in erroneous predictions of dry matter accumulation and water use. When measured light interception was used as input, most models improved in their prediction of dry matter and yield. This test highlighted a range of compensating errors in all modelling approaches. Time course and final amount of water extraction was simulated well by two models, while others left up to 25% of potentially available soil water in the profile. Kernel nitrogen percentage was predicted poorly by all models due to its sensitivity to small dry matter changes. Yield and dry matter could be estimated adequately for a range of environmental conditions using the general concepts of radiation use efficiency and transpiration efficiency. However, leaf area and kernel nitrogen dynamics need to be improved to achieve better estimates of water and nitrogen use if such models are to be use to evaluate cropping systems. (C) 1998 Elsevier Science B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous work has identified several short-comings in the ability of four spring wheat and one barley model to simulate crop processes and resource utilization. This can have important implications when such models are used within systems models where final soil water and nitrogen conditions of one crop define the starting conditions of the following crop. In an attempt to overcome these limitations and to reconcile a range of modelling approaches, existing model components that worked demonstrably well were combined with new components for aspects where existing capabilities were inadequate. This resulted in the Integrated Wheat Model (I_WHEAT), which was developed as a module of the cropping systems model APSIM. To increase predictive capability of the model, process detail was reduced, where possible, by replacing groups of processes with conservative, biologically meaningful parameters. I_WHEAT does not contain a soil water or soil nitrogen balance. These are present as other modules of APSIM. In I_WHEAT, yield is simulated using a linear increase in harvest index whereby nitrogen or water limitations can lead to early termination of grainfilling and hence cessation of harvest index increase. Dry matter increase is calculated either from the amount of intercepted radiation and radiation conversion efficiency or from the amount of water transpired and transpiration efficiency, depending on the most limiting resource. Leaf area and tiller formation are calculated from thermal time and a cultivar specific phyllochron interval. Nitrogen limitation first reduces leaf area and then affects radiation conversion efficiency as it becomes more severe. Water or nitrogen limitations result in reduced leaf expansion, accelerated leaf senescence or tiller death. This reduces the radiation load on the crop canopy (i.e. demand for water) and can make nitrogen available for translocation to other organs. Sensitive feedbacks between light interception and dry matter accumulation are avoided by having environmental effects acting directly on leaf area development, rather than via biomass production. This makes the model more stable across environments without losing the interactions between the different external influences. When comparing model output with models tested previously using data from a wide range of agro-climatic conditions, yield and biomass predictions were equal to the best of those models, but improvements could be demonstrated for simulating leaf area dynamics in response to water and nitrogen supply, kernel nitrogen content, and total water and nitrogen use. I_WHEAT does not require calibration for any of the environments tested. Further model improvement should concentrate on improving phenology simulations, a more thorough derivation of coefficients to describe leaf area development and a better quantification of some processes related to nitrogen dynamics. (C) 1998 Elsevier Science B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional waste stabilisation pond (WSP) models encounter problems predicting pond performance because they cannot account for the influence of pond features, such as inlet structure or pond geometry, on fluid hydrodynamics. In this study, two dimensional (2-D) computational fluid dynamics (CFD) models were compared to experimental residence time distributions (RTD) from literature. In one of the-three geometries simulated, the 2-D CFD model successfully predicted the experimental RTD. However, flow patterns in the other two geometries were not well described due to the difficulty of representing the three dimensional (3-D) experimental inlet in the 2-D CFD model, and the sensitivity of the model results to the assumptions used to characterise the inlet. Neither a velocity similarity nor geometric similarity approach to inlet representation in 2-D gave results correlating with experimental data. However. it was shown that 2-D CFD models were not affected by changes in values of model parameters which are difficult to predict, particularly the turbulent inlet conditions. This work suggests that 2-D CFD models cannot be used a priori to give an adequate description of the hydrodynamic patterns in WSP. (C) 1998 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A version of the Agricultural Production Systems Simulator (APSIM) capable of simulating the key agronomic aspects of intercropping maize between legume shrub hedgerows was described and parameterised in the first paper of this series (Nelson et al., this issue). In this paper, APSIM is used to simulate maize yields and soil erosion from traditional open-field farming and hedgerow intercropping in the Philippine uplands. Two variants of open-field farming were simulated using APSIM, continuous and fallow, for comparison with intercropping maize between leguminous shrub hedgerows. Continuous open-field maize farming was predicted to be unsustainable in the long term, while fallow open-field farming was predicted to slow productivity decline by spreading the effect of erosion over a larger cropping area. Hedgerow intercropping was predicted to reduce erosion by maintaining soil surface cover during periods of intense rainfall, contributing to sustainable production of maize in the long term. In the third paper in this series, Nelson et al. (this issue) use cost-benefit analysis to compare the economic viability of hedgerow intercropping relative to traditional open-field farming of maize in relatively inaccessible upland areas. (C) 1998 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A space-marching code for the simulation and optimization of inviscid supersonic flow in three dimensions is described. The now in a scramjet module with a relatively complex three-dimensional geometry is examined and wall-pressure estimates are compared with experimental data. Given that viscous effects are not presently included, the comparison is reasonable. The thermodynamic compromise of adding heat in a diverging combustor is also examined. The code is then used to optimize the shape of a thrust surface for a simpler (box-section) scramjet module in the presence of uniform and nonuniform heat distributions. The optimum two-dimensional profiles for the thrust surface are obtained via a perturbation procedure that requires about 30-50 now solutions. It is found that the final shapes are fairly insensitive to the details of the heat distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An automated method for extracting brain volumes from three commonly acquired three-dimensional (3D) MR images (proton density, T1 weighted, and T2-weighted) of the human head is described. The procedure is divided into four levels: preprocessing, segmentation, scalp removal, and postprocessing. A user-provided reference point is the sole operator-dependent input required, The method's parameters were first optimized and then fixed and applied to 30 repeat data sets from 15 normal older adult subjects to investigate its reproducibility. Percent differences between total brain volumes (TBVs) for the subjects' repeated data sets ranged from .5% to 2.2%. We conclude that the method is both robust and reproducible and has the potential for wide application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer models can be combined with laboratory experiments for the efficient determination of (i) peptides that bind MHC molecules and (ii) T-cell epitopes. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures. This requires the definition of standards and experimental protocols for model application. We describe the requirements for validation and assessment of computer models. The utility of combining accurate predictions with a limited number of laboratory experiments is illustrated by practical examples. These include the identification of T-cell epitopes from IDDM-, melanoma- and malaria-related antigens by combining computational and conventional laboratory assays. The success rate in determining antigenic peptides, each in the context of a specific HLA molecule, ranged from 27 to 71%, while the natural prevalence of MHC-binding peptides is 0.1-5%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two experimental studies were conducted to examine whether the stress-buffering effects of behavioral control on work task responses varied as a function of procedural information. Study 1 manipulated low and high levels of task demands, behavioral control, and procedural information for 128 introductory psychology students completing an in-basket activity. ANOVA procedures revealed a significant three-way interaction among these variables in the prediction of subjective task performance and task satisfaction. It was found that procedural information buffered the negative effects of task demands on ratings of performance and satisfaction only under conditions of low behavioral control. This pattern of results suggests that procedural information may have a compensatory effect when the work environment is characterized by a combination of high task demands and low behavioral control. Study 2 (N = 256) utilized simple and complex versions of the in-basket activity to examine the extent to which the interactive relationship among task demands, behavioral control, and procedural information varied as a function of task complexity. There was further support for the stress-buffering role of procedural information on work task responses under conditions of low behavioral control. This effect was, however, only present when the in-basket activity was characterized by high task complexity, suggesting that the interactive relationship among these variables may depend on the type of tasks performed at work. Copyright (C) 1999 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RWMODEL II simulates the Rescorla-Wagner model of Pavlovian conditioning. It is written in Delphi and runs under Windows 3.1 and Windows 95. The program was designed for novice and expert users and can be employed in teaching, as well as in research. It is user friendly and requires a minimal level of computer literacy but is sufficiently flexible to permit a wide range of simulations. It allows the display of empirical data, against which predictions from the model can be validated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Protein kinases exhibit various degrees of substrate specificity. The large number of different protein kinases in the eukaryotic proteomes makes it impractical to determine the specificity of each enzyme experimentally. To test if it were possible to discriminate potential substrates from non-substrates by simple computational techniques, we analysed the binding enthalpies of modelled enzyme-substrate complexes and attempted to correlate it with experimental enzyme kinetics measurements. The crystal structures of phosphorylase kinase and cAMP-dependent protein kinase were used to generate models of the enzyme with a series of known peptide substrates and non-substrates, and the approximate enthalpy of binding assessed following energy minimization. We show that the computed enthalpies do not correlate closely with kinetic measurements, but the method can distinguish good substrates from weak substrates and non-substrates. Copyright (C) 2002 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Axial X-ray Computed tomography (CT) scanning provides a convenient means of recording the three-dimensional form of soil structure. The technique has been used for nearly two decades, but initial development has concentrated on qualitative description of images. More recently, increasing effort has been put into quantifying the geometry and topology of macropores likely to contribute to preferential now in soils. Here we describe a novel technique for tracing connected macropores in the CT scans. After object extraction, three-dimensional mathematical morphological filters are applied to quantify the reconstructed structure. These filters consist of sequences of so-called erosions and/or dilations of a 32-face structuring element to describe object distances and volumes of influence. The tracing and quantification methodologies were tested on a set of undisturbed soil cores collected in a Swiss pre-alpine meadow, where a new earthworm species (Aporrectodea nocturna) was accidentally introduced. Given the reduced number of samples analysed in this study, the results presented only illustrate the potential of the method to reconstruct and quantify macropores. Our results suggest that the introduction of the new species induced very limited chance to the soil structured for example, no difference in total macropore length or mean diameter was observed. However. in the zone colonised by, the new species. individual macropores tended to have a longer average length. be more vertical and be further apart at some depth. Overall, the approach proved well suited to the analysis of the three-dimensional architecture of macropores. It provides a framework for the analysis of complex structures, which are less satisfactorily observed and described using 2D imaging. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The explosive growth in biotechnology combined with major advancesin information technology has the potential to radically transformimmunology in the postgenomics era. Not only do we now have readyaccess to vast quantities of existing data, but new data with relevanceto immunology are being accumulated at an exponential rate. Resourcesfor computational immunology include biological databases and methodsfor data extraction, comparison, analysis and interpretation. Publiclyaccessible biological databases of relevance to immunologists numberin the hundreds and are growing daily. The ability to efficientlyextract and analyse information from these databases is vital forefficient immunology research. Most importantly, a new generationof computational immunology tools enables modelling of peptide transportby the transporter associated with antigen processing (TAP), modellingof antibody binding sites, identification of allergenic motifs andmodelling of T-cell receptor serial triggering.