146 resultados para computational modelling

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Experimental aerodynamic studies of the flows around new aerocapture spacecraft configurations are presently being done in the superorbital expansion tubes at The University of Queensland. Short duration flows at speeds of 10--13 km/s are produced in the expansion tube facility and are then applied to the model spacecraft. Although high-temperature effects, such as molecular dissociation, have long been a part of the computational modelling of the expansion tube flows for speeds below 10 km/s, radiation may now be a significant mechanism of energy transfer within the shock layer on the model. This paper will study the coupling of radiation energy transport for an optically thin gas to the flow dynamics in order to obtain accurate predictions of thermal loads on the spacecraft. The results show that the effect of radiation on the flowfields of subscale models for expansion tube experiments can be assessed by measurements of total heat transfer and radiative heat transfer.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The ability to grow microscopic spherical birefringent crystals of vaterite, a calcium carbonate mineral, has allowed the development of an optical microrheometer based on optical tweezers. However, since these crystals are birefringent, and worse, are expected to have non-uniform birefringence, computational modeling of the microrheometer is a highly challenging task. Modeling the microrheometer - and optical tweezers in general - typically requires large numbers of repeated calculations for the same trapped particle. This places strong demands on the efficiency of computational methods used. While our usual method of choice for computational modelling of optical tweezers - the T-matrix method - meets this requirement of efficiency, it is restricted to homogeneous isotropic particles. General methods that can model complex structures such as the vaterite particles, such as finite-difference time-domain (FDTD) or finite-difference frequency-domain (FDFD) methods, are inefficient. Therefore, we have developed a hybrid FDFD/T-matrix method that combines the generality of volume-discretisation methods such as FDFD with the efficiency of the T-matrix method. We have used this hybrid method to calculate optical forces and torques on model vaterite spheres in optical traps. We present and compare the results of computational modelling and experimental measurements.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes recent advances made in computational modelling of the sugar cane liquid extraction process. The saturated fibro-porous material is rolled between circumferentially grooved rolls, which enhance frictional grip and provide a low-resistance path for liquid flow during the extraction process. Previously reported two-dimensional (2D) computational models, account for the large deformation of the porous material by solving the fully coupled governing fibre stress and fluid-flow equations using finite element techniques. While the 2D simulations provide much insight into the overarching cause-effect relationships, predictions of mechanical quantities such as roll separating force and particularly torque as a function of roll speed and degree of compression are not satisfactory for industrial use. It is considered that the unsatisfactory response in roll torque prediction may be due to the stress levels that exist between the groove tips and roots which have been largely neglected in the geometrically simplified 2D model. This paper gives results for both two- and three-dimensional finite element models and highlights their strengths and weaknesses in predicting key milling parameters. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The hypothesis that lipid rafts exist in plasma membranes and have crucial biological functions remains controversial. The lateral heterogeneity of proteins in the plasma membrane is undisputed, but the contribution of cholesterol-dependent lipid assemblies to this complex, non-random organization promotes vigorous debate. In the light of recent studies with model membranes, computational modelling and innovative cell biology, I propose an updated model of lipid rafts that readily accommodates diverse views on plasma-membrane micro-organization.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Systems biology is based on computational modelling and simulation of large networks of interacting components. Models may be intended to capture processes, mechanisms, components and interactions at different levels of fidelity. Input data are often large and geographically disperse, and may require the computation to be moved to the data, not vice versa. In addition, complex system-level problems require collaboration across institutions and disciplines. Grid computing can offer robust, scaleable solutions for distributed data, compute and expertise. We illustrate some of the range of computational and data requirements in systems biology with three case studies: one requiring large computation but small data (orthologue mapping in comparative genomics), a second involving complex terabyte data (the Visible Cell project) and a third that is both computationally and data-intensive (simulations at multiple temporal and spatial scales). Authentication, authorisation and audit systems are currently not well scalable and may present bottlenecks for distributed collaboration particularly where outcomes may be commercialised. Challenges remain in providing lightweight standards to facilitate the penetration of robust, scalable grid-type computing into diverse user communities to meet the evolving demands of systems biology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper presents a computational system based upon formal principles to run spatial models for environmental processes. The simulator is named SimuMap because it is typically used to simulate spatial processes over a mapped representation of terrain. A model is formally represented in SimuMap as a set of coupled sub-models. The paper considers the situation where spatial processes operate at different time levels, but are still integrated. An example of such a situation commonly occurs in watershed hydrology where overland flow and stream channel flow have very different flow rates but are highly related as they are subject to the same terrain runoff processes. SimuMap is able to run a network of sub-models that express different time-space derivatives for water flow processes. Sub-models may be coded generically with a map algebra programming language that uses a surface data model. To address the problem of differing time levels in simulation, the paper: (i) reviews general approaches for numerical solvers, (ii) considers the constraints that need to be enforced to use more adaptive time steps in discrete time specified simulations, and (iii) scaling transfer rates in equations that use different time bases for time-space derivatives. A multistep scheme is proposed for SimuMap. This is presented along with a description of its visual programming interface, its modelling formalisms and future plans. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A one-dimensional computational model of pilling of a fibre assembly has been created. The model follows a set of individual fibres, as free ends and loops appear as fuzz and arc progressively withdrawn from the body of the assembly, and entangle to form pills, which eventually break off or are pulled out. The time dependence of the computation is given by ticks, which correspond to cycles of a wear and laundering process. The movement of the fibres is treated as a reptation process. A set of standard values is used as inputs to the computation. Predictions arc given of the change with a number Of cycles of mass of fuzz, mass of pills, and mass removed from the assembly. Changes in the standard values allow sensitivity studies to be carried out.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use the finite element method to solve reactive mass transport problems in fluid-saturated porous media. In particular, we discuss the mathematical expression of the chemical reaction terms involved in the mass transport equations for an isothermal, non-equilibrium chemical reaction. It has turned out that the Arrhenius law in chemistry is a good mathematical expression for such non-equilibrium chemical reactions especially from the computational point of view. Using the finite element method and the Arrhenius law, we investigate the distributions of PH (i.e. the concentration of H+) and the relevant reactive species in a groundwater system. Although the main focus of this study is on the contaminant transport problems in groundwater systems, the related numerical techniques and principles are equally applicable to the orebody formation problems in the geosciences. Copyright (C) 1999 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein kinases exhibit various degrees of substrate specificity. The large number of different protein kinases in the eukaryotic proteomes makes it impractical to determine the specificity of each enzyme experimentally. To test if it were possible to discriminate potential substrates from non-substrates by simple computational techniques, we analysed the binding enthalpies of modelled enzyme-substrate complexes and attempted to correlate it with experimental enzyme kinetics measurements. The crystal structures of phosphorylase kinase and cAMP-dependent protein kinase were used to generate models of the enzyme with a series of known peptide substrates and non-substrates, and the approximate enthalpy of binding assessed following energy minimization. We show that the computed enthalpies do not correlate closely with kinetic measurements, but the method can distinguish good substrates from weak substrates and non-substrates. Copyright (C) 2002 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The explosive growth in biotechnology combined with major advancesin information technology has the potential to radically transformimmunology in the postgenomics era. Not only do we now have readyaccess to vast quantities of existing data, but new data with relevanceto immunology are being accumulated at an exponential rate. Resourcesfor computational immunology include biological databases and methodsfor data extraction, comparison, analysis and interpretation. Publiclyaccessible biological databases of relevance to immunologists numberin the hundreds and are growing daily. The ability to efficientlyextract and analyse information from these databases is vital forefficient immunology research. Most importantly, a new generationof computational immunology tools enables modelling of peptide transportby the transporter associated with antigen processing (TAP), modellingof antibody binding sites, identification of allergenic motifs andmodelling of T-cell receptor serial triggering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the finite element simulations of reactive mineral carrying fluids mixing and mineralization in pore-fluid saturated hydrothermal/sedimentary basins. In particular we explore the mixing of reactive sulfide and sulfate fluids and the relevant patterns of mineralization for Load, zinc and iron minerals in the regime of temperature-gradient-driven convective flow. Since the mineralization and ore body formation may last quite a long period of time in a hydrothermal basin, it is commonly assumed that, in the geochemistry, the solutions of minerals are in an equilibrium state or near an equilibrium state. Therefore, the mineralization rate of a particular kind of mineral can be expressed as the product of the pore-fluid velocity and the equilibrium concentration of this particular kind of mineral Using the present mineralization rate of a mineral, the potential of the modern mineralization theory is illustrated by means of finite element studies related to reactive mineral-carrying fluids mixing problems in materially homogeneous and inhomogeneous porous rock basins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Allergy is a major cause of morbidity worldwide. The number of characterized allergens and related information is increasing rapidly creating demands for advanced information storage, retrieval and analysis. Bioinformatics provides useful tools for analysing allergens and these are complementary to traditional laboratory techniques for the study of allergens. Specific applications include structural analysis of allergens, identification of B- and T-cell epitopes, assessment of allergenicity and cross-reactivity, and genome analysis. In this paper, the most important bioinformatic tools and methods with relevance to the study of allergy have been reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.