99 resultados para Computational simulation
Resumo:
Computer models can be combined with laboratory experiments for the efficient determination of (i) peptides that bind MHC molecules and (ii) T-cell epitopes. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures. This requires the definition of standards and experimental protocols for model application. We describe the requirements for validation and assessment of computer models. The utility of combining accurate predictions with a limited number of laboratory experiments is illustrated by practical examples. These include the identification of T-cell epitopes from IDDM-, melanoma- and malaria-related antigens by combining computational and conventional laboratory assays. The success rate in determining antigenic peptides, each in the context of a specific HLA molecule, ranged from 27 to 71%, while the natural prevalence of MHC-binding peptides is 0.1-5%.
Resumo:
Two experimental studies were conducted to examine whether the stress-buffering effects of behavioral control on work task responses varied as a function of procedural information. Study 1 manipulated low and high levels of task demands, behavioral control, and procedural information for 128 introductory psychology students completing an in-basket activity. ANOVA procedures revealed a significant three-way interaction among these variables in the prediction of subjective task performance and task satisfaction. It was found that procedural information buffered the negative effects of task demands on ratings of performance and satisfaction only under conditions of low behavioral control. This pattern of results suggests that procedural information may have a compensatory effect when the work environment is characterized by a combination of high task demands and low behavioral control. Study 2 (N = 256) utilized simple and complex versions of the in-basket activity to examine the extent to which the interactive relationship among task demands, behavioral control, and procedural information varied as a function of task complexity. There was further support for the stress-buffering role of procedural information on work task responses under conditions of low behavioral control. This effect was, however, only present when the in-basket activity was characterized by high task complexity, suggesting that the interactive relationship among these variables may depend on the type of tasks performed at work. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
RWMODEL II simulates the Rescorla-Wagner model of Pavlovian conditioning. It is written in Delphi and runs under Windows 3.1 and Windows 95. The program was designed for novice and expert users and can be employed in teaching, as well as in research. It is user friendly and requires a minimal level of computer literacy but is sufficiently flexible to permit a wide range of simulations. It allows the display of empirical data, against which predictions from the model can be validated.
Resumo:
In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
Resumo:
Protein kinases exhibit various degrees of substrate specificity. The large number of different protein kinases in the eukaryotic proteomes makes it impractical to determine the specificity of each enzyme experimentally. To test if it were possible to discriminate potential substrates from non-substrates by simple computational techniques, we analysed the binding enthalpies of modelled enzyme-substrate complexes and attempted to correlate it with experimental enzyme kinetics measurements. The crystal structures of phosphorylase kinase and cAMP-dependent protein kinase were used to generate models of the enzyme with a series of known peptide substrates and non-substrates, and the approximate enthalpy of binding assessed following energy minimization. We show that the computed enthalpies do not correlate closely with kinetic measurements, but the method can distinguish good substrates from weak substrates and non-substrates. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
The explosive growth in biotechnology combined with major advancesin information technology has the potential to radically transformimmunology in the postgenomics era. Not only do we now have readyaccess to vast quantities of existing data, but new data with relevanceto immunology are being accumulated at an exponential rate. Resourcesfor computational immunology include biological databases and methodsfor data extraction, comparison, analysis and interpretation. Publiclyaccessible biological databases of relevance to immunologists numberin the hundreds and are growing daily. The ability to efficientlyextract and analyse information from these databases is vital forefficient immunology research. Most importantly, a new generationof computational immunology tools enables modelling of peptide transportby the transporter associated with antigen processing (TAP), modellingof antibody binding sites, identification of allergenic motifs andmodelling of T-cell receptor serial triggering.
Resumo:
The paper presents a theory for modeling flow in anisotropic, viscous rock. This theory has originally been developed for the simulation of large deformation processes including the folding and kinking of multi-layered visco-elastic rock (Muhlhaus et al. [1,2]). The orientation of slip planes in the context of crystallographic slip is determined by the normal vector - the director - of these surfaces. The model is applied to simulate anisotropic mantle convection. We compare the evolution of flow patterns, Nusselt number and director orientations for isotropic and anisotropic rheologies. In the simulations we utilize two different finite element methodologies: The Lagrangian Integration Point Method Moresi et al [8] and an Eulerian formulation, which we implemented into the finite element based pde solver Fastflo (www.cmis.csiro.au/Fastflo/). The reason for utilizing two different finite element codes was firstly to study the influence of an anisotropic power law rheology which currently is not implemented into the Lagrangian Integration point scheme [8] and secondly to study the numerical performance of Eulerian (Fastflo)- and Lagrangian integration schemes [8]. It turned out that whereas in the Lagrangian method the Nusselt number vs time plot reached only a quasi steady state where the Nusselt number oscillates around a steady state value the Eulerian scheme reaches exact steady states and produces a high degree of alignment (director orientation locally orthogonal to velocity vector almost everywhere in the computational domain). In the simulations emergent anisotropy was strongest in terms of modulus contrast in the up and down-welling plumes. Mechanisms for anisotropic material behavior in the mantle dynamics context are discussed by Christensen [3]. The dominant mineral phases in the mantle generally do not exhibit strong elastic anisotropy but they still may be oriented by the convective flow. Thus viscous anisotropy (the main focus of this paper) may or may not correlate with elastic or seismic anisotropy.
Resumo:
Allergy is a major cause of morbidity worldwide. The number of characterized allergens and related information is increasing rapidly creating demands for advanced information storage, retrieval and analysis. Bioinformatics provides useful tools for analysing allergens and these are complementary to traditional laboratory techniques for the study of allergens. Specific applications include structural analysis of allergens, identification of B- and T-cell epitopes, assessment of allergenicity and cross-reactivity, and genome analysis. In this paper, the most important bioinformatic tools and methods with relevance to the study of allergy have been reviewed.
Resumo:
A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Numerical methods are used to simulate the double-diffusion driven convective pore-fluid flow and rock alteration in three-dimensional fluid-saturated geological fault zones. The double diffusion is caused by a combination of both the positive upward temperature gradient and the positive downward salinity concentration gradient within a three-dimensional fluid-saturated geological fault zone, which is assumed to be more permeable than its surrounding rocks. In order to ensure the physical meaningfulness of the obtained numerical solutions, the numerical method used in this study is validated by a benchmark problem, for which the analytical solution to the critical Rayleigh number of the system is available. The theoretical value of the critical Rayleigh number of a three-dimensional fluid-saturated geological fault zone system can be used to judge whether or not the double-diffusion driven convective pore-fluid flow can take place within the system. After the possibility of triggering the double-diffusion driven convective pore-fluid flow is theoretically validated for the numerical model of a three-dimensional fluid-saturated geological fault zone system, the corresponding numerical solutions for the convective flow and temperature are directly coupled with a geochemical system. Through the numerical simulation of the coupled system between the convective fluid flow, heat transfer, mass transport and chemical reactions, we have investigated the effect of the double-diffusion driven convective pore-fluid flow on the rock alteration, which is the direct consequence of mineral redistribution due to its dissolution, transportation and precipitation, within the three-dimensional fluid-saturated geological fault zone system. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A comprehensive probabilistic model for simulating dendrite morphology and investigating dendritic growth kinetics during solidification has been developed, based on a modified Cellular Automaton (mCA) for microscopic modeling of nucleation, growth of crystals and solute diffusion. The mCA model numerically calculated solute redistribution both in the solid and liquid phases, the curvature of dendrite tips and the growth anisotropy. This modeling takes account of thermal, curvature and solute diffusion effects. Therefore, it can simulate microstructure formation both on the scale of the dendrite tip length. This model was then applied for simulating dendritic solidification of an Al-7%Si alloy. Both directional and equiaxed dendritic growth has been performed to investigate the growth anisotropy and cooling rate on dendrite morphology. Furthermore, the competitive growth and selection of dendritic crystals have also investigated.
Resumo:
The St. Lawrence Island polynya (SLIP) is a commonly occurring winter phenomenon in the Bering Sea, in which dense saline water produced during new ice formation is thought to flow northward through the Bering Strait to help maintain the Arctic Ocean halocline. Winter darkness and inclement weather conditions have made continuous in situ and remote observation of this polynya difficult. However, imagery acquired from the European Space Agency ERS-1 Synthetic Aperture Radar (SAR) has allowed observation of the St. Lawrence Island polynya using both the imagery and derived ice displacement products. With the development of ARCSyM, a high resolution regional model of the Arctic atmosphere/sea ice system, simulation of the SLIP in a climate model is now possible. Intercomparisons between remotely sensed products and simulations can lead to additional insight into the SLIP formation process. Low resolution SAR, SSM/I and AVHRR infrared imagery for the St. Lawrence Island region are compared with the results of a model simulation for the period of 24-27 February 1992. The imagery illustrates a polynya event (polynya opening). With the northerly winds strong and consistent over several days, the coupled model captures the SLIP event with moderate accuracy. However, the introduction of a stability dependent atmosphere-ice drag coefficient, which allows feedbacks between atmospheric stability, open water, and air-ice drag, produces a more accurate simulation of the SLIP in comparison to satellite imagery. Model experiments show that the polynya event is forced primarily by changes in atmospheric circulation followed by persistent favorable conditions: ocean surface currents are found to have a small but positive impact on the simulation which is enhanced when wind forcing is weak or variable.
Resumo:
Fluid mixing in steady and unsteady Bow through a channel containing periodic square obstructions has been studied using a finite-difference simulation to determine fluid velocities, followed by the use of passive marker particle advection to look at fluid transport out of the cavities formed between each of the obstructions. The geometry and Bow conditions were chosen from the work by Perkins (1989, M.S. Thesis, Lehigh University; 1992, Ph.D. Thesis, Lehigh University); who investigated heat transfer enhancement due to unsteady flow through such an obstructed channel. Particle advection shows that Bow regimes which are predicted to give good mixing based on snapshots of instantaneous streamline contour plots were not necessarily able to efficiently mix fluid which started in the cavity regions throughout the channel. The use of Poincare sections shows regular regions existing under these conditions which inhibit efficient fluid transport. These regular regions are found to disappear when the unsteady Bow velocity is increased. (C) 1997 Elsevier Science Ltd.
Resumo:
Systems approaches can help to evaluate and improve the agronomic and economic viability of nitrogen application in the frequently water-limited environments. This requires a sound understanding of crop physiological processes and well tested simulation models. Thus, this experiment on spring wheat aimed to better quantify water x nitrogen effects on wheat by deriving some key crop physiological parameters that have proven useful in simulating crop growth. For spring wheat grown in Northern Australia under four levels of nitrogen (0 to 360 kg N ha(-1)) and either entirely on stored soil moisture or under full irrigation, kernel yields ranged from 343 to 719 g m(-2). Yield increases were strongly associated with increases in kernel number (9150-19950 kernels m(-2)), indicating the sensitivity of this parameter to water and N availability. Total water extraction under a rain shelter was 240 mm with a maximum extraction depth of 1.5 m. A substantial amount of mineral nitrogen available deep in the profile (below 0.9 m) was taken up by the crop. This was the source of nitrogen uptake observed after anthesis. Under dry conditions this late uptake accounted for approximately 50% of total nitrogen uptake and resulted in high (>2%) kernel nitrogen percentages even when no nitrogen was applied,Anthesis LAI values under sub-optimal water supply were reduced by 63% and under sub-optimal nitrogen supply by 50%. Radiation use efficiency (RUE) based on total incident short-wave radiation was 1.34 g MJ(-1) and did not differ among treatments. The conservative nature of RUE was the result of the crop reducing leaf area rather than leaf nitrogen content (which would have affected photosynthetic activity) under these moderate levels of nitrogen limitation. The transpiration efficiency coefficient was also conservative and averaged 4.7 Pa in the dry treatments. Kernel nitrogen percentage varied from 2.08 to 2.42%. The study provides a data set and a basis to consider ways to improve simulation capabilities of water and nitrogen effects on spring wheat. (C) 1997 Elsevier Science B.V.