10 resultados para Turbulence -- Computer simulation

em National Center for Biotechnology Information - NCBI


Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-resolution video microscopy, image analysis, and computer simulation were used to study the role of the Spitzenkörper (Spk) in apical branching of ramosa-1, a temperature-sensitive mutant of Aspergillus niger. A shift to the restrictive temperature led to a cytoplasmic contraction that destabilized the Spk, causing its disappearance. After a short transition period, new Spk appeared where the two incipient apical branches emerged. Changes in cell shape, growth rate, and Spk position were recorded and transferred to the fungus simulator program to test the hypothesis that the Spk functions as a vesicle supply center (VSC). The simulation faithfully duplicated the elongation of the main hypha and the two apical branches. Elongating hyphae exhibited the growth pattern described by the hyphoid equation. During the transition phase, when no Spk was visible, the growth pattern was nonhyphoid, with consecutive periods of isometric and asymmetric expansion; the apex became enlarged and blunt before the apical branches emerged. Video microscopy images suggested that the branch Spk were formed anew by gradual condensation of vesicle clouds. Simulation exercises where the VSC was split into two new VSCs failed to produce realistic shapes, thus supporting the notion that the branch Spk did not originate by division of the original Spk. The best computer simulation of apical branching morphogenesis included simulations of the ontogeny of branch Spk via condensation of vesicle clouds. This study supports the hypothesis that the Spk plays a major role in hyphal morphogenesis by operating as a VSC—i.e., by regulating the traffic of wall-building vesicles in the manner predicted by the hyphoid model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To compare the cost effectiveness of two possible modifications to the current UK screening programme: shortening the screening interval from three to two years and extending the age of invitation to a final screen from 64 to 69.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a procedure for the generation of chemically accurate computer-simulation models to study chemical reactions in the condensed phase. The process involves (i) the use of a coupled semiempirical quantum and classical molecular mechanics method to represent solutes and solvent, respectively; (ii) the optimization of semiempirical quantum mechanics (QM) parameters to produce a computationally efficient and chemically accurate QM model; (iii) the calibration of a quantum/classical microsolvation model using ab initio quantum theory; and (iv) the use of statistical mechanical principles and methods to simulate, on massively parallel computers, the thermodynamic properties of chemical reactions in aqueous solution. The utility of this process is demonstrated by the calculation of the enthalpy of reaction in vacuum and free energy change in aqueous solution for a proton transfer involving methanol, methoxide, imidazole, and imidazolium, which are functional groups involved with proton transfers in many biochemical systems. An optimized semiempirical QM model is produced, which results in the calculation of heats of formation of the above chemical species to within 1.0 kcal/mol (1 kcal = 4.18 kJ) of experimental values. The use of the calibrated QM and microsolvation QM/MM (molecular mechanics) models for the simulation of a proton transfer in aqueous solution gives a calculated free energy that is within 1.0 kcal/mol (12.2 calculated vs. 12.8 experimental) of a value estimated from experimental pKa values of the reacting species.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The origin of the catalytic power of enzymes is discussed, paying attention to evolutionary constraints. It is pointed out that enzyme catalysis reflects energy contributions that cannot be determined uniquely by current experimental approaches without augmenting the analysis by computer simulation studies. The use of energy considerations and computer simulations allows one to exclude many of the popular proposals for the way enzymes work. It appears that the standard approaches used by organic chemists to catalyze reactions in solutions are not used by enzymes. This point is illustrated by considering the desolvation hypothesis and showing that it cannot account for a large increase in kcat relative to the corresponding kcage for the reference reaction in a solvent cage. The problems associated with other frequently invoked mechanisms also are outlined. Furthermore, it is pointed out that mutation studies are inconsistent with ground state destabilization mechanisms. After considering factors that were not optimized by evolution, we review computer simulation studies that reproduced the overall catalytic effect of different enzymes. These studies pointed toward electrostatic effects as the most important catalytic contributions. The nature of this electrostatic stabilization mechanism is far from being obvious because the electrostatic interaction between the reacting system and the surrounding area is similar in enzymes and in solution. However, the difference is that enzymes have a preorganized dipolar environment that does not have to pay the reorganization energy for stabilizing the relevant transition states. Apparently, the catalytic power of enzymes is stored in their folding energy in the form of the preorganized polar environment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This computer simulation is based on a model of the origin of life proposed by H. Kuhn and J. Waser, where the evolution of short molecular strands is assumed to take place in a distinct spatiotemporal structured environment. In their model, the prebiotic situation is strongly simplified to grasp essential features of the evolution of the genetic apparatus without attempts to trace the historic path. With the tool of computer implementation confining to principle aspects and focused on critical features of the model, a deeper understanding of the model's premises is achieved. Each generation consists of three steps: (i) construction of devices (entities exposed to selection) presently available; (ii) selection; and (iii) multiplication of the isolated strands (R oligomers) by complementary copying with occasional variation by copying mismatch. In the beginning, the devices are single strands with random sequences; later, increasingly complex aggregates of strands form devices such as a hairpin-assembler device which develop in favorable cases. A monomers interlink by binding to the hairpin-assembler device, and a translation machinery, called the hairpin-assembler-enzyme device, emerges, which translates the sequence of R1 and R2 monomers in the assembler strand to the sequence of A1 and A2 monomers in the A oligomer, working as an enzyme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Heparinase I from Flavobacterium heparinum has important uses for elucidating the complex sequence heterogeneity of heparin-like glycosaminoglycans (HLGAGs). Understanding the biological function of HLGAGs has been impaired by the limited methods for analysis of pure or mixed oligosaccharide fragments. Here, we use methodologies involving MS and capillary electrophoresis to investigate the sequence of events during heparinase I depolymerization of HLGAGs. In an initial step, heparinase I preferentially cleaves exolytically at the nonreducing terminal linkage of the HLGAG chain, although it also cleaves internal linkages at a detectable rate. In a second step, heparinase I has a strong preference for cleaving the same substrate molecule processively, i.e., to cleave the next site toward the reducing end of the HLGAG chain. Computer simulation showed that the experimental results presented here from analysis of oligosaccharide degradation were consistent with literature data for degradation of polymeric HLGAG by heparinase I. This study presents direct evidence for a predominantly exolytic and processive mechanism of depolymerization of HLGAG by heparinase I.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Protein folding occurs on a time scale ranging from milliseconds to minutes for a majority of proteins. Computer simulation of protein folding, from a random configuration to the native structure, is nontrivial owing to the large disparity between the simulation and folding time scales. As an effort to overcome this limitation, simple models with idealized protein subdomains, e.g., the diffusion–collision model of Karplus and Weaver, have gained some popularity. We present here new results for the folding of a four-helix bundle within the framework of the diffusion–collision model. Even with such simplifying assumptions, a direct application of standard Brownian dynamics methods would consume 10,000 processor-years on current supercomputers. We circumvent this difficulty by invoking a special Brownian dynamics simulation. The method features the calculation of the mean passage time of an event from the flux overpopulation method and the sampling of events that lead to productive collisions even if their probability is extremely small (because of large free-energy barriers that separate them from the higher probability events). Using these developments, we demonstrate that a coarse-grained model of the four-helix bundle can be simulated in several days on current supercomputers. Furthermore, such simulations yield folding times that are in the range of time scales observed in experiments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Whole-genome duplication approximately 108 years ago was proposed as an explanation for the many duplicated chromosomal regions in Saccharomyces cerevisiae. Here we have used computer simulations and analytic methods to estimate some parameters describing the evolution of the yeast genome after this duplication event. Computer simulation of a model in which 8% of the original genes were retained in duplicate after genome duplication, and 70–100 reciprocal translocations occurred between chromosomes, produced arrangements of duplicated chromosomal regions very similar to the map of real duplications in yeast. An analytical method produced an independent estimate of 84 map disruptions. These results imply that many smaller duplicated chromosomal regions exist in the yeast genome in addition to the 55 originally reported. We also examined the possibility of determining the original order of chromosomal blocks in the ancestral unduplicated genome, but this cannot be done without information from one or more additional species. If the genome sequence of one other species (such as Kluyveromyces lactis) were known it should be possible to identify 150–200 paired regions covering the whole yeast genome and to reconstruct approximately two-thirds of the original order of blocks of genes in yeast. Rates of interchromosome translocation in yeast and mammals appear similar despite their very different rates of homologous recombination per kilobase.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the maximum parsimony (MP) and minimum evolution (ME) methods of phylogenetic inference, evolutionary trees are constructed by searching for the topology that shows the minimum number of mutational changes required (M) and the smallest sum of branch lengths (S), respectively, whereas in the maximum likelihood (ML) method the topology showing the highest maximum likelihood (A) of observing a given data set is chosen. However, the theoretical basis of the optimization principle remains unclear. We therefore examined the relationships of M, S, and A for the MP, ME, and ML trees with those for the true tree by using computer simulation. The results show that M and S are generally greater for the true tree than for the MP and ME trees when the number of nucleotides examined (n) is relatively small, whereas A is generally lower for the true tree than for the ML tree. This finding indicates that the optimization principle tends to give incorrect topologies when n is small. To deal with this disturbing property of the optimization principle, we suggest that more attention should be given to testing the statistical reliability of an estimated tree rather than to finding the optimal tree with excessive efforts. When a reliability test is conducted, simplified MP, ME, and ML algorithms such as the neighbor-joining method generally give conclusions about phylogenetic inference very similar to those obtained by the more extensive tree search algorithms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The proneural genes encode basic-helix–loop–helix (bHLH) proteins and promote the formation of distinct types of sensory organs. In Drosophila, two sets of proneural genes, atonal (ato) and members of the achaete–scute complex (ASC), are required for the formation of chordotonal (ch) organs and external sensory (es) organs, respectively. We assayed the production of sensory organs in transgenic flies expressing chimeric genes of ato and scute (sc), a member of ASC, and found that the information that specifies ch organs resides in the bHLH domain of ato; chimeras containing the b domain of ato and the HLH domain of sc also induced ch organ formation, but to a lesser extent than those containing the bHLH domain of ato. The b domains of ato and sc differ in seven residues. Mutations of these seven residues in the b domain of ato suggest that most or perhaps all of these residues are required for induction of ch organs. None of these seven residues is predicted to contact DNA directly by computer simulation using the structure of the myogenic factor MyoD as a model, implying that interaction of ato with other cofactors is likely to be involved in neuronal type specification.