958 resultados para Computer-simulations


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die causa finalis der vorliegenden Arbeit ist das Verständnis des Phasendiagramms von Wasserstoff bei ultrahohen Drücken, welche von nichtleitendem H2 bis hin zu metallischem H reichen. Da die Voraussetzungen für ultrahohen Druck im Labor schwer zu schaffen sind, bilden Computersimulationen ein wichtiges alternatives Untersuchungsinstrument. Allerdings sind solche Berechnungen eine große Herausforderung. Eines der größten Probleme ist die genaue Auswertung des Born-Oppenheimer Potentials, welches sowohl für die nichtleitende als auch für die metallische Phase geeignet sein muss. Außerdem muss es die starken Korrelationen berücksichtigen, die durch die kovalenten H2 Bindungen und die eventuellen Phasenübergänge hervorgerufen werden. Auf dieses Problem haben unsere Anstrengungen abgezielt. Im Kontext von Variationellem Monte Carlo (VMC) ist die Shadow Wave Function (SWF) eine sehr vielversprechende Option. Aufgrund ihrer Flexibilität sowohl lokalisierte als auch delokalisierte Systeme zu beschreiben sowie ihrer Fähigkeit Korrelationen hoher Ordnung zu berücksichtigen, ist sie ein idealer Kandidat für unsere Zwecke. Unglücklicherweise bringt ihre Formulierung ein Vorzeichenproblem mit sich, was die Anwendbarkeit limitiert. Nichtsdestotrotz ist es möglich diese Schwierigkeit zu umgehen indem man die Knotenstruktur a priori festlegt. Durch diesen Formalismus waren wir in der Lage die Beschreibung der Elektronenstruktur von Wasserstoff signifikant zu verbessern, was eine sehr vielversprechende Perspektive bietet. Während dieser Forschung haben wir also die Natur des Vorzeichenproblems untersucht, das sich auf die SWF auswirkt, und dabei ein tieferes Verständnis seines Ursprungs erlangt. Die vorliegende Arbeit ist in vier Kapitel unterteilt. Das erste Kapitel führt VMC und die SWF mit besonderer Ausrichtung auf fermionische Systeme ein. Kapitel 2 skizziert die Literatur über das Phasendiagramm von Wasserstoff bei ultrahohem Druck. Das dritte Kapitel präsentiert die Implementierungen unseres VMC Programms und die erhaltenen Ergebnisse. Zum Abschluss fasst Kapitel 4 unsere Bestrebungen zur Lösung des zur SWF zugehörigen Vorzeichenproblems zusammen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In den vergangenen Jahren wurden einige bislang unbekannte Phänomene experimentell beobachtet, wie etwa die Existenz unterschiedlicher Prä-Nukleations-Strukturen. Diese haben zu einem neuen Verständnis von Prozessen, die auf molekularer Ebene während der Nukleation und dem Wachstum von Kristallen auftreten, beigetragen. Die Auswirkungen solcher Prä-Nukleations-Strukturen auf den Prozess der Biomineralisation sind noch nicht hinreichend verstanden. Die Mechanismen, mittels derer biomolekulare Modifikatoren, wie Peptide, mit Prä-Nukleations-Strukturen interagieren und somit den Nukleationsprozess von Mineralen beeinflussen könnten, sind vielfältig. Molekulare Simulationen sind zur Analyse der Formation von Prä-Nukleations-Strukturen in Anwesenheit von Modifikatoren gut geeignet. Die vorliegende Arbeit beschreibt einen Ansatz zur Analyse der Interaktion von Peptiden mit den in Lösung befindlichen Bestandteilen der entstehenden Kristalle mit Hilfe von Molekular-Dynamik Simulationen.rnUm informative Simulationen zu ermöglichen, wurde in einem ersten Schritt die Qualität bestehender Kraftfelder im Hinblick auf die Beschreibung von mit Calciumionen interagierenden Oligoglutamaten in wässrigen Lösungen untersucht. Es zeigte sich, dass große Unstimmigkeiten zwischen etablierten Kraftfeldern bestehen, und dass keines der untersuchten Kraftfelder eine realistische Beschreibung der Ionen-Paarung dieser komplexen Ionen widerspiegelte. Daher wurde eine Strategie zur Optimierung bestehender biomolekularer Kraftfelder in dieser Hinsicht entwickelt. Relativ geringe Veränderungen der auf die Ionen–Peptid van-der-Waals-Wechselwirkungen bezogenen Parameter reichten aus, um ein verlässliches Modell für das untersuchte System zu erzielen. rnDas umfassende Sampling des Phasenraumes der Systeme stellt aufgrund der zahlreichen Freiheitsgrade und der starken Interaktionen zwischen Calciumionen und Glutamat in Lösung eine besondere Herausforderung dar. Daher wurde die Methode der Biasing Potential Replica Exchange Molekular-Dynamik Simulationen im Hinblick auf das Sampling von Oligoglutamaten justiert und es erfolgte die Simulation von Peptiden verschiedener Kettenlängen in Anwesenheit von Calciumionen. Mit Hilfe der Sketch-Map Analyse konnten im Rahmen der Simulationen zahlreiche stabile Ionen-Peptid-Komplexe identifiziert werden, welche die Formation von Prä-Nukleations-Strukturen beeinflussen könnten. Abhängig von der Kettenlänge des Peptids weisen diese Komplexe charakteristische Abstände zwischen den Calciumionen auf. Diese ähneln einigen Abständen zwischen den Calciumionen in jenen Phasen von Calcium-Oxalat Kristallen, die in Anwesenheit von Oligoglutamaten gewachsen sind. Die Analogie der Abstände zwischen Calciumionen in gelösten Ionen-Peptid-Komplexen und in Calcium-Oxalat Kristallen könnte auf die Bedeutung von Ionen-Peptid-Komplexen im Prozess der Nukleation und des Wachstums von Biomineralen hindeuten und stellt einen möglichen Erklärungsansatz für die Fähigkeit von Oligoglutamaten zur Beeinflussung der Phase des sich formierenden Kristalls dar, die experimentell beobachtet wurde.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem of localizing a scatterer, which represents a tumor, in a homogeneous circular domain, which represents a breast, is addressed. A breast imaging method based on microwaves is considered. The microwave imaging involves to several techniques for detecting, localizing and characterizing tumors in breast tissues. In all such methods an electromagnetic inverse scattering problem exists. For the scattering detection method, an algorithm based on a linear procedure solution, inspired by MUltiple SIgnal Classification algorithm (MUSIC) and Time Reversal method (TR), is implemented. The algorithm returns a reconstructed image of the investigation domain in which it is detected the scatterer position. This image is called pseudospectrum. A preliminary performance analysis of the algorithm vying the working frequency is performed: the resolution and the signal-to-noise ratio of the pseudospectra are improved if a multi-frequency approach is considered. The Geometrical Mean-MUSIC algorithm (GM- MUSIC) is proposed as multi-frequency method. The performance of the GMMUSIC is tested in different real life computer simulations. The performed analysis shows that the algorithm detects the scatterer until the electrical parameters of the breast are known. This is an evident limit, since, in a real life situation, the anatomy of the breast is unknown. An improvement in GM-MUSIC is proposed: the Eye-GMMUSIC algorithm. Eye-GMMUSIC algorithm needs no a priori information on the electrical parameters of the breast. It is an optimizing algorithm based on the pattern search algorithm: it searches the breast parameters which minimize the Signal-to-Clutter Mean Ratio (SCMR) in the signal. Finally, the GM-MUSIC and the Eye-GMMUSIC algorithms are tested on a microwave breast cancer detection system consisting of an dipole antenna, a Vector Network Analyzer and a novel breast phantom built at University of Bologna. The reconstruction of the experimental data confirm the GM-MUSIC ability to localize a scatterer in a homogeneous medium.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Transient trapping is a new mechanism of on-line sample concentration and separation that has recently been presented. It involves the injection of a short length of micellar solution in front of the sample, making it similar to sweeping in partial-filling MEKC. Here, we examine the mechanism of transient trapping by the use of computer simulations and compare it to sweeping in MEKC for the two analytes, sulforhodamine B and 101. The simulation results confirm the mechanism for concentration and separation originally proposed. The mechanism for concentration is similar to sweeping since the analytes are picked and accumulated by the micelles that penetrate the sample zone. The mechanism for separation is however quite unique since the concentrated analytes are trapped for a few seconds on the sample/micelle boundary before they are released as the concentration of micelle is reduced as it undergoes electromigration dispersion and the analytes separate down a micelle gradient. Simulation results suggested that a significant contribution of band broadening arises from the micelle gradient, with shallower gradients resulting in broader peaks. However, this is offset by an increase in selectivity, such that resolution was enhanced even though the peaks are broader. Transient trapping analysis with similar resolution to those obtained by sweeping MEKC could be achieved in 1/10 of the time and 1/4 of the capillary length, which results in a 2-3 times increase in sensitivity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays computer simulation is used in various fields, particularly in laboratories where it is used for the exploration data which are sometimes experimentally inaccessible. In less developed countries where there is a need for up to date laboratories for the realization of practical lessons in chemistry, especially in secondary schools and some higher institutions of learning, it may permit learners to carryout experiments such as titrations without the use of laboratory materials and equipments. Computer simulations may also permit teachers to better explain the realities of practical lessons, given that computers have now become very accessible and less expensive compared to the acquisition of laboratory materials and equipments. This work is aimed at coming out with a virtual laboratory that shall permit the simulation of an acid-base titration and an oxidation-reduction titration with the use of synthetic images. To this effect, an appropriate numerical method was used to obtain appropriate organigram, which were further transcribed into source codes with the help of a programming language so as to come out with the software.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In modern life- and medical-sciences major efforts are currently concentrated on creating artificial photoenzymes, consisting of light- oxygen-voltage-sensitive (LOV) domains fused to a target enzyme. Such protein constructs possess great potential for controlling the cell metabolism as well as gene function upon light stimulus. This has recently been impressively demonstrated by designing a novel artificial fusion protein, connecting the AsLOV2-Jα-photosensor from Avena sativa with the Rac1-GTPase (AsLOV2-Jα-Rac1), and by using it, to control the motility of cancer cells from the HeLa-line. Although tremendous progress has been achieved on the generation of such protein constructs, a detailed understanding of their signaling pathway after photoexcitation is still in its infancy. Here, we show through computer simulations of the AsLOV2-Jα-Rac1-photoenzyme that the early processes after formation of the Cys450-FMN-adduct involve the breakage of a H-bond between the carbonyl oxygen FMN-C4O and the amino group of Gln513, followed by a rotational reorientation of its sidechain. This initial event is followed by successive events including β-sheet tightening and transmission of torsional stress along the Iβ-sheet, which leads to the disruption of the Jα-helix from the N-terminal end. Finally, this process triggers the detachment of the AsLOV2-Jα-photosensor from the Rac1-GTPase, ultimately enabling the activation of Rac1 via binding of the effector protein PAK1.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There are numerous statistical methods for quantitative trait linkage analysis in human studies. An ideal such method would have high power to detect genetic loci contributing to the trait, would be robust to non-normality in the phenotype distribution, would be appropriate for general pedigrees, would allow the incorporation of environmental covariates, and would be appropriate in the presence of selective sampling. We recently described a general framework for quantitative trait linkage analysis, based on generalized estimating equations, for which many current methods are special cases. This procedure is appropriate for general pedigrees and easily accommodates environmental covariates. In this paper, we use computer simulations to investigate the power robustness of a variety of linkage test statistics built upon our general framework. We also propose two novel test statistics that take account of higher moments of the phenotype distribution, in order to accommodate non-normality. These new linkage tests are shown to have high power and to be robust to non-normality. While we have not yet examined the performance of our procedures in the context of selective sampling via computer simulations, the proposed tests satisfy all of the other qualities of an ideal quantitative trait linkage analysis method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Comments on an article by Kashima et al. (see record 2007-10111-001). In their target article Kashima and colleagues try to show how a connectionist model conceptualization of the self is best suited to capture the self's temporal and socio-culturally contextualized nature. They propose a new model and to support this model, the authors conduct computer simulations of psychological phenomena whose importance for the self has long been clear, even if not formally modeled, such as imitation, and learning of sequence and narrative. As explicated when we advocated connectionist models as a metaphor for self in Mischel and Morf (2003), we fully endorse the utility of such a metaphor, as these models have some of the processing characteristics necessary for capturing key aspects and functions of a dynamic cognitive-affective self-system. As elaborated in that chapter, we see as their principal strength that connectionist models can take account of multiple simultaneous processes without invoking a single central control. All outputs reflect a distributed pattern of activation across a large number of simple processing units, the nature of which depends on (and changes with) the connection weights between the links and the satisfaction of mutual constraints across these links (Rummelhart & McClelland, 1986). This allows a simple account for why certain input features will at times predominate, while others take over on other occasions. (PsycINFO Database Record (c) 2008 APA, all rights reserved)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The rising concerns about environmental pollution and global warming have facilitated research interest in hydrogen energy as an alternative energy source. To apply hydrogen for transportations, several issues have to be solved, within which hydrogen storage is the most critical problem. Lots of materials and devices have been developed; however, none is able to meet the DOE storage target. The primary issue for hydrogen physisorption is a weak interaction between hydrogen and the surface of solid materials, resulting negligible adsorption at room temperature. To solve this issue, there is a need to increase the interaction between the hydrogen molecules and adsorbent surface. In this study, intrinsic electric dipole is investigated to enhance the adsorption energy. The results from the computer simulation of single ionic compounds with hydrogen molecules to form hydrogen clusters showed that electrical charge of substances plays an important role in generation of attractive interaction with hydrogen molecules. In order to further examine the effects of static interaction on hydrogen adsorption, activated carbon with a large surface area was impregnated with various ionic salts including LiCl, NaCl, KCl, KBr, and NiCl and their performance for hydrogen storage was evaluated by using a volumetric method. Corresponding computer simulations have been carried out by using DFT (Density Functional Theory) method combined with point charge arrays. Both experimental and computational results prove that the adsorption capacity of hydrogen and its interaction with the solid materials increased with electrical dipole moment. Besides the intrinsic dipole, an externally applied electric field could be another means to enhance hydrogen adsorption. Hydrogen adsorption under an applied electric field was examined by using porous nickel foil as electrodes. Electrical signals showed that adsorption capacity increased with the increasing of gas pressure and external electric voltage. Direct measurement of the amount of hydrogen adsorption was also carried out with porous nickel oxides and magnesium oxides using the piezoelectric material PMN-PT as the charge supplier due to the pressure. The adsorption enhancement from the PMN-PT generated charges is obvious at hydrogen pressure between 0 and 60 bars, where the hydrogen uptake is increased at about 35% for nickel oxide and 25% for magnesium oxide. Computer simulation reveals that under the external electric field, the electron cloud of hydrogen molecules is pulled over to the adsorbent site and can overlap with the adsorbent electrons, which in turn enhances the adsorption energy Experiments were also carried out to examine the effects of hydrogen spillover with charge induced enhancement. The results show that the overall storage capacity in nickel oxide increased remarkably by a factor of 4.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study investigated the effectiveness of incorporating several new instructional strategies into an International Baccalaureate (IB) chemistry course in terms of how they supported high school seniors’ understanding of electrochemistry. The three new methods used were (a) providing opportunities for visualization of particle movement by student manipulation of physical models and interactive computer simulations, (b) explicitly addressing common misconceptions identified in the literature, and (c) teaching an algorithmic, step-wise approach for determining the products of an aqueous solution electrolysis. Changes in student understanding were assessed through test scores on both internally and externally administered exams over a two-year period. It was found that visualization practice and explicit misconception instruction improved student understanding, but the effect was more apparent in the short-term. The data suggested that instruction time spent on algorithm practice was insufficient to cause significant test score improvement. There was, however, a substantial increase in the percentage of the experimental group students who chose to answer an optional electrochemistry-related external exam question, indicating an increase in student confidence. Implications for future instruction are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Moisture induced distresses have been the prevalent distress type affecting the deterioration of both asphalt and concrete pavement sections. While various surface techniques have been employed over the years to minimize the ingress of moisture into the pavement structural sections, subsurface drainage components like open-graded base courses remain the best alternative in minimizing the time the pavement structural sections are exposed to saturated conditions. This research therefore focuses on assessing the performance and cost-effectiveness of pavement sections containing both treated and untreated open-graded aggregate base materials. Three common roadway aggregates comprising of two virgin aggregates and one recycled aggregate were investigated using four open-ended gradations and two binder types. Laboratory tests were conducted to determine the hydraulic, mechanical and durability characteristics of treated and untreated open-graded mixes made from these three aggregate types. Results of the experimental program show that for the same gradation and mix design types, limestone samples have the greatest drainage capacity, stability to traffic loads and resistance to degradation from environmental conditions like freeze-thaw. However, depending on the gradation and mix design used, all three aggregate types namely limestone, natural gravel and recycled concrete can meet the minimum coefficient of hydraulic conductivity required for good drainage in most pavements. Tests results for both asphalt and cement treated open-graded samples indicate that a percent air void content within the range of 15-25 will produce a treated open-graded base course with sufficient drainage capacity and also long term stability under both traffic and environmental loads. Using the new Mechanistic and Empirical Design Guide software, computer simulations of pavement performance were conducted on pavement sections containing these open-graded base aggregate base materials to determine how the MEPDG predicted pavement performance is sensitive to drainage. Using three truck traffic levels and four climatic regions, results of the computer simulations indicate that the predicted performance was not sensitive to the drainage characteristics of the open-graded base course. Based on the result of the MEPDG predicted pavement performance, the cost-effectiveness of the pavement sections with open-graded base was computed on the assumption that the increase service life experienced by these sections was attributed to the positive effects of subsurface drainage. The two cost analyses used gave two contrasting results with the one indicating that the inclusion of open-graded base courses can lead to substantial savings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the laboratory of Dr. Dieter Jaeger at Emory University, we use computer simulations to study how the biophysical properties of neurons—including their three-dimensional structure, passive membrane resistance and capacitance, and active membrane conductances generated by ion channels—affect the way that the neurons transfer synaptic inputs into the action potential streams that represent their output. Because our ultimate goal is to understand how neurons process and relay information in a living animal, we try to make our computer simulations as realistic as possible. As such, the computer models reflect the detailed morphology and all of the ion channels known to exist in the particular neuron types being simulated, and the model neurons are tested with synaptic input patterns that are intended to approximate the inputs that real neurons receive in vivo. The purpose of this workshop tutorial was to explain what we mean by ‘in vivo-like’ synaptic input patterns, and how we introduce these input patterns into our computer simulations using the freely available GENESIS software package (http://www.genesis-sim.org/GENESIS). The presentation was divided into four sections: first, an explanation of what we are talking about when we refer to in vivo-like synaptic input patterns

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A change in synaptic strength arising from the activation of two neuronal pathways at approximately the same time is a form of associative plasticity and may underlie classical conditioning. Previously, a cellular analog of a classical conditioning protocol has been demonstrated to produce short-term associative plasticity at the connections between sensory and motor neurons in Aplysia. A similar training protocol produced long-term (24 hour) enhancement of excitatory postsynaptic potentials (EPSPs). EPSPs produced by sensory neurons in which activity was paired with a reinforcing stimulus were significantly larger than unpaired controls 24 hours after training. To examined whether the associative plasticity observed at these synapses may be involved in higher-order forms of classical conditioning, a neural analog of contingency was developed. In addition, computer simulations were used to analyze whether the associative plasticity observed in Aplysia could, in theory, account for second-order conditioning and blocking. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

I studied the apolipoprotein (apo) B 3$\sp\prime$ variable number tandem repeat (VNTR) and did computer simulations of the stepwise mutation model to address four questions: (1) How did the apo B VNTR originate? (2) What is the mutational mechanism of repeat number change at the apo B VNTR? (3) To what extent are population and molecular level events responsible for the determination of the contemporary apo B allele frequency distribution? (4) Can VNTR allele frequency distributions be explained by a simple and conservative mutation-drift model? I used three general approaches to address these questions: (1) I characterized the apo B VNTR region in non-human primate species; (2) I constructed haplotypes of polymorphic markers flanking the apo B VNTR in a sample of individuals from Lorrain, France and studied the associations between the flanking-marker haplotypes and apo B VNTR size; (3) I did computer simulations of the one-step stepwise mutation model and compared the results to real data in terms of four allele frequency distribution characteristics.^ The results of this work have allowed me to conclude that the apo B VNTR originated after an initial duplication of a sequence which is still present as a single copy sequence in New World monkey species. I conclude that this locus did not originate by the transposition of an array of repeats from somewhere else in the genome. It is unlikely that recombination is the primary mutational mechanism. Furthermore, the clustered nature of these associations implicates a stepwise mutational mechanism. From the high frequencies of certain haplotype-allele size combinations, it is evident that population level events have also been important in the determination of the apo B VNTR allele frequency distribution. Results from computer simulations of the one-step stepwise mutation model have allowed me to conclude that bimodal and multimodal allele frequency distributions are not unexpected at loci evolving via stepwise mutation mechanisms. Short tandem repeat loci fit the stepwise mutation model best, followed by microsatellite loci. I therefore conclude that there are differences in the mutational mechanisms of VNTR loci as classed by repeat unit size. (Abstract shortened by UMI.) ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Primate immunodeficiency viruses, or lentiviruses (HIV-1, HIV-2, and SIV), and hepatitis delta virus (HDV) are RNA viruses characterized by rapid evolution. Infection by primate immunodeficiency viruses usually results in the development of acquired immunodeficiency syndrome (AIDS) in humans and AIDS-like illnesses in Asian macaques. Similarly, hepatitis delta virus infection causes hepatitis and liver cancer in humans. These viruses are heterogeneous within an infected patient and among individuals. Substitution rates in the virus genomes are high and vary in different lineages and among sites. Methods of phylogenetic analysis were applied to study the evolution of primate lentiviruses and the hepatitis delta virus. The following results have been obtained: (1) The substitution rate varies among sites of primate lentivirus genes according to the two parameter gamma distribution, with the shape parameter $\alpha$ being close to 1. (2) Primate immunodeficiency viruses fall into species-specific lineages. Therefore, viral transmissions across primate species are not as frequent as suggested by previous authors. (3) Primate lentiviruses have acquired or lost their pathogenicity several times in the course of evolution. (4) Evidence was provided for multiple infections of a North American patient by distinct HIV-1 strains of the B subtype. (5) Computer simulations indicate that the probability of committing an error in testing HIV transmission depends on the number of virus sequences and their length, the divergence times among sequences, and the model of nucleotide substitution. (6) For future investigations of HIV-1 transmissions, using longer virus sequences and avoiding the use of distant outgroups is recommended. (7) Hepatitis delta virus strains are usually related according to the geographic region of isolation. (8) Evolution of HDV is characterized by the rate of synonymous substitution being lower than the nonsynonymous substitution rate and the rate of evolution of the noncoding region. (9) There is a strong preference for G and C nucleotides at the third codon positions of the HDV coding region. ^