956 resultados para Computer models


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In the laboratory of Dr. Dieter Jaeger at Emory University, we use computer simulations to study how the biophysical properties of neurons—including their three-dimensional structure, passive membrane resistance and capacitance, and active membrane conductances generated by ion channels—affect the way that the neurons transfer synaptic inputs into the action potential streams that represent their output. Because our ultimate goal is to understand how neurons process and relay information in a living animal, we try to make our computer simulations as realistic as possible. As such, the computer models reflect the detailed morphology and all of the ion channels known to exist in the particular neuron types being simulated, and the model neurons are tested with synaptic input patterns that are intended to approximate the inputs that real neurons receive in vivo. The purpose of this workshop tutorial was to explain what we mean by ‘in vivo-like’ synaptic input patterns, and how we introduce these input patterns into our computer simulations using the freely available GENESIS software package (http://www.genesis-sim.org/GENESIS). The presentation was divided into four sections: first, an explanation of what we are talking about when we refer to in vivo-like synaptic input patterns

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The development of electrophoretic computer models and their use for simulation of electrophoretic processes has increased significantly during the last few years. Recently, GENTRANS and SIMUL5 were extended with algorithms that describe chemical equilibria between solutes and a buffer additive in a fast 1:1 interaction process, an approach that enables simulation of the electrophoretic separation of enantiomers. For acidic cationic systems with sodium and H3 0(+) as leading and terminating components, respectively, acetic acid as counter component, charged weak bases as samples, and a neutral CD as chiral selector, the new codes were used to investigate the dynamics of isotachophoretic adjustment of enantiomers, enantiomer separation, boundaries between enantiomers and between an enantiomer and a buffer constituent of like charge, and zone stability. The impact of leader pH, selector concentration, free mobility of the weak base, mobilities of the formed complexes and complexation constants could thereby be elucidated. For selected examples with methadone enantiomers as analytes and (2-hydroxypropyl)-β-CD as selector, simulated zone patterns were found to compare well with those monitored experimentally in capillary setups with two conductivity detectors or an absorbance and a conductivity detector. Simulation represents an elegant way to provide insight into the formation of isotachophoretic boundaries and zone stability in presence of complexation equilibria in a hitherto inaccessible way.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Computer models were used to examine whether and under what conditions the multimeric protein complex is inhibited by high concentrations of one of its components—an effect analogous to the prozone phenomenon in precipitin tests. A series of idealized simple “ball-and-stick” structures representing small oligomeric complexes of protein molecules formed by reversible binding reactions were analyzed to determine the binding steps leading to each structure. The equilibrium state of each system was then determined over a range of starting concentrations and Kds and the steady-state concentration of structurally complete oligomer calculated for each situation. A strong inhibitory effect at high concentrations was shown by any protein molecule forming a bridge between two or more separable parts of the complex. By contrast, proteins linked to the outside of the complex by a single bond showed no inhibition whatsoever at any concentration. Nonbridging, multivalent proteins in the body of the complex could show an inhibitory effect or not depending on the structure of the complex and the strength of its bonds. On the basis of this study, we suggest that the prozone phenomenon will occur widely in living cells and that it could be a crucial factor in the regulation of protein complex formation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Too often, validation of computer models is considered as a "once and forget" task. In this paper a systematic and graduated approach to evacuation model validation is suggested. This involves, (i) component testing, (ii) functional validation, (iii) qualitative validation and (iv) quantitative validation. Viewed in this manner, validation is considered an on-going activity and an integral part of the life cycle of the software. While the first three components of the validation protocol pose little or no significant problems, the task of quantitative validation poses a number of challenges, the most significant being a shortage of suitable experimental data. Finally, the validation protocol used in the development of the EXODUS suite of evacuation models is examined.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Computer models can be combined with laboratory experiments for the efficient determination of (i) peptides that bind MHC molecules and (ii) T-cell epitopes. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures. This requires the definition of standards and experimental protocols for model application. We describe the requirements for validation and assessment of computer models. The utility of combining accurate predictions with a limited number of laboratory experiments is illustrated by practical examples. These include the identification of T-cell epitopes from IDDM-, melanoma- and malaria-related antigens by combining computational and conventional laboratory assays. The success rate in determining antigenic peptides, each in the context of a specific HLA molecule, ranged from 27 to 71%, while the natural prevalence of MHC-binding peptides is 0.1-5%.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertation presented to obtain the PhD degree in Electrical and Computer Engineering - Electronics

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis introduces a search for a new design of the frame for a permanent magnet generator mounted at a windmill. The objective of this work is to offer new design ideas for the stator frame - new concepts for connecting stator core to stator frame in a generator. Desired aims of new design concepts are: simplification of the structure production; decrease of material use; use of standard components; light weight of construction and etc. Thesis contains several new possible designs for the stator frame structure. Also, it has a list of possible connection concepts, which can be used to join the stator to the frame. All new ideas are described and compared according to its match to the desired purposes of the work. New design concepts are modeled using modern software. The main part of the Thesis contains several approximate computer models of the current and new offered constructions, description of loads and stress in the current stator frame. It has evaluation of the most important stress and load characteristics. The final design is a result of all previous research. It has a description of a new frame structure and joining concept for it. This structure matched main aims of work, but it does not have detailed design with dimensions and check calculations of the frame and welds. Thesis gives representation about design search, evaluation and comparison of new concepts of generator structure. Also, it gives general representation of renewable energy technology, knowledge about windmill turbines and its contents.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In dieser Arbeit werden verschiedene Computermodelle, Rechenverfahren und Methoden zur Unterstützung bei der Integration großer Windleistungen in die elektrische Energieversorgung entwickelt. Das Rechenmodell zur Simulation der zeitgleich eingespeisten Windenergie erzeugt Summenganglinien von beliebig zusammengestellten Gruppen von Windenergieanlagen, basierend auf gemessenen Wind- und Leistungsdaten der nahen Vergangenheit. Dieses Modell liefert wichtige Basisdaten für die Analyse der Windenergieeinspeisung auch für zukünftige Szenarien. Für die Untersuchung der Auswirkungen von Windenergieeinspeisungen großräumiger Anlagenverbünde im Gigawattbereich werden verschiedene statistische Analysen und anschauliche Darstellungen erarbeitet. Das im Rahmen dieser Arbeit entwickelte Modell zur Berechnung der aktuell eingespeisten Windenergie aus online gemessenen Leistungsdaten repräsentativer Windparks liefert wertvolle Informationen für die Leistungs- und Frequenzregelung der Netzbetreiber. Die zugehörigen Verfahren zur Ermittlung der repräsentativen Standorte und zur Überprüfung der Repräsentativität bilden die Grundlage für eine genaue Abbildung der Windenergieeinspeisung für größere Versorgungsgebiete, basierend auf nur wenigen Leistungsmessungen an Windparks. Ein weiteres wertvolles Werkzeug für die optimale Einbindung der Windenergie in die elektrische Energieversorgung bilden die Prognosemodelle, die die kurz- bis mittelfristig zu erwartende Windenergieeinspeisung ermitteln. In dieser Arbeit werden, aufbauend auf vorangegangenen Forschungsarbeiten, zwei, auf Künstlich Neuronalen Netzen basierende Modelle vorgestellt, die den zeitlichen Verlauf der zu erwarten Windenergie für Netzregionen und Regelzonen mit Hilfe von gemessenen Leistungsdaten oder prognostizierten meteorologischen Parametern zur Verfügung stellen. Die softwaretechnische Zusammenfassung des Modells zur Berechnung der aktuell eingespeisten Windenergie und der Modelle für die Kurzzeit- und Folgetagsprognose bietet eine attraktive Komplettlösung für die Einbindung der Windenergie in die Leitwarten der Netzbetreiber. Die dabei entwickelten Schnittstellen und die modulare Struktur des Programms ermöglichen eine einfache und schnelle Implementierung in beliebige Systemumgebungen. Basierend auf der Leistungsfähigkeit der Online- und Prognosemodelle werden Betriebsführungsstrategien für zu Clustern im Gigawattbereich zusammengefasste Windparks behandelt, die eine nach ökologischen und betriebswirtschaftlichen Gesichtspunkten sowie nach Aspekten der Versorgungssicherheit optimale Einbindung der geplanten Offshore-Windparks ermöglichen sollen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The possibility of future rapid climatic changes is a pressing concern amongst climate scientists. For example, an abrupt collapse of the ocean's Thermohaline Circulation (THC) would rapidly cool the northern hemisphere and reduce the net global primary productivity of vegetation, according to computer models. It is unclear how to incorporate such low-probability, high-impact events into the development of economics policies. This paper reviews the salient aspects of rapid climate change relevant to economists and policy makers. The main scientific certainties and uncertainties are clearly delineated, with the aim of guiding economics goals and ensuring that they retain fidelity to their scientific underpinnings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Our understanding of the climate system has been revolutionized recently, by the development of sophisticated computer models. The predictions of such models are used to formulate international protocols, intended to mitigate the severity of global warming and its impacts. Yet, these models are not perfect representations of reality, because they remove from explicit consideration many physical processes which are known to be key aspects of the climate system, but which are too small or fast to be modelled. The purpose of this paper is to give a personal perspective of the current state of knowledge regarding the problem of unresolved scales in climate models. A recent novel solution to the problem is discussed, in which it is proposed, somewhat counter-intuitively, that the performance of models may be improved by adding random noise to represent the unresolved processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent severe flooding in the UK has highlighted the need for better information on flood risk, increasing the pressure on engineers to enhance the capabilities of computer models for flood prediction. This paper evaluates the benefits to be gained from the use of remotely sensed data to support flood modelling. The remotely sensed data available can be used either to produce high-resolution digital terrain models (DTMs) (light detection and ranging (Lidar) data), or to generate accurate inundation mapping of past flood events (airborne synthetic aperture radar (SAR) data and aerial photography). The paper reports on the modelling of real flood events that occurred at two UK sites on the rivers Severn and Ouse. At these sites a combination of remotely sensed data and recorded hydrographs was available. It is concluded first that light detection and ranging Lidar generated DTMs support the generation of considerably better models and enhance the visualisation of model results and second that flood outlines obtained from airborne SAR or aerial images help develop an appreciation of the hydraulic behaviour of important model components, and facilitate model validation. The need for further research is highlighted by a number of limitations, namely: the difficulties in obtaining an adequate representation of hydraulically important features such as embankment crests and walls; uncertainties in the validation data; and difficulties in extracting flood outlines from airborne SAR images in urban areas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The geospace environment is controlled largely by events on the Sun, such as solar flares and coronal mass ejections, which generate significant geomagnetic and upper atmospheric disturbances. The study of this Sun-Earth system, which has become known as space weather, has both intrinsic scientific interest and practical applications. Adverse conditions in space can damage satellites and disrupt communications, navigation, and electric power grids, as well as endanger astronauts. The Center for Integrated Space Weather Modeling (CISM), a Science and Technology Center (STC) funded by the U.S. National Science Foundation (see http://www.bu.edu/cism/), is developing a suite of integrated physics-based computer models that describe the space environment from the Sun to the Earth for use in both research and operations [Hughes and Hudson, 2004, p. 1241]. To further this mission, advanced education and training programs sponsored by CISM encourage students to view space weather as a system that encompasses the Sun, the solar wind, the magnetosphere, and the ionosphere/thermosphere. This holds especially true for participants in the CISM space weather summer school [Simpson, 2004].

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Determination of the local structure of a polymer glass by scattering methods is complex due to the number of spatial and orientational correlations, both from within the polymer chain (intrachain) and between neighbouring chains (interchain), from which the scattering arises. Recently considerable advances have been made in the structural analysis of relatively simple polymers such as poly(ethylene) through the use of broad Q neutron scattering data tightly coupled to atomistic modelling procedures. This paper presents the results of an investigation into the use of these procedures for the analysis of the local structure of a-PMMA which is chemically more complex with a much greater number of intrachain structural parameters. We have utilised high quality neutron scattering data obtained using SANDALS at ISIS coupled with computer models representing both the single chain and bulk polymer system. Several different modelling approaches have been explored which encompass such techniques as Reverse Monte Carlo refinement and energy minimisation and their relative merits and successes are discussed. These different approaches highlight structural parameters which any realistic model of glassy atactic PMMA must replicate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.