941 resultados para SPICE simulations
Resumo:
While sound and video may capture viewers' attention, interaction can captivate them. This has not been available prior to the advent of Digital Television. In fact, what lies at the heart of the Digital Television revolution is this new type of interactive content, offered in the form of interactive Television (iTV) services. On top of that, the new world of converged networks has created a demand for a new type of converged services on a range of mobile terminals (Tablet PCs, PDAs and mobile phones). This paper aims at presenting a new approach to service creation that allows for the semi-automatic translation of simulations and rapid prototypes created in the accessible desktop multimedia authoring package Macromedia Director into services ready for broadcast. This is achieved by a series of tools that de-skill and speed-up the process of creating digital TV user interfaces (UI) and applications for mobile terminals. The benefits of rapid prototyping are essential for the production of these new types of services, and are therefore discussed in the first section of this paper. In the following sections, an overview of the operation of content, service, creation and management sub-systems is presented, which illustrates why these tools compose an important and integral part of a system responsible of creating, delivering and managing converged broadcast and telecommunications services. The next section examines a number of metadata languages candidates for describing the iTV services user interface and the schema language adopted in this project. A detailed description of the operation of the two tools is provided to offer an insight of how they can be used to de-skill and speed-up the process of creating digital TV user interfaces and applications for mobile terminals. Finally, representative broadcast oriented and telecommunication oriented converged service components are also introduced, demonstrating how these tools have been used to generate different types of services.
Resumo:
Neurons in Action (NIA1, 2000; NIA1.5, 2004; NIA2, 2007), a set of tutorials and linked simulations, is designed to acquaint students with neuronal physiology through interactive, virtual laboratory experiments. Here we explore the uses of NIA in lecture, both interactive and didactic, as well as in the undergraduate laboratory, in the graduate seminar course, and as an examination tool through homework and problem set assignments. NIA, made with the simulator NEURON (http://www.neuron.yale.edu/neuron/), displays voltages, currents, and conductances in a membrane patch or signals moving within the dendrites, soma and/or axon of a neuron. Customized simulations start with the plain lipid bilayer and progress through equilibrium potentials; currents through single Na and K channels; Na and Ca action potentials; voltage clamp of a patch or a whole neuron; voltage spread and propagation in axons, motoneurons and nerve terminals; synaptic excitation and inhibition; and advanced topics such as channel kinetics and coincidence detection. The user asks and answers "what if" questions by specifying neuronal parameters, ion concentrations, and temperature, and the experimental results are then plotted as conductances, currents, and voltage changes. Such exercises provide immediate confirmation or refutation of the student's ideas to guide their learning. The tutorials are hyperlinked to explanatory information and to original research papers. Although the NIA tutorials were designed as a sequence to empower a student with a working knowledge of fundamental neuronal principles, we find that faculty are using the individual tutorials in a variety of educational situations, some of which are described here. Here we offer ideas to colleagues using interactive software, whether NIA or another tool, for educating students of differing backgrounds in the subject of neurophysiology.
Resumo:
In recent years interactive media and tools, like scientific simulations and simulation environments or dynamic data visualizations, became established methods in the neural and cognitive sciences. Hence, university teachers of neural and cognitive sciences are faced with the challenge to integrate these media into the neuroscientific curriculum. Especially simulations and dynamic visualizations offer great opportunities for teachers and learners, since they are both illustrative and explorable. However, simulations bear instructional problems: they are abstract, demand some computer skills and conceptual knowledge about what simulations intend to explain. By following two central questions this article provides an overview on possible approaches to be applied in neuroscience education and opens perspectives for their curricular integration: (i) How can complex scientific media be transformed for educational use in an efficient and (for students on all levels) comprehensible manner and (ii) by what technical infrastructure can this transformation be supported? Exemplified by educational simulations for the neurosciences and their application in courses, answers to these questions are proposed a) by introducing a specific educational simulation approach for the neurosciences b) by introducing an e-learning environment for simulations, and c) by providing examples of curricular integration on different levels which might help academic teachers to integrate newly created or existing interactive educational resources in their courses.
Resumo:
Die hohe Komplexität zellularer intralogistischer Systeme und deren Steuerungsarchitektur legt die Verwendung moderner Simulations- und Visualisierungstechniken nahe, um schon im Vorfeld Aussagen über die Leistungsfähigkeit und Zukunftssicherheit eines geplanten Systems treffen zu können. In dieser Arbeit wird ein Konzept für ein Simulationssystem zur VR-basierten Steuerungsverifikation zellularer Intralogistiksysteme vorgestellt. Beschrieben wird die Erstellung eines Simulationsmodells für eine real existierende Anlage und es wird ein Überblick über die Bestandteile der Simulation, insbesondere die Anbindung der Steuerung des realen agentenbasierten Systems, gegeben.
Resumo:
Die Bestimmung der Leistungsverfügbarkeit als Maß für den Erfüllungsgrad logistischer Prozesse erfolgt gewöhnlich während des Betriebes einer logistischen Anlage. Wir haben ein Simulationssystem entwickelt, um die Leistungsverfügbarkeit einer zellularen intralogistischen Anlage schon im Vorfeld ermitteln zu können. Die detailgenaue Simulation erfasst dabei nicht nur Einflüsse statischer Parameter wie Dimensionierung der Anlage sondern auch dynamische Parameter, wie Fahrzeugverhalten oder Auftragszusammensetzung. Aufgrund der Echtzeit- und VR-Fähigkeit des vorgestellten Systems, ist eine Präsentation in einer VR-Umgebung möglich. Intuitive Interaktionsmechanismen und Visualisierungs-Metaphern bieten einen intuitiven Zugang zur Leistungsverfügbarkeit des Systems und den Größen, die sie beeinflussen.
Resumo:
We present a molecular modeling study based on ab initio and classical molecular dynamics calculations, for the investigation of the tridimensional structure and supramolecular assembly formation of heptapyrenotide oligomers in water solution. Our calculations show that free oligomers self-assemble in helical structures characterized by an inner core formed by π- stacked pyrene units, and external grooves formed by the linker moieties. The coiling of the linkers has high ordering, dominated by hydrogen-bond interactions among the phosphate and amide groups. Our models support a mechanism of longitudinal supramolecular oligomerization based on interstrand pyrene intercalation. Only a minimal number of pyrene units intercalate at one end, favoring formation of very extended longitudinal chains, as also detected by AFM experiment. Our results provide a structural explanation of the mechanism of chirality amplification in 1:1 mixtures of standard heptapyrenotides and modified oligomers with covalently linked deoxycytidine, based on selective molecular recognition and binding of the nucleotide to the groove of the left-wound helix.
Resumo:
We analyze the impact of stratospheric volcanic aerosols on the diurnal temperature range (DTR) over Europe using long-term subdaily station records. We compare the results with a 28-member ensemble of European Centre/Hamburg version 5.4 (ECHAM5.4) general circulation model simulations. Eight stratospheric volcanic eruptions during the instrumental period are investigated. Seasonal all- and clear-sky DTR anomalies are compared with contemporary (approximately 20 year) reference periods. Clear sky is used to eliminate cloud effects and better estimate the signal from the direct radiative forcing of the volcanic aerosols. We do not find a consistent effect of stratospheric aerosols on all-sky DTR. For clear skies, we find average DTR anomalies of −0.08°C (−0.13°C) in the observations (in the model), with the largest effect in the second winter after the eruption. Although the clear-sky DTR anomalies from different stations, volcanic eruptions, and seasons show heterogeneous signals in terms of order of magnitude and sign, the significantly negative DTR anomalies (e.g., after the Tambora eruption) are qualitatively consistent with other studies. Referencing with clear-sky DTR anomalies to the radiative forcing from stratospheric volcanic eruptions, we find the resulting sensitivity to be of the same order of magnitude as previously published estimates for tropospheric aerosols during the so-called “global dimming” period (i.e., 1950s to 1980s). Analyzing cloud cover changes after volcanic eruptions reveals an increase in clear-sky days in both data sets. Quantifying the impact of stratospheric volcanic eruptions on clear-sky DTR over Europe provides valuable information for the study of the radiative effect of stratospheric aerosols and for geo-engineering purposes.
Resumo:
Let us consider a large set of candidate parameter fields, such as hydraulic conductivity maps, on which we can run an accurate forward flow and transport simulation. We address the issue of rapidly identifying a subset of candidates whose response best match a reference response curve. In order to keep the number of calls to the accurate flow simulator computationally tractable, a recent distance-based approach relying on fast proxy simulations is revisited, and turned into a non-stationary kriging method where the covariance kernel is obtained by combining a classical kernel with the proxy. Once the accurate simulator has been run for an initial subset of parameter fields and a kriging metamodel has been inferred, the predictive distributions of misfits for the remaining parameter fields can be used as a guide to select candidate parameter fields in a sequential way. The proposed algorithm, Proxy-based Kriging for Sequential Inversion (ProKSI), relies on a variant of the Expected Improvement, a popular criterion for kriging-based global optimization. A statistical benchmark of ProKSI’s performances illustrates the efficiency and the robustness of the approach when using different kinds of proxies.
Resumo:
In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.
Resumo:
It is often claimed that scientists can obtain new knowledge about nature by running computer simulations. How is this possible? I answer this question by arguing that computer simulations are arguments. This view parallels Norton’s argument view about thought experiments. I show that computer simulations can be reconstructed as arguments that fully capture the epistemic power of the simulations. Assuming the extended mind hypothesis, I furthermore argue that running the computer simulation is to execute the reconstructing argument. I discuss some objections and reject the view that computer simulations produce knowledge because they are experiments. I conclude by comparing thought experiments and computer simulations, assuming that both are arguments.