949 resultados para Simulations biomécaniques
Resumo:
While sound and video may capture viewers' attention, interaction can captivate them. This has not been available prior to the advent of Digital Television. In fact, what lies at the heart of the Digital Television revolution is this new type of interactive content, offered in the form of interactive Television (iTV) services. On top of that, the new world of converged networks has created a demand for a new type of converged services on a range of mobile terminals (Tablet PCs, PDAs and mobile phones). This paper aims at presenting a new approach to service creation that allows for the semi-automatic translation of simulations and rapid prototypes created in the accessible desktop multimedia authoring package Macromedia Director into services ready for broadcast. This is achieved by a series of tools that de-skill and speed-up the process of creating digital TV user interfaces (UI) and applications for mobile terminals. The benefits of rapid prototyping are essential for the production of these new types of services, and are therefore discussed in the first section of this paper. In the following sections, an overview of the operation of content, service, creation and management sub-systems is presented, which illustrates why these tools compose an important and integral part of a system responsible of creating, delivering and managing converged broadcast and telecommunications services. The next section examines a number of metadata languages candidates for describing the iTV services user interface and the schema language adopted in this project. A detailed description of the operation of the two tools is provided to offer an insight of how they can be used to de-skill and speed-up the process of creating digital TV user interfaces and applications for mobile terminals. Finally, representative broadcast oriented and telecommunication oriented converged service components are also introduced, demonstrating how these tools have been used to generate different types of services.
Resumo:
Neurons in Action (NIA1, 2000; NIA1.5, 2004; NIA2, 2007), a set of tutorials and linked simulations, is designed to acquaint students with neuronal physiology through interactive, virtual laboratory experiments. Here we explore the uses of NIA in lecture, both interactive and didactic, as well as in the undergraduate laboratory, in the graduate seminar course, and as an examination tool through homework and problem set assignments. NIA, made with the simulator NEURON (http://www.neuron.yale.edu/neuron/), displays voltages, currents, and conductances in a membrane patch or signals moving within the dendrites, soma and/or axon of a neuron. Customized simulations start with the plain lipid bilayer and progress through equilibrium potentials; currents through single Na and K channels; Na and Ca action potentials; voltage clamp of a patch or a whole neuron; voltage spread and propagation in axons, motoneurons and nerve terminals; synaptic excitation and inhibition; and advanced topics such as channel kinetics and coincidence detection. The user asks and answers "what if" questions by specifying neuronal parameters, ion concentrations, and temperature, and the experimental results are then plotted as conductances, currents, and voltage changes. Such exercises provide immediate confirmation or refutation of the student's ideas to guide their learning. The tutorials are hyperlinked to explanatory information and to original research papers. Although the NIA tutorials were designed as a sequence to empower a student with a working knowledge of fundamental neuronal principles, we find that faculty are using the individual tutorials in a variety of educational situations, some of which are described here. Here we offer ideas to colleagues using interactive software, whether NIA or another tool, for educating students of differing backgrounds in the subject of neurophysiology.
Resumo:
In recent years interactive media and tools, like scientific simulations and simulation environments or dynamic data visualizations, became established methods in the neural and cognitive sciences. Hence, university teachers of neural and cognitive sciences are faced with the challenge to integrate these media into the neuroscientific curriculum. Especially simulations and dynamic visualizations offer great opportunities for teachers and learners, since they are both illustrative and explorable. However, simulations bear instructional problems: they are abstract, demand some computer skills and conceptual knowledge about what simulations intend to explain. By following two central questions this article provides an overview on possible approaches to be applied in neuroscience education and opens perspectives for their curricular integration: (i) How can complex scientific media be transformed for educational use in an efficient and (for students on all levels) comprehensible manner and (ii) by what technical infrastructure can this transformation be supported? Exemplified by educational simulations for the neurosciences and their application in courses, answers to these questions are proposed a) by introducing a specific educational simulation approach for the neurosciences b) by introducing an e-learning environment for simulations, and c) by providing examples of curricular integration on different levels which might help academic teachers to integrate newly created or existing interactive educational resources in their courses.
Resumo:
Die hohe Komplexität zellularer intralogistischer Systeme und deren Steuerungsarchitektur legt die Verwendung moderner Simulations- und Visualisierungstechniken nahe, um schon im Vorfeld Aussagen über die Leistungsfähigkeit und Zukunftssicherheit eines geplanten Systems treffen zu können. In dieser Arbeit wird ein Konzept für ein Simulationssystem zur VR-basierten Steuerungsverifikation zellularer Intralogistiksysteme vorgestellt. Beschrieben wird die Erstellung eines Simulationsmodells für eine real existierende Anlage und es wird ein Überblick über die Bestandteile der Simulation, insbesondere die Anbindung der Steuerung des realen agentenbasierten Systems, gegeben.
Resumo:
Die Bestimmung der Leistungsverfügbarkeit als Maß für den Erfüllungsgrad logistischer Prozesse erfolgt gewöhnlich während des Betriebes einer logistischen Anlage. Wir haben ein Simulationssystem entwickelt, um die Leistungsverfügbarkeit einer zellularen intralogistischen Anlage schon im Vorfeld ermitteln zu können. Die detailgenaue Simulation erfasst dabei nicht nur Einflüsse statischer Parameter wie Dimensionierung der Anlage sondern auch dynamische Parameter, wie Fahrzeugverhalten oder Auftragszusammensetzung. Aufgrund der Echtzeit- und VR-Fähigkeit des vorgestellten Systems, ist eine Präsentation in einer VR-Umgebung möglich. Intuitive Interaktionsmechanismen und Visualisierungs-Metaphern bieten einen intuitiven Zugang zur Leistungsverfügbarkeit des Systems und den Größen, die sie beeinflussen.
Resumo:
We present a molecular modeling study based on ab initio and classical molecular dynamics calculations, for the investigation of the tridimensional structure and supramolecular assembly formation of heptapyrenotide oligomers in water solution. Our calculations show that free oligomers self-assemble in helical structures characterized by an inner core formed by π- stacked pyrene units, and external grooves formed by the linker moieties. The coiling of the linkers has high ordering, dominated by hydrogen-bond interactions among the phosphate and amide groups. Our models support a mechanism of longitudinal supramolecular oligomerization based on interstrand pyrene intercalation. Only a minimal number of pyrene units intercalate at one end, favoring formation of very extended longitudinal chains, as also detected by AFM experiment. Our results provide a structural explanation of the mechanism of chirality amplification in 1:1 mixtures of standard heptapyrenotides and modified oligomers with covalently linked deoxycytidine, based on selective molecular recognition and binding of the nucleotide to the groove of the left-wound helix.
Resumo:
We analyze the impact of stratospheric volcanic aerosols on the diurnal temperature range (DTR) over Europe using long-term subdaily station records. We compare the results with a 28-member ensemble of European Centre/Hamburg version 5.4 (ECHAM5.4) general circulation model simulations. Eight stratospheric volcanic eruptions during the instrumental period are investigated. Seasonal all- and clear-sky DTR anomalies are compared with contemporary (approximately 20 year) reference periods. Clear sky is used to eliminate cloud effects and better estimate the signal from the direct radiative forcing of the volcanic aerosols. We do not find a consistent effect of stratospheric aerosols on all-sky DTR. For clear skies, we find average DTR anomalies of −0.08°C (−0.13°C) in the observations (in the model), with the largest effect in the second winter after the eruption. Although the clear-sky DTR anomalies from different stations, volcanic eruptions, and seasons show heterogeneous signals in terms of order of magnitude and sign, the significantly negative DTR anomalies (e.g., after the Tambora eruption) are qualitatively consistent with other studies. Referencing with clear-sky DTR anomalies to the radiative forcing from stratospheric volcanic eruptions, we find the resulting sensitivity to be of the same order of magnitude as previously published estimates for tropospheric aerosols during the so-called “global dimming” period (i.e., 1950s to 1980s). Analyzing cloud cover changes after volcanic eruptions reveals an increase in clear-sky days in both data sets. Quantifying the impact of stratospheric volcanic eruptions on clear-sky DTR over Europe provides valuable information for the study of the radiative effect of stratospheric aerosols and for geo-engineering purposes.
Resumo:
Let us consider a large set of candidate parameter fields, such as hydraulic conductivity maps, on which we can run an accurate forward flow and transport simulation. We address the issue of rapidly identifying a subset of candidates whose response best match a reference response curve. In order to keep the number of calls to the accurate flow simulator computationally tractable, a recent distance-based approach relying on fast proxy simulations is revisited, and turned into a non-stationary kriging method where the covariance kernel is obtained by combining a classical kernel with the proxy. Once the accurate simulator has been run for an initial subset of parameter fields and a kriging metamodel has been inferred, the predictive distributions of misfits for the remaining parameter fields can be used as a guide to select candidate parameter fields in a sequential way. The proposed algorithm, Proxy-based Kriging for Sequential Inversion (ProKSI), relies on a variant of the Expected Improvement, a popular criterion for kriging-based global optimization. A statistical benchmark of ProKSI’s performances illustrates the efficiency and the robustness of the approach when using different kinds of proxies.
Resumo:
In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.
Resumo:
It is often claimed that scientists can obtain new knowledge about nature by running computer simulations. How is this possible? I answer this question by arguing that computer simulations are arguments. This view parallels Norton’s argument view about thought experiments. I show that computer simulations can be reconstructed as arguments that fully capture the epistemic power of the simulations. Assuming the extended mind hypothesis, I furthermore argue that running the computer simulation is to execute the reconstructing argument. I discuss some objections and reject the view that computer simulations produce knowledge because they are experiments. I conclude by comparing thought experiments and computer simulations, assuming that both are arguments.
Resumo:
The purpose of this evaluation project was to describe the integration of simulation into a nursing internship program and to help prepare new graduate nurses for patient care. Additionally, learning styles and perceptions of active learning, collaboration among peers, ways of learning, expectation of simulation, satisfaction, self-confidence, and design of simulation were examined. [See PDF for complete abstract]
Resumo:
Correct estimation of the firn lock-in depth is essential for correctly linking gas and ice chronologies in ice core studies. Here, two approaches to constrain the firn depth evolution in Antarctica are presented over the last deglaciation: outputs of a firn densification model, and measurements of δ15N of N2 in air trapped in ice core, assuming that δ15N is only affected by gravitational fractionation in the firn column. Since the firn densification process is largely governed by surface temperature and accumulation rate, we have investigated four ice cores drilled in coastal (Berkner Island, BI, and James Ross Island, JRI) and semi-coastal (TALDICE and EPICA Dronning Maud Land, EDML) Antarctic regions. Combined with available ice core air-δ15N measurements from the EPICA Dome C (EDC) site, the studied regions encompass a large range of surface accumulation rates and temperature conditions. Our δ15N profiles reveal a heterogeneous response of the firn structure to glacial–interglacial climatic changes. While firn densification simulations correctly predict TALDICE δ15N variations, they systematically fail to capture the large millennial-scale δ15N variations measured at BI and the δ15N glacial levels measured at JRI and EDML – a mismatch previously reported for central East Antarctic ice cores. New constraints of the EDML gas–ice depth offset during the Laschamp event (~41 ka) and the last deglaciation do not favour the hypothesis of a large convective zone within the firn as the explanation of the glacial firn model–δ15N data mismatch for this site. While we could not conduct an in-depth study of the influence of impurities in snow for firnification from the existing datasets, our detailed comparison between the δ15N profiles and firn model simulations under different temperature and accumulation rate scenarios suggests that the role of accumulation rate may have been underestimated in the current description of firnification models.
Resumo:
Understanding natural climate variability and its driving factors is crucial to assessing future climate change. Therefore, comparing proxy-based climate reconstructions with forcing factors as well as comparing these with paleoclimate model simulations is key to gaining insights into the relative roles of internal versus forced variability. A review of the state of modelling of the climate of the last millennium prior to the CMIP5–PMIP3 (Coupled Model Intercomparison Project Phase 5–Paleoclimate Modelling Intercomparison Project Phase 3) coordinated effort is presented and compared to the available temperature reconstructions. Simulations and reconstructions broadly agree on reproducing the major temperature changes and suggest an overall linear response to external forcing on multidecadal or longer timescales. Internal variability is found to have an important influence at hemispheric and global scales. The spatial distribution of simulated temperature changes during the transition from the Medieval Climate Anomaly to the Little Ice Age disagrees with that found in the reconstructions. Thus, either internal variability is a possible major player in shaping temperature changes through the millennium or the model simulations have problems realistically representing the response pattern to external forcing. A last millennium transient climate response (LMTCR) is defined to provide a quantitative framework for analysing the consistency between simulated and reconstructed climate. Beyond an overall agreement between simulated and reconstructed LMTCR ranges, this analysis is able to single out specific discrepancies between some reconstructions and the ensemble of simulations. The disagreement is found in the cases where the reconstructions show reduced covariability with external forcings or when they present high rates of temperature change.