914 resultados para system models
Resumo:
Thesis (Ph.D.)--University of Washington, 2015
Resumo:
The existing parking simulations, as most simulations, are intended to gain insights of a system or to make predictions. The knowledge they have provided has built up over the years, and several research works have devised detailed parking system models. This thesis work describes the use of an agent-based parking simulation in the context of a bigger parking system development. It focuses more on flexibility than on fidelity, showing the case where it is relevant for a parking simulation to consume dynamically changing GIS data from external, online sources and how to address this case. The simulation generates the parking occupancy information that sensing technologies should eventually produce and supplies it to the bigger parking system. It is built as a Java application based on the MASON toolkit and consumes GIS data from an ArcGis Server. The application context of the implemented parking simulation is a university campus with free, on-street parking places.
Resumo:
Dans le domaine des neurosciences computationnelles, l'hypothèse a été émise que le système visuel, depuis la rétine et jusqu'au cortex visuel primaire au moins, ajuste continuellement un modèle probabiliste avec des variables latentes, à son flux de perceptions. Ni le modèle exact, ni la méthode exacte utilisée pour l'ajustement ne sont connus, mais les algorithmes existants qui permettent l'ajustement de tels modèles ont besoin de faire une estimation conditionnelle des variables latentes. Cela nous peut nous aider à comprendre pourquoi le système visuel pourrait ajuster un tel modèle; si le modèle est approprié, ces estimé conditionnels peuvent aussi former une excellente représentation, qui permettent d'analyser le contenu sémantique des images perçues. Le travail présenté ici utilise la performance en classification d'images (discrimination entre des types d'objets communs) comme base pour comparer des modèles du système visuel, et des algorithmes pour ajuster ces modèles (vus comme des densités de probabilité) à des images. Cette thèse (a) montre que des modèles basés sur les cellules complexes de l'aire visuelle V1 généralisent mieux à partir d'exemples d'entraînement étiquetés que les réseaux de neurones conventionnels, dont les unités cachées sont plus semblables aux cellules simples de V1; (b) présente une nouvelle interprétation des modèles du système visuels basés sur des cellules complexes, comme distributions de probabilités, ainsi que de nouveaux algorithmes pour les ajuster à des données; et (c) montre que ces modèles forment des représentations qui sont meilleures pour la classification d'images, après avoir été entraînés comme des modèles de probabilités. Deux innovations techniques additionnelles, qui ont rendu ce travail possible, sont également décrites : un algorithme de recherche aléatoire pour sélectionner des hyper-paramètres, et un compilateur pour des expressions mathématiques matricielles, qui peut optimiser ces expressions pour processeur central (CPU) et graphique (GPU).
Resumo:
Im Rahmen dieser Arbeit wird eine gemeinsame Optimierung der Hybrid-Betriebsstrategie und des Verhaltens des Verbrennungsmotors vorgestellt. Die Übernahme von den im Steuergerät verwendeten Funktionsmodulen in die Simulationsumgebung für Fahrzeuglängsdynamik stellt eine effiziente Applikationsmöglichkeit der Originalparametrierung dar. Gleichzeitig ist es notwendig, das Verhalten des Verbrennungsmotors derart nachzubilden, dass das stationäre und das dynamische Verhalten, inklusive aller relevanten Einflussmöglichkeiten, wiedergegeben werden kann. Das entwickelte Werkzeug zur Übertragung der in Ascet definierten Steurgerätefunktionen in die Simulink-Simulationsumgebung ermöglicht nicht nur die Simulation der relevanten Funktionsmodule, sondern es erfüllt auch weitere wichtige Eigenschaften. Eine erhöhte Flexibilität bezüglich der Daten- und Funktionsstandänderungen, sowie die Parametrierbarkeit der Funktionsmodule sind Verbesserungen die an dieser Stelle zu nennen sind. Bei der Modellierung des stationären Systemverhaltens des Verbrennungsmotors erfolgt der Einsatz von künstlichen neuronalen Netzen. Die Auswahl der optimalen Neuronenanzahl erfolgt durch die Betrachtung des SSE für die Trainings- und die Verifikationsdaten. Falls notwendig, wird zur Sicherstellung der angestrebten Modellqualität, das Interpolationsverhalten durch Hinzunahme von Gauß-Prozess-Modellen verbessert. Mit den Gauß-Prozess-Modellen werden hierbei zusätzliche Stützpunkte erzeugt und mit einer verminderten Priorität in die Modellierung eingebunden. Für die Modellierung des dynamischen Systemverhaltens werden lineare Übertragungsfunktionen verwendet. Bei der Minimierung der Abweichung zwischen dem Modellausgang und den Messergebnissen wird zusätzlich zum SSE das 2σ-Intervall der relativen Fehlerverteilung betrachtet. Die Implementierung der Steuergerätefunktionsmodule und der erstellten Steller-Sensor-Streckenmodelle in der Simulationsumgebung für Fahrzeuglängsdynamik führt zum Anstieg der Simulationszeit und einer Vergrößerung des Parameterraums. Das aus Regelungstechnik bekannte Verfahren der Gütevektoroptimierung trägt entscheidend zu einer systematischen Betrachtung und Optimierung der Zielgrößen bei. Das Ergebnis des Verfahrens ist durch das Optimum der Paretofront der einzelnen Entwurfsspezifikationen gekennzeichnet. Die steigenden Simulationszeiten benachteiligen Minimumsuchverfahren, die eine Vielzahl an Iterationen benötigen. Um die Verwendung einer Zufallsvariablen, die maßgeblich zur Steigerung der Iterationanzahl beiträgt, zu vermeiden und gleichzeitig eine Globalisierung der Suche im Parameterraum zu ermöglichen wird die entwickelte Methode DelaunaySearch eingesetzt. Im Gegensatz zu den bekannten Algorithmen, wie die Partikelschwarmoptimierung oder die evolutionären Algorithmen, setzt die neu entwickelte Methode bei der Suche nach dem Minimum einer Kostenfunktion auf eine systematische Analyse der durchgeführten Simulationsergebnisse. Mit Hilfe der bei der Analyse gewonnenen Informationen werden Bereiche mit den bestmöglichen Voraussetzungen für ein Minimum identifiziert. Somit verzichtet das iterative Verfahren bei der Bestimmung des nächsten Iterationsschrittes auf die Verwendung einer Zufallsvariable. Als Ergebnis der Berechnungen steht ein gut gewählter Startwert für eine lokale Optimierung zur Verfügung. Aufbauend auf der Simulation der Fahrzeuglängsdynamik, der Steuergerätefunktionen und der Steller-Sensor-Streckenmodelle in einer Simulationsumgebung wird die Hybrid-Betriebsstrategie gemeinsam mit der Steuerung des Verbrennungsmotors optimiert. Mit der Entwicklung und Implementierung einer neuen Funktion wird weiterhin die Verbindung zwischen der Betriebsstrategie und der Motorsteuerung erweitert. Die vorgestellten Werkzeuge ermöglichten hierbei nicht nur einen Test der neuen Funktionalitäten, sondern auch eine Abschätzung der Verbesserungspotentiale beim Verbrauch und Abgasemissionen. Insgesamt konnte eine effiziente Testumgebung für eine gemeinsame Optimierung der Betriebsstrategie und des Verbrennungsmotorverhaltens eines Hybridfahrzeugs realisiert werden.
Resumo:
As exploration of our solar system and outerspace move into the future, spacecraft are being developed to venture on increasingly challenging missions with bold objectives. The spacecraft tasked with completing these missions are becoming progressively more complex. This increases the potential for mission failure due to hardware malfunctions and unexpected spacecraft behavior. A solution to this problem lies in the development of an advanced fault management system. Fault management enables spacecraft to respond to failures and take repair actions so that it may continue its mission. The two main approaches developed for spacecraft fault management have been rule-based and model-based systems. Rules map sensor information to system behaviors, thus achieving fast response times, and making the actions of the fault management system explicit. These rules are developed by having a human reason through the interactions between spacecraft components. This process is limited by the number of interactions a human can reason about correctly. In the model-based approach, the human provides component models, and the fault management system reasons automatically about system wide interactions and complex fault combinations. This approach improves correctness, and makes explicit the underlying system models, whereas these are implicit in the rule-based approach. We propose a fault detection engine, Compiled Mode Estimation (CME) that unifies the strengths of the rule-based and model-based approaches. CME uses a compiled model to determine spacecraft behavior more accurately. Reasoning related to fault detection is compiled in an off-line process into a set of concurrent, localized diagnostic rules. These are then combined on-line along with sensor information to reconstruct the diagnosis of the system. These rules enable a human to inspect the diagnostic consequences of CME. Additionally, CME is capable of reasoning through component interactions automatically and still provide fast and correct responses. The implementation of this engine has been tested against the NEAR spacecraft advanced rule-based system, resulting in detection of failures beyond that of the rules. This evolution in fault detection will enable future missions to explore the furthest reaches of the solar system without the burden of human intervention to repair failed components.
Resumo:
Shelf and coastal seas are regions of exceptionally high biological productivity, high rates of biogeochemical cycling and immense socio-economic importance. They are, however, poorly represented by the present generation of Earth system models, both in terms of resolution and process representation. Hence, these models cannot be used to elucidate the role of the coastal ocean in global biogeochemical cycles and the effects global change (both direct anthropogenic and climatic) are having on them. Here, we present a system for simulating all the coastal regions around the world (the Global Coastal Ocean Modelling System) in a systematic and practical fashion. It is based on automatically generating multiple nested model domains, using the Proudman Oceanographic Laboratory Coastal Ocean Modelling System coupled to the European Regional Seas Ecosystem Model. Preliminary results from the system are presented. These demonstrate the viability of the concept, and we discuss the prospects for using the system to explore key areas of global change in shelf seas, such as their role in the carbon cycle and climate change effects on fisheries.
Resumo:
The Atlantic meridional overturning circulation (AMOC) is an important component of the climate system. Models indicate that the AMOC can be perturbed by freshwater forcing in the North Atlantic. Using an ocean-atmosphere general circulation model, we investigate the dependence of such a perturbation of the AMOC, and the consequent climate change, on the region of freshwater forcing. A wide range of changes in AMOC strength is found after 100 years of freshwater forcing. The largest changes in AMOC strength occur when the regions of deepwater formation in the model are forced directly, although reductions in deepwater formation in one area may be compensated by enhanced formation elsewhere. North Atlantic average surface air temperatures correlate linearly with the AMOC decline, but warming may occur in localised regions, notably over Greenland and where deepwater formation is enhanced. This brings into question the representativeness of temperature changes inferred from Greenland ice-core records.
Resumo:
The purpose of Research Theme 4 (RT4) was to advance understanding of the basic science issues at the heart of the ENSEMBLES project, focusing on the key processes that govern climate variability and change, and that determine the predictability of climate. Particular attention was given to understanding linear and non-linear feedbacks that may lead to climate surprises,and to understanding the factors that govern the probability of extreme events. Improved understanding of these issues will contribute significantly to the quantification and reduction of uncertainty in seasonal to decadal predictions and projections of climate change. RT4 exploited the ENSEMBLES integrations (stream 1) performed in RT2A as well as undertaking its own experimentation to explore key processes within the climate system. It was working at the cutting edge of problems related to climate feedbacks, the interaction between climate variability and climate change � especially how climate change pertains to extreme events, and the predictability of the climate system on a range of time-scales. The statisticalmethodologies developed for extreme event analysis are new and state-of-the-art. The RT4-coordinated experiments, which have been conducted with six different atmospheric GCMs forced by common timeinvariant sea surface temperature (SST) and sea-ice fields (removing some sources of inter-model variability), are designed to help to understand model uncertainty (rather than scenario or initial condition uncertainty) in predictions of the response to greenhouse-gas-induced warming. RT4 links strongly with RT5 on the evaluation of the ENSEMBLES prediction system and feeds back its results to RT1 to guide improvements in the Earth system models and, through its research on predictability, to steer the development of methods for initialising the ensembles
Resumo:
Perturbations to the carbon cycle could constitute large feedbacks on future changes in atmospheric CO2 concentration and climate. This paper demonstrates how carbon cycle feedback can be expressed in formally similar ways to climate feedback, and thus compares their magnitudes. The carbon cycle gives rise to two climate feedback terms: the concentration–carbon feedback, resulting from the uptake of carbon by land and ocean as a biogeochemical response to the atmospheric CO2 concentration, and the climate–carbon feedback, resulting from the effect of climate change on carbon fluxes. In the earth system models of the Coupled Climate–Carbon Cycle Model Intercomparison Project (C4MIP), climate–carbon feedback on warming is positive and of a similar size to the cloud feedback. The concentration–carbon feedback is negative; it has generally received less attention in the literature, but in magnitude it is 4 times larger than the climate–carbon feedback and more uncertain. The concentration–carbon feedback is the dominant uncertainty in the allowable CO2 emissions that are consistent with a given CO2 concentration scenario. In modeling the climate response to a scenario of CO2 emissions, the net carbon cycle feedback is of comparable size and uncertainty to the noncarbon–climate response. To quantify simulated carbon cycle feedbacks satisfactorily, a radiatively coupled experiment is needed, in addition to the fully coupled and biogeochemically coupled experiments, which are referred to as coupled and uncoupled in C4MIP. The concentration–carbon and climate–carbon feedbacks do not combine linearly, and the concentration–carbon feedback is dependent on scenario and time.
Resumo:
The Atlantic thermohaline circulation (THC) is an important part of the earth's climate system. Previous research has shown large uncertainties in simulating future changes in this critical system. The simulated THC response to idealized freshwater perturbations and the associated climate changes have been intercompared as an activity of World Climate Research Program (WCRP) Coupled Model Intercomparison Project/Paleo-Modeling Intercomparison Project (CMIP/PMIP) committees. This intercomparison among models ranging from the earth system models of intermediate complexity (EMICs) to the fully coupled atmosphere-ocean general circulation models (AOGCMs) seeks to document and improve understanding of the causes of the wide variations in the modeled THC response. The robustness of particular simulation features has been evaluated across the model results. In response to 0.1-Sv (1 Sv equivalent to 10(6) ms(3) s(-1)) freshwater input in the northern North Atlantic, the multimodel ensemble mean THC weakens by 30% after 100 yr. All models simulate sonic weakening of the THC, but no model simulates a complete shutdown of the THC. The multimodel ensemble indicates that the surface air temperature could present a complex anomaly pattern with cooling south of Greenland and warming over the Barents and Nordic Seas. The Atlantic ITCZ tends to shift southward. In response to 1.0-Sv freshwater input, the THC switches off rapidly in all model simulations. A large cooling occurs over the North Atlantic. The annual mean Atlantic ITCZ moves into the Southern Hemisphere. Models disagree in terms of the reversibility of the THC after its shutdown. In general, the EMICs and AOGCMs obtain similar THC responses and climate changes with more pronounced and sharper patterns in the AOGCMs.
Resumo:
Inverse problems for dynamical system models of cognitive processes comprise the determination of synaptic weight matrices or kernel functions for neural networks or neural/dynamic field models, respectively. We introduce dynamic cognitive modeling as a three tier top-down approach where cognitive processes are first described as algorithms that operate on complex symbolic data structures. Second, symbolic expressions and operations are represented by states and transformations in abstract vector spaces. Third, prescribed trajectories through representation space are implemented in neurodynamical systems. We discuss the Amari equation for a neural/dynamic field theory as a special case and show that the kernel construction problem is particularly ill-posed. We suggest a Tikhonov-Hebbian learning method as regularization technique and demonstrate its validity and robustness for basic examples of cognitive computations.
Resumo:
Dual-system models suggest that English past tense morphology involves two processing routes: rule application for regular verbs and memory retrieval for irregular verbs (Pinker, 1999). In second language (L2) processing research, Ullman (2001a) suggested that both verb types are retrieved from memory, but more recently Clahsen and Felser (2006) and Ullman (2004) argued that past tense rule application can be automatised with experience by L2 learners. To address this controversy, we tested highly proficient Greek-English learners with naturalistic or classroom L2 exposure compared to native English speakers in a self-paced reading task involving past tense forms embedded in plausible sentences. Our results suggest that, irrespective to the type of exposure, proficient L2 learners of extended L2 exposure apply rule-based processing.
Resumo:
The World Weather Research Programme (WWRP) and the World Climate Research Programme (WCRP) have identified collaborations and scientific priorities to accelerate advances in analysis and prediction at subseasonal-to-seasonal time scales, which include i) advancing knowledge of mesoscale–planetary-scale interactions and their prediction; ii) developing high-resolution global–regional climate simulations, with advanced representation of physical processes, to improve the predictive skill of subseasonal and seasonal variability of high-impact events, such as seasonal droughts and floods, blocking, and tropical and extratropical cyclones; iii) contributing to the improvement of data assimilation methods for monitoring and predicting used in coupled ocean–atmosphere–land and Earth system models; and iv) developing and transferring diagnostic and prognostic information tailored to socioeconomic decision making. The document puts forward specific underpinning research, linkage, and requirements necessary to achieve the goals of the proposed collaboration.
Resumo:
Plant traits – the morphological, anatomical, physiological, biochemical and phenological characteristics of plants and their organs – determine how primary producers respond to environmental factors, affect other trophic levels, influence ecosystem processes and services and provide a link from species richness to ecosystem functional diversity. Trait data thus represent the raw material for a wide range of research from evolutionary biology, community and functional ecology to biogeography. Here we present the global database initiative named TRY, which has united a wide range of the plant trait research community worldwide and gained an unprecedented buy-in of trait data: so far 93 trait databases have been contributed. The data repository currently contains almost three million trait entries for 69 000 out of the world's 300 000 plant species, with a focus on 52 groups of traits characterizing the vegetative and regeneration stages of the plant life cycle, including growth, dispersal, establishment and persistence. A first data analysis shows that most plant traits are approximately log-normally distributed, with widely differing ranges of variation across traits. Most trait variation is between species (interspecific), but significant intraspecific variation is also documented, up to 40% of the overall variation. Plant functional types (PFTs), as commonly used in vegetation models, capture a substantial fraction of the observed variation – but for several traits most variation occurs within PFTs, up to 75% of the overall variation. In the context of vegetation models these traits would better be represented by state variables rather than fixed parameter values. The improved availability of plant trait data in the unified global database is expected to support a paradigm shift from species to trait-based ecology, offer new opportunities for synthetic plant trait research and enable a more realistic and empirically grounded representation of terrestrial vegetation in Earth system models.
Resumo:
The scientific understanding of the Earth’s climate system, including the central question of how the climate system is likely to respond to human-induced perturbations, is comprehensively captured in GCMs and Earth System Models (ESM). Diagnosing the simulated climate response, and comparing responses across different models, is crucially dependent on transparent assumptions of how the GCM/ESM has been driven – especially because the implementation can involve subjective decisions and may differ between modelling groups performing the same experiment. This paper outlines the climate forcings and setup of the Met Office Hadley Centre ESM, HadGEM2-ES for the CMIP5 set of centennial experiments. We document the prescribed greenhouse gas concentrations, aerosol precursors, stratospheric and tropospheric ozone assumptions, as well as implementation of land-use change and natural forcings for the HadGEM2-ES historical and future experiments following the Representative Concentration Pathways. In addition, we provide details of how HadGEM2-ES ensemble members were initialised from the control run and how the palaeoclimate and AMIP experiments, as well as the “emission driven” RCP experiments were performed.