907 resultados para Hyperbolic Dynamic System
Resumo:
More than a century ago Ramon y Cajal pioneered the description of neural circuits. Currently, new techniques are being developed to streamline the characterization of entire neural circuits. Even if this 'connectome' approach is successful, it will represent only a static description of neural circuits. Thus, a fundamental question in neuroscience is to understand how information is dynamically represented by neural populations. In this thesis, I studied two main aspects of dynamical population codes. ^ First, I studied how the exposure or adaptation, for a fraction of a second to oriented gratings dynamically changes the population response of primary visual cortex neurons. The effects of adaptation to oriented gratings have been extensively explored in psychophysical and electrophysiological experiments. However, whether rapid adaptation might induce a change in the primary visual cortex's functional connectivity to dynamically impact the population coding accuracy is currently unknown. To address this issue, we performed multi-electrode recordings in primary visual cortex, where adaptation has been previously shown to induce changes in the selectivity and response amplitude of individual neurons. We found that adaptation improves the population coding accuracy. The improvement was more prominent for iso- and orthogonal orientation adaptation, consistent with previously reported psychophysical experiments. We propose that selective decorrelation is a metabolically inexpensive mechanism that the visual system employs to dynamically adapt the neural responses to the statistics of the input stimuli to improve coding efficiency. ^ Second, I investigated how ongoing activity modulates orientation coding in single neurons, neural populations and behavior. Cortical networks are never silent even in the absence of external stimulation. The ongoing activity can account for up to 80% of the metabolic energy consumed by the brain. Thus, a fundamental question is to understand the functional role of ongoing activity and its impact on neural computations. I studied how the orientation coding by individual neurons and cell populations in primary visual cortex depend on the spontaneous activity before stimulus presentation. We hypothesized that since the ongoing activity of nearby neurons is strongly correlated, it would influence the ability of the entire population of orientation-selective cells to process orientation depending on the prestimulus spontaneous state. Our findings demonstrate that ongoing activity dynamically filters incoming stimuli to shape the accuracy of orientation coding by individual neurons and cell populations and this interaction affects behavioral performance. In summary, this thesis is a contribution to the study of how dynamic internal states such as rapid adaptation and ongoing activity modulate the population code accuracy. ^
Resumo:
A diverse suite of geochemical tracers, including 87Sr/86Sr and 143Nd/144Nd isotope ratios, the rare earth elements (REEs), and select trace elements were used to determine sand-sized sediment provenance and transport pathways within the San Francisco Bay coastal system. This study complements a large interdisciplinary effort (Barnard et al., 2012) that seeks to better understand recent geomorphic change in a highly urbanized and dynamic estuarine-coastal setting. Sand-sized sediment provenance in this geologically complex system is important to estuarine resource managers and was assessed by examining the geographic distribution of this suite of geochemical tracers from the primary sources (fluvial and rock) throughout the bay, adjacent coast, and beaches. Due to their intrinsic geochemical nature, 143Nd/144Nd isotopic ratios provide the most resolved picture of where sediment in this system is likely sourced and how it moves through this estuarine system into the Pacific Ocean. For example, Nd isotopes confirm that the predominant source of sand-sized sediment to Suisun Bay, San Pablo Bay, and Central Bay is the Sierra Nevada Batholith via the Sacramento River, with lesser contributions from the Napa and San Joaquin Rivers. Isotopic ratios also reveal hot-spots of local sediment accumulation, such as the basalt and chert deposits around the Golden Gate Bridge and the high magnetite deposits of Ocean Beach. Sand-sized sediment that exits San Francisco Bay accumulates on the ebb-tidal delta and is in part conveyed southward by long-shore currents. Broadly, the geochemical tracers reveal a complex story of multiple sediment sources, dynamic intra-bay sediment mixing and reworking, and eventual dilution and transport by energetic marine processes. Combined geochemical results provide information on sediment movement into and through San Francisco Bay and further our understanding of how sustained anthropogenic activities which limit sediment inputs to the system (e.g., dike and dam construction) as well as those which directly remove sediments from within the Bay, such as aggregate mining and dredging, can have long-lasting effects.
Resumo:
We introduce two probabilistic, data-driven models that predict a ship's speed and the situations where a ship is probable to get stuck in ice based on the joint effect of ice features such as the thickness and concentration of level ice, ice ridges, rafted ice, moreover ice compression is considered. To develop the models to datasets were utilized. First, the data from the Automatic Identification System about the performance of a selected ship was used. Second, a numerical ice model HELMI, developed in the Finnish Meteorological Institute, provided information about the ice field. The relations between the ice conditions and ship movements were established using Bayesian learning algorithms. The case study presented in this paper considers a single and unassisted trip of an ice-strengthened bulk carrier between two Finnish ports in the presence of challenging ice conditions, which varied in time and space. The obtained results show good prediction power of the models. This means, on average 80% for predicting the ship's speed within specified bins, and above 90% for predicting cases where a ship may get stuck in ice. We expect this new approach to facilitate the safe and effective route selection problem for ice-covered waters where the ship performance is reflected in the objective function.
Resumo:
We compared particle data from a moored video camera system with sediment trap derived fluxes at ~1100 m depth in the highly dynamic coastal upwelling system off Cape Blanc, Mauritania. Between spring 2008 and winter 2010 the trap collected settling particles in 9-day intervals, while the camera recorded in-situ particle abundance and size-distribution every third day. Particle fluxes were highly variable (40-1200 mg m**-2 d**-1) and followed distinct seasonal patterns with peaks during spring, summer and fall. The particle flux patterns from the sediment traps correlated to the total particle volume captured by the video camera, which ranged from1 to 22 mm**3 l**-1. The measured increase in total particle volume during periods of high mass flux appeared to be better related to increases in the particle concentrations, rather than to increased average particle size. We observed events that had similar particle fluxes, but showed clear differences in particle abundance and size-distribution, and vice versa. Such observations can only be explained by shifts in the composition of the settling material, with changes both in particle density and chemical composition. For example, the input of wind-blown dust from the Sahara during September 2009 led to the formation of high numbers of comparably small particles in the water column. This suggests that, besides seasonal changes, the composition of marine particles in one region underlies episodical changes. The time between the appearance of high dust concentrations in the atmosphere and the increase lithogenic flux in the 1100 m deep trap suggested an average settling rate of 200 m d**-1, indicating a close and fast coupling between dust input and sedimentation of the material.
Resumo:
The paper aims to develop a quasi-dynamic interregional input-output model for evaluating the macro-economic impacts of small city development. The features of the model are summarized as follows: (1) the consumption expenditure of households is regarded as an endogenous variable, (2) the technological change is determined by the change of industrial Location Quotient caused by firm's investment activities. (3) a strong feedback function between the city design and the economic analysis is provided. For checking the performance of the model, Saemangeum's Flux City Design Plan is used as the simulation target in our paper.
Resumo:
This paper uses a GVC (Global Value Chain)-based CGE model to assess the impact of TTIP between the U.S. and the EU on their main trading partners who are mainly engaged at the low end in the division system of global value chains, such as BRICS countries. The simulation results indicate that in general the TTIP would positively impact global trade and economies due to the reduction of both tariff and non-tariff barriers. With great increases in the US–EU bilateral trade, significant economic gains for the U.S. and the EU can be expected. For most BRICS countries, the aggregate exports and GDP suffer small negative impacts from the TTIP, except Brazil, but the inter-country trade within BRICS economies increases due to the substitution effect between the US–EU trade and the imports from BRICS countries when the TTIP commences.
Resumo:
Models are an effective tool for systems and software design. They allow software architects to abstract from the non-relevant details. Those qualities are also useful for the technical management of networks, systems and software, such as those that compose service oriented architectures. Models can provide a set of well-defined abstractions over the distributed heterogeneous service infrastructure that enable its automated management. We propose to use the managed system as a source of dynamically generated runtime models, and decompose management processes into a composition of model transformations. We have created an autonomic service deployment and configuration architecture that obtains, analyzes, and transforms system models to apply the required actions, while being oblivious to the low-level details. An instrumentation layer automatically builds these models and interprets the planned management actions to the system. We illustrate these concepts with a distributed service update operation.
Resumo:
In this paper, the dynamic response of a hydro power plant for providing secondary regulation reserve is studied in detail. Special emphasis is given to the elastic water column effects both in the penstock and the tailrace tunnel. For this purpose, a nonlinear model based on the analogy between mass and momentum conservation equations of a water conduit and those of wave propagation in transmission lines is used. The influence of the plant configuration and design parameters on the fulfilment of the Spanish Electrical System Operator requirements is analysed
Resumo:
This article describes a new visual servo control and strategies that are used to carry out dynamic tasks by the Robotenis platform. This platform is basically a parallel robot that is equipped with an acquisition and processing system of visual information, its main feature is that it has a completely open architecture control, and planned in order to design, implement, test and compare control strategies and algorithms (visual and actuated joint controllers). Following sections describe a new visual control strategy specially designed to track and intercept objects in 3D space. The results are compared with a controller shown in previous woks, where the end effector of the robot keeps a constant distance from the tracked object. In this work, the controller is specially designed in order to allow changes in the tracking reference. Changes in the tracking reference can be used to grip an object that is under movement, or as in this case, hitting a hanging Ping-Pong ball. Lyapunov stability is taken into account in the controller design.
Resumo:
This paper introduces a method to analyze and predict stability and transient performance of a distributed system where COTS (Commercial-off-the-shelf) modules share an input filter. The presented procedure is based on the measured data from the input and output terminals of the power modules. The required information for the analysis is obtained by performing frequency response measurements for each converter. This attained data is utilized to compute special transfer functions, which partly determine the source and load interactions within the converters. The system level dynamic description is constructed based on the measured and computed transfer functions introducing cross-coupling mechanisms within the system. System stability can be studied based on the well-known impedance- related minor-loop gain at an arbitrary interface within the system.
Resumo:
The contributions of driver behaviour as well as surrounding infrastructure are decisive on pollutant emissions from vehicles in real traffic situations. This article deals with the preliminary study of the interaction between the dynamic variables recorded in a vehicle (driving pattern) and pollutant emissions produced over a given urban route. It has been established a “dynamic performance index”-DPI, which is calculated from some driving pattern parameters, which in turn depends on traffic congestion level and route characteristics, in order to determine whether the driving has been aggressive, normal or calm. Two passenger cars instrumented with a portable activity measurement system -to record dynamic variables- and on-board emission measurement equipment have been used. This study has shown that smooth driving patterns can reduce up to 80% NOX emissions and up to 20% of fuel in the same route
Resumo:
Ciao is a logic-based, multi-paradigm programming system. One of its most distinguishing features is that it supports a large number of semantic and syntactic language features which can be selectively activated or deactivated for each program module. As a result, a module can be written in, for example, ISO-Prolog plus constraints and higher order, while another can be a puré logic module with a different control rule such as iterative deepening and/or tabling, and perhaps using constructive negation. A powerful and modular extensión mechanism allows user-level design and implementation of such features and sub-languages. Another distinguishing feature of Ciao is its powerful assertion language, which allows expressing many kinds of program properties (ranging from, e.g., moded types to resource consumption), as well as tests and documentation. The compiler is capable of statically ñnding violations of these properties or verifying that programs comply with them, and issuing certiñcates of this compliance. The compiler also performs many types of optimizations, including automatic parallelization. It offers very competitive performance, while retaining the flexibility and interactive development of a dynamic language. We will present a hands-on overview of the system, through small examples which emphasize the novel aspects and the motivations which lie behind Ciao's design and implementation.
Resumo:
Ciao Prolog incorporates a module system which allows sepárate compilation and sensible creation of standalone executables. We describe some of the main aspects of the Ciao modular compiler, ciaoc, which takes advantage of the characteristics of the Ciao Prolog module system to automatically perform sepárate and incremental compilation and efficiently build small, standalone executables with competitive run-time performance, ciaoc can also detect statically a larger number of programming errors. We also present a generic code processing library for handling modular programs, which provides an important part of the functionality of ciaoc. This library allows the development of program analysis and transformation tools in a way that is to some extent orthogonal to the details of module system design, and has been used in the implementation of ciaoc and other Ciao system tools. We also describe the different types of executables which can be generated by the Ciao compiler, which offer different tradeoffs between executable size, startup time, and portability, depending, among other factors, on the linking regime used (static, dynamic, lazy, etc.). Finally, we provide experimental data which illustrate these tradeoffs.
Resumo:
Ciao is a public domain, next generation multi-paradigm programming environment with a unique set of features: Ciao offers a complete Prolog system, supporting ISO-Prolog, but its novel modular design allows both restricting and extending the language. As a result, it allows working with fully declarative subsets of Prolog and also to extend these subsets (or ISO-Prolog) both syntactically and semantically. Most importantly, these restrictions and extensions can be activated separately on each program module so that several extensions can coexist in the same application for different modules. Ciao also supports (through such extensions) programming with functions, higher-order (with predicate abstractions), constraints, and objects, as well as feature terms (records), persistence, several control rules (breadth-first search, iterative deepening, ...), concurrency (threads/engines), a good base for distributed execution (agents), and parallel execution. Libraries also support WWW programming, sockets, external interfaces (C, Java, TclTk, relational databases, etc.), etc. Ciao offers support for programming in the large with a robust module/object system, module-based separate/incremental compilation (automatically -no need for makefiles), an assertion language for declaring (optional) program properties (including types and modes, but also determinacy, non-failure, cost, etc.), automatic static inference and static/dynamic checking of such assertions, etc. Ciao also offers support for programming in the small producing small executables (including only those builtins used by the program) and support for writing scripts in Prolog. The Ciao programming environment includes a classical top-level and a rich emacs interface with an embeddable source-level debugger and a number of execution visualization tools. The Ciao compiler (which can be run outside the top level shell) generates several forms of architecture-independent and stand-alone executables, which run with speed, efficiency and executable size which are very competive with other commercial and academic Prolog/CLP systems. Library modules can be compiled into compact bytecode or C source files, and linked statically, dynamically, or autoloaded. The novel modular design of Ciao enables, in addition to modular program development, effective global program analysis and static debugging and optimization via source to source program transformation. These tasks are performed by the Ciao preprocessor ( ciaopp, distributed separately). The Ciao programming environment also includes lpdoc, an automatic documentation generator for LP/CLP programs. It processes Prolog files adorned with (Ciao) assertions and machine-readable comments and generates manuals in many formats including postscript, pdf, texinfo, info, HTML, man, etc. , as well as on-line help, ascii README files, entries for indices of manuals (info, WWW, ...), and maintains WWW distribution sites.