13 resultados para simple timing task

em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco


Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last two decades, analysis of 1/f noise in cognitive science has led to a considerable progress in the way we understand the organization of our mental life. However, there is still a lack of specific models providing explanations of how 1/f noise is generated in coupled brain-body-environment systems, since existing models and experiments typically target either externally observable behaviour or isolated neuronal systems but do not address the interplay between neuronal mechanisms and sensorimotor dynamics. We present a conceptual model of a minimal neurorobotic agent solving a behavioural task that makes it possible to relate mechanistic (neurodynamic) and behavioural levels of description. The model consists of a simulated robot controlled by a network of Kuramoto oscillators with homeostatic plasticity and the ability to develop behavioural preferences mediated by sensorimotor patterns. With only three oscillators, this simple model displays self-organized criticality in the form of robust 1/f noise and a wide multifractal spectrum. We show that the emergence of self-organized criticality and 1/f noise in our model is the result of three simultaneous conditions: a) non-linear interaction dynamics capable of generating stable collective patterns, b) internal plastic mechanisms modulating the sensorimotor flows, and c) strong sensorimotor coupling with the environment that induces transient metastable neurodynamic regimes. We carry out a number of experiments to show that both synaptic plasticity and strong sensorimotor coupling play a necessary role, as constituents of self-organized criticality, in the generation of 1/f noise. The experiments also shown to be useful to test the robustness of 1/f scaling comparing the results of different techniques. We finally discuss the role of conceptual models as mediators between nomothetic and mechanistic models and how they can inform future experimental research where self-organized critically includes sensorimotor coupling among the essential interaction-dominant process giving rise to 1/f noise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper adapts a non cooperative game presented by Dagan, Serrano and Volij (1997) for bankruptcy problems to the context of TU veto balanced games. We investigate the relationship between the Nash outcomes of a noncooperative game and solution concepts of cooperative games such as the nucleolus, kernel and the egalitarian core.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When it comes to measuring blade-tip clearance or blade-tip timing in turbines, reflective intensity-modulated optical fiber sensors overcome several traditional limitations of capacitive, inductive or discharging probe sensors. This paper presents the signals and results corresponding to the third stage of a multistage turbine rig, obtained from a transonic wind-tunnel test. The probe is based on a trifurcated bundle of optical fibers that is mounted on the turbine casing. To eliminate the influence of light source intensity variations and blade surface reflectivity, the sensing principle is based on the quotient of the voltages obtained from the two receiving bundle legs. A discrepancy lower than 3% with respect to a commercial sensor was observed in tip clearance measurements. Regarding tip timing measurements, the travel wave spectrum was obtained, which provides the average vibration amplitude for all blades at a particular nodal diameter. With this approach, both blade-tip timing and tip clearance measurements can be carried out simultaneously. The results obtained on the test turbine rig demonstrate the suitability and reliability of the type of sensor used, and suggest the possibility of performing these measurements in real turbines under real working conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The smart grid is a highly complex system that is being formed from the traditional power grid, adding new and sophisticated communication and control devices. This will enable integrating new elements for distributed power generation and also achieving an increasingly automated operation so for actions of the utilities as for customers. In order to model such systems a bottom-up method is followed, using only a few basic elements which are structured into two layers: a physical layer for the electrical power transmission, and one logical layer for element communication. A simple case study is presented to analyse the possibilities of simulation. It shows a microgrid model with dynamic load management and an integrated approach that can process both electrical and communication flows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is a version of the discussion paper titled "Simple coalitional strategy profiles"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrogen is the only atom for which the Schr odinger equation is solvable. Consisting only of a proton and an electron, hydrogen is the lightest element and, nevertheless, is far from being simple. Under ambient conditions, it forms diatomic molecules H2 in gas phase, but di erent temperature and pressures lead to a complex phase diagram, which is not completely known yet. Solid hydrogen was rst documented in 1899 [1] and was found to be isolating. At higher pressures, however, hydrogen can be metallized. In 1935 Wigner and Huntington predicted that the metallization pressure would be 25 GPa [2], where molecules would disociate to form a monoatomic metal, as alkali metals that lie below hydrogen in the periodic table. The prediction of the metallization pressure turned out to be wrong: metallic hydrogen has not been found yet, even under a pressure as high as 320 GPa. Nevertheless, extrapolations based on optical measurements suggest that a metallic phase may be attained at 450 GPa [3]. The interest of material scientist in metallic hydrogen can be attributed, at least to a great extent, to Ashcroft, who in 1968 suggested that such a system could be a hightemperature superconductor [4]. The temperature at which this material would exhibit a transition from a superconducting to a non-superconducting state (Tc) was estimated to be around room temperature. The implications of such a statement are very interesting in the eld of astrophysics: in planets that contain a big quantity of hydrogen and whose temperature is below Tc, superconducting hydrogen may be found, specially at the center, where the gravitational pressure is high. This might be the case of Jupiter, whose proportion of hydrogen is about 90%. There are also speculations suggesting that the high magnetic eld of Jupiter is due to persistent currents related to the superconducting phase [5]. Metallization and superconductivity of hydrogen has puzzled scientists for decades, and the community is trying to answer several questions. For instance, what is the structure of hydrogen at very high pressures? Or a more general one: what is the maximum Tc a phonon-mediated superconductor can have [6]? A great experimental e ort has been carried out pursuing metallic hydrogen and trying to answer the questions above; however, the characterization of solid phases of hydrogen is a hard task. Achieving the high pressures needed to get the sought phases requires advanced technologies. Diamond anvil cells (DAC) are commonly used devices. These devices consist of two diamonds with a tip of small area; for this reason, when a force is applied, the pressure exerted is very big. This pressure is uniaxial, but it can be turned into hydrostatic pressure using transmitting media. Nowadays, this method makes it possible to reach pressures higher than 300 GPa, but even at this pressure hydrogen does not show metallic properties. A recently developed technique that is an improvement of DAC can reach pressures as high as 600 GPa [7], so it is a promising step forward in high pressure physics. Another drawback is that the electronic density of the structures is so low that X-ray di raction patterns have low resolution. For these reasons, ab initio studies are an important source of knowledge in this eld, within their limitations. When treating hydrogen, there are many subtleties in the calculations: as the atoms are so light, the ions forming the crystalline lattice have signi cant displacements even when temperatures are very low, and even at T=0 K, due to Heisenberg's uncertainty principle. Thus, the energy corresponding to this zero-point (ZP) motion is signi cant and has to be included in an accurate determination of the most stable phase. This has been done including ZP vibrational energies within the harmonic approximation for a range of pressures and at T=0 K, giving rise to a series of structures that are stable in their respective pressure ranges [8]. Very recently, a treatment of the phases of hydrogen that includes anharmonicity in ZP energies has suggested that relative stability of the phases may change with respect to the calculations within the harmonic approximation [9]. Many of the proposed structures for solid hydrogen have been investigated. Particularly, the Cmca-4 structure, which was found to be the stable one from 385-490 GPa [8], is metallic. Calculations for this structure, within the harmonic approximation for the ionic motion, predict a Tc up to 242 K at 450 GPa [10]. Nonetheless, due to the big ionic displacements, the harmonic approximation may not su ce to describe correctly the system. The aim of this work is to apply a recently developed method to treat anharmonicity, the stochastic self-consistent harmonic approximation (SSCHA) [11], to Cmca-4 metallic hydrogen. This way, we will be able to study the e ects of anharmonicity in the phonon spectrum and to try to understand the changes it may provoque in the value of Tc. The work is structured as follows. First we present the theoretical basis of the calculations: Density Functional Theory (DFT) for the electronic calculations, phonons in the harmonic approximation and the SSCHA. Then we apply these methods to Cmca-4 hydrogen and we discuss the results obtained. In the last chapter we draw some conclusions and propose possible future work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La actividad aseguradora supone la transferencia de riesgos del asegurado al asegurador. El asegurador se compromete al pago de una prestación si el riesgo se realiza. Se produce un cambio en el ciclo productivo. El asegurador vende una cobertura sin conocer el momento y el coste exacto de dicha cobertura. Esta particularidad de la actividad aseguradora explica la necesidad para una entidad aseguradora de ser solvente en cada momento y ante cualquier imprevisto. Por ello, la solvencia de las entidades aseguradoras es un aspecto que se ha ido recogiendo en las distintas normativas que han regulado la actividad aseguradora y al que se ha ido dando cada vez más importancia. Actualmente la legislación vigente en materia de solvencia de las aseguradoras esta regulada por la directiva europea Solvencia I. Esta directiva establece dos conceptos para garantizar la solvencia: las provisiones técnicas y el margen de solvencia. Las provisiones técnicas son las calculadas para garantizar la solvencia estática de la compañía, es decir aquella que hace frente, en un instante temporal determinado, a los compromisos asumidos por la entidad. El margen de solvencia se destina a cubrir la solvencia dinámica, aquella que hace referencia a eventos futuros que puedan afectar la capacidad del asegurador. Sin embargo en una corriente de gestión global del riesgo en la que el sector bancario ya se había adelantado al sector asegurador con la normativa Basilea II, se decidió iniciar un proyecto europeo de reforma de Solvencia I y en noviembre del 2009 se adoptó la directiva 2009/138/CE del parlamento europeo y del consejo, sobre el seguro de vida, el acceso a la actividad de seguro y de reaseguro y su ejercicio mas conocida como Solvencia II. Esta directiva supone un profundo cambio en las reglas actuales de solvencia para las entidades aseguradoras. Este cambio persigue el objetivo de establecer un marco regulador común a nivel europeo que sea más adaptado al perfil de riesgo de cada entidad aseguradora. Esta nueva directiva define dos niveles distintos de capital: el SCR (requerimiento estándar de capital de solvencia) y el MCR (requerimiento mínimo de capital). Para el calculo del SCR se ha establecido que el asegurador tendrá la libertad de elegir entre dos modelos. Un modelo estándar propuesto por la Autoridad Europea de Seguros y Pensiones de Jubilación (EIOPA por sus siglas en inglés), que permitirá un calculo simple, y un modelo interno desarrollado por la propia entidad que deberá ser aprobado por las autoridades competentes. También se contempla la posibilidad de utilizar un modelo mixto que combine ambos, el estándar y el interno. Para el desarrollo del modelo estándar se han realizado una serie de estudios de impacto cuantitativos (QIS). El último estudio (QIS 5) ha sido el que ha planteado de forma más precisa el cálculo del SCR. Plantea unos shocks que se deberán de aplicar al balance de la entidad con el objetivo de estresarlo, y en base a los resultados obtenidos constituir el SCR. El objetivo de este trabajo es realizar una síntesis de las especificaciones técnicas del QIS5 para los seguros de vida y realizar una aplicación práctica para un seguro de vida mixto puro. En la aplicación práctica se determinarán los flujos de caja asociados a este producto para calcular su mejor estimación (Best estimate). Posteriormente se determinará el SCR aplicando los shocks para los riesgos de mortalidad, rescates y gastos. Por último, calcularemos el margen de riesgo asociado al SCR. Terminaremos el presente TFG con unas conclusiones, la bibliografía empleada así como un anexo con las tablas empleadas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[ES]Este Trabajo de Fin de Grado “Control de un sistema de accionamientos de traslación basado en correa para un manipulador de cinemática paralela” tiene como objetivo principal la implementación de un sistema de control que nos permita manejar un manipulador de cinemática paralela de dos grados de libertad accionado mediante dos motores eléctricos de corriente continua. Como componente central de este sistema de control, se dispondrá de un ordenador portátil cuyo procesador será el encargado de ejecutar las acciones necesarias para que pueda llevarse a cabo esta actividad de control. De esta forma, la tarea más importante y laboriosa a llevar cabo en este proyecto será el desarrollo de un aplicación de control que, corriendo en el citado ordenador, permitirá al usuario manejar el manipulador de cinemática paralela en cuestión. Para ello, esta aplicación deberá ser capaz de interpretar las ordenes de movimiento dadas por el usuario y transmitirlas al procesador del mencionado ordenador. Además de todo lo anterior, para completar el desarrollo del sistema de control, será necesaria la implementación de diversos sensores que se encargarán de detectar y transmitir las señales necesarias para evitar situaciones de emergencia en el que el manipulador estuviese a punto de chocar con algún objeto o persona. En conclusión, mediante el cumplimiento de los objetivos de este Trabajo de Fin de Grado, se va a disponer de un sistema de control sencillo, intuitivo y fácilmente operable, que va a permitir a cualquier futuro usuario del mismo el manejo de un robot de cinemática paralela.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[ES]La edición de audio y video es una práctica muy frecuente actualmente en todas partes del mundo, tanto en ámbitos profesionales como domésticos. Tanto que para el año 2018, se prevé que el 80% del tráfico de internet serán descargas y subidas de videos. Para poder ofrecer edición de audio y video simple y potente a los usuarios, existen cantidades grandes de software de pago que pueden ser muy eficientes y pueden tener buenos resultados, pero puede que algunos usuarios no se puedan permitir tener acceso a ello por razones económicas o por no encontrarse en las plataformas de las que dispone el usuario, y de ahí nace el editor de videos Kdenlive, un editor de video de software libre desarrollado por una comunidad de usuarios y desarrolladores que juntos están haciendo que Kdenlive sea un programa al nivel de los editores de video comerciales. Aunque exista una gran comunidad de personas que se ayudan entre ellos, la documentación actual no está del todo enfocada a desarrolladores nuevos, sin experiencia previa. Éste trabajo tratará de añadir nuevas funcionalidades a Kdenlive, a petición de la comunidad de desarrolladores, así como crear una documentación que pueda ayudar a nuevos desarrolladores en poder concentrarse directamente en la tarea de programar, en vez de la búsqueda de información y formación previa sobre el programa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To achieve the apparently simple Periodic Table of the Elements has implied tremendous efforts over thousands of years. In this paper we present a brief history of the discovery of the chemical elements from prehistory to the present day, revealing the controversies that arose on the way and claiming the important work performed by alchemists in the advancement of knowledge. This is especially important if we consider that alchemy had a period of existence of many thousands of years, while the "Chemistry", officially established as a science in the eighteenth century, has operated as such for only a few hundred years. Even so, if we consider the progress of discovery and isolation of chemical elements throughout history, it can be observed that the number of elements identified is achieved mainly in the nineteenth and twentieth centuries, reflecting the development of instrumental techniques, that facilitated this task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change has differentially affected the timing of seasonal events for interacting trophic levels, and this has often led to increased selection on seasonal timing. Yet, the environmental variables driving this selection have rarely been identified, limiting our ability to predict future ecological impacts of climate change. Using a dataset spanning 31 years from a natural population of pied flycatchers (Ficedula hypoleuca), we show that directional selection on timing of reproduction intensified in the first two decades (1980-2000) but weakened during the last decade (2001-2010). Against expectation, this pattern could not be explained by the temporal variation in the phenological mismatch with food abundance. We therefore explored an alternative hypothesis that selection on timing was affected by conditions individuals experience when arriving in spring at the breeding grounds: arriving early in cold conditions may reduce survival. First, we show that in female recruits, spring arrival date in the first breeding year correlates positively with hatch date; hence, early-hatched individuals experience colder conditions at arrival than late-hatched individuals. Second, we show that when temperatures at arrival in the recruitment year were high, early-hatched young had a higher recruitment probability than when temperatures were low. We interpret this as a potential cost of arriving early in colder years, and climate warming may have reduced this cost. We thus show that higher temperatures in the arrival year of recruits were associated with stronger selection for early reproduction in the years these birds were born. As arrival temperatures in the beginning of the study increased, but recently declined again, directional selection on timing of reproduction showed a nonlinear change. We demonstrate that environmental conditions with a lag of up to two years can alter selection on phenological traits in natural populations, something that has important implications for our understanding of how climate can alter patterns of selection in natural populations.