133 resultados para temporal stability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is a sequel to ``Normal forms, stability and splitting of invariant manifolds I. Gevrey Hamiltonians", in which we gave a new construction of resonant normal forms with an exponentially small remainder for near-integrable Gevrey Hamiltonians at a quasi-periodic frequency, using a method of periodic approximations. In this second part we focus on finitely differentiable Hamiltonians, and we derive normal forms with a polynomially small remainder. As applications, we obtain a polynomially large upper bound on the stability time for the evolution of the action variables and a polynomially small upper bound on the splitting of invariant manifolds for hyperbolic tori.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estudi i avaluació de les principals propietats del sòl i la dinàmica erosiva relacionades amb l’ús del sòl i l’abandó agrícola en diferents ambients de la península del Cap de Creus. Per dur a terme aquest estudi s’han seleccionat 7 ambients diferentsrepresentatius d’una seqüència d’usos fins l’abandó en diferents etapes desuccessió vegetal. Els ambients són diferenciats entre sòls cultivats (vinya iolivera), sòls forestals (sureda i pineda), pastures (prats) i matollars (Cistusmonspeliensis i d’Erica arborea) respectivament cremat reiteradament i nocremat durant 25 anys. En cada ambient s’han instal.lat parcel.les d’erosióque permeten avaluar la producció de sediments i la mobilització de nutrients (carboni i nitrogen) durant els episodis de precipitació que generen escolament superficial

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proyecto de investigación elaborado a partir de una estancia en el Institute for Atmospheric and Climate Science, a Alemanya, entre 2010 y 2012. La radiación solar que alcanza la superficie terrestre es un factor clave entre los procesos que controlan el clima de la Tierra, dado el papel que desempeñan en el balance energético y el ciclo hidrológico. Establecer su contribución al cambio climático reciente supone una gran dificultad debido a la complejidad de los procesos implicados, la gran cantidad de información requerida, y la incertidumbre de las bases de datos disponibles en la actualidad. Así, el objetivo principal del proyecto ha consistido en generar una base de datos de insolación incluyendo las series más largas (desde finales del siglo XIX) disponibles en toda Europa. Esta base de datos complementa para nuestro continente el Global Energy Balance Archive (GEBA) que mantiene y gestiona el grupo que ha acogido al receptor de la ayuda postdoctoral, y permite extender espacial (especialmente en países del sur de Europa) y temporalmente las series climáticas disponibles de mediciones de irradiancia solar. Como la insolación es un proxy de la irradiancia solar, el proyecto actual también ha tratado de calibrar de forma exhaustiva ambas variables, a fin de generar una nueva base de datos reconstruida de esta segunda variable que esté disponible desde finales del siglo XIX en Europa. Un segundo objetivo del proyecto ha consistido en continuar trabajando a escala de mayor detalle sobre la Península Ibérica, con el fin de proporcionar una mejor comprensión del fenómeno del “global dimming/brightening” y su impacto en el ciclo hidrológico y balance energético. Finalmente, un tercer objetivo del presente proyecto postdoctoral ha consistido en continuar estudiando los posibles ciclos semanales a gran escala de diferentes variables climáticas, línea de investigación de interés para la detección de posibles efectos de los aerosoles antrópicos en el clima a escalas temporales breves, y consecuentemente estrechamente vinculado al fenómeno del “global dimming/brightening”.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many terrestrial and marine systems are experiencing accelerating decline due to the effects of global change. This situation has raised concern about the consequences of biodiversity losses for ecosystem function, ecosystem service provision, and human well-being. Coastal marine habitats are a main focus of attention because they harbour a high biological diversity, are among the most productive systems of the world and present high anthropogenic interaction levels. The accelerating degradation of many terrestrial and marine systems highlights the urgent need to evaluate the consequence of biodiversity loss. Because marine biodiversity is a dynamic entity and this study was interested global change impacts, this study focused on benthic biodiversity trends over large spatial and long temporal scales. The main aim of this project was to investigate the current extent of biodiversity of the high diverse benthic coralligenous community in the Mediterranean Sea, detect its changes, and predict its future changes over broad spatial and long temporal scales. These marine communities are characterized by structural species with low growth rates and long life spans; therefore they are considered particularly sensitive to disturbances. For this purpose, this project analyzed permanent photographic plots over time at four locations in the NW Mediterranean Sea. The spatial scale of this study provided information on the level of species similarity between these locations, thus offering a solid background on the amount of large scale variability in coralligenous communities; whereas the temporal scale was fundamental to determine the natural variability in order to discriminate between changes observed due to natural factors and those related to the impact of disturbances (e.g. mass mortality events related to positive thermal temperatures, extreme catastrophic events). This study directly addressed the challenging task of analyzing quantitative biodiversity data of these high diverse marine benthic communities. Overall, the scientific knowledge gained with this research project will improve our understanding in the function of marine ecosystems and their trajectories related to global change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we describe the usage of bilinear statistical models as a means of factoring the shape variability into two components attributed to inter-subject variation and to the intrinsic dynamics of the human heart. We show that it is feasible to reconstruct the shape of the heart at discrete points in the cardiac cycle. Provided we are given a small number of shape instances representing the same heart atdifferent points in the same cycle, we can use the bilinearmodel to establish this. Using a temporal and a spatial alignment step in the preprocessing of the shapes, around half of the reconstruction errors were on the order of the axial image resolution of 2 mm, and over 90% was within 3.5 mm. From this, weconclude that the dynamics were indeed separated from theinter-subject variability in our dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the development of fluency in 30 advanced L2 learners of English over a period of 15 months. In order to measure fluency, several temporal variables and hesitation phenomena are analyzed and compared. Oral competence is assessed by means of an oral interview carried out by the learners. Data collection takes place at three different times: before (T1) and after (T2) a six-month period of FI (80 hours) in the home university, and after a three-month SA term (T3). The data is analyzed quantitatively. Developmental gains in fluency are measured for the whole period, adopting a view of complementarity between the two learning contexts. From these results, a group of high fluency speakers is identified. Correlations between fluency gains and individual and contextual variables are executed and a more qualitative analysis is performed for high fluency speakers' performance and behavior. Results show an overall development of students' oral fluency during a period of 15 months favored by the combination of a period of FI at home followed by a 3-months SA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper retakes previous work of the authors, about the relationship between non-quasi-competitiveness (the increase in price caused by an increase in the number of oligopolists) and stability of the equilibrium in the classical Cournot oligopoly model. Though it has been widely accepted in the literature that the loss of quasi-competitiveness is linked, in the long run as new firms entered the market, to instability of the model, the authors in their previous work put forward a model in which a situation of monopoly changed to duopoly losing quasi-competitiveness but maintaining the stability of the equilibrium. That model could not, at the time, be extended to any number of oligopolists. The present paper exhibits such an extension. An oligopoly model is shown in which the loss of quasi-competitiveness resists the presence in the market of as many firms as one wishes and where the successive Cournot's equilibrium points are unique and asymptotically stable. In this way, for the first time, the conjecture that non-quasi- competitiveness and instability were equivalent in the long run, is proved false.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In experiments with two-person sequential games we analyzewhether responses to favorable and unfavorable actions dependon the elicitation procedure. In our hot treatment thesecond player responds to the first player s observed actionwhile in our cold treatment we follow the strategy method and have the second player decide on a contingent action foreach and every possible first player move, without firstobserving this move. Our analysis centers on the degree towhich subjects deviate from the maximization of their pecuniaryrewards, as a response to others actions. Our results show nodifference in behavior between the two treatments. We also findevidence of the stability of subjects preferences with respectto their behavior over time and to the consistency of theirchoices as first and second mover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate a forward-looking monetary policy reaction function for thepostwar United States economy, before and after Volcker's appointmentas Fed Chairman in 1979. Our results point to substantial differencesin the estimated rule across periods. In particular, interest ratepolicy in the Volcker-Greenspan period appears to have been much moresensitive to changes in expected inflation than in the pre-Volckerperiod. We then compare some of the implications of the estimated rulesfor the equilibrium properties of inflation and output, using a simplemacroeconomic model, and show that the Volcker-Greenspan rule is stabilizing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is widely accepted in the literature about the classicalCournot oligopoly model that the loss of quasi competitiveness is linked,in the long run as new firms enter the market, to instability of the equilibrium. In this paper, though, we present a model in which a stableunique symmetric equilibrium is reached for any number of oligopolistsas industry price increases with each new entry. Consequently, the suspicion that non quasi competitiveness implies, in the long run, instabilityis proved false.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a coordination game to model interactions betweenfundamentals and liquidity during unstable periods in financial markets.We then propose a flexible econometric framework for estimationof the model and analysis of its quantitative implications. The specificempirical application is carry trades in the yen dollar market, includingthe turmoil of 1998. We find a generally very deep market, withlow information disparities amongst agents. We observe occasionallyepisodes of market fragility, or turmoil with up by the escalator, downby the elevator patterns in prices. The key role of strategic behaviorin the econometric model is also confirmed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we discuss pros and cons ofdifferent models for financial market regulationand supervision and we present a proposal forthe re-organisation of regulatory and supervisoryagencies in the Euro Area. Our arguments areconsistent with both new theories and effectivebehaviour of financial intermediaries inindustrialized countries. Our proposed architecturefor financial market regulation is based on theassignment of different objectives or "finalities"to different authorities, both at the domesticand the European level. According to thisperspective, the three objectives of supervision- microeconomic stability, investor protectionand proper behaviour, efficiency and competition- should be assigned to three distinct Europeanauthorities, each one at the centre of a Europeansystem of financial regulators and supervisorsspecialized in overseeing the entire financialmarket with respect to a single regulatoryobjective and regardless of the subjective natureof the intermediaries. Each system should bestructured and organized similarly to the EuropeanSystem of Central Banks and work in connectionwith the central bank which would remain theinstitution responsible for price and macroeconomicstability. We suggest a plausible path to buildour 4-peak regulatory architecture in the Euro area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although the histogram is the most widely used density estimator, itis well--known that the appearance of a constructed histogram for a given binwidth can change markedly for different choices of anchor position. In thispaper we construct a stability index $G$ that assesses the potential changesin the appearance of histograms for a given data set and bin width as theanchor position changes. If a particular bin width choice leads to an unstableappearance, the arbitrary choice of any one anchor position is dangerous, anda different bin width should be considered. The index is based on the statisticalroughness of the histogram estimate. We show via Monte Carlo simulation thatdensities with more structure are more likely to lead to histograms withunstable appearance. In addition, ignoring the precision to which the datavalues are provided when choosing the bin width leads to instability. We provideseveral real data examples to illustrate the properties of $G$. Applicationsto other binned density estimators are also discussed.