92 resultados para IMPROVED STABILITY
Resumo:
In this paper, we give a new construction of resonant normal forms with a small remainder for near-integrable Hamiltonians at a quasi-periodic frequency. The construction is based on the special case of a periodic frequency, a Diophantine result concerning the approximation of a vector by independent periodic vectors and a technique of composition of periodic averaging. It enables us to deal with non-analytic Hamiltonians, and in this first part we will focus on Gevrey Hamiltonians and derive normal forms with an exponentially small remainder. This extends a result which was known for analytic Hamiltonians, and only in the periodic case for Gevrey Hamiltonians. As applications, we obtain an exponentially large upper bound on the stability time for the evolution of the action variables and an exponentially small upper bound on the splitting of invariant manifolds for hyperbolic tori, generalizing corresponding results for analytic Hamiltonians.
Resumo:
This paper is a sequel to ``Normal forms, stability and splitting of invariant manifolds I. Gevrey Hamiltonians", in which we gave a new construction of resonant normal forms with an exponentially small remainder for near-integrable Gevrey Hamiltonians at a quasi-periodic frequency, using a method of periodic approximations. In this second part we focus on finitely differentiable Hamiltonians, and we derive normal forms with a polynomially small remainder. As applications, we obtain a polynomially large upper bound on the stability time for the evolution of the action variables and a polynomially small upper bound on the splitting of invariant manifolds for hyperbolic tori.
Resumo:
In this study, glyoxalated alkaline lignins with a non-volatile and non-toxic aldehyde, which can be obtained from several natural resources, namely glyoxal, were prepared and characterized for its use in wood adhesives. The preparation method consisted of the reaction of lignin with glyoxal under an alkaline medium. The influence of reaction conditions such as the molar ratio of sodium hydroxide-to-lignin and reaction time were studied relative to the properties of the prepared adducts. The analytical techniques used were FTIR and 1H-NMR spectroscopies, gel permeation chromatography (GPC), differential scanning calorimetry (DSC), and thermogravimetric analysis (TGA). Results from both the FTIR and 1H-NMR spectroscopies showed that the amount of introduced aliphatic hydroxyl groups onto the lignin molecule increased with increasing reaction time and reached a maximum value at 10 h, and after they began to decrease. The molecular weights remained unchanged until 10 h of reaction time, and then started to increase, possibly due to the repolymerization reactions. DSC analysis showed that the glass transition temperature (Tg) decreased with the introduction of glyoxal onto the lignin molecule due to the increase in free volume of the lignin molecules. TGA analysis showed that the thermal stability of glyoxalated lignin is not influenced and remained suitable for wood adhesives. Compared to the original lignin, the improved lignin is reactive and a suitable raw material for adhesive formula
Resumo:
Earthquakes occurring around the world each year cause thousands ofdeaths, millions of dollars in damage to infrastructure, and incalculablehuman suffering. In recent years, satellite technology has been asignificant boon to response efforts following an earthquake and itsafter-effects by providing mobile communications between response teamsand remote sensing of damaged areas to disaster management organizations.In 2007, an international team of students and professionals assembledduring theInternational Space University’s Summer Session Program in Beijing, Chinato examine how satellite and ground-based technology could be betterintegrated to provide an optimised response in the event of an earthquake.The resulting Technology Resources for Earthquake MOnitoring and Response(TREMOR) proposal describes an integrative prototype response system thatwill implement mobile satellite communication hubs providing telephone anddata links between response teams, onsite telemedicine consultation foremergency first-responders, and satellite navigation systems that willlocate and track emergency vehicles and guide search-and-rescue crews. Aprototype earthquake simulation system is also proposed, integratinghistorical data, earthquake precursor data, and local geomatics andinfrastructure information to predict the damage that could occur in theevent of an earthquake. The backbone of these proposals is a comprehensiveeducation and training program to help individuals, communities andgovernments prepare in advance. The TREMOR team recommends thecoordination of these efforts through a centralised, non-governmentalorganization.
Resumo:
Estudi realitzat a partir d’una estada a la University of British Columbia, Canada, entre 2010 i 2012. Primerament es va desenvolupar una escala per mesurar coixeses (amb valors de l’1 al 5). Aquesta escala es va utilitzar per estudiar l’associació entre factors de risc a nivell de granja (disseny de le instal.lacions i maneig) i la prevalencia de coixeses a Nord America. Les dades es van recollir en un total de 40 granges al Nord Est dels E.E.U.U (NE) i 39 a California (CA) . Totes les vaques del group mes productiu es van categoritzar segons la severitat de les coixeses: sanes, coixes i severament coixes. La prevalencia de coixeses en general fou del 55 % a NE i del 31% a CA. La prevalencia de coixeses severes fou del 8% a NE i del 4% a Ca. A NE, les coixeses en general increntaren amb la presencia de serradura als llits i disminuiren en granjes grans, amb major quantitat de llit i acces a pastura. Les coixeses mes severes incrementaren amb la falta d’higiene als llit i amb la presencia de serradura als llits, i disminuiren amb la quantitat de llit proveit, l’us de sorra als llits i amb la mida de la granja. A CA, les coixeses en general incrementaren amb la falta d’higiene al llit, i disminuiren amb la mida de la granja, la presencia de terres de goma, l’increment d’espai als cubicles , l’espai a l’abeuredor i la desinfeccio de les peulles. Les coixeses severes incrementaren amb la falta d’higiene al llit i disminuixen amb la frequencia de neteja del corral. En conclusio, canvis en el maneig i el disseny de les instal.lacions poden ajudar a disminuir la prevalencia de coixeses, tot i que les estrategies a seguir variaran segons la regio.
Resumo:
Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.
Resumo:
Background: Recent advances on high-throughput technologies have produced a vast amount of protein sequences, while the number of high-resolution structures has seen a limited increase. This has impelled the production of many strategies to built protein structures from its sequence, generating a considerable amount of alternative models. The selection of the closest model to the native conformation has thus become crucial for structure prediction. Several methods have been developed to score protein models by energies, knowledge-based potentials and combination of both.Results: Here, we present and demonstrate a theory to split the knowledge-based potentials in scoring terms biologically meaningful and to combine them in new scores to predict near-native structures. Our strategy allows circumventing the problem of defining the reference state. In this approach we give the proof for a simple and linear application that can be further improved by optimizing the combination of Zscores. Using the simplest composite score () we obtained predictions similar to state-of-the-art methods. Besides, our approach has the advantage of identifying the most relevant terms involved in the stability of the protein structure. Finally, we also use the composite Zscores to assess the conformation of models and to detect local errors.Conclusion: We have introduced a method to split knowledge-based potentials and to solve the problem of defining a reference state. The new scores have detected near-native structures as accurately as state-of-art methods and have been successful to identify wrongly modeled regions of many near-native conformations.
Resumo:
This paper retakes previous work of the authors, about the relationship between non-quasi-competitiveness (the increase in price caused by an increase in the number of oligopolists) and stability of the equilibrium in the classical Cournot oligopoly model. Though it has been widely accepted in the literature that the loss of quasi-competitiveness is linked, in the long run as new firms entered the market, to instability of the model, the authors in their previous work put forward a model in which a situation of monopoly changed to duopoly losing quasi-competitiveness but maintaining the stability of the equilibrium. That model could not, at the time, be extended to any number of oligopolists. The present paper exhibits such an extension. An oligopoly model is shown in which the loss of quasi-competitiveness resists the presence in the market of as many firms as one wishes and where the successive Cournot's equilibrium points are unique and asymptotically stable. In this way, for the first time, the conjecture that non-quasi- competitiveness and instability were equivalent in the long run, is proved false.
Resumo:
In experiments with two-person sequential games we analyzewhether responses to favorable and unfavorable actions dependon the elicitation procedure. In our hot treatment thesecond player responds to the first player s observed actionwhile in our cold treatment we follow the strategy method and have the second player decide on a contingent action foreach and every possible first player move, without firstobserving this move. Our analysis centers on the degree towhich subjects deviate from the maximization of their pecuniaryrewards, as a response to others actions. Our results show nodifference in behavior between the two treatments. We also findevidence of the stability of subjects preferences with respectto their behavior over time and to the consistency of theirchoices as first and second mover.
Resumo:
We estimate a forward-looking monetary policy reaction function for thepostwar United States economy, before and after Volcker's appointmentas Fed Chairman in 1979. Our results point to substantial differencesin the estimated rule across periods. In particular, interest ratepolicy in the Volcker-Greenspan period appears to have been much moresensitive to changes in expected inflation than in the pre-Volckerperiod. We then compare some of the implications of the estimated rulesfor the equilibrium properties of inflation and output, using a simplemacroeconomic model, and show that the Volcker-Greenspan rule is stabilizing.
Resumo:
Condence intervals in econometric time series regressions suffer fromnotorious coverage problems. This is especially true when the dependencein the data is noticeable and sample sizes are small to moderate, as isoften the case in empirical studies. This paper suggests using thestudentized block bootstrap and discusses practical issues, such as thechoice of the block size. A particular data-dependent method is proposedto automate the method. As a side note, it is pointed out that symmetricconfidence intervals are preferred over equal-tailed ones, since theyexhibit improved coverage accuracy. The improvements in small sampleperformance are supported by a simulation study.
Resumo:
It is widely accepted in the literature about the classicalCournot oligopoly model that the loss of quasi competitiveness is linked,in the long run as new firms enter the market, to instability of the equilibrium. In this paper, though, we present a model in which a stableunique symmetric equilibrium is reached for any number of oligopolistsas industry price increases with each new entry. Consequently, the suspicion that non quasi competitiveness implies, in the long run, instabilityis proved false.
Resumo:
We develop a coordination game to model interactions betweenfundamentals and liquidity during unstable periods in financial markets.We then propose a flexible econometric framework for estimationof the model and analysis of its quantitative implications. The specificempirical application is carry trades in the yen dollar market, includingthe turmoil of 1998. We find a generally very deep market, withlow information disparities amongst agents. We observe occasionallyepisodes of market fragility, or turmoil with up by the escalator, downby the elevator patterns in prices. The key role of strategic behaviorin the econometric model is also confirmed.
Resumo:
In this paper, we discuss pros and cons ofdifferent models for financial market regulationand supervision and we present a proposal forthe re-organisation of regulatory and supervisoryagencies in the Euro Area. Our arguments areconsistent with both new theories and effectivebehaviour of financial intermediaries inindustrialized countries. Our proposed architecturefor financial market regulation is based on theassignment of different objectives or "finalities"to different authorities, both at the domesticand the European level. According to thisperspective, the three objectives of supervision- microeconomic stability, investor protectionand proper behaviour, efficiency and competition- should be assigned to three distinct Europeanauthorities, each one at the centre of a Europeansystem of financial regulators and supervisorsspecialized in overseeing the entire financialmarket with respect to a single regulatoryobjective and regardless of the subjective natureof the intermediaries. Each system should bestructured and organized similarly to the EuropeanSystem of Central Banks and work in connectionwith the central bank which would remain theinstitution responsible for price and macroeconomicstability. We suggest a plausible path to buildour 4-peak regulatory architecture in the Euro area.
Resumo:
Although the histogram is the most widely used density estimator, itis well--known that the appearance of a constructed histogram for a given binwidth can change markedly for different choices of anchor position. In thispaper we construct a stability index $G$ that assesses the potential changesin the appearance of histograms for a given data set and bin width as theanchor position changes. If a particular bin width choice leads to an unstableappearance, the arbitrary choice of any one anchor position is dangerous, anda different bin width should be considered. The index is based on the statisticalroughness of the histogram estimate. We show via Monte Carlo simulation thatdensities with more structure are more likely to lead to histograms withunstable appearance. In addition, ignoring the precision to which the datavalues are provided when choosing the bin width leads to instability. We provideseveral real data examples to illustrate the properties of $G$. Applicationsto other binned density estimators are also discussed.