101 resultados para Frequency stability
Resumo:
Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.
Resumo:
This paper retakes previous work of the authors, about the relationship between non-quasi-competitiveness (the increase in price caused by an increase in the number of oligopolists) and stability of the equilibrium in the classical Cournot oligopoly model. Though it has been widely accepted in the literature that the loss of quasi-competitiveness is linked, in the long run as new firms entered the market, to instability of the model, the authors in their previous work put forward a model in which a situation of monopoly changed to duopoly losing quasi-competitiveness but maintaining the stability of the equilibrium. That model could not, at the time, be extended to any number of oligopolists. The present paper exhibits such an extension. An oligopoly model is shown in which the loss of quasi-competitiveness resists the presence in the market of as many firms as one wishes and where the successive Cournot's equilibrium points are unique and asymptotically stable. In this way, for the first time, the conjecture that non-quasi- competitiveness and instability were equivalent in the long run, is proved false.
Resumo:
In experiments with two-person sequential games we analyzewhether responses to favorable and unfavorable actions dependon the elicitation procedure. In our hot treatment thesecond player responds to the first player s observed actionwhile in our cold treatment we follow the strategy method and have the second player decide on a contingent action foreach and every possible first player move, without firstobserving this move. Our analysis centers on the degree towhich subjects deviate from the maximization of their pecuniaryrewards, as a response to others actions. Our results show nodifference in behavior between the two treatments. We also findevidence of the stability of subjects preferences with respectto their behavior over time and to the consistency of theirchoices as first and second mover.
Resumo:
We estimate a forward-looking monetary policy reaction function for thepostwar United States economy, before and after Volcker's appointmentas Fed Chairman in 1979. Our results point to substantial differencesin the estimated rule across periods. In particular, interest ratepolicy in the Volcker-Greenspan period appears to have been much moresensitive to changes in expected inflation than in the pre-Volckerperiod. We then compare some of the implications of the estimated rulesfor the equilibrium properties of inflation and output, using a simplemacroeconomic model, and show that the Volcker-Greenspan rule is stabilizing.
Resumo:
It is widely accepted in the literature about the classicalCournot oligopoly model that the loss of quasi competitiveness is linked,in the long run as new firms enter the market, to instability of the equilibrium. In this paper, though, we present a model in which a stableunique symmetric equilibrium is reached for any number of oligopolistsas industry price increases with each new entry. Consequently, the suspicion that non quasi competitiveness implies, in the long run, instabilityis proved false.
Resumo:
We develop a coordination game to model interactions betweenfundamentals and liquidity during unstable periods in financial markets.We then propose a flexible econometric framework for estimationof the model and analysis of its quantitative implications. The specificempirical application is carry trades in the yen dollar market, includingthe turmoil of 1998. We find a generally very deep market, withlow information disparities amongst agents. We observe occasionallyepisodes of market fragility, or turmoil with up by the escalator, downby the elevator patterns in prices. The key role of strategic behaviorin the econometric model is also confirmed.
Resumo:
In this paper, we discuss pros and cons ofdifferent models for financial market regulationand supervision and we present a proposal forthe re-organisation of regulatory and supervisoryagencies in the Euro Area. Our arguments areconsistent with both new theories and effectivebehaviour of financial intermediaries inindustrialized countries. Our proposed architecturefor financial market regulation is based on theassignment of different objectives or "finalities"to different authorities, both at the domesticand the European level. According to thisperspective, the three objectives of supervision- microeconomic stability, investor protectionand proper behaviour, efficiency and competition- should be assigned to three distinct Europeanauthorities, each one at the centre of a Europeansystem of financial regulators and supervisorsspecialized in overseeing the entire financialmarket with respect to a single regulatoryobjective and regardless of the subjective natureof the intermediaries. Each system should bestructured and organized similarly to the EuropeanSystem of Central Banks and work in connectionwith the central bank which would remain theinstitution responsible for price and macroeconomicstability. We suggest a plausible path to buildour 4-peak regulatory architecture in the Euro area.
Resumo:
We analyze which normal form solution concepts capture the notion offorward induction, as defined by van Damme (JET, 1989) in the classof generic two player normal form games preceded by an outsideoption. We find that none of the known strategic stability concepts(including Mertens stable sets and hyperstable sets) captures this form of forward induction. On the other hand, we show that the evolutionary concept of EES set (Swinkels, JET, 1992) is always consistent with forward induction.
Resumo:
We argue the importance both of developing simple sufficientconditions for the stability of general multiclass queueing networks and also of assessing such conditions under a range of assumptions on the weight of the traffic flowing between service stations. To achieve the former, we review a peak-rate stability condition and extend its range of application and for the latter, we introduce a generalisation of the Lu-Kumar network on which the stability condition may be tested for a range of traffic configurations. The peak-rate condition is close to exact when the between-station traffic is light, but degrades as this traffic increases.
Resumo:
This paper presents a classical Cournot oligopoly model with some peculiar features: it is non--quasi--competitive as price under N-poly is greater than monopoly price; Cournot equilibrium exists and is unique with each new entry; the successive equilibria after new entries are stable under the adjustment mechanism that assumes that actual output of each seller is adjusted proportionally to the difference between actual output and profit maximizing output. Moreover, the model tends to perfect competition as N goes to infinity, reaching the monopoly price again.
Resumo:
Previous works on asymmetric information in asset markets tendto focus on the potential gains in the asset market itself. We focus on the market for information and conduct an experimental study to explore, in a game of finite but uncertain duration, whether reputation can be an effective constraint on deliberate misinformation. At the beginning of each period, an uninformed potential asset buyer can purchase information, at a fixed price and from a fully-informed source, about the value of the asset in that period. The informational insiders cannot purchase the asset and are given short-term incentives to provide false information when the asset value is low. Our model predicts that, in accordance with the Folk Theorem, Pareto-superior outcomes featuring truthful revelation should be sustainable. However, this depends critically on beliefs about rationality and behavior. We find that, overall, sellers are truthful 89% of the time. More significantly, the observed frequency of truthfulness is 81% when the asset value is low. Our result is consistent with both mixed-strategy and trigger strategy interpretations and provides evidence that most subjects correctly anticipate rational behavior. We discuss applications to financial markets, media regulation, and the stability of cartels.
Resumo:
The second differential of the entropy is used for analysing the stability of a thermodynamic climatic model. A delay time for the heat flux is introduced whereby it becomes an independent variable. Two different expressions for the second differential of the entropy are used: one follows classical irreversible thermodynamics theory; the second is related to the introduction of response time and is due to the extended irreversible thermodynamics theory. the second differential of the classical entropy leads to unstable solutions for high values of delay times. the extended expression always implies stable states for an ice-free earth. When the ice-albedo feedback is included, a discontinuous distribution of stable states is found for high response times. Following the thermodynamic analysis of the model, the maximum rates of entropy production at the steady state are obtained. A latitudinally isothermal earth produces the extremum in global entropy production. the material contribution to entropy production (by which we mean the production of entropy by material transport of heat) is a maximum when the latitudinal distribution of temperatures becomes less homogeneous than present values
Resumo:
This editorial examines the actions undertaken in number 5 and announces two important changes for the numbers 6 and 7, namely frequency and languages. Furthermore, the use of some bibliometric indicators, such as impact factor and refusal rate, is critically analyzed.
Resumo:
We report millimetre-wave continuum observations of the X-ray binaries Cygnus X-3, SS 433, LSI+61 303, Cygnus X-1 and GRS 1915+105. The observations were carried out with the IRAM 30 m-antenna at 250 GHz (1.25 mm) from 1998 March 14 to March 20. These millimetre measurements are complemented with centimetre observations from the Ryle Telescope, at 15 GHz (2.0 cm) and from the Green Bank Interferometer at 2.25 and 8.3 GHz (13 and 3.6 cm). Both Cygnus X-3 and SS 433 underwent moderate flaring events during our observations, whose main spectral evolution properties are described and interpreted. A significant spectral steepening was observed in both sources during the flare decay, that is likely to be caused by adiabatic expansion, inverse Compton and synchrotron losses. Finally, we also report 250 GHz upper limits for three additional undetected X-ray binary stars: LSI+65 010, LSI+61 235 and X Per.
Resumo:
P27(Kip1) (p27) is a member of the Cip/Kip family of cyclin-dependent kinase inhibitors. Recently, a new function of p27 as transcriptional regulator has been reported. It has been shown that p27 regulates the expression of target genes mostly involved in splicing, cell cycle, respiration and translation. We report here that p27 directly binds to the transcriptional coactivator PCAF by a region including amino acids 91-120. PCAF associates with p27 through its catalytic domain and acetylates p27 at lysine 100. Our data showed that overexpression of PCAF induces the degradation of p27 whereas in contrast, the knockdown of PCAF stabilizes the protein. A p27 mutant in which K100 was substituted by arginine (p27-K100R) cannot be acetylated by PCAF and has a half-life much higher than that of p27WT. Moreover, p27-K100R remains stable along cell-cycle progression. Ubiquitylation assays and the use of proteasome inhibitors indicate that PCAF induces p27 degradation via proteasome. We also observed that knockdown of skp2 did not affect the PCAF induced degradation of p27. In conclusion, our data suggest that the p27 acetylation by PCAF regulates its stability.