68 resultados para aerobic stability
Resumo:
In standard multivariate statistical analysis common hypotheses of interest concern changes in mean vectors and subvectors. In compositional data analysis it is now well established that compositional change is most readily described in terms of the simplicial operation of perturbation and that subcompositions replace the marginal concept of subvectors. To motivate the statistical developments of this paper we present two challenging compositional problems from food production processes.Against this background the relevance of perturbations and subcompositions can beclearly seen. Moreover we can identify a number of hypotheses of interest involvingthe specification of particular perturbations or differences between perturbations and also hypotheses of subcompositional stability. We identify the two problems as being the counterpart of the analysis of paired comparison or split plot experiments and of separate sample comparative experiments in the jargon of standard multivariate analysis. We then develop appropriate estimation and testing procedures for a complete lattice of relevant compositional hypotheses
Resumo:
In this paper, robustness of parametric systems is analyzed using a new approach to interval mathematics called Modal Interval Analysis. Modal Intervals are an interval extension that, instead of classic intervals, recovers some of the properties required by a numerical system. Modal Interval Analysis not only simplifies the computation of interval functions but allows semantic interpretation of their results. Necessary, sufficient and, in some cases, necessary and sufficient conditions for robust performance are presented
Resumo:
Vegeu el resum a l'inici del document de l'arxiu adjunt
Resumo:
The problem of stability analysis for a class of neutral systems with mixed time-varying neutral, discrete and distributed delays and nonlinear parameter perturbations is addressed. By introducing a novel Lyapunov-Krasovskii functional and combining the descriptor model transformation, the Leibniz-Newton formula, some free-weighting matrices, and a suitable change of variables, new sufficient conditions are established for the stability of the considered system, which are neutral-delay-dependent, discrete-delay-range dependent, and distributeddelay-dependent. The conditions are presented in terms of linear matrix inequalities (LMIs) and can be efficiently solved using convex programming techniques. Two numerical examples are given to illustrate the efficiency of the proposed method
Resumo:
Why does the EU have an ambiguous and inconsistent democracy promotion (DP) policy towards the Mediterranean countries? This paper argues that the EU´s DP is determined by a crucial conflict of interests conceptualised as a stability – democracy dilemma. The EU has been attempting to promote democracy, but without risking the current stability and in connivance with incumbent autocratic regimes. In view of this dilemma, the four main characteristics of the EU´s DP promotion are explored, namely: gradualism, a strong notion of partnership-building, a narrow definition of civil society, and a strong belief in economic liberalisation. A fifth feature, relation of the EU with moderate Islamists, is analysed in the paper as it represents the most striking illustration of its contradictions. The paper concludes by arguing that the definition of a clear DP by the EU that considered engagement with moderate Islamists would represent a major step towards squaring its stability – democracy circle.
Resumo:
In this paper, we give a new construction of resonant normal forms with a small remainder for near-integrable Hamiltonians at a quasi-periodic frequency. The construction is based on the special case of a periodic frequency, a Diophantine result concerning the approximation of a vector by independent periodic vectors and a technique of composition of periodic averaging. It enables us to deal with non-analytic Hamiltonians, and in this first part we will focus on Gevrey Hamiltonians and derive normal forms with an exponentially small remainder. This extends a result which was known for analytic Hamiltonians, and only in the periodic case for Gevrey Hamiltonians. As applications, we obtain an exponentially large upper bound on the stability time for the evolution of the action variables and an exponentially small upper bound on the splitting of invariant manifolds for hyperbolic tori, generalizing corresponding results for analytic Hamiltonians.
Resumo:
This paper is a sequel to ``Normal forms, stability and splitting of invariant manifolds I. Gevrey Hamiltonians", in which we gave a new construction of resonant normal forms with an exponentially small remainder for near-integrable Gevrey Hamiltonians at a quasi-periodic frequency, using a method of periodic approximations. In this second part we focus on finitely differentiable Hamiltonians, and we derive normal forms with a polynomially small remainder. As applications, we obtain a polynomially large upper bound on the stability time for the evolution of the action variables and a polynomially small upper bound on the splitting of invariant manifolds for hyperbolic tori.
Resumo:
Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.
Resumo:
Peroxiredoxins are known to interact with hydrogen peroxide (H2O2) and to participate in oxidant scavenging, redox signal transduction, and heat-shock responses. The two-cysteine peroxiredoxin Tpx1 of Schizosaccharomyces pombe has been characterized as the H2O2 sensor that transduces the redox signal to the transcription factor Pap1. Here, we show that Tpx1 is essential for aerobic, but not anaerobic, growth. We demonstrate that Tpx1 has an exquisite sensitivity for its substrate, which explains its participation in maintaining low steady-state levels of H2O2. We also show in vitro and in vivo that inactivation of Tpx1 by oxidation of its catalytic cysteine to a sulfinic acid is always preceded by a sulfinic acid form in a covalently linked dimer, which may be important for understanding the kinetics of Tpx1 inactivation. Furthermore, we provide evidence that a strain expressing Tpx1.C169S, lacking the resolving cysteine, can sustain aerobic growth, and we show that small reductants can modulate the activity of the mutant protein in vitro, probably by supplying a thiol group to substitute for cysteine 169.
Resumo:
This paper retakes previous work of the authors, about the relationship between non-quasi-competitiveness (the increase in price caused by an increase in the number of oligopolists) and stability of the equilibrium in the classical Cournot oligopoly model. Though it has been widely accepted in the literature that the loss of quasi-competitiveness is linked, in the long run as new firms entered the market, to instability of the model, the authors in their previous work put forward a model in which a situation of monopoly changed to duopoly losing quasi-competitiveness but maintaining the stability of the equilibrium. That model could not, at the time, be extended to any number of oligopolists. The present paper exhibits such an extension. An oligopoly model is shown in which the loss of quasi-competitiveness resists the presence in the market of as many firms as one wishes and where the successive Cournot's equilibrium points are unique and asymptotically stable. In this way, for the first time, the conjecture that non-quasi- competitiveness and instability were equivalent in the long run, is proved false.
Resumo:
In experiments with two-person sequential games we analyzewhether responses to favorable and unfavorable actions dependon the elicitation procedure. In our hot treatment thesecond player responds to the first player s observed actionwhile in our cold treatment we follow the strategy method and have the second player decide on a contingent action foreach and every possible first player move, without firstobserving this move. Our analysis centers on the degree towhich subjects deviate from the maximization of their pecuniaryrewards, as a response to others actions. Our results show nodifference in behavior between the two treatments. We also findevidence of the stability of subjects preferences with respectto their behavior over time and to the consistency of theirchoices as first and second mover.
Resumo:
We estimate a forward-looking monetary policy reaction function for thepostwar United States economy, before and after Volcker's appointmentas Fed Chairman in 1979. Our results point to substantial differencesin the estimated rule across periods. In particular, interest ratepolicy in the Volcker-Greenspan period appears to have been much moresensitive to changes in expected inflation than in the pre-Volckerperiod. We then compare some of the implications of the estimated rulesfor the equilibrium properties of inflation and output, using a simplemacroeconomic model, and show that the Volcker-Greenspan rule is stabilizing.
Resumo:
It is widely accepted in the literature about the classicalCournot oligopoly model that the loss of quasi competitiveness is linked,in the long run as new firms enter the market, to instability of the equilibrium. In this paper, though, we present a model in which a stableunique symmetric equilibrium is reached for any number of oligopolistsas industry price increases with each new entry. Consequently, the suspicion that non quasi competitiveness implies, in the long run, instabilityis proved false.
Resumo:
We develop a coordination game to model interactions betweenfundamentals and liquidity during unstable periods in financial markets.We then propose a flexible econometric framework for estimationof the model and analysis of its quantitative implications. The specificempirical application is carry trades in the yen dollar market, includingthe turmoil of 1998. We find a generally very deep market, withlow information disparities amongst agents. We observe occasionallyepisodes of market fragility, or turmoil with up by the escalator, downby the elevator patterns in prices. The key role of strategic behaviorin the econometric model is also confirmed.
Resumo:
In this paper, we discuss pros and cons ofdifferent models for financial market regulationand supervision and we present a proposal forthe re-organisation of regulatory and supervisoryagencies in the Euro Area. Our arguments areconsistent with both new theories and effectivebehaviour of financial intermediaries inindustrialized countries. Our proposed architecturefor financial market regulation is based on theassignment of different objectives or "finalities"to different authorities, both at the domesticand the European level. According to thisperspective, the three objectives of supervision- microeconomic stability, investor protectionand proper behaviour, efficiency and competition- should be assigned to three distinct Europeanauthorities, each one at the centre of a Europeansystem of financial regulators and supervisorsspecialized in overseeing the entire financialmarket with respect to a single regulatoryobjective and regardless of the subjective natureof the intermediaries. Each system should bestructured and organized similarly to the EuropeanSystem of Central Banks and work in connectionwith the central bank which would remain theinstitution responsible for price and macroeconomicstability. We suggest a plausible path to buildour 4-peak regulatory architecture in the Euro area.