79 resultados para Steady State Processes.
Resumo:
Following a general macroeconomic approach, this paper sets a closed micro-founded structural model to determine the long run real exchange rate of a developed economy. In particular, the analysis follows the structure of a Natrex model. The main contribution of this research paper is the development of a solid theoretical framework that analyse in depth the basis of the real exchange rate and the details of the equilibrium dynamics after any shock influencing the steady state. In our case, the intertemporal factors derived from the stock-flow relationship will be particularly determinant. The main results of the paper can be summarised as follows. In first place, a complete well-integrated structural model for long-run real exchange rate determination is developed from first principles. Moreover, within the concrete dynamics of the model, it is found that some convergence restrictions will be necessary. On one hand, for the medium run convergence the sensitivity of the trade balance to changes in real exchange rate should be higher that the correspondent one to the investment decisions. On the other hand, and regarding long-run convergence, it is also necessary both that there exists a negative relationship between investment and capital stock accumulation and that the global saving of the economy depends positively on net foreign debt accumulation. In addition, there are also interesting conclusions about the effects that certain shocks over the exogenous variables of the model have on real exchange rates.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of ?the historical tendencies?: a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin?s method, to prove the result.
Resumo:
La 3,4-Metilendioximetanfetamina (MDMA, éxtasis) es un derivado anfetamínico sintético ampliamente usado como droga recreativa, que produce neurotoxicidad serotonérgica en animales y posiblemente también en humanos. El mecanismo subyacente de neurotoxicidad, incluye la formación de especies reactivas de oxigeno (ROS), pero la fuente de generación de estos es un punto de controversia. Se postula que la neurotoxicidad inducida por la MDMA es mediada por la formación de metabolitos bioreactivos. Específicamente, los metabolitos primarios de tipo catecol, la 3,4- dihidroximetanfetamina (HHMA) y la 3,4-dihidroxianfetamina (HHA), que luego dan lugar a la formación de conjugados con el glutatión y la N-acetilcisteína, y que conservan la capacidad de entrar en el ciclo redox y presentan neurotoxicidad serotonérgica en ratas. Aunque la presencia de dichos metabolitos se demostró recientemente en microdialisados de cerebros de ratas, su formación en humanos no se ha reportado aun. Este trabajo describe la detección de N-acetil-cisteína-HHMA (NAC-HHMA) y N-acetil-cisteína-HHA (NAC-HHA) en orina humana de 15 consumidores recreacionales de MDMA (1.5 mg/kg) en un entorno controlado. Los resultados revelan que en las primeras 4 horas después del consumo de MDMA aproximadamente el 0.002% de la dosis administrada es recuperada como aductos tioéter. Los polimorfismos genéticos en la expresión de las enzimas CYP2D6 y COMT, que en conjunto son las principales determinantes de los niveles estables de HHMA y HHA, posiblemente expliquen la variabilidad interindividual observada en la recuperación de la NAC-HHMA y la NAC-HHA en orina. Resumiendo, por primera vez se demuestra la formación de aductos tioéteres neurotóxicos de la MDMA en humanos. Estos resultados apoyan la hipótesis de que la bioactivación de la MDMA a metabolitos neurotóxicos es el mecanismo relevante para la generación de la neurotoxicidad en humanos.
Resumo:
This technical report is a document prepared as a deliverable [D4.3 Report of the Interlinkages and forecasting prototype tool] of a EU project – DECOIN Project No. 044428 - FP6-2005-SSP-5A. The text is divided into 4 sections: (1) this short introductory section explains the purpose of the report; (2) the second section provides a general discussion of a systemic problem found in existing quantitative analysis of sustainability. It addresses the epistemological implications of complexity, which entails the need of dealing with the existence of Multiple-Scales and non-equivalent narratives (multiple dimensions/attributes) to be used to define sustainability issues. There is an unavoidable tension between a “steady-state view” (= the perception of what is going on now – reflecting a PAST --& PRESENT view of the reality) versus an “evolutionary view” (= the unknown transformation that we have to expect in the process of becoming of the observed reality and in the observer – reflecting a PRESENT --& FUTURE view of the reality). The section ends by listing the implications of these points on the choice of integrated packages of sustainability indicators; (3) the third section illustrates the potentiality of the DECOIN toolkit for the study of sustainability trade-offs and linkages across indicators using quantitative examples taken from cases study of another EU project (SMILE). In particular, this section starts by addressing the existence of internal constraints to sustainability (economic versus social aspects). The narrative chosen for this discussion focuses on the dark side of ageing and immigration on the economic viability of social systems. Then the section continues by exploring external constraints to sustainability (economic development vs the environment). The narrative chosen for this discussion focuses on the dark side of current strategy of economic development based on externalization and the “bubbles-disease”; (4) the last section presents a critical appraisal of the quality of energy data found in energy statistics. It starts with a discussion of the general goal of statistical accounting. Then it introduces the concept of multipurpose grammars. The second part uses the experience made in the activities of the DECOIN project to answer the question: how useful are EUROSTAT energy statistics? The answer starts with an analysis of basic epistemological problems associated with accounting of energy. This discussion leads to the acknowledgment of an important epistemological problem: the unavoidable bifurcations in the mechanism of accounting needed to generate energy statistics. By using numerical example the text deals with the following issues: (i) the pitfalls of the actual system of accounting in energy statistics; (ii) a critical appraisal of the actual system of accounting in BP statistics; (iii) a critical appraisal of the actual system of accounting in Eurostat statistics. The section ends by proposing an innovative method to represent energy statistics which can result more useful for those willing develop sustainability indicators.
Resumo:
We propose a mixed finite element method for a class of nonlinear diffusion equations, which is based on their interpretation as gradient flows in optimal transportation metrics. We introduce an appropriate linearization of the optimal transport problem, which leads to a mixed symmetric formulation. This formulation preserves the maximum principle in case of the semi-discrete scheme as well as the fully discrete scheme for a certain class of problems. In addition solutions of the mixed formulation maintain exponential convergence in the relative entropy towards the steady state in case of a nonlinear Fokker-Planck equation with uniformly convex potential. We demonstrate the behavior of the proposed scheme with 2D simulations of the porous medium equations and blow-up questions in the Patlak-Keller-Segel model.
Resumo:
We consider a nonlinear cyclin content structured model of a cell population divided into proliferative and quiescent cells. We show, for particular values of the parameters, existence of solutions that do not depend on the cyclin content. We make numerical simulations for the general case obtaining, for some values of the parameters convergence to the steady state but also oscillations of the population for others.
Resumo:
Piped water is used to remove hydration heat from concrete blocks during construction. In this paper we develop an approximate model for this process. The problem reduces to solving a one-dimensional heat equation in the concrete, coupled with a first order differential equation for the water temperature. Numerical results are presented and the effect of varying model parameters shown. An analytical solution is also provided for a steady-state constant heat generationmodel. This helps highlight the dependence on certain parameters and can therefore provide an aid in the design of cooling systems.
Resumo:
This paper develops a simple model that can be used to analyze the long-term sustainability of the contributive pension system and the steady-state response of pension expenditure to changes in some key demographic and economic variables, in the characteristics of the average pensioner and in the parameters that describe how pensions are calculated in Spain as a function of workers' Social Security contribution histories.
Resumo:
Background: With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results: In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK) τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes.Conclusions: The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.
Resumo:
Peroxiredoxins are known to interact with hydrogen peroxide (H2O2) and to participate in oxidant scavenging, redox signal transduction, and heat-shock responses. The two-cysteine peroxiredoxin Tpx1 of Schizosaccharomyces pombe has been characterized as the H2O2 sensor that transduces the redox signal to the transcription factor Pap1. Here, we show that Tpx1 is essential for aerobic, but not anaerobic, growth. We demonstrate that Tpx1 has an exquisite sensitivity for its substrate, which explains its participation in maintaining low steady-state levels of H2O2. We also show in vitro and in vivo that inactivation of Tpx1 by oxidation of its catalytic cysteine to a sulfinic acid is always preceded by a sulfinic acid form in a covalently linked dimer, which may be important for understanding the kinetics of Tpx1 inactivation. Furthermore, we provide evidence that a strain expressing Tpx1.C169S, lacking the resolving cysteine, can sustain aerobic growth, and we show that small reductants can modulate the activity of the mutant protein in vitro, probably by supplying a thiol group to substitute for cysteine 169.
Resumo:
In the first part of this paper we try to test the relationship between mothers earnings, fertility and children's work in the Spanish (Catalan) context of the first third of the 20th century. Specific human capital investment of adult working women had as an outcome the sharp increase of their real wage and also the increase of the opportunity cost of time devoted to house work including child rearing. Fertility evolution is endogenous to the model and decreases as a result of women real wage increases. Human capital investment of labouring women and mandatory schooling of children shift the labour supply function to a new steady state in which the slope is steeper. According to recent papers this model applies to 20th century Spain and it causes the abolition of children's work. Nonetheless the model do not apply to 20th century Latin America. Despite the positive evolution of literacy and life expectancy in this region, other factors involved poor results of the educational human capital investment. In this paper we remark the role of the increasing share of the informal sector of the economy ruled on the bases of women's and children's work. Second we stress the role of high income inequality evolution and endogamic school supplies to explain the limits of increasing literacy on more remarkable human capital improvements.
Resumo:
In this paper we present a simple theory-based measure of the variations in aggregate economic efficiency: the gap between the marginal product of labor and the household s consumption/leisure tradeoff. We show that this indicator corresponds to the inverse of the markup of price over social marginal cost, and give some evidence in support of this interpretation. We then show that, with some auxilliary assumptions our gap variable may be used to measure the efficiency costs of business fluctuations. We find that the latter costs are modest on average. However, to the extent the flexible price equilibrium is distorted, the gross efficiency losses from recessions and gains from booms may be large. Indeed, we find that the major recessions involved large efficiency losses. These results hold for reasonable parameterizations of the Frisch elasticity of labor supply, the coefficient of relative risk aversion, and steady state distortions.
Resumo:
In this paper, we develop a general equilibrium model of crime and show thatlaw enforcement has different roles depending on the equilibrium characterization and the value of social norms. When an economy has a unique stable equilibrium where a fraction of the population is productive and the remaining predates, the government can choose an optimal law enforcement policy to maximize a welfare function evaluated at the steady state. If such steady state is not unique, law enforcement is still relevant but in a completely different way because the steady state that prevails depends on the initial proportions of productive and predator individuals in the economy. The relative importance of these proportions can be changed through law enforcement policy.
Resumo:
The paper proposes a technique to jointly test for groupings of unknown size in the cross sectional dimension of a panel and estimates the parameters of each group, and applies it to identifying convergence clubs in income per-capita. The approach uses the predictive density of the data, conditional on the parameters of the model. The steady state distribution of European regional data clusters around four poles of attraction with different economic features. The distribution of incomeper-capita of OECD countries has two poles of attraction and each grouphas clearly identifiable economic characteristics.