938 resultados para Price dynamics model with memory
Resumo:
Introduction.- Knowledge of predictors of an unfavourable outcome, e.g. non-return to work after an injury enables to identify patients at risk and to target interventions for modifiable predictors. It has been recently shown that INTERMED; a tool to measure biopsychosocial complexity in four domains (biologic, psychologic, social and care, with a score between 0-60 points) can be useful in this context. The aim of this study was to set up a predictive model for non-return to work using INTERMED in patients in vocational rehabilitation after orthopaedic injury.Patients and methods.- In this longitudinal prospective study, the cohort consisted of 2156 consecutively included inpatients with orthopaedic trauma attending a rehabilitation hospital after a work, traffic or sport related injury. Two years after discharge, a questionnaire regarding return to work was sent (1502 returned their questionnaires). In addition to INTERMED, 18 predictors known at baseline of the rehabilitation were selected based on previous research. A multivariable logistic regression was performed.Results.- In the multivariate model, not-returning to work at 2 years was significantly predicted by the INTERMED: odds-ratio (OR) 1.08 (95% confidence interval, CI [1.06; 1.11]) for a one point increase in scale; by qualified work-status before the injury OR = 0.74, CI (0.54; 0.99), by using French as preferred language OR = 0.60, CI (0.45; 0.80), by upper-extremity injury OR = 1.37, CI (1.03; 1.81), by higher education (> 9 years) OR = 0.74, CI (0.55; 1.00), and by a 10 year increase in age OR = 1.15, CI (1.02; 1.29). The area under the receiver-operator-characteristics curve (ROC)-curve was 0.733 for the full model (INTERMED plus 18 variables).Discussion.- These results confirm that the total score of the INTERMED is a significant predictor for return to work. The full model with 18 predictors combined with the total score of INTERMED has good predictive value. However, the number of variables (19) to measure is high for the use as screening tool in a clinic.
Resumo:
In this paper we examine the effect of tax policy on the relationship between inequality and growth in a two-sector non-scale model. With non-scale models, the longrun equilibrium growth rate is determined by technological parameters and it is independent of macroeconomic policy instruments. However, this fact does not imply that fiscal policy is unimportant for long-run economic performance. It indeed has important effects on the different levels of key economic variables such as per capita stock of capital and output. Hence, although the economy grows at the same rate across steady states, the bases for economic growth may be different.The model has three essential features. First, we explicitly model skill accumulation, second, we introduce government finance into the production function, and we introduce an income tax to mirror the fiscal events of the 1980¿s and 1990¿s in the US. The fact that the non-scale model is associated with higher order dynamics enables it to replicate the distinctly non-linear nature of inequality in the US with relative ease. The results derived in this paper attract attention to the fact that the non-scale growth model does not only fit the US data well for the long-run (Jones, 1995b) but also that it possesses unique abilities in explaining short term fluctuations of the economy. It is shown that during transition the response of the relative simulated wage to changes in the tax code is rather non-monotonic, quite in accordance to the US inequality pattern in the 1980¿s and early 1990¿s.More specifically, we have analyzed in detail the dynamics following the simulation of an isolated tax decrease and an isolated tax increase. So, after a tax decrease the skill premium follows a lower trajectory than the one it would follow without a tax decrease. Hence we are able to reduce inequality for several periods after the fiscal shock. On the contrary, following a tax increase, the evolution of the skill premium remains above the trajectory carried on by the skill premium under a situation with no tax increase. Consequently, a tax increase would imply a higher level of inequality in the economy
Resumo:
This paper analyzes the issue of the interiority of the optimal population growth rate in a two-period overlapping generations model with endogenous fertility. Using Cobb-Douglas utility and production functions, we show that the introduction of a cost of raising children allows for the possibility of the existence of an interior global maximum in the planner¿s problem, contrary to the exogenous fertility case
Resumo:
In this paper we examine the effect of tax policy on the relationship between inequality and growth in a two-sector non-scale model. With non-scale models, the longrun equilibrium growth rate is determined by technological parameters and it is independent of macroeconomic policy instruments. However, this fact does not imply that fiscal policy is unimportant for long-run economic performance. It indeed has important effects on the different levels of key economic variables such as per capita stock of capital and output. Hence, although the economy grows at the same rate across steady states, the bases for economic growth may be different.The model has three essential features. First, we explicitly model skill accumulation, second, we introduce government finance into the production function, and we introduce an income tax to mirror the fiscal events of the 1980¿s and 1990¿s in the US. The fact that the non-scale model is associated with higher order dynamics enables it to replicate the distinctly non-linear nature of inequality in the US with relative ease. The results derived in this paper attract attention to the fact that the non-scale growth model does not only fit the US data well for the long-run (Jones, 1995b) but also that it possesses unique abilities in explaining short term fluctuations of the economy. It is shown that during transition the response of the relative simulated wage to changes in the tax code is rather non-monotonic, quite in accordance to the US inequality pattern in the 1980¿s and early 1990¿s.More specifically, we have analyzed in detail the dynamics following the simulation of an isolated tax decrease and an isolated tax increase. So, after a tax decrease the skill premium follows a lower trajectory than the one it would follow without a tax decrease. Hence we are able to reduce inequality for several periods after the fiscal shock. On the contrary, following a tax increase, the evolution of the skill premium remains above the trajectory carried on by the skill premium under a situation with no tax increase. Consequently, a tax increase would imply a higher level of inequality in the economy
Resumo:
We consider stochastic partial differential equations with multiplicative noise. We derive an algorithm for the computer simulation of these equations. The algorithm is applied to study domain growth of a model with a conserved order parameter. The numerical results corroborate previous analytical predictions obtained by linear analysis.
Resumo:
An Ising-like model, with interactions ranging up to next-nearest-neighbor pairs, is used to simulate the process of interface alloying. Interactions are chosen to stabilize an intermediate "antiferromagnetic" ordered structure. The dynamics proceeds exclusively by atom-vacancy exchanges. In order to characterize the process, the time evolution of the width of the intermediate ordered region and the diffusion length is studied. Both lengths are found to follow a power-law evolution with exponents depending on the characteristic features of the model.
Resumo:
This work presents an analysis of hysteresis and dissipation in quasistatically driven disordered systems. The study is based on the random field Ising model with fluctuationless dynamics. It enables us to sort out the fraction of the energy input by the driving field stored in the system and the fraction dissipated in every step of the transformation. The dissipation is directly related to the occurrence of avalanches, and does not scale with the size of Barkhausen magnetization jumps. In addition, the change in magnetic field between avalanches provides a measure of the energy barriers between consecutive metastable states
Resumo:
We consider the two Higgs doublet model extension of the standard model in the limit where all physical scalar particles are very heavy, too heavy, in fact, to be experimentally produced in forthcoming experiments. The symmetry-breaking sector can thus be described by an effective chiral Lagrangian. We obtain the values of the coefficients of the O(p4) operators relevant to the oblique corrections and investigate to what extent some nondecoupling effects may remain at low energies. A comparison with recent CERN LEP data shows that this model is indistinguishable from the standard model with one doublet and with a heavy Higgs boson, unless the scalar mass splittings are large.
Resumo:
The effect of external fluctuations on the formation of spatial patterns is analyzed by means of a stochastic Swift-Hohenberg model with multiplicative space-correlated noise. Numerical simulations in two dimensions show a shift of the bifurcation point controlled by the intensity of the multiplicative noise. This shift takes place in the ordering direction (i.e., produces patterns), but its magnitude decreases with that of the noise correlation length. Analytical arguments are presented to explain these facts.
Resumo:
compatible with the usual nonlocal model governed by surface tension that results from a macroscopic description. To explore this discrepancy, we exhaustively analyze numerical integrations of a phase-field model with dichotomic columnar disorder. We find that two distinct behaviors are possible depending on the capillary contrast between the two values of disorder. In a high-contrast case, where interface evolution is mainly dominated by the disorder, an inherent anomalous scaling is always observed. Moreover, in agreement with experimental work, the interface motion has to be described through a local model. On the other hand, in a lower-contrast case, the interface is dominated by interfacial tension and can be well modeled by a nonlocal model. We have studied both spontaneous and forced-flow imbibition situations, giving a complete set of scaling exponents in each case, as well as a comparison to the experimental results.
Resumo:
We study the influence of disorder strength on the interface roughening process in a phase-field model with locally conserved dynamics. We consider two cases where the mobility coefficient multiplying the locally conserved current is either constant throughout the system (the two-sided model) or becomes zero in the phase into which the interface advances (one-sided model). In the limit of weak disorder, both models are completely equivalent and can reproduce the physical process of a fluid diffusively invading a porous media, where super-rough scaling of the interface fluctuations occurs. On the other hand, increasing disorder causes the scaling properties to change to intrinsic anomalous scaling. In the limit of strong disorder this behavior prevails for the one-sided model, whereas for the two-sided case, nucleation of domains in front of the invading front are observed.
Resumo:
OBJECTIVE: This study aimed to assess the impact of individual comorbid conditions as well as the weight assignment, predictive properties and discriminating power of the Charlson Comorbidity Index (CCI) on outcome in patients with acute coronary syndrome (ACS). METHODS: A prospective multicentre observational study (AMIS Plus Registry) from 69 Swiss hospitals with 29 620 ACS patients enrolled from 2002 to 2012. The main outcome measures were in-hospital and 1-year follow-up mortality. RESULTS: Of the patients, 27% were female (age 72.1 ± 12.6 years) and 73% were male (64.2 ± 12.9 years). 46.8% had comorbidities and they were less likely to receive guideline-recommended drug therapy and reperfusion. Heart failure (adjusted OR 1.88; 95% CI 1.57 to 2.25), metastatic tumours (OR 2.25; 95% CI 1.60 to 3.19), renal diseases (OR 1.84; 95% CI 1.60 to 2.11) and diabetes (OR 1.35; 95% CI 1.19 to 1.54) were strong predictors of in-hospital mortality. In this population, CCI weighted the history of prior myocardial infarction higher (1 instead of -0.4, 95% CI -1.2 to 0.3 points) but heart failure (1 instead of 3.7, 95% CI 2.6 to 4.7) and renal disease (2 instead of 3.5, 95% CI 2.7 to 4.4) lower than the benchmark, where all comorbidities, age and gender were used as predictors. However, the model with CCI and age has an identical discrimination to this benchmark (areas under the receiver operating characteristic curves were both 0.76). CONCLUSIONS: Comorbidities greatly influenced clinical presentation, therapies received and the outcome of patients admitted with ACS. Heart failure, diabetes, renal disease or metastatic tumours had a major impact on mortality. CCI seems to be an appropriate prognostic indicator for in-hospital and 1-year outcomes in ACS patients. ClinicalTrials.gov Identifier: NCT01305785.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.
Resumo:
All derivations of the one-dimensional telegraphers equation, based on the persistent random walk model, assume a constant speed of signal propagation. We generalize here the model to allow for a variable propagation speed and study several limiting cases in detail. We also show the connections of this model with anomalous diffusion behavior and with inertial dichotomous processes.
Resumo:
We consider damage spreading transitions in the framework of mode-coupling theory. This theory describes relaxation processes in glasses in the mean-field approximation which are known to be characterized by the presence of an exponentially large number of metastable states. For systems evolving under identical but arbitrarily correlated noises, we demonstrate that there exists a critical temperature T0 which separates two different dynamical regimes depending on whether damage spreads or not in the asymptotic long-time limit. This transition exists for generic noise correlations such that the zero damage solution is stable at high temperatures, being minimal for maximal noise correlations. Although this dynamical transition depends on the type of noise correlations, we show that the asymptotic damage has the good properties of a dynamical order parameter, such as (i) independence of the initial damage; (ii) independence of the class of initial condition; and (iii) stability of the transition in the presence of asymmetric interactions which violate detailed balance. For maximally correlated noises we suggest that damage spreading occurs due to the presence of a divergent number of saddle points (as well as metastable states) in the thermodynamic limit consequence of the ruggedness of the free-energy landscape which characterizes the glassy state. These results are then compared to extensive numerical simulations of a mean-field glass model (the Bernasconi model) with Monte Carlo heat-bath dynamics. The freedom of choosing arbitrary noise correlations for Langevin dynamics makes damage spreading an interesting tool to probe the ruggedness of the configurational landscape.