831 resultados para Equilibrium Measure
Resumo:
The “cotton issue” has been a topic of several academic discussions for trade policy analysts. However the design of trade and agricultural policy in the EU and the USA has become a politically sensitive matter throughout the last five years. This study utilizing the Agricultural Trade Policy Simulation Model (ATPSM) aims to gain insights into the global cotton market, to explain why domestic support for cotton has become an issue, to quantify the impact of the new EU agricultural policy on the cotton sector, and to measure the effect of eliminating support policies on production and trade. Results indicate that full trade liberalization would lead the four West African countries to better terms of trade with the EU. If tariff reduction follows the so-called Swiss formula, world prices would increase by 3.5%.
Resumo:
We investigate the eigenvalue statistics of ensembles of normal random matrices when their order N tends to infinite. In the model, the eigenvalues have uniform density within a region determined by a simple analytic polynomial curve. We study the conformal deformations of equilibrium measures of normal random ensembles to the real line and give sufficient conditions for it to weakly converge to a Wigner measure.
Resumo:
In this paper we prove convergence to chaotic sunspot equilibrium through two learning rules used in the bounded rationality literature. The rst one shows the convergence of the actual dynamics generated by simple adaptive learning rules to a probability distribution that is close to the stationary measure of the sunspot equilibrium; since this stationary measure is absolutely continuous it results in a robust convergence to the stochastic equilibrium. The second one is based on the E-stability criterion for testing stability of rational expectations equilibrium, we show that the conditional probability distribution de ned by the sunspot equilibrium is expectational stable under a reasonable updating rule of this parameter. We also report some numerical simulations of the processes proposed.
Resumo:
In a general equilibrium model. we show that the value of the equilibrium real exchange rate is affected by its own volatility. Risk averse exporters. that make their exporting decision before observing the realization of the real exchange rate. choose to export less the more volatile is the real exchange rate. Therefore the trude balance and the variance of the real exchange rate are negatively related. An increase in the volatility of the real exchange rate for instance deteriorates the trade balance and to restore equilibrium a real exchange rate depreciation has to take place. In the empirical part of the paper we use the traditional (unconditional) standard deviation of RER changes as our measure of RER volatility.We describe the behavior of the RER volatility for Brazil,Argentina and Mexico.Monthly data for the three countries are used. and also daily data for Bruzil. Interesting patterns of volatility could be associated to the nature of the several stabilization plans adopted in those countries and to changes in the exchange rate regimes .
Resumo:
We consider private value auctions where bidders’ types are dependent, a case usually treated by assuming affiliation. We show that affiliation is a restrictive assumption in three senses: topological, measure-theoretic and statistical (affiliation is a very restrictive characterization of positive dependence). We also show that affiliation’s main implications do not generalize for alternative definitions of positive dependence. From this, we propose new approaches to the problems of pure strategy equilibrium existence in first-price auctions (PSEE) and the characterization of the revenue ranking of auctions. For equilibrium existence, we slightly restrict the set of distributions considered, without loss of economic generality, and offer a complete characterization of PSEE. For revenue ranking, we obtain a characterization of the expected revenue differences between second and first price auctions with general dependence of types.
Resumo:
In this paper we consider an equilibrium last-passage percolation model on an environment given by a compound two-dimensional Poisson process. We prove an L-2-formula relating the initial measure with the last-passage percolation time. This formula turns out to be a useful tool to analyze the fluctuations of the last-passage times along non-characteristic directions.
Resumo:
We study a probabilistic model of interacting spins indexed by elements of a finite subset of the d-dimensional integer lattice, da parts per thousand yen1. Conditions of time reversibility are examined. It is shown that the model equilibrium distribution converges to a limit distribution as the indexing set expands to the whole lattice. The occupied site percolation problem is solved for the limit distribution. Two models with similar dynamics are also discussed.
Resumo:
In this research, a modification to initiation aid ignition in bomb calorimetry that involves systemically blending levels of boron and potassium nitrate initiation aids with a bulk structural energetic elemental power blend is developed. A regression is used to estimate the nominal heat of reaction for the primary reaction. The technique is first applied to the synthesis of TiB2 as a validation study to see if close proximity to literature values can be achieved. The technique is then applied to two systems of interest, Al-Ti-B, and Al-Ti-B4C. In all three investigations, x-ray diffraction is used to characterize the product phases of the reactions to determine the extent and identity of the product phases and any by-products that may have formed as a result of adding the initiation aid. The experimental data indicates the technique approximates the heat of reaction value for the synthesis of TiB2 from Ti-B powder blends and the formation of TiB2 is supported by volume fraction analysis by x-ray diffraction. Application to the Al-Ti-B and Al-Ti-B4C blends show some correlation with variation of the initiation aid, with x-ray diffraction showing the formation of equilibrium products. However, these blends require further investigation to resolve more complex interactions and rule out extraneous variables.
Resumo:
Conventional differential scanning calorimetry (DSC) techniques are commonly used to quantify the solubility of drugs within polymeric-controlled delivery systems. However, the nature of the DSC experiment, and in particular the relatively slow heating rates employed, limit its use to the measurement of drug solubility at the drug's melting temperature. Here, we describe the application of hyper-DSC (HDSC), a variant of DSC involving extremely rapid heating rates, to the calculation of the solubility of a model drug, metronidazole, in silicone elastomer, and demonstrate that the faster heating rates permit the solubility to be calculated under non-equilibrium conditions such that the solubility better approximates that at the temperature of use. At a heating rate of 400°C/min (HDSC), metronidazole solubility was calculated to be 2.16 mg/g compared with 6.16 mg/g at 20°C/min. © 2005 Elsevier B.V. All rights reserved.
Resumo:
A study of vapour-liquid equilibria is presented together with current developments. The theory of vapour-liquid equilibria is discussed. Both experimental and prediction methods for obtaining vapour-liquid equilibria data are critically reviewed. The development of a new family of equilibrium stills to measure experimental VLE data from sub-atmosphere to 35 bar pressure is described. Existing experimental techniques are reviewed, to highlight the needs for these new apparati and their major attributes. Details are provided of how apparatus may be further improved and how computer control may be implemented. To provide a rigorous test of the apparatus the stills have been commissioned using acetic acid-water mixture at one atmosphere pressure. A Barker-type consistency test computer program, which allows for association in both phases has been applied to the data generated and clearly shows that the stills produce data of a very high quality. Two high quality data sets, for the mixture acetone-chloroform, have been generated at one atmosphere and 64.3oC. These data are used to investigate the ability of the new novel technique, based on molecular parameters, to predict VLE data for highly polar mixtures. Eight, vapour-liquid equilibrium data sets have been produced for the cyclohexane-ethanol mixture at one atmosphere, 2, 4, 6, 8 and 11 bar, 90.9oC and 132.8oC. These data sets have been tested for thermodynamic consistency using a Barker-type fitting package and shown to be of high quality. The data have been used to investigate the dependence of UNIQUAC parameters with temperature. The data have in addition been used to compare directly the performance of the predictive methods - Original UNIFAC, a modified version of UNIFAC, and the new novel technique, based on molecular parameters developed from generalised London's potential (GLP) theory.
Resumo:
A total pressure apparatus has been developed to measure vapour-liquid equilibrium data on binary mixtures at atmospheric and sub-atmospheric pressures. The method gives isothermal data which can be obtained rapidly. Only measurements of total pressure are made as a direct function of composition of synthetic liquid phase composition, the vapour phase composition being deduced through the Gibbs-Duhem relationship. The need to analyse either of the phases is eliminated. As such the errors introduced by sampling and analysis are removed. The essential requirements are that the pure components be degassed completely since any deficiency in degassing would introduce errors into the measured pressures. A similarly essential requirement was that the central apparatus would have to be absolutely leak-tight as any leakage of air either in or out of the apparatus would introduce erroneous pressure readings. The apparatus was commissioned by measuring the saturated vapour pressures of both degassed water and ethanol as a function of temperature. The pressure-temperature data on degassed water measured were directly compared with data in the literature, with good agreement. Similarly the pressure-temperature data were measured for ethanol, methanol and cyclohexane and where possible a direct comparison made with the literature data. Good agreement between the pure component data of this work and those available in the literature demonstrates firstly that a satisfactory degassing procedure has been achieved and that secondly the measurements of pressure-temperature are consistent for any one component; since this is true for a number of components, the measurements of both temperature and pressure are both self-consistent and of sufficient accuracy, with an observed compatibility between the precision/accuracy of the separate means of measuring pressure and temperature. The liquid mixtures studied were of ethanol-water, methanol-water and ethanol-cyclohexane. The total pressure was measured as the composition inside the equilibrium cell was varied at a set temperature. This gave P-T-x data sets for each mixture at a range of temperatures. A standard fitting-package from the literature was used to reduce the raw data to yield y-values to complete the x-y-P-T data sets. A consistency test could not be applied to the P-T-x data set as no y-values were obtained during the experimental measurements. In general satisfactory agreement was found between the data of this work and those available in the literature. For some runs discrepancies were observed, and further work recommended to eliminate the problems identified.
Resumo:
The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^
Resumo:
Macroeconomic policy makers are typically concerned with several indicators of economic performance. We thus propose to tackle the design of macroeconomic policy using Multicriteria Decision Making (MCDM) techniques. More specifically, we employ Multiobjective Programming (MP) to seek so-called efficient policies. The MP approach is combined with a computable general equilibrium (CGE) model. We chose use of a CGE model since they have the dual advantage of being consistent with standard economic theory while allowing one to measure the effect(s) of a specific policy with real data. Applying the proposed methodology to Spain (via the 1995 Social Accounting Matrix) we first quantified the trade-offs between two specific policy objectives: growth and inflation, when designing fiscal policy. We then constructed a frontier of efficient policies involving real growth and inflation. In doing so, we found that policy in 1995 Spain displayed some degree of inefficiency with respect to these two policy objectives. We then offer two sets of policy recommendations that, ostensibly, could have helped Spain at the time. The first deals with efficiency independent of the importance given to both growth and inflation by policy makers (we label this set: general policy recommendations). A second set depends on which policy objective is seen as more important by policy makers: increasing growth or controlling inflation (we label this one: objective-specific recommendations).