965 resultados para Eventual Consistency
Resumo:
A theoretical framework for the joint conservation of energy and momentum in the parameterization of subgrid-scale processes in climate models is presented. The framework couples a hydrostatic resolved (planetary scale) flow to a nonhydrostatic subgrid-scale (mesoscale) flow. The temporal and horizontal spatial scale separation between the planetary scale and mesoscale is imposed using multiple-scale asymptotics. Energy and momentum are exchanged through subgrid-scale flux convergences of heat, pressure, and momentum. The generation and dissipation of subgrid-scale energy and momentum is understood using wave-activity conservation laws that are derived by exploiting the (mesoscale) temporal and horizontal spatial homogeneities in the planetary-scale flow. The relations between these conservation laws and the planetary-scale dynamics represent generalized nonacceleration theorems. A derived relationship between the wave-activity fluxes-which represents a generalization of the second Eliassen-Palm theorem-is key to ensuring consistency between energy and momentum conservation. The framework includes a consistent formulation of heating and entropy production due to kinetic energy dissipation.
Resumo:
In recent years a number of chemistry-climate models have been developed with an emphasis on the stratosphere. Such models cover a wide range of time scales of integration and vary considerably in complexity. The results of specific diagnostics are here analysed to examine the differences amongst individual models and observations, to assess the consistency of model predictions, with a particular focus on polar ozone. For example, many models indicate a significant cold bias in high latitudes, the “cold pole problem”, particularly in the southern hemisphere during winter and spring. This is related to wave propagation from the troposphere which can be improved by improving model horizontal resolution and with the use of non-orographic gravity wave drag. As a result of the widely differing modelled polar temperatures, different amounts of polar stratospheric clouds are simulated which in turn result in varying ozone values in the models. The results are also compared to determine the possible future behaviour of ozone, with an emphasis on the polar regions and mid-latitudes. All models predict eventual ozone recovery, but give a range of results concerning its timing and extent. Differences in the simulation of gravity waves and planetary waves as well as model resolution are likely major sources of uncertainty for this issue. In the Antarctic, the ozone hole has probably reached almost its deepest although the vertical and horizontal extent of depletion may increase slightly further over the next few years. According to the model results, Antarctic ozone recovery could begin any year within the range 2001 to 2008. The limited number of models which have been integrated sufficiently far indicate that full recovery of ozone to 1980 levels may not occur in the Antarctic until about the year 2050. For the Arctic, most models indicate that small ozone losses may continue for a few more years and that recovery could begin any year within the range 2004 to 2019. The start of ozone recovery in the Arctic is therefore expected to appear later than in the Antarctic.
Resumo:
We compare hypothetical and observed (experimental) willingness to pay (WTP) for a gradual improvement in the environmental performance of a marketed good (an office table). First, following usual practices in marketing research, subjects’ stated WTP for the improvement is obtained. Second, the same subjects participate in a real reward experiment designed to replicate the scenario valued in the hypothetical question. Our results show that, independently of the degree of the improvement, there are no significant median differences between stated and experimental data. However, subjects reporting extreme values of WTP (low or high) exhibit a more moderate behavior in the experiment.
Resumo:
Flood forecasting increasingly relies on numerical weather prediction forecasts to achieve longer lead times. One of the key difficulties that is emerging in constructing a decision framework for these flood forecasts is what to dowhen consecutive forecasts are so different that they lead to different conclusions regarding the issuing of warnings or triggering other action. In this opinion paper we explore some of the issues surrounding such forecast inconsistency (also known as "Jumpiness", "Turning points", "Continuity" or number of "Swings"). In thsi opinion paper we define forecast inconsistency; discuss the reasons why forecasts might be inconsistent; how we should analyse inconsistency; and what we should do about it; how we should communicate it and whether it is a totally undesirable property. The property of consistency is increasingly emerging as a hot topic in many forecasting environments.
Resumo:
Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The growth of online, hybrid, and distance courses challenges institutions to maintain content consistency across multiple platforms. This report examines the policies, standards, and practices that guide course consistency initiatives.
Resumo:
This paper is about economies with a representative consumer. In general a representative consumer need not exist, although there are several well known sets of sufficient conditions under which Qne will. It is common practice, however, to use the representative consumer hypothesis without specifically assuming any of these. We show, firstly, that it is possible for the utility of the representative consumer to increase when every actual consumer is made worse off. This shows a serious shortcoming of welfare judgements based on the representatíve consumer. Secondly, in economies where this does not occur, there exists a social welfare function, which we construct, which is consistent with welfare judgements based on the utility of the representative consumer. Finally we provide a converse to Samuelson' s 1956 representative consumer result, which relates it to Scitovsky's community indifference curves.
Resumo:
This paper investigates which properties money-demand functions have to satisfy to be consistent with multidimensional extensions of Lucasí(2000) versions of the Sidrauski (1967) and the shopping-time models. We also investigate how such classes of models relate to each other regarding the rationalization of money demands. We conclude that money demand functions rationalizable by the shoppingtime model are always rationalizable by the Sidrauski model, but that the converse is not true. The log-log money demand with an interest-rate elasticity greater than or equal to one and the semi-log money demand are counterexamples.
Resumo:
We consider the problem of time consistency of the Ramsey monetary and fiscal policies in an economy without capital. Following Lucas and Stokey (1983) we allow the government at date t to leave its successor at t + 1 a profile of real and nominal debt of all maturities, as a way to influence its decisions. We show that the Ramsey policies are time consistent if and only if the Friedman rule is the optimal Ramsey policy.
Resumo:
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.
Resumo:
We derive the torsion constraints and show the consistency of equations of motion of four-dimensional Type II supergravity in superspace. with Type II sigma model. This is achieved by coupling the four-dimensional compactified Type II Berkovits' superstring to an N = 2 curved background and requiring that the sigma-model has superconformal invariance at tree-level. We compute this in a manifestly 4D N = 2 supersymmetric way. The constraints break the target conformal and SU(2) invariances and the dilaton will be a conformal, SU(2) x U(1) compensator. For Type II superstring in four dimensions, worldsheet supersymmetry requires two different compensators. One type is described by chiral and anti-chiral superfields. This compensator can be identified with a vector multiplet. The other Type II compensator is described by twist-chiral and twist-anti-chiral superfields and can be identified with a tensor hypermultiplet. Also, the superconformal invariance at tree-level selects a particular gauge, where the matter is fixed, but not the compensators. After imposing the reality conditions, we show that the Type II sigma model at tree-level is consistent with the equations of motion for Type II supergravity in the string gauge. (C) 2003 Elsevier B.V All rights reserved.