134 resultados para Razón real (ratio rei)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most central banks perceive a trade-off between stabilizing inflation and stabilizing the gap between output and desired output. However, the standard new Keynesian framework implies no such trade-off. In that framework, stabilizing inflation is equivalent to stabilizing the welfare-relevant output gap. In this paper, we argue that this property of the new Keynesian framework, which we call the divine coincidence, is due to a special feature of the model: the absence of non trivial real imperfections.We focus on one such real imperfection, namely, real wage rigidities. When the baseline new Keynesian model is extended to allow for real wage rigidities, the divine coincidence disappears, and central banks indeed face a trade-off between stabilizing inflation and stabilizing the welfare-relevant output gap. We show that not only does the extended model have more realistic normative implications, but it also has appealing positive properties. In particular, it provides a natural interpretation for the dynamic inflation-unemployment relation found in the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the effects of German unification in a model with capital accumulation, skill differences and a welfare state. We argue that this event is similar to a mass migration of low-skilled agents holding no capital into a foreign country. Absent a welfare state, we observe an investment boom, depressed output and employment conditions. Capital owners and high-skilled agents are willing to give up to 4% of per-capita consumption to favor unification. When a welfare state exists the investment boom disappears and the recession is prolonged. Now, with unification, capital owners and high-skilled agents lose 4% of per-capita consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The well-known lack of power of unit root tests has often been attributed to the shortlength of macroeconomic variables and also to DGP s that depart from the I(1)-I(0)alternatives. This paper shows that by using long spans of annual real GNP and GNPper capita (133 years) high power can be achieved, leading to the rejection of both theunit root and the trend-stationary hypothesis. This suggests that possibly neither modelprovides a good characterization of these data. Next, more flexible representations areconsidered, namely, processes containing structural breaks (SB) and fractional ordersof integration (FI). Economic justification for the presence of these features in GNP isprovided. It is shown that the latter models (FI and SB) are in general preferred to theARIMA (I(1) or I(0)) ones. As a novelty in this literature, new techniques are appliedto discriminate between FI and SB models. It turns out that the FI specification ispreferred, implying that GNP and GNP per capita are non-stationary, highly persistentbut mean-reverting series. Finally, it is shown that the results are robust when breaksin the deterministic component are allowed for in the FI model. Some macroeconomicimplications of these findings are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Confidence in decision making is an important dimension of managerialbehavior. However, what is the relation between confidence, on the onehand, and the fact of receiving or expecting to receive feedback ondecisions taken, on the other hand? To explore this and related issuesin the context of everyday decision making, use was made of the ESM(Experience Sampling Method) to sample decisions taken by undergraduatesand business executives. For several days, participants received 4 or 5SMS messages daily (on their mobile telephones) at random moments at whichpoint they completed brief questionnaires about their current decisionmaking activities. Issues considered here include differences between thetypes of decisions faced by the two groups, their structure, feedback(received and expected), and confidence in decisions taken as well as inthe validity of feedback. No relation was found between confidence indecisions and whether participants received or expected to receivefeedback on those decisions. In addition, although participants areclearly aware that feedback can provide both confirming and disconfirming evidence, their ability to specify appropriatefeedback is imperfect. Finally, difficulties experienced inusing the ESM are discussed as are possibilities for further researchusing this methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the empirical interdependences among assetreturns, real activity and inflation from a multicountry and internationalpoint of view. We find that nominal stock returns are significantly relatedto inflation only in the US, that the US term structure of interest ratespredicts both domestic and foreign inflation rates while foreign termstructures do not have this predictive power and that innovations in inflationand exchange rates induce insignificant responses of real and financialvariables. An interpretation of the dynamics and some policy implicationsof the results are provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a comparative analysis of linear and mixed modelsfor short term forecasting of a real data series with a high percentage of missing data. Data are the series of significant wave heights registered at regular periods of three hours by a buoy placed in the Bay of Biscay.The series is interpolated with a linear predictor which minimizes theforecast mean square error. The linear models are seasonal ARIMA models and themixed models have a linear component and a non linear seasonal component.The non linear component is estimated by a non parametric regression of dataversus time. Short term forecasts, no more than two days ahead, are of interestbecause they can be used by the port authorities to notice the fleet.Several models are fitted and compared by their forecasting behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study the evolution of the labor share in the OECD since 1970. We show it is essentially related to the capital-output ratio; that this relationship is shifted by factors like the price of imported materials or the skill mix; and that discrepancies between the marginal product of labor and the real wage (due to, e.g., product market power, union bargaining, and labor adjustment costs) cause departures from it. We provide estimates of the model with panel data on 14 industries and 14 countries for 1973-93 and use them to compute the evolution of the wage gap in Germany and the US.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Industrial Revolution was characterized by technologicalprogress and an increasing capital intensity. Why did real wages stagnateor fall in the beginning? I answer this question by modeling the IndustrialRevolution as the introduction of a relatively more capital intensiveproduction method in a standard neoclassical framework. I show that{\sl real wages fall in the beginning of an industrial revolution if andonly if technological progress in the relatively more capital intensivesector is relatively fast.}

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Treball de recerca realitzat per alumnes d’ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l’any 2010. L’objectiu inicial ha estat l'anàlisi mitjançant una càmera de vídeo d'un moviment parabòlic real, un salt de longitud, per intentar combinar física i esport. Però, com quasi sempre quan un investiga de forma rigorosa i atenta, es va obrir una porta, un nou objectiu. La comparació entre salts d'un amateur i uns professionals ens va permetre descobrir la tècnica dels saltadors, que d'altra banda complicava el moviment. El descobriment que amb una petita implementació tècnica podíem millorar la marca de salt va reorientar el treball i li va donar la vessant esportiva desitjada. S’ha seguit una metodologia d'observació i anàlisi experimental i d'aplicació del coneixement fent servir una tècnica, encara que no nova, poc utilitzada en treballs d'aquests nivells, el tractament de vídeos com a sensors cinemàtics. A través del processament fotograma a fotograma amb el software MultiLab s'han pogut identificar les variables involucrades en l'optimització del salt; s'han comparat les evolucions en els diferents salts i s'ha après d'aquest anàlisi per aplicar-ho a la millora. No cal oblidar una primera recerca documental, incloses les entrevistes a professionals.

Relevância:

20.00% 20.00%

Publicador: