102 resultados para PARAMETRIC-INSTABILITIES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the mid-1980s, many European countries introduced fixed-term contracts.Since then their labor markets have become more dynamic. This paper studiesthe implications of such reforms for the duration distribution ofunemployment, with particular emphasis on the changes in the durationdependence. I estimate a parametric duration model using cross-sectionaldata drawn from the Spanish Labor Force Survey from 1980 to 1994 to analyzethe chances of leaving unemployment before and after the introduction offixed-term contracts. I find that duration dependence has increased sincesuch reform. Semi-parametric estimation of the model also shows that forlong spells, the probability of leaving unemployment has decreased sincesuch reform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the contribution of money to business cycle fluctuations in the US,the UK, Japan, and the Euro area using a small scale structural monetary business cycle model. Constrained likelihood-based estimates of the parameters areprovided and time instabilities analyzed. Real balances are statistically importantfor output and inflation fluctuations. Their contribution changes over time. Models giving money no role provide a distorted representation of the sources of cyclicalfluctuations, of the transmission of shocks and of the events of the last 40 years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are two fundamental puzzles about trade credit: why does it appearto be so expensive,and why do input suppliers engage in the business oflending money? This paper addresses and answers both questions analysingthe interaction between the financial and the industrial aspects of thesupplier-customer relationship. It examines how, in a context of limitedenforceability of contracts, suppliers may have a comparative advantageover banks in lending to their customers because they hold the extrathreat of stopping the supply of intermediate goods. Suppliers may alsoact as lenders of last resort, providing insurance against liquidityshocks that may endanger the survival of their customers. The relativelyhigh implicit interest rates of trade credit result from the existenceof default and insurance premia. The implications of the model areexamined empirically using parametric and nonparametric techniques on apanel of UK firms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a comparative analysis of linear and mixed modelsfor short term forecasting of a real data series with a high percentage of missing data. Data are the series of significant wave heights registered at regular periods of three hours by a buoy placed in the Bay of Biscay.The series is interpolated with a linear predictor which minimizes theforecast mean square error. The linear models are seasonal ARIMA models and themixed models have a linear component and a non linear seasonal component.The non linear component is estimated by a non parametric regression of dataversus time. Short term forecasts, no more than two days ahead, are of interestbecause they can be used by the port authorities to notice the fleet.Several models are fitted and compared by their forecasting behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we analyse the observed systematic differences incosts for teaching hospitals (THhenceforth) in Spain. Concernhas been voiced regarding the existence of a bias in thefinancing of TH s has been raised once prospective budgets arein the arena for hospital finance, and claims for adjusting totake into account the legitimate extra costs of teaching onhospital expenditure are well grounded. We focus on theestimation of the impact of teaching status on average cost. Weused a version of a multiproduct hospital cost function takinginto account some relevant factors from which to derive theobserved differences. We assume that the relationship betweenthe explanatory and the dependent variables follows a flexibleform for each of the explanatory variables. We also model theunderlying covariance structure of the data. We assumed twoqualitatively different sources of variation: random effects andserial correlation. Random variation refers to both general levelvariation (through the random intercept) and the variationspecifically related to teaching status. We postulate that theimpact of the random effects is predominant over the impact ofthe serial correlation effects. The model is estimated byrestricted maximum likelihood. Our results show that costs are 9%higher (15% in the case of median costs) in teaching than innon-teaching hospitals. That is, teaching status legitimatelyexplains no more than half of the observed difference in actualcosts. The impact on costs of the teaching factor depends on thenumber of residents, with an increase of 51.11% per resident forhospitals with fewer than 204 residents (third quartile of thenumber of residents) and 41.84% for hospitals with more than 204residents. In addition, the estimated dispersion is higher amongteaching hospitals. As a result, due to the considerable observedheterogeneity, results should be interpreted with caution. From apolicy making point of view, we conclude that since a higherrelative burden for medical training is under public hospitalcommand, an explicit adjustment to the extra costs that theteaching factor imposes on hospital finance is needed, beforehospital competition for inpatient services takes place.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates what has caused output and inflation volatility to fall in the USusing a small scale structural model using Bayesian techniques and rolling samples. Thereare instabilities in the posterior of the parameters describing the private sector, the policyrule and the standard deviation of the shocks. Results are robust to the specification ofthe policy rule. Changes in the parameters describing the private sector are the largest,but those of the policy rule and the covariance matrix of the shocks explain the changes most.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We revisit the debt overhang question. We first use non-parametric techniques to isolate a panel of countries on the downward sloping section of a debt Laffer curve. In particular, overhang countries are ones where a threshold level of debt is reached in sample, beyond which (initial) debt ends up lowering (subsequent)growth. On average, significantly negative coefficients appear when debt face value reaches 60 percent of GDP or 200 percent of exports, and when its present value reaches 40 percent of GDP or 140 percent of exports. Second, we depart from reduced form growth regressions and perform direct tests of the theory on the thus selected sample of overhang countries. In the spirit of event studies, we ask whether, as overhang level of debt is reached: (i)investment falls precipitously as it should when it becomes optimal to default, (ii) economic policy deteriorates observably, as it should when debt contracts become unable to elicit effort on the part of the debtor, and (iii) the terms of borrowing worsen noticeably, as they should when it becomes optimal for creditors to pre-empt default and exact punitive interest rates. We find a systematic response of investment, particularly when property rights are weakly enforced, some worsening of the policy environment, and a fall in interest rates. This easing of borrowing conditions happens because lending by the private sector virtually disappears in overhang situations, and multilateral agencies step in with concessional rates. Thus, while debt relief is likely to improve economic policy (and especially investment) in overhang countries, it is doubtful that it would ease their terms of borrowing, or the burden of debt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate on-line prediction of individual sequences. Given a class of predictors, the goal is to predict as well as the best predictor in the class, where the loss is measured by the self information (logarithmic) loss function. The excess loss (regret) is closely related to the redundancy of the associated lossless universal code. Using Shtarkov's theorem and tools from empirical process theory, we prove a general upper bound on the best possible (minimax) regret. The bound depends on certain metric properties of the class of predictors. We apply the bound to both parametric and nonparametric classes ofpredictors. Finally, we point out a suboptimal behavior of the popular Bayesian weighted average algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new parametric minimum distance time-domain estimator for ARFIMA processes is introduced in this paper. The proposed estimator minimizes the sum of squared correlations of residuals obtained after filtering a series through ARFIMA parameters. The estimator iseasy to compute and is consistent and asymptotically normally distributed for fractionallyintegrated (FI) processes with an integration order d strictly greater than -0.75. Therefore, it can be applied to both stationary and non-stationary processes. Deterministic components are also allowed in the DGP. Furthermore, as a by-product, the estimation procedure provides an immediate check on the adequacy of the specified model. This is so because the criterion function, when evaluated at the estimated values, coincides with the Box-Pierce goodness of fit statistic. Empirical applications and Monte-Carlo simulations supporting the analytical results and showing the good performance of the estimator in finite samples are also provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this paper is to present an optimal resource allocation model for the regional allocation of public service inputs. Theproposed solution leads to maximise the relative public service availability in regions located below the best availability frontier, subject to exogenous budget restrictions and equality ofaccess for equal need criteria (equity-based notion of regional needs). The construction of non-parametric deficit indicators is proposed for public service availability by a novel application of Data Envelopment Analysis (DEA) models, whose results offer advantages for the evaluation and improvement of decentralised public resource allocation systems. The method introduced in this paper has relevance as a resource allocation guide for the majority of services centrally funded by the public sector in a given country, such as health care, basic and higher education, citizen safety, justice, transportation, environmental protection, leisure, culture, housing and city planning, etc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The influence of geomorphological and hydrologic conditions on travertine deposit formation has been studied in Cogolls Valley (Girona, Spain). Results indicate an evolutive sequence of the Quaternary materials, that is related to instabilities created by progresive changes into the valley system

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our procedure to detect moving groups in the solar neighbourhood (Chen et al., 1997) in the four-dimensional space of the stellar velocity components and age has been improved. The method, which takes advantadge of non-parametric estimators of density distribution to avoid any a priori knowledge of the kinematic properties of these stellar groups, now includes the effect of observational errors on the process to select moving group stars, uses a better estimation of the density distribution of the total sample and field stars, and classifies moving group stars using all the available information. It is applied here to an accurately selected sample of early-type stars with known radial velocities and Strömgren photometry. Astrometric data are taken from the HIPPARCOS catalogue (ESA, 1997), which results in an important decrease in the observational errors with respect to ground-based data, and ensures the uniformity of the observed data. Both the improvement of our method and the use of precise astrometric data have allowed us not only to confirm the existence of classical moving groups, but also to detect finer structures that in several cases can be related to kinematic properties of nearby open clusters or associations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an analysis of proper motions of O and B stars contained the Input Catalogue for Hipparcos, we have found a clear deviation from the expected pattern of systematic motions which can be readily identified with the associations Cygnus OB1 and Cygnus OB9, located near de the edge of the Cygnus Superbubble. Teha anomalous motions are directed outwards from the center of the Superbubble, which is coincident with tha association Cygnus OB2. This seems to support the hypothesis of a strong stellar and supernova activity in Cygnus 0B2 giving rise to the Superbubble and, by means of gravitational instabilities in its boundaries, to Cygnus 0B1 and Cygnus OB9. New uvby-beta aperture photometry of selected O and B stars in the area of Cygnus OB1 and Cygnus OB9 is also presented and analyzed in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.