116 resultados para Intractable Likelihood
Resumo:
This paper shows how a high level matrix programming language may be used to perform Monte Carlo simulation, bootstrapping, estimation by maximum likelihood and GMM, and kernel regression in parallel on symmetric multiprocessor computers or clusters of workstations. The implementation of parallelization is done in a way such that an investigator may use the programs without any knowledge of parallel programming. A bootable CD that allows rapid creation of a cluster for parallel computing is introduced. Examples show that parallelization can lead to important reductions in computational time. Detailed discussion of how the Monte Carlo problem was parallelized is included as an example for learning to write parallel programs for Octave.
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
Ever since the appearance of the ARCH model [Engle(1982a)], an impressive array of variance specifications belonging to the same class of models has emerged [i.e. Bollerslev's (1986) GARCH; Nelson's (1990) EGARCH]. This recent domain has achieved very successful developments. Nevertheless, several empirical studies seem to show that the performance of such models is not always appropriate [Boulier(1992)]. In this paper we propose a new specification: the Quadratic Moving Average Conditional heteroskedasticity model. Its statistical properties, such as the kurtosis and the symmetry, as well as two estimators (Method of Moments and Maximum Likelihood) are studied. Two statistical tests are presented, the first one tests for homoskedasticity and the second one, discriminates between ARCH and QMACH specification. A Monte Carlo study is presented in order to illustrate some of the theoretical results. An empirical study is undertaken for the DM-US exchange rate.
Resumo:
The Hausman (1978) test is based on the vector of differences of two estimators. It is usually assumed that one of the estimators is fully efficient, since this simplifies calculation of the test statistic. However, this assumption limits the applicability of the test, since widely used estimators such as the generalized method of moments (GMM) or quasi maximum likelihood (QML) are often not fully efficient. This paper shows that the test may easily be implemented, using well-known methods, when neither estimator is efficient. To illustrate, we present both simulation results as well as empirical results for utilization of health care services.
Resumo:
We use a threshold seemingly unrelated regressions specification to assess whether the Central and East European countries (CEECs) are synchronized in their business cycles to the Euro-area. This specification is useful in two ways: First, it takes into account the common institutional factors and the similarities across CEECs in their process of economic transition. Second, it captures business cycle asymmetries by allowing for the presence of two distinct regimes for the CEECs. As the CEECs are strongly affected by the Euro-area these regimes may be associated with Euro-area expansions and contractions. We discuss representation, estimation by maximum likelihood and inference. The methodology is illustrated by using monthly industrial production in 8 CEECs. The results show that apart from Lithuania the rest of the CEECs experience “normal” growth when the Euro-area contracts and “high” growth when the Euro-area expands. Given that the CEECs are “catching up” with the Euro-area this result shows that most CEECs seem synchronized to the Euro-area cycle. Keywords: Threshold SURE; asymmetry; business cycles; CEECs. JEL classification: C33; C50; E32.
Resumo:
This paper enquires into whether economic sanctions are effective in destabilizing authoritarian rulers. We argue that this effect is mediated by the type of authoritarian regime against which sanctions are imposed. Thus, personalist regimes and monarchies, which are more dependent on aid and resource rents to maintain their patronage networks, are more likely to be affected by sanctions. In contrast, single-party and military regimes are able to maintain (and even increase) their tax revenues and to reallocate their expenditures and so increase their levels of cooptation. Data on sanction episodes, authoritarian rulers and regimes covering the period 1946–2000 have allowed us to test our hypotheses. To do so, duration models have been run, and the results confirm that personalist autocrats are more vulnerable to foreign pressure. Concretely, the analysis of the modes of exit reveals that sanctions increase the likelihood of an irregular change of ruler, such as a coup. Sanctions are basically ineffective when targeting single-party or military regimes.
Resumo:
El uso intensivo y prolongado de computadores de altas prestaciones para ejecutar aplicaciones computacionalmente intensivas, sumado al elevado número de elementos que los componen, incrementan drásticamente la probabilidad de ocurrencia de fallos durante su funcionamiento. El objetivo del trabajo es resolver el problema de tolerancia a fallos para redes de interconexión de altas prestaciones, partiendo del diseño de políticas de encaminamiento tolerantes a fallos. Buscamos resolver una determinada cantidad de fallos de enlaces y nodos, considerando sus factores de impacto y probabilidad de aparición. Para ello aprovechamos la redundancia de caminos de comunicación existentes, partiendo desde enfoques de encaminamiento adaptativos capaces de cumplir con las cuatro fases de la tolerancia a fallos: detección del error, contención del daño, recuperación del error, y tratamiento del fallo y continuidad del servicio. La experimentación muestra una degradación de prestaciones menor al 5%. En el futuro, se tratará la pérdida de información en tránsito.
Resumo:
This paper empirically analyses the hypothesis of the existence of a dual market for contracts in local services. Large firms that operate on a national basis control the contracts for delivery in the most populated and/or urban municipalities, whereas small firms that operate at a local level have the contracts in the least populated and/or rural municipalities. The dual market implies the high concentration and dominance of major firms in large municipalities, and local monopolies in the smaller ones. This market structure is harmful to competition for the market as the effective number of competitors is low across all municipalities. Thus, it damages the likelihood of obtaining cost savings from privatization.
Resumo:
This paper empirically studies the effects of service offshoring on white-collar employment, using data for more than one hundred U.S. occupations. A model of firm behavior based on separability allows to derive the labor demand elasticity with respect to service offshoring for each occupation. Estimation is performed with Quasi-Maximum Likelihood, to account for high degrees of censoring in the employment variable. The estimated elasticities are then related to proxies for the skill level and the degree of tradability of the occupations. Results show that service offshoring increases high skilled employment and decreases medium and low skilled employment. Within each skill group, however, service offshoring penalizes tradable occupations and benefits non-tradable occupations.
Resumo:
Research in business dynamics has been advancing rapidly in the last years but the translation of the new knowledge to industrial policy design is slow. One striking aspect in the policy area is that although research and analysis do not identify the existence of an specific optimal rate of business creation and business exit, governments everywhere have adopted business start-up support programs with the implicit principle that the more the better. The purpose of this article is to contribute to understand the implications of the available research for policy design. Economic analysis has identified firm heterogeneity as being the most salient characteristic of industrial dynamics, and so a better knowledge of the different types of entrepreneur, their behavior and their specific contribution to innovation and growth would enable us to see into the ‘black box’ of business dynamics and improve the design of appropriate public policies. The empirical analysis performed here shows that not all new business have the same impact on relevant economic variables, and that self-employment is of quite a different economic nature to that of firms with employees. It is argued that public programs should not promote indiscriminate entry but rather give priority to able entrants with survival capacities. Survival of entrants is positively related to their size at birth. Innovation and investment improve the likelihood of survival of new manufacturing start-ups. Investment in R&D increases the risk of failure in new firms, although it improves the competitiveness of incumbents.
Resumo:
En aquest estudi es pretenia assolir la pràctica d'un sistema que només s'ha plantejat teòricament, doncs l'experiència acumulada, ja abans de la reforma orgànica de 2003, parteix de la figura dels serveis comuns, amb un criteri d'eficiència, racionalitat i economia per invertir en l'Administració de justícia catalana. Assumir i donar resposta jurídica concreta a tots els reptes tècnics pot fer-se des d'una perspectiva dogmàtica, tot i que el temps transcorregut també permet advertir exigències pràctiques a l'àmbit de la dogmàtica processal, el context socio-cultural i, naturalment, les necessitats laborals. En aquest sentit, els objectius s'introdueixen cap a la garantia d'un apropament material, conceptual i quotidià de l'Administració de justícia al ciutadà, establert com a eix inextricable del sistema, tenint present la de vegades despesa incomprensible en una realitat històricament menystinguda i deficitària, malgrat l'esforç pressupostari fet els darrers anys que, malgrat tot, no ha evitat discordances greus per manca de racionalitat i eficiència en el consum diari dels operadors implicats. Es pretén aconseguir, per mitjà del nou sistema d'oficina judicial estructurat, un avenç efectiu i econòmic a l'estat de la Justícia al país, especialment envers la dilació dels tràmits, a més de reclamar reformes legals, de caire processal especialment, que no es prestin a mers paràmetres d'ajust formal, sinó que incideix substantivament en millores per altre part reclamades fa temps per la doctrina científica. S'han aconseguit les fites definides des del punt de vista teòric, així com s'ha afrontat tots el problemes conceptuals i hipòtesis pràctiques més significatives, establint pautes de resposta sota la prèvia determinació de les qüestions debatudes i els conflictes habituals que correspon enfrontar. D'aquesta manera, s'ha estudiat la normativa en presència i la jurisprudència que hi dona actualitat pràctica, sense oblidar la doctrina d'autors. S'ha repassat el funcionament existent a l'oficina judicial i les previsions de la mateixa en un futur immediat, remarcant les noves tecnologies en ús i projectades, de igual manera que les bondats i crítiques de tots els operador jurídics actuants.
Resumo:
It has been argued that by truncating the sample space of the negative binomial and of the inverse Gaussian-Poisson mixture models at zero, one is allowed to extend the parameter space of the model. Here that is proved to be the case for the more general three parameter Tweedie-Poisson mixture model. It is also proved that the distributions in the extended part of the parameter space are not the zero truncation of mixed poisson distributions and that, other than for the negative binomial, they are not mixtures of zero truncated Poisson distributions either. By extending the parameter space one can improve the fit when the frequency of one is larger and the right tail is heavier than is allowed by the unextended model. Considering the extended model also allows one to use the basic maximum likelihood based inference tools when parameter estimates fall in the extended part of the parameter space, and hence when the m.l.e. does not exist under the unextended model. This extended truncated Tweedie-Poisson model is proved to be useful in the analysis of words and species frequency count data.
Resumo:
In this study, we analyse the degree of polarisation-a concept fundamentally different from that of inequality-in the international distribution of CO2 emissions per capita in the European Union. It is analytically relevant to examine the degree of instability inherent to a distribution and, in the analysed case, the likelihood that the distribution and its evolution will increase or decrease the chances of reaching an agreement. Two approaches were used to measure polarisation: the endogenous approach, in which countries are grouped according to their similarity in terms of emissions, and the exogenous approach, in which countries are grouped geographically. Our findings indicate a clear decrease in polarisation since the mid-1990s, which can essentially be explained by the fact that the different groups of countries have converged (i.e. antagonism among the CO2 emitters has decreased) as the contribution of energy intensity to between-group differences has decreased. This lower degree of polarisation in CO2 distribution suggests a situation more conducive to the possibility of reaching EU-wide agreements on the mitigation of CO2 emissions.
Resumo:
This paper addresses the issue of policy evaluation in a context in which policymakers are uncertain about the effects of oil prices on economic performance. I consider models of the economy inspired by Solow (1980), Blanchard and Gali (2007), Kim and Loungani (1992) and Hamilton (1983, 2005), which incorporate different assumptions on the channels through which oil prices have an impact on economic activity. I first study the characteristics of the model space and I analyze the likelihood of the different specifications. I show that the existence of plausible alternative representations of the economy forces the policymaker to face the problem of model uncertainty. Then, I use the Bayesian approach proposed by Brock, Durlauf and West (2003, 2007) and the minimax approach developed by Hansen and Sargent (2008) to integrate this form of uncertainty into policy evaluation. I find that, in the environment under analysis, the standard Taylor rule is outperformed under a number of criteria by alternative simple rules in which policymakers introduce persistence in the policy instrument and respond to changes in the real price of oil.
Resumo:
Land cover classification is a key research field in remote sensing and land change science as thematic maps derived from remotely sensed data have become the basis for analyzing many socio-ecological issues. However, land cover classification remains a difficult task and it is especially challenging in heterogeneous tropical landscapes where nonetheless such maps are of great importance. The present study aims to establish an efficient classification approach to accurately map all broad land cover classes in a large, heterogeneous tropical area of Bolivia, as a basis for further studies (e.g., land cover-land use change). Specifically, we compare the performance of parametric (maximum likelihood), non-parametric (k-nearest neighbour and four different support vector machines - SVM), and hybrid classifiers, using both hard and soft (fuzzy) accuracy assessments. In addition, we test whether the inclusion of a textural index (homogeneity) in the classifications improves their performance. We classified Landsat imagery for two dates corresponding to dry and wet seasons and found that non-parametric, and particularly SVM classifiers, outperformed both parametric and hybrid classifiers. We also found that the use of the homogeneity index along with reflectance bands significantly increased the overall accuracy of all the classifications, but particularly of SVM algorithms. We observed that improvements in producer’s and user’s accuracies through the inclusion of the homogeneity index were different depending on land cover classes. Earlygrowth/degraded forests, pastures, grasslands and savanna were the classes most improved, especially with the SVM radial basis function and SVM sigmoid classifiers, though with both classifiers all land cover classes were mapped with producer’s and user’s accuracies of around 90%. Our approach seems very well suited to accurately map land cover in tropical regions, thus having the potential to contribute to conservation initiatives, climate change mitigation schemes such as REDD+, and rural development policies.