98 resultados para KEYHOLE APPROACH
Resumo:
Consider a model with parameter phi, and an auxiliary model with parameter theta. Let phi be a randomly sampled from a given density over the known parameter space. Monte Carlo methods can be used to draw simulated data and compute the corresponding estimate of theta, say theta_tilde. A large set of tuples (phi, theta_tilde) can be generated in this manner. Nonparametric methods may be use to fit the function E(phi|theta_tilde=a), using these tuples. It is proposed to estimate phi using the fitted E(phi|theta_tilde=theta_hat), where theta_hat is the auxiliary estimate, using the real sample data. This is a consistent and asymptotically normally distributed estimator, under certain assumptions. Monte Carlo results for dynamic panel data and vector autoregressions show that this estimator can have very attractive small sample properties. Confidence intervals can be constructed using the quantiles of the phi for which theta_tilde is close to theta_hat. Such confidence intervals are found to have very accurate coverage.
Resumo:
The aim of this paper is the analysis of the Catalan economy (2001) with the use of a National Accounting Matrix with environmental accounts (NAMEA) for the Catalan economy with 2001 data. We will focus on the analysis of the emission multipliers and we will also analyse the impact of a 10% reduction in greenhouse emissions on emission multipliers. This emission-reduction percentage would bring the Catalan economy into compliance with the maximum emissions level allowed by the Kyoto Protocol. We consider three possible scenarios that would allow this goal to be met. First, we will simulate a 10% reduction in regional emissions and a 5% drop in the endogenous income of the multipliers' model (production, factorial and private income). Second, we will simulate a 10% reduction in emissions and a 10% increase in endogenous income. Finally, we will simulate a 10% reduction in emissions and a 5% increase in endogenous income. Additionally, we will analyse the decomposition of the emission multipliers into own effects, open effects and circular effects to capture the different channels of the emission generation process. Keywords: NAMEA, emission multipliers, Kyoto Protocol.
Resumo:
This paper investigates vulnerability to poverty in Haiti. Research in vulnerability in developing countries has been scarce due to the high data requirements of vulnerability studies (e.g. panel or long series of cross-sections). The methodology adopted here allows the assessment of vulnerability to poverty by exploiting the short panel structure of nested data at different levels. The decomposition method reveals that vulnerability in Haiti is largely a rural phenomenon and that schooling correlates negatively with vulnerability. Most importantly, among the different shocks affecting household's income, it is found that meso-level shocks are in general far more important than covariate shocks. This finding points to some interesting policy implications in decentralizing policies to alleviate vulnerability to poverty.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
The peace process in Northern Ireland demonstrates that new sovereignty formulas need to be explored in order to meet the demands of the populations and territories in conflict. The profound transformation of the classic symbolic elements of the nation-state within the context of the European Union has greatly contributed to the prospects for a resolution of this old conflict. Today’s discussions are focused on the search for instruments of shared sovereignty that are adapted to a complex and plural social reality. This new approach for finding a solution to the Irish conflict is particularly relevant to the Basque debate about formulating creative and modern solutions to similar conflicts over identity and sovereignty. The notion of shared sovereignty implemented in Northern Ireland –a formula for complex interdependent relations– is of significant relevance to the broader international community and is likely to become an increasingly potent and transcendent model for conflict resolution and peace building.
Resumo:
This PhD project aims to study paraphrasing, initially understood as the different ways in which the same content is expressed linguistically. We will go into that concept in depth trying to define and delimit its scope more accurately. In that sense, we also aim to discover which kind of structures and phenomena it covers. Although there exist some paraphrasing typologies, the great majority of them only apply to English, and focus on lexical and syntactic transformations. Our intention is to go further into this subject and propose a paraphrasing typology for Spanish and Catalan combining lexical, syntactic, semantic and pragmatic knowledge. We apply a bottom-up methodology trying to collect evidence of this phenomenon from the data. For this purpose, we are initially using the Spanish Wikipedia as our corpus. The internal structure of this encyclopedia makes it a good resource for extracting paraphrasing examples for our investigation. This empirical approach will be complemented with the use of linguistic knowledge, and by comparing and contrasting our results to previously proposed paraphrasing typologies in order to enlarge the possible paraphrasing forms found in our corpus. The fact that the same content can be expressed in many different ways presents a major challenge for Natural Language Processing (NLP) applications. Thus, research on paraphrasing has recently been attracting increasing attention in the fields of NLP and Computational Linguistics. The results obtained in this investigation would be of great interest in many of these applications.
Resumo:
Based on the Ahumada et al. (2007, Review of Income and Wealth) critique we revise existing estimates of the size of the German underground economy. Among other things, it turns out that most of these estimates are untenable and that the tax pressure induced size of the German underground economy may be much lower than previously thought. To this extent, German policy and law makers have been misguided during the last three decades. Therefore, we introduce the Modified-Cash-Deposit-Ratio (MCDR) approach, which is not subject to the recent critique and apply it to Germany for the period 1960 to 2008. JEL: O17, Q41, C22, Keywords: underground economy, shadow economy, cash-depositratio, currency demand approach, MIMIC approach
Resumo:
The aim of this paper is to analyse the colocation patterns of industries and firms. We study the spatial distribution of firms from different industries at a microgeographic level and from this identify the main reasons for this locational behaviour. The empirical application uses data from Mercantile Registers of Spanish firms (manufacturers and services). Inter-sectorial linkages are shown using self-organizing maps. Key words: clusters, microgeographic data, self-organizing maps, firm location JEL classification: R10, R12, R34
Resumo:
This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described
Resumo:
The studies of Giacomo Becattini concerning the notion of the "Marshallian industrial district" have led a revolution in the field of economic development around the world. The paper offers an interpretation of the methodology adopted by Becattini. The roots are clearly Marshallian. Becattini proposes a return to the economy as a complex social science that operates in historical time. We adopt a Schumpeterian approach to the method in economic analysis in order to highlight the similarities between the Marshall and Becattini's approach. Finally the paper uses the distinction between logical time, real time and historical time which enable us to study the "localized" economic process in a Becattinian way.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
This paper proposes a new methodology to compute Value at Risk (VaR) for quantifying losses in credit portfolios. We approximate the cumulative distribution of the loss function by a finite combination of Haar wavelet basis functions and calculate the coefficients of the approximation by inverting its Laplace transform. The Wavelet Approximation (WA) method is specially suitable for non-smooth distributions, often arising in small or concentrated portfolios, when the hypothesis of the Basel II formulas are violated. To test the methodology we consider the Vasicek one-factor portfolio credit loss model as our model framework. WA is an accurate, robust and fast method, allowing to estimate VaR much more quickly than with a Monte Carlo (MC) method at the same level of accuracy and reliability.
Resumo:
This article focuses on business risk management in the insurance industry. A methodology for estimating the profit loss caused by each customer in the portfolio due to policy cancellation is proposed. Using data from a European insurance company, customer behaviour over time is analyzed in order to estimate the probability of policy cancelation and the resulting potential profit loss due to cancellation. Customers may have up to two different lines of business contracts: motor insurance and other diverse insurance (such as, home contents, life or accident insurance). Implications for understanding customer cancellation behaviour as the core of business risk management are outlined.
Resumo:
Solving multi-stage oligopoly models by backward induction can easily become a com- plex task when rms are multi-product and demands are derived from a nested logit frame- work. This paper shows that under the assumption that within-segment rm shares are equal across segments, the analytical expression for equilibrium pro ts can be substantially simpli ed. The size of the error arising when this condition does not hold perfectly is also computed. Through numerical examples, it is shown that the error is rather small in general. Therefore, using this assumption allows to gain analytical tractability in a class of models that has been used to approach relevant policy questions, such as for example rm entry in an industry or the relation between competition and location. The simplifying approach proposed in this paper is aimed at helping improving these type of models for reaching more accurate recommendations.