998 resultados para Econometric methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A history of specialties in economics since the late 1950s is constructed on the basis of a large corpus of documents from economics journals. The production of this history relies on a combination of algorithmic methods that avoid subjective assessments of the boundaries of specialties: bibliographic coupling, automated community detection in dynamic networks, and text mining. These methods uncover a structuring of economics around recognizable specialties with some significant changes over the period covered (1956–2014). Among our results, especially noteworthy are (1) the clear-cut existence of ten families of specialties, (2) the disappearance in the late 1970s of a specialty focused on general economic theory, (3) the dispersal of the econometrics-centered specialty in the early 1990s and the ensuing importance of specific econometric methods for the identity of many specialties since the 1990s, and (4) the low level of specialization of individual economists throughout the period in contrast to physicists as early as the late 1960s.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The relationship between the socio-economic status of parents and children is referred by the literature as intergenerational social mobility. The scope of this mobility encompasses different aspects such as educational attainment, income, wealth, prestige and occupational status. In particular, intergenerational occupational mobility is an interesting topic in the economic literature because it is positively associated with the economic achievement and the professional success. Low mobility implies that human capital, skills and talent can be misallocated. As a consequence, the workers’ efforts, their motivation and productivity could be negatively affected, which would have adverse effects on the economy growth and its competitiveness. This paper attempts to carry out the study of the evolution of intergenerational social mobility in Spain during the 21st century. The methodology applied involves to associate the National Classification of Occupations (CNO-94) with the New International Socio-economic Index of Occupational Status (ISEI-08), in order to establish a socio-economic hierarchy. Afterwards, once the occupational ranking is defined, we use statistic and econometric methods to assess the occupational transitions between fathers and children and to analyse the covariates’ effects on these transitions, including as explanatory variable the children’s educational attainment. Data used corresponds to the 2005 and 2011 Living Condition Survey (INE, 2005, 2011). The results of the study are displayed by distinguishing children according to their birth cohort.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We survey articles covering how hedge fund returns are explained, using largely non-linear multifactor models that examine the non-linear pay-offs and exposures of hedge funds. We provide an integrated view of the implicit factor and statistical factor models that are largely able to explain the hedge fund return-generating process. We present their evolution through time by discussing pioneering studies that made a significant contribution to knowledge, and also recent innovative studies that examine hedge fund exposures using advanced econometric methods. This is the first review that analyzes very recent studies that explain a large part of hedge fund variation. We conclude by presenting some gaps for future research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A history of specialties in economics since the late 1950s is constructed on the basis of a large corpus of documents from economics journals. The production of this history relies on a combination of algorithmic methods that avoid subjective assessments of the boundaries of specialties: bibliographic coupling, automated community detection in dynamic networks and text mining. these methods uncover a structuring of economics around recognizable specialties with some significant changes over the time-period covered (1956-2014). Among our results, especially noteworthy are (a) the clearcut existence of 10 families of specialties, (b) the disappearance in the late 1970s of a specialty focused on general economic theory, (c) the dispersal of the econometrics-centered specialty in the early 1990s and the ensuing importance of specific econometric methods for the identity of many specialties since the 1990s, (d) the low level of specialization of individual economists throughout the period in contrast to physicists as early as the late 1960s.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The success of regional development policies depends on the homogeneity of the territorial units. This paper aims to propose a framework for obtaining homogenous territorial clusters based on a Pareto frontier considering multiple criteria related to territories’ endogenous resources, economic profile and socio-cultural features. This framework is developed in two phases. First, the criteria correlated with development at the territorial unit level are determined through statistical and econometric methods. Then, a multi-criteria approach is developed to allocate each territorial unit (parishes) to a territorial agglomerate, according to the Pareto frontier established.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When linear equality constraints are invariant through time they can be incorporated into estimation by restricted least squares. If, however, the constraints are time-varying, this standard methodology cannot be applied. In this paper we show how to incorporate linear time-varying constraints into the estimation of econometric models. The method involves the augmentation of the observation equation of a state-space model prior to estimation by the Kalman filter. Numerical optimisation routines are used for the estimation. A simple example drawn from demand analysis is used to illustrate the method and its application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper surveys recent evidence on the determinants of (national and/or foreign) industrial location. We find that the basic analytical framework has remained essentially unaltered since the early contributions of the early 1980's while, in contrast, there have been significant advances in the quality of the data and, to a lesser extent, the econometric modelling. We also identify certain determinants (neoclassical and institutional factors) that tend to provide largely consistent results across the reviewed studies. In light of this evidence, we finally suggest future lines of research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study describes in detail the major technological advances in the rubber-growing industry in the lastfour decades. The major technological changes experienced in the rubber plantation industry during the period are the introduction of 'high yielding-planting materials, scientific application of fertilisers, use of pesticides, tapping during rainy season using‘rain guards, use of. yield stimulants and improved tapping methods School of Management Studies, Cochin University of Science and Technology

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents four non-survey methods to construct a full-information international input-output table from national IO tables and international import and export statistics, and this paper tests these four methods against the semi-survey international IO table for nine East-Asian countries and the USA, which is constructed by the Institute of Developing Economies in Japan. The tests show that the impact on the domestic flows of using self-sufficiency ratios is small, except for Singapore and Malaysia, two countries with large volumes of smuggling and transit trade. As regards the accuracy of the international flows, all methods show considerable errors, of 10%-40% for commodities and of 10%-70% for services. When more information is added, i.e. going from Method 1 to 4, the accuracy increases, except for Method 2 that generally produces larger errors than Method 1. In all, it seems doubtful whether replacing the semi-survey Asian-Pacific IO table with one of the four non-survey tables is justified, except when the semi-survey table itself is also considered to be just another estimate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Negli ultimi anni i modelli VAR sono diventati il principale strumento econometrico per verificare se può esistere una relazione tra le variabili e per valutare gli effetti delle politiche economiche. Questa tesi studia tre diversi approcci di identificazione a partire dai modelli VAR in forma ridotta (tra cui periodo di campionamento, set di variabili endogene, termini deterministici). Usiamo nel caso di modelli VAR il test di Causalità di Granger per verificare la capacità di una variabile di prevedere un altra, nel caso di cointegrazione usiamo modelli VECM per stimare congiuntamente i coefficienti di lungo periodo ed i coefficienti di breve periodo e nel caso di piccoli set di dati e problemi di overfitting usiamo modelli VAR bayesiani con funzioni di risposta di impulso e decomposizione della varianza, per analizzare l'effetto degli shock sulle variabili macroeconomiche. A tale scopo, gli studi empirici sono effettuati utilizzando serie storiche di dati specifici e formulando diverse ipotesi. Sono stati utilizzati tre modelli VAR: in primis per studiare le decisioni di politica monetaria e discriminare tra le varie teorie post-keynesiane sulla politica monetaria ed in particolare sulla cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015) e regola del GDP nominale in Area Euro (paper 1); secondo per estendere l'evidenza dell'ipotesi di endogeneità della moneta valutando gli effetti della cartolarizzazione delle banche sul meccanismo di trasmissione della politica monetaria negli Stati Uniti (paper 2); terzo per valutare gli effetti dell'invecchiamento sulla spesa sanitaria in Italia in termini di implicazioni di politiche economiche (paper 3). La tesi è introdotta dal capitolo 1 in cui si delinea il contesto, la motivazione e lo scopo di questa ricerca, mentre la struttura e la sintesi, così come i principali risultati, sono descritti nei rimanenti capitoli. Nel capitolo 2 sono esaminati, utilizzando un modello VAR in differenze prime con dati trimestrali della zona Euro, se le decisioni in materia di politica monetaria possono essere interpretate in termini di una "regola di politica monetaria", con specifico riferimento alla cosiddetta "nominal GDP targeting rule" (McCallum 1988 Hall e Mankiw 1994; Woodford 2012). I risultati evidenziano una relazione causale che va dallo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo alle variazioni dei tassi di interesse di mercato a tre mesi. La stessa analisi non sembra confermare l'esistenza di una relazione causale significativa inversa dalla variazione del tasso di interesse di mercato allo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo. Risultati simili sono stati ottenuti sostituendo il tasso di interesse di mercato con il tasso di interesse di rifinanziamento della BCE. Questa conferma di una sola delle due direzioni di causalità non supporta un'interpretazione della politica monetaria basata sulla nominal GDP targeting rule e dà adito a dubbi in termini più generali per l'applicabilità della regola di Taylor e tutte le regole convenzionali della politica monetaria per il caso in questione. I risultati appaiono invece essere più in linea con altri approcci possibili, come quelli basati su alcune analisi post-keynesiane e marxiste della teoria monetaria e più in particolare la cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015). Queste linee di ricerca contestano la tesi semplicistica che l'ambito della politica monetaria consiste nella stabilizzazione dell'inflazione, del PIL reale o del reddito nominale intorno ad un livello "naturale equilibrio". Piuttosto, essi suggeriscono che le banche centrali in realtà seguono uno scopo più complesso, che è il regolamento del sistema finanziario, con particolare riferimento ai rapporti tra creditori e debitori e la relativa solvibilità delle unità economiche. Il capitolo 3 analizza l’offerta di prestiti considerando l’endogeneità della moneta derivante dall'attività di cartolarizzazione delle banche nel corso del periodo 1999-2012. Anche se gran parte della letteratura indaga sulla endogenità dell'offerta di moneta, questo approccio è stato adottato raramente per indagare la endogeneità della moneta nel breve e lungo termine con uno studio degli Stati Uniti durante le due crisi principali: scoppio della bolla dot-com (1998-1999) e la crisi dei mutui sub-prime (2008-2009). In particolare, si considerano gli effetti dell'innovazione finanziaria sul canale dei prestiti utilizzando la serie dei prestiti aggiustata per la cartolarizzazione al fine di verificare se il sistema bancario americano è stimolato a ricercare fonti più economiche di finanziamento come la cartolarizzazione, in caso di politica monetaria restrittiva (Altunbas et al., 2009). L'analisi si basa sull'aggregato monetario M1 ed M2. Utilizzando modelli VECM, esaminiamo una relazione di lungo periodo tra le variabili in livello e valutiamo gli effetti dell’offerta di moneta analizzando quanto la politica monetaria influisce sulle deviazioni di breve periodo dalla relazione di lungo periodo. I risultati mostrano che la cartolarizzazione influenza l'impatto dei prestiti su M1 ed M2. Ciò implica che l'offerta di moneta è endogena confermando l'approccio strutturalista ed evidenziando che gli agenti economici sono motivati ad aumentare la cartolarizzazione per una preventiva copertura contro shock di politica monetaria. Il capitolo 4 indaga il rapporto tra spesa pro capite sanitaria, PIL pro capite, indice di vecchiaia ed aspettativa di vita in Italia nel periodo 1990-2013, utilizzando i modelli VAR bayesiani e dati annuali estratti dalla banca dati OCSE ed Eurostat. Le funzioni di risposta d'impulso e la scomposizione della varianza evidenziano una relazione positiva: dal PIL pro capite alla spesa pro capite sanitaria, dalla speranza di vita alla spesa sanitaria, e dall'indice di invecchiamento alla spesa pro capite sanitaria. L'impatto dell'invecchiamento sulla spesa sanitaria è più significativo rispetto alle altre variabili. Nel complesso, i nostri risultati suggeriscono che le disabilità strettamente connesse all'invecchiamento possono essere il driver principale della spesa sanitaria nel breve-medio periodo. Una buona gestione della sanità contribuisce a migliorare il benessere del paziente, senza aumentare la spesa sanitaria totale. Tuttavia, le politiche che migliorano lo stato di salute delle persone anziane potrebbe essere necessarie per una più bassa domanda pro capite dei servizi sanitari e sociali.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study has three main objectives. First, it develops a generalization of the commonly used EKS method to multilateral price comparisons. It is shown that the EKS system can be generalized so that weights can be attached to each of the link comparisons used in the EKS computations. These weights can account for differing levels of reliability of the underlying binary comparisons. Second, various reliability measures and corresponding weighting schemes are presented and their merits discussed. Finally, these new methods are applied to an international data set of manufacturing prices from the ICOP project. Although theoretically superior, it appears that the empirical impact of the weighted EKS method is generally small compared to the unweighted EKS. It is also found that this impact is larger when it is applied at lower levels of aggregation. Finally, the importance of using sector specific PPPs in assessing relative levels of manufacturing productivity is indicated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this clinical study was to determine the efficacy of Uncaria tomentosa (cat's claw) against denture stomatitis (DS). Fifty patients with DS were randomly assigned into 3 groups to receive 2% miconazole, placebo, or 2% U tomentosa gel. DS level was recorded immediately, after 1 week of treatment, and 1 week after treatment. The clinical effectiveness of each treatment was measured using Newton's criteria. Mycologic samples from palatal mucosa and prosthesis were obtained to determinate colony forming units per milliliter (CFU/mL) and fungal identification at each evaluation period. Candida species were identified with HiCrome Candida and API 20C AUX biochemical test. DS severity decreased in all groups (P < .05). A significant reduction in number of CFU/mL after 1 week (P < .05) was observed for all groups and remained after 14 days (P > .05). C albicans was the most prevalent microorganism before treatment, followed by C tropicalis, C glabrata, and C krusei, regardless of the group and time evaluated. U tomentosa gel had the same effect as 2% miconazole gel. U tomentosa gel is an effective topical adjuvant treatment for denture stomatitis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.