9 resultados para Random coefficient multinomial logit
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The goal of this dissertation is to use statistical tools to analyze specific financial risks that have played dominant roles in the US financial crisis of 2008-2009. The first risk relates to the level of aggregate stress in the financial markets. I estimate the impact of financial stress on economic activity and monetary policy using structural VAR analysis. The second set of risks concerns the US housing market. There are in fact two prominent risks associated with a US mortgage, as borrowers can both prepay or default on a mortgage. I test the existence of unobservable heterogeneity in the borrower's decision to default or prepay on his mortgage by estimating a multinomial logit model with borrower-specific random coefficients.
Resumo:
This research has been triggered by an emergent trend in customer behavior: customers have rapidly expanded their channel experiences and preferences beyond traditional channels (such as stores) and they expect the company with which they do business to have a presence on all these channels. This evidence has produced an increasing interest in multichannel customer behavior and it has motivated several researchers to study the customers’ channel choices dynamics in multichannel environment. We study how the consumer decision process for channel choice and response to marketing communications evolves for a cohort of new customers. We assume a newly acquired customer’s decisions are described by a “trial” model, but the customer’s choice process evolves to a “post-trial” model as the customer learns his or her preferences and becomes familiar with the firm’s marketing efforts. The trial and post-trial decision processes are each described by different multinomial logit choice models, and the evolution from the trial to post-trial model is determined by a customer-level geometric distribution that captures the time it takes for the customer to make the transition. We utilize data for a major retailer who sells in three channels – retail store, the Internet, and via catalog. The model is estimated using Bayesian methods that allow for cross-customer heterogeneity. This allows us to have distinct parameters estimates for a trial and an after trial stages and to estimate the quickness of this transit at the individual level. The results show for example that the customer decision process indeed does evolve over time. Customers differ in the duration of the trial period and marketing has a different impact on channel choice in the trial and post-trial stages. Furthermore, we show that some people switch channel decision processes while others don’t and we found that several factors have an impact on the probability to switch decision process. Insights from this study can help managers tailor their marketing communication strategy as customers gain channel choice experience. Managers may also have insights on the timing of the direct marketing communications. They can predict the duration of the trial phase at individual level detecting the customers with a quick, long or even absent trial phase. They can even predict if the customer will change or not his decision process over time, and they can influence the switching process using specific marketing tools
Resumo:
The aim of this work is to put forward a statistical mechanics theory of social interaction, generalizing econometric discrete choice models. After showing the formal equivalence linking econometric multinomial logit models to equilibrium statical mechanics, a multi- population generalization of the Curie-Weiss model for ferromagnets is considered as a starting point in developing a model capable of describing sudden shifts in aggregate human behaviour. Existence of the thermodynamic limit for the model is shown by an asymptotic sub-additivity method and factorization of correlation functions is proved almost everywhere. The exact solution for the model is provided in the thermodynamical limit by nding converging upper and lower bounds for the system's pressure, and the solution is used to prove an analytic result regarding the number of possible equilibrium states of a two-population system. The work stresses the importance of linking regimes predicted by the model to real phenomena, and to this end it proposes two possible procedures to estimate the model's parameters starting from micro-level data. These are applied to three case studies based on census type data: though these studies are found to be ultimately inconclusive on an empirical level, considerations are drawn that encourage further refinements of the chosen modelling approach, to be considered in future work.
Resumo:
The papers included in this thesis deal with a few aspects of insurance economics that have seldom been dealt with in the applied literature. In the first paper I apply for the first time the tools of the economics of crime to study the determinants of frauds, using data on Italian provinces. The contributions to the literature are manifold: -The price of insuring has a positive correlation with the propensity to defraud -Social norms constraint fraudulent behavior, but their strength is curtailed in economic downturns -I apply a simple extension of the Random Coefficient model, which allows for the presence of time invariant covariates and asymmetries in the impact of the regressors. The second paper assesses how the evolution of macro prudential regulation of insurance companies has been reflected in their equity price. I employ a standard event study methodology, deriving the definition of the “control” and “treatment” groups from what is implied by the regulatory framework. The main results are: -Markets care about the evolution of the legislation. Their perception has shifted from a first positive assessment of a possible implicit “too big to fail” subsidy to a more negative one related to its cost in terms of stricter capital requirement -The size of this phenomenon is positively related to leverage, size and on the geographical location of the insurance companies The third paper introduces a novel methodology to forecast non-life insurance premiums and profitability as function of macroeconomic variables, using the simultaneous equation framework traditionally employed macroeconometric models and a simple theoretical model of insurance pricing to derive a long term relationship between premiums, claims expenses and short term rates. The model is shown to provide a better forecast of premiums and profitability compared with the single equation specifications commonly used in applied analysis.
Resumo:
High-frequency seismograms contain features that reflect the random inhomogeneities of the earth. In this work I use an imaging method to locate the high contrast small- scale heterogeneity respect to the background earth medium. This method was first introduced by Nishigami (1991) and than applied to different volcanic and tectonically active areas (Nishigami, 1997, Nishigami, 2000, Nishigami, 2006). The scattering imaging method is applied to two volcanic areas: Campi Flegrei and Mt. Vesuvius. Volcanic and seismological active areas are often characterized by complex velocity structures, due to the presence of rocks with different elastic properties. I introduce some modifications to the original method in order to make it suitable for small and highly complex media. In particular, for very complex media the single scattering approximation assumed by Nishigami (1991) is not applicable as the mean free path becomes short. The multiple scattering or diffusive approximation become closer to the reality. In this thesis, differently from the ordinary Nishigami’s method (Nishigami, 1991), I use the mean of the recorded coda envelope as reference curve and calculate the variations from this average envelope. In this way I implicitly do not assume any particular scattering regime for the "average" scattered radiation, whereas I consider the variations as due to waves that are singularly scattered from the strongest heterogeneities. The imaging method is applied to a relatively small area (20 x 20 km), this choice being justified by the small length of the analyzed codas of the low magnitude earthquakes. I apply the unmodified Nishigami’s method to the volcanic area of Campi Flegrei and compare the results with the other tomographies done in the same area. The scattering images, obtained with frequency waves around 18 Hz, show the presence of high scatterers in correspondence with the submerged caldera rim in the southern part of the Pozzuoli bay. Strong scattering is also found below the Solfatara crater, characterized by the presence of densely fractured, fluid-filled rocks and by a strong thermal anomaly. The modified Nishigami’s technique is applied to the Mt. Vesuvius area. Results show a low scattering area just below the central cone and a high scattering area around it. The high scattering zone seems to be due to the contrast between the high rigidity body located beneath the crater and the low rigidity materials located around it. The central low scattering area overlaps the hydrothermal reservoirs located below the central cone. An interpretation of the results in terms of geological properties of the medium is also supplied, aiming to find a correspondence of the scattering properties and the geological nature of the material. A complementary result reported in this thesis is that the strong heterogeneity of the volcanic medium create a phenomenon called "coda localization". It has been verified that the shape of the seismograms recorded from the stations located at the top of the volcanic edifice of Mt. Vesuvius is different from the shape of the seismograms recorded at the bottom. This behavior is justified by the consideration that the coda energy is not uniformly distributed within a region surrounding the source for great lapse time.
Resumo:
The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.
Resumo:
It is usual to hear a strange short sentence: «Random is better than...». Why is randomness a good solution to a certain engineering problem? There are many possible answers, and all of them are related to the considered topic. In this thesis I will discuss about two crucial topics that take advantage by randomizing some waveforms involved in signals manipulations. In particular, advantages are guaranteed by shaping the second order statistic of antipodal sequences involved in an intermediate signal processing stages. The first topic is in the area of analog-to-digital conversion, and it is named Compressive Sensing (CS). CS is a novel paradigm in signal processing that tries to merge signal acquisition and compression at the same time. Consequently it allows to direct acquire a signal in a compressed form. In this thesis, after an ample description of the CS methodology and its related architectures, I will present a new approach that tries to achieve high compression by design the second order statistics of a set of additional waveforms involved in the signal acquisition/compression stage. The second topic addressed in this thesis is in the area of communication system, in particular I focused the attention on ultra-wideband (UWB) systems. An option to produce and decode UWB signals is direct-sequence spreading with multiple access based on code division (DS-CDMA). Focusing on this methodology, I will address the coexistence of a DS-CDMA system with a narrowband interferer. To do so, I minimize the joint effect of both multiple access (MAI) and narrowband (NBI) interference on a simple matched filter receiver. I will show that, when spreading sequence statistical properties are suitably designed, performance improvements are possible with respect to a system exploiting chaos-based sequences minimizing MAI only.
Resumo:
Il pomodoro è una delle colture principali del panorama agro-alimentare italiano e rappresenta un ingrediente base della tradizione culinaria nazionale. Il pomodoro lavorato dall’industria conserviera può essere trasformato in diverse tipologie merceologiche, che si differenziano in base alla tecniche di lavorazione impiegate ed alle caratteristiche del prodotto finito. la percentuale di spesa totale destinata all’acquisto di cibo fuori casa è in aumento a livello globale e l’interesse dell’industria alimentare nei confronti di questo canale di vendita è quindi crescente. Mentre sono numerose le indagine in letteratura che studiano i processi di acquisto dei consumatori finali, non ci sono evidenze di studi simili condotti sugli operatori del Food Service. Obiettivo principale della ricerca è quello di valutare le preferenze dei responsabili acquisti del settore Food Service per diverse tipologie di pomodoro trasformato, in relazione ad una gamma di attributi rilevanti del prodotto e di caratteristiche del cliente. La raccolta dei dati è avvenuta attraverso un esperimento di scelta ipotetico realizzato in Italia e alcuni mercati esteri. Dai risultati ottenuti dall’indagine emerge che i Pelati sono la categoria di pomodoro trasformato preferita dai responsabili degli acquisti del settore Food Service intervistati, con il 35% delle preferenze dichiarate nell'insieme dei contesti di scelta proposti, seguita dalla Polpa (25%), dalla Passata (20%) e dal Concentrato (15%). Dai risultati ottenuti dalla stima del modello econometrico Logit a parametri randomizzati è emerso che alcuni attributi qualitativi di fiducia (credence), spesso impiegati nelle strategie di differenziazione e posizionamento da parte dell’industria alimentare nel mercato Retail, possono rivestire un ruolo importante anche nell’influenzare le preferenze degli operatori del Food Service. Questo potrebbe quindi essere un interessante filone di ricerca da sviluppare nel futuro, possibilmente con l'impiego congiunto di metodologie di analisi basate su esperimenti di scelta ipotetici e non ipotetici.