6 resultados para predictability
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The subject of this work concerns the study of the immigration phenomenon, with emphasis on the aspects related to the integration of an immigrant population in a hosting one. Aim of this work is to show the forecasting ability of a recent finding where the behavior of integration quantifiers was analyzed and investigated with a mathematical model of statistical physics origins (a generalization of the monomer dimer model). After providing a detailed literature review of the model, we show that not only such a model is able to identify the social mechanism that drives a particular integration process, but it also provides correct forecast. The research reported here proves that the proposed model of integration and its forecast framework are simple and effective tools to reduce uncertainties about how integration phenomena emerge and how they are likely to develop in response to increased migration levels in the future.
Resumo:
The common thread of this thesis is the will of investigating properties and behavior of assemblies. Groups of objects display peculiar properties, which can be very far from the simple sum of respective components’ properties. This is truer, the smaller is inter-objects distance, i.e. the higher is their density, and the smaller is the container size. “Confinement” is in fact a key concept in many topics explored and here reported. It can be conceived as a spatial limitation, that yet gives origin to unexpected processes and phenomena based on inter-objects communication. Such phenomena eventually result in “non-linear properties”, responsible for the low predictability of large assemblies. Chapter 1 provides two insights on surface chemistry, namely (i) on a supramolecular assembly based on orthogonal forces, and (ii) on selective and sensitive fluorescent sensing in thin polymeric film. In chapters 2 to 4 confinement of molecules plays a major role. Most of the work focuses on FRET within core-shell nanoparticles, investigated both through a simulation model and through experiments. Exciting results of great applicative interest are drawn, such as a method of tuning emission wavelength at constant excitation, and a way of overcoming self-quenching processes by setting up a competitive deactivation channel. We envisage applications of these materials as labels for multiplexing analysis, and in all fields of fluorescence imaging, where brightness coupled with biocompatibility and water solubility is required. Adducts of nanoparticles and molecular photoswitches are investigated in the context of superresolution techniques for fluorescence microscopy. In chapter 5 a method is proposed to prepare a library of functionalized Pluronic F127, which gives access to a twofold “smart” nanomaterial, namely both (i)luminescent and (ii)surface-functionalized SCSSNPs. Focus shifts in chapter 6 to confinement effects in an upper size scale. Moving from nanometers to micrometers, we investigate the interplay between microparticles flowing in microchannels where a constriction affects at very long ranges structure and dynamics of the colloidal paste.
Resumo:
L’elaborato ha lo scopo di presentare le nuove opportunità di business offerte dal Web. Il rivoluzionario cambiamento che la pervasività della Rete e tutte le attività correlate stanno portando, ha posto le aziende davanti ad un diverso modo di relazionarsi con i propri consumatori, che sono sempre più informati, consapevoli ed esigenti, e con la concorrenza. La sfida da accettare per rimanere competitivi sul mercato è significativa e il mutamento in rapido sviluppo: gli aspetti che contraddistinguono questo nuovo paradigma digitale sono, infatti, velocità, mutevolezza, ma al tempo stesso misurabilità, ponderabilità, previsione. Grazie agli strumenti tecnologici a disposizione e alle dinamiche proprie dei diversi spazi web (siti, social network, blog, forum) è possibile tracciare più facilmente, rispetto al passato, l’impatto di iniziative, lanci di prodotto, promozioni e pubblicità, misurandone il ritorno sull’investimento, oltre che la percezione dell’utente finale. Un approccio datacentrico al marketing, attraverso analisi di monitoraggio della rete, permette quindi al brand investimenti più mirati e ponderati sulla base di stime e previsioni. Tra le più significative strategie di marketing digitale sono citate: social advertising, keyword advertising, digital PR, social media, email marketing e molte altre. Sono riportate anche due case history: una come ottimo esempio di co-creation in cui il brand ha coinvolto direttamente il pubblico nel processo di produzione del prodotto, affidando ai fan della Pagina Facebook ufficiale la scelta dei gusti degli yogurt da mettere in vendita. La seconda, caso internazionale di lead generation, ha permesso al brand di misurare la conversione dei visitatori del sito (previa compilazione di popin) in reali acquirenti, collegando i dati di traffico del sito a quelli delle vendite. Esempio di come online e offline comunichino strettamente.
Resumo:
This thesis is the result of a project aimed at the study of a crucial topic in finance: default risk, whose measurement and modelling have achieved increasing relevance in recent years. We investigate the main issues related to the default phenomenon, under both a methodological and empirical perspective. The topics of default predictability and correlation are treated with a constant attention to the modelling solutions and reviewing critically the literature. From the methodological point of view, our analysis results in the proposal of a new class of models, called Poisson Autoregression with Exogenous Covariates (PARX). The PARX models, including both autoregressive end exogenous components, are able to capture the dynamics of default count time series, characterized by persistence of shocks and slowly decaying autocorrelation. Application of different PARX models to the monthly default counts of US industrial firms in the period 1982-2011 allows an empirical insight of the defaults dynamics and supports the identification of the main default predictors at an aggregate level.
Resumo:
This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.
Resumo:
La tesi di ricerca si propone di indagare il riflesso che i principi/valori producono sul parametro nel sindacato di legittimità costituzionale, al fine di verificarne le implicazioni sulla legalità, in termini di prevedibilità e certezza. In particolare, delineata la connessione tra principi e valori costituzionali e, ricostruito, secondo la teoria dell'ordinamento, il rapporto tra valori e normatività,si analizzano i riflessi prodotti, sul piano interpretativo, dall’apertura del parametro costituzionale alla logica dei valori, enfatizzandone le ricadute sul controllo di costituzionalità delle leggi. Identificato il nesso tra principi e valori nella capacità funzionale dei primi di realizzare i diritti fondamentali, si è inteso rimarcare come la più estesa realizzazione dei principi-valori costituzionali potrebbe compiersi a spese della legge e della certezza del diritto, in una relazione inversamente proporzionale. Ciò apparirebbe evidente dall’ottica privilegiata della materia penale, per cui una legalità materiale, letta alla luce di criteri di adeguatezza e di ragionevole proporzione, seppur vicina alle esigenze di giustizia del caso concreto, se spinta in eccessi interpretativi rischia di invadere il campo del legislatore, unico deputato a compiere scelte di valore.