875 resultados para Latent Dirichlet Allocation
Resumo:
Le scelte di asset allocation costituiscono un problema ricorrente per ogni investitore. Quest’ultimo è continuamente impegnato a combinare diverse asset class per giungere ad un investimento coerente con le proprie preferenze. L’esigenza di supportare gli asset manager nello svolgimento delle proprie mansioni ha alimentato nel tempo una vasta letteratura che ha proposto numerose strategie e modelli di portfolio construction. Questa tesi tenta di fornire una rassegna di alcuni modelli innovativi di previsione e di alcune strategie nell’ambito dell’asset allocation tattica, per poi valutarne i risvolti pratici. In primis verificheremo la sussistenza di eventuali relazioni tra la dinamica di alcune variabili macroeconomiche ed i mercati finanziari. Lo scopo è quello di individuare un modello econometrico capace di orientare le strategie dei gestori nella costruzione dei propri portafogli di investimento. L’analisi prende in considerazione il mercato americano, durante un periodo caratterizzato da rapide trasformazioni economiche e da un’elevata volatilità dei prezzi azionari. In secondo luogo verrà esaminata la validità delle strategie di trading momentum e contrarian nei mercati futures, in particolare quelli dell’Eurozona, che ben si prestano all’implementazione delle stesse, grazie all’assenza di vincoli sulle operazioni di shorting ed ai ridotti costi di transazione. Dall’indagine emerge che entrambe le anomalie si presentano con carattere di stabilità. I rendimenti anomali permangono anche qualora vengano utilizzati i tradizionali modelli di asset pricing, quali il CAPM, il modello di Fama e French e quello di Carhart. Infine, utilizzando l’approccio EGARCH-M, verranno formulate previsioni sulla volatilità dei rendimenti dei titoli appartenenti al Dow Jones. Quest’ultime saranno poi utilizzate come input per determinare le views da inserire nel modello di Black e Litterman. I risultati ottenuti, evidenziano, per diversi valori dello scalare tau, extra rendimenti medi del new combined vector superiori al vettore degli extra rendimenti di equilibrio di mercato, seppur con livelli più elevati di rischio.
Resumo:
This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.
Resumo:
Dealing with latent constructs (loaded by reflective and congeneric measures) cross-culturally compared means studying how these unobserved variables vary, and/or covary each other, after controlling for possibly disturbing cultural forces. This yields to the so-called ‘measurement invariance’ matter that refers to the extent to which data collected by the same multi-item measurement instrument (i.e., self-reported questionnaire of items underlying common latent constructs) are comparable across different cultural environments. As a matter of fact, it would be unthinkable exploring latent variables heterogeneity (e.g., latent means; latent levels of deviations from the means (i.e., latent variances), latent levels of shared variation from the respective means (i.e., latent covariances), levels of magnitude of structural path coefficients with regard to causal relations among latent variables) across different populations without controlling for cultural bias in the underlying measures. Furthermore, it would be unrealistic to assess this latter correction without using a framework that is able to take into account all these potential cultural biases across populations simultaneously. Since the real world ‘acts’ in a simultaneous way as well. As a consequence, I, as researcher, may want to control for cultural forces hypothesizing they are all acting at the same time throughout groups of comparison and therefore examining if they are inflating or suppressing my new estimations with hierarchical nested constraints on the original estimated parameters. Multi Sample Structural Equation Modeling-based Confirmatory Factor Analysis (MS-SEM-based CFA) still represents a dominant and flexible statistical framework to work out this potential cultural bias in a simultaneous way. With this dissertation I wanted to make an attempt to introduce new viewpoints on measurement invariance handled under covariance-based SEM framework by means of a consumer behavior modeling application on functional food choices.
Resumo:
The aim of the thesis is to propose a Bayesian estimation through Markov chain Monte Carlo of multidimensional item response theory models for graded responses with complex structures and correlated traits. In particular, this work focuses on the multiunidimensional and the additive underlying latent structures, considering that the first one is widely used and represents a classical approach in multidimensional item response analysis, while the second one is able to reflect the complexity of real interactions between items and respondents. A simulation study is conducted to evaluate the parameter recovery for the proposed models under different conditions (sample size, test and subtest length, number of response categories, and correlation structure). The results show that the parameter recovery is particularly sensitive to the sample size, due to the model complexity and the high number of parameters to be estimated. For a sufficiently large sample size the parameters of the multiunidimensional and additive graded response models are well reproduced. The results are also affected by the trade-off between the number of items constituting the test and the number of item categories. An application of the proposed models on response data collected to investigate Romagna and San Marino residents' perceptions and attitudes towards the tourism industry is also presented.
Resumo:
This thesis focuses on the role of B cells in mCMV and Leishmania major infection. B cells are an essential component of the adaptive immune system and play a key role in the humoral immune response. In mCMV infection we analyzed the influence of B cells on the virus-specific CD8 T cell response, in detail the role of B cells as IL-10 secreting cells, as source of immunoglobulin (Ig) and as antigen presenting cells. In Leishmania major infection we investigated the role of Ig in Th1 and Th2 directed disease.rnWe found in mCMV infection that the B cell secreted IL-10 suppresses effectively the acute virus-specific CD8 T cell response, while the IL-10 secreted by dendritic cell has no obvious effect. Ig has no effect in the acute virus-specific CD8 T cell response, but in memory response Ig is essential. If Ig is missing the CD8 T cell population remains high in memory response 135 days post infection. The complete absence of B cells dramatically reduces the acute virus-specific CD8 T cell response, while B cell reconstitution just partially rescues this dramatic reduction. A comparison of this reduction in a B cell free organism to an organism with depleted dendritic cells gave a similar result. To exclude a malfunction of the CD8 T cells in the B cell deficient mice, the decreased virus-specific CD8 T cell population was confirmed in a B cell depletion model. Further, bone marrow chimeras with a B cell compartment deficient for CD40-/- showed a decrease of the virus-specific response and an involvement of CD40 on B cells. Taken together these results suggest a role for B cells in antigen presentation during mCMV infection.rnFurther we took advantage of the altered mCMV specific CD8 T cell memory response in mice without Ig to investigate the memory inflation of CD8 T cells specific for distinct mCMV specifc peptides. Using a SIINFEKL-presenting virus system, we were able to shorten the time until the memory inflation occurs and show that direct presentation stimulates the memory inflation. rnIn Leishmania major infection, Ig of Th2 balanced BALB/c mice has a central role in preventing a systemic infection, although the ear lesions are smaller in IgMi mice without specific Ig. Here the parasite loads of ears and spleen are elevated, and an IMS-reconstitution does not affect the parasite load. In contrast in Th1 balanced C57BL/6 mice, reconstitution of IgMi mice with serum of either untreated or immunized mice decreased the parasite load of spleen and ear, further IMS treatment reduces the size of the spleen and the cytokine-levels of IL-10, IL-4, IL-2 and IFN-γ to a level comparable to wt mice. rn
Resumo:
Apple latent infection caused by Neofabraea alba: host-pathogen interaction and disease management Bull’s eye rot (BER) caused by Neofabraea alba is one of the most frequent and damaging latent infection occurring in stored pome fruits worldwide. Fruit infection occurs in the orchard, but disease symptoms appear only 3 months after harvest, during refrigerated storage. In Italy BER is particularly serious for late harvest apple cultivar as ‘Pink Lady™’. The purposes of this thesis were: i) Evaluate the influence of ‘Pink Lady™’ apple primary metabolites in N. alba quiescence ii) Evaluate the influence of pH in five different apple cultivars on BER susceptibility iii) To find out not chemical method to control N. alba infection iv) Identify some fungal volatile compounds in order to use them as N. alba infections markers. Results regarding the role of primary metabolites showed that chlorogenic, quinic and malic acid inhibit N. alba development. The study based on the evaluation of cultivar susceptibility, showed that Granny Smith was the most resistant apple cultivar among the varieties analyzed. Moreover, Granny Smith showed the lowest pH value from harvest until the end of storage, supporting the thesis that ambient pH could be involved in the interaction between N. alba and apple. In order to find out new technologies able to improve lenticel rot management, the application of a non-destructive device for the determination of chlorophyll content was applied. Results showed that fruit with higher chlorophyll content are less susceptible to BER, and molecular analyses comforted this result. Fruits with higher chlorophyll content showed up-regulation of PGIP and HCT, genes involved in plant defence. Through the application of PTR-MS and SPME GC-MS, 25 volatile organic compounds emitted by N. alba were identified. Among them, 16 molecules were identified as potential biomarkers.
Resumo:
In questa tesi vengono forniti risultati sulle serie di Fourier e successivamente sulle serie di Fejér, utili per poter analizzare il cosiddetto problema di Cauchy-Dirichlet per l'equazione del calore di una sbarra omogenea. Lo scopo è trovare soluzioni classiche del problema che presenta come dato iniziale dapprima una funzione di classe C^1 e successivamente una funzione solamente continua.
Resumo:
Classic group recommender systems focus on providing suggestions for a fixed group of people. Our work tries to give an inside look at design- ing a new recommender system that is capable of making suggestions for a sequence of activities, dividing people in subgroups, in order to boost over- all group satisfaction. However, this idea increases problem complexity in more dimensions and creates great challenge to the algorithm’s performance. To understand the e↵ectiveness, due to the enhanced complexity and pre- cise problem solving, we implemented an experimental system from data collected from a variety of web services concerning the city of Paris. The sys- tem recommends activities to a group of users from two di↵erent approaches: Local Search and Constraint Programming. The general results show that the number of subgroups can significantly influence the Constraint Program- ming Approaches’s computational time and e�cacy. Generally, Local Search can find results much quicker than Constraint Programming. Over a lengthy period of time, Local Search performs better than Constraint Programming, with similar final results.
Resumo:
La tesi affronta il problema di Finanza Matematica dell'asset allocation strategica che consiste nel processo di ripartizione ottimale delle risorse tra diverse attività finanziarie presenti su un mercato. Sulla base della teoria di Harry Markowitz, attraverso passaggi matematici rigorosi si costruisce un portafoglio che risponde a dei requisiti di efficienza in termini di rapporto rischio-rendimento. Vengono inoltre forniti esempi di applicazione elaborati attraverso il software Mathematica.
Resumo:
High Performance Computing e una tecnologia usata dai cluster computazionali per creare sistemi di elaborazione che sono in grado di fornire servizi molto piu potenti rispetto ai computer tradizionali. Di conseguenza la tecnologia HPC e diventata un fattore determinante nella competizione industriale e nella ricerca. I sistemi HPC continuano a crescere in termini di nodi e core. Le previsioni indicano che il numero dei nodi arrivera a un milione a breve. Questo tipo di architettura presenta anche dei costi molto alti in termini del consumo delle risorse, che diventano insostenibili per il mercato industriale. Un scheduler centralizzato non e in grado di gestire un numero di risorse cosi alto, mantenendo un tempo di risposta ragionevole. In questa tesi viene presentato un modello di scheduling distribuito che si basa sulla programmazione a vincoli e che modella il problema dello scheduling grazie a una serie di vincoli temporali e vincoli sulle risorse che devono essere soddisfatti. Lo scheduler cerca di ottimizzare le performance delle risorse e tende ad avvicinarsi a un profilo di consumo desiderato, considerato ottimale. Vengono analizzati vari modelli diversi e ognuno di questi viene testato in vari ambienti.