194 resultados para conditional models
Resumo:
This paper proposes an empirical framework to study the effects of a policy regime change defined as an unpredictable and permanent change in the policy parameters. In particular I show how to make conditional forecast and perform impulse response functions and counterfactual analysis. As an application, the effects of changes in fiscal policy rules in the US are investigated. I find that discretionary fiscal policy has become more countercyclical over the last decades. In absence of such a change, surplus would have been higher, debt lower and output gap more volatile but only until mid 80s. An increase in the degree of counter-cyclicality of fiscal policy has a positive effect on output gap in periods where the level of debt-to-GDP ratio is low and a zero or negative effect when the ratio is high. This explains why a more countercylical stance of the systematic fiscal policy taking place in 2008:II is predicted to be rather ineffective for recovering from the crisis.
Resumo:
This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics
Resumo:
Our work is concerned with user modelling in open environments. Our proposal then is the line of contributions to the advances on user modelling in open environments thanks so the Agent Technology, in what has been called Smart User Model. Our research contains a holistic study of User Modelling in several research areas related to users. We have developed a conceptualization of User Modelling by means of examples from a broad range of research areas with the aim of improving our understanding of user modelling and its role in the next generation of open and distributed service environments. This report is organized as follow: In chapter 1 we introduce our motivation and objectives. Then in chapters 2, 3, 4 and 5 we provide the state-of-the-art on user modelling. In chapter 2, we give the main definitions of elements described in the report. In chapter 3, we present an historical perspective on user models. In chapter 4 we provide a review of user models from the perspective of different research areas, with special emphasis on the give-and-take relationship between Agent Technology and user modelling. In chapter 5, we describe the main challenges that, from our point of view, need to be tackled by researchers wanting to contribute to advances in user modelling. From the study of the state-of-the-art follows an exploratory work in chapter 6. We define a SUM and a methodology to deal with it. We also present some cases study in order to illustrate the methodology. Finally, we present the thesis proposal to continue the work, together with its corresponding work scheduling and temporalisation
Resumo:
Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services
Resumo:
Calculating explicit closed form solutions of Cournot models where firms have private information about their costs is, in general, very cumbersome. Most authors consider therefore linear demands and constant marginal costs. However, within this framework, the nonnegativity constraint on prices (and quantities) has been ignored or not properly dealt with and the correct calculation of all Bayesian Nash equilibria is more complicated than expected. Moreover, multiple symmetric and interior Bayesianf equilibria may exist for an open set of parameters. The reason for this is that linear demand is not really linear, since there is a kink at zero price: the general ''linear'' inverse demand function is P (Q) = max{a - bQ, 0} rather than P (Q) = a - bQ.
Resumo:
We present a detailed analytical and numerical study of the avalanche distributions of the continuous damage fiber bundle model CDFBM . Linearly elastic fibers undergo a series of partial failure events which give rise to a gradual degradation of their stiffness. We show that the model reproduces a wide range of mechanical behaviors. We find that macroscopic hardening and plastic responses are characterized by avalanche distributions, which exhibit an algebraic decay with exponents between 5/2 and 2 different from those observed in mean-field fiber bundle models. We also derive analytically the phase diagram of a family of CDFBM which covers a large variety of potential avalanche size distributions. Our results provide a unified view of the statistics of breaking avalanches in fiber bundle models
Resumo:
The introduction of an infective-infectious period on the geographic spread of epidemics is considered in two different models. The classical evolution equations arising in the literature are generalized and the existence of epidemic wave fronts is revised. The asymptotic speed is obtained and improves previous results for the Black Death plague
Resumo:
Es tracta d'un projecte que proposa una aplicació per al calibratge automàtic de models P-sistema. Per a fer-ho primer es farà un estudi sobre els models P-sistema i el procediment seguit pels investigadors per desenvolupar aquest tipus de models. Es desenvoluparà una primera solució sèrie per al problema, i s'analitzaran els seus punts febles. Seguidament es proposarà una versió paral·lela que millori significativament el temps d'execució, tot mantenint una alta eficiència i escalabilitat.
Resumo:
En aquest projecte, tractarem de crear una aplicació que ens permeti d'una forma ràpida i eficient, el processament dels resultats obtinguts per un eina de simulació d'ecosistemes naturals anomenada PlinguaCore .L'objectiu d'aquest tractament és doble. En primer lloc, dissenyar una API que ens permeti de forma eficient processar la gran quantitat de dades generades per simulador d'ecosistemes PlinguaCore. En segon lloc, fer que aquesta API es pugui integrar en altres aplicacions, tant de tractament de dades, com de cal·libració dels models.
Resumo:
The radiation distribution function used by Domínguez and Jou [Phys. Rev. E 51, 158 (1995)] has been recently modified by Domínguez-Cascante and Faraudo [Phys. Rev. E 54, 6933 (1996)]. However, in these studies neither distribution was written in terms of directly measurable quantities. Here a solution to this problem is presented, and we also propose an experiment that may make it possible to determine the distribution function of nonequilibrium radiation experimentally. The results derived do not depend on a specific distribution function for the matter content of the system
Resumo:
Estudi realitzat a partir d’una estada a la Stanford University School of Medicine. Division of Radiation Oncology, Estats Units, entre 2010 i 2012. Durant els dos anys de beca postdoctoral he estat treballant en dos projectes diferents. En primer lloc, i com a continuació d'estudis previs del grup, volíem estudiar la causa de les diferències en nivells d'hipòxia que havíem observat en models de càncer de pulmó. La nostra hipòtesi es basava en el fet que aquestes diferències es devien a la funcionalitat de la vasculatura. Vam utilitzar dos models preclínics: un en què els tumors es formaven espontàniament als pulmons i l'altre on nosaltres injectàvem les cèl•lules de manera subcutània. Vam utilitzar tècniques com la ressonància magnètica dinàmica amb agent de contrast (DCE-MRI) i l'assaig de perfusió amb el Hoeschst 33342 i ambdues van demostrar que la funcionalitat de la vasculatura dels tumors espontanis era molt més elevada comparada amb la dels tumors subcutanis. D'aquest estudi, en podem concloure que les diferències en els nivells d'hipòxia en els diferents models tumorals de càncer de pulmó podrien ser deguts a la variació en la formació i funcionalitat de la vasculatura. Per tant, la selecció de models preclínics és essencial, tant pels estudi d'hipòxia i angiogènesi, com per a teràpies adreçades a aquests fenòmens. L'altre projecte que he estat desenvolupant es basa en l'estudi de la radioteràpia i els seus possibles efectes a l’hora de potenciar l'autoregeneració del tumor a partir de les cèl•lules tumorals circulants (CTC). Aquest efecte s'ha descrit en alguns models tumorals preclínics. Per tal de dur a terme els nostres estudis, vam utilitzar una línia tumoral de càncer de mama de ratolí, marcada permanentment amb el gen de Photinus pyralis o sense marcar i vam fer estudis in vitro i in vivo. Ambdós estudis han demostrat que la radiació tumoral promou la invasió cel•lular i l'autoregeneració del tumor per CTC. Aquest descobriment s'ha de considerar dins d'un context de radioteràpia clínica per tal d'aconseguir el millor tractament en pacients amb nivells de CTC elevats.
Resumo:
The use of cannabis sativa preparations as recreational drugs can be traced back to the earliest civilizations. However, animal models of cannabinoid addiction allowing the exploration of neural correlates of cannabinoid abuse have been developed only recently. We review these models and the role of the CB1 cannabinoid receptor, the main target of natural cannabinoids, and its interaction with opioid and dopamine transmission in reward circuits. Extensive reviews on the molecular basis of cannabinoid action are available elsewhere (Piomelli et al., 2000;Schlicker and Kathmann, 2001).
Resumo:
Background: Few studies have used longitudinal ultrasound measurements to assess the effect of traffic-related air pollution on fetal growth.Objective: We examined the relationship between exposure to nitrogen dioxide (NO2) and aromatic hydrocarbons [benzene, toluene, ethylbenzene, m/p-xylene, and o-xylene (BTEX)] on fetal growth assessed by 1,692 ultrasound measurements among 562 pregnant women from the Sabadell cohort of the Spanish INMA (Environment and Childhood) study.Methods: We used temporally adjusted land-use regression models to estimate exposures to NO2 and BTEX. We fitted mixed-effects models to estimate longitudinal growth curves for femur length (FL), head circumference (HC), abdominal circumference (AC), biparietal diameter (BPD), and estimated fetal weight (EFW). Unconditional and conditional SD scores were calculated at 12, 20, and 32 weeks of gestation. Sensitivity analyses were performed considering time–activity patterns during pregnancy.Results: Exposure to BTEX from early pregnancy was negatively associated with growth in BPD during weeks 20–32. None of the other fetal growth parameters were associated with exposure to air pollution during pregnancy. When considering only women who spent 2 hr/day in nonresidential outdoor locations, effect estimates were stronger and statistically significant for the association between NO2 and growth in HC during weeks 12–20 and growth in AC, BPD, and EFW during weeks 20–32.Conclusions: Our results lend some support to an effect of exposure to traffic-related air pollutants from early pregnancy on fetal growth during mid-pregnancy.