920 resultados para New Keynesian model, Bayesian methods, Monetary policy, Great Inflation
Resumo:
Purpose - The purpose of this paper is to present designs for an accelerated life test (ALT). Design/methodology/approach - Bayesian methods and simulation Monte Carlo Markov Chain (MCMC) methods were used. Findings - In the paper a Bayesian method based on MCMC for ALT under EW distribution (for life time) and Arrhenius models (relating the stress variable and parameters) was proposed. The paper can conclude that it is a reasonable alternative to the classical statistical methods since the implementation of the proposed method is simple, not requiring advanced computational understanding and inferences on the parameters can be made easily. By the predictive density of a future observation, a procedure was developed to plan ALT and also to verify if the conformance fraction of the manufactured process reaches some desired level of quality. This procedure is useful for statistical process control in many industrial applications. Research limitations/implications - The results may be applied in a semiconductor manufacturer. Originality/value - The Exponentiated-Weibull-Arrhenius model has never before been used to plan an ALT. © Emerald Group Publishing Limited.
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
Pós-graduação em Economia - FCLAR
Resumo:
Introduction: Many experimental models using lung lavage have been developed for the study of acute respiratory distress syndrome (ARDS). The original technique has been modified by many authors, resulting in difficulties with reproducibility. There is insufficient detail on the lung injury models used, including hemodynamic stability during animal preparation and drawbacks encountered such as mortality. The authors studied the effects of the pulmonary recruitment and the use of fixed tidal volume (Vt) or fixed inspiratory pressure in the experimental ARDS model installation. Methods: Adult rabbits were submitted to repeated lung lavages with 30 ml/kg warm saline until the ARDS definition (PaO2/FiO(2) <= 100) was reached. The animals were divided into three groups, according to the technique used for mechanical ventilation: 1) fixed Vt of 10 ml/kg; 2) fixed inspiratory pressure (IP) with a tidal volume of 10 ml/kg prior to the first lung lavage; and 3) fixed Vt of 10 ml/kg with pulmonary recruitment before the first lavage. Results: The use of alveolar recruitment maneuvers, and the use of a fixed Vt or IP between the lung lavages did not change the number of lung lavages necessary to obtain the experimental model of ARDS or the hemodynamic stability of the animals during the procedure. A trend was observed toward an increased mortality rate with the recruitment maneuver and with the use of a fixed IP. Discussion: There were no differences between the three study groups, with no disadvantage in method of lung recruitment, either fixed tidal volume or fixed inspiratory pressure, regarding the number of lung lavages necessary to obtain the ARDS animal model. Furthermore, the three different procedures resulted in good hemodynamic stability of the animals, and low mortality rate. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
This thesis presents Bayesian solutions to inference problems for three types of social network data structures: a single observation of a social network, repeated observations on the same social network, and repeated observations on a social network developing through time. A social network is conceived as being a structure consisting of actors and their social interaction with each other. A common conceptualisation of social networks is to let the actors be represented by nodes in a graph with edges between pairs of nodes that are relationally tied to each other according to some definition. Statistical analysis of social networks is to a large extent concerned with modelling of these relational ties, which lends itself to empirical evaluation. The first paper deals with a family of statistical models for social networks called exponential random graphs that takes various structural features of the network into account. In general, the likelihood functions of exponential random graphs are only known up to a constant of proportionality. A procedure for performing Bayesian inference using Markov chain Monte Carlo (MCMC) methods is presented. The algorithm consists of two basic steps, one in which an ordinary Metropolis-Hastings up-dating step is used, and another in which an importance sampling scheme is used to calculate the acceptance probability of the Metropolis-Hastings step. In paper number two a method for modelling reports given by actors (or other informants) on their social interaction with others is investigated in a Bayesian framework. The model contains two basic ingredients: the unknown network structure and functions that link this unknown network structure to the reports given by the actors. These functions take the form of probit link functions. An intrinsic problem is that the model is not identified, meaning that there are combinations of values on the unknown structure and the parameters in the probit link functions that are observationally equivalent. Instead of using restrictions for achieving identification, it is proposed that the different observationally equivalent combinations of parameters and unknown structure be investigated a posteriori. Estimation of parameters is carried out using Gibbs sampling with a switching devise that enables transitions between posterior modal regions. The main goal of the procedures is to provide tools for comparisons of different model specifications. Papers 3 and 4, propose Bayesian methods for longitudinal social networks. The premise of the models investigated is that overall change in social networks occurs as a consequence of sequences of incremental changes. Models for the evolution of social networks using continuos-time Markov chains are meant to capture these dynamics. Paper 3 presents an MCMC algorithm for exploring the posteriors of parameters for such Markov chains. More specifically, the unobserved evolution of the network in-between observations is explicitly modelled thereby avoiding the need to deal with explicit formulas for the transition probabilities. This enables likelihood based parameter inference in a wider class of network evolution models than has been available before. Paper 4 builds on the proposed inference procedure of Paper 3 and demonstrates how to perform model selection for a class of network evolution models.
Resumo:
This research has been triggered by an emergent trend in customer behavior: customers have rapidly expanded their channel experiences and preferences beyond traditional channels (such as stores) and they expect the company with which they do business to have a presence on all these channels. This evidence has produced an increasing interest in multichannel customer behavior and it has motivated several researchers to study the customers’ channel choices dynamics in multichannel environment. We study how the consumer decision process for channel choice and response to marketing communications evolves for a cohort of new customers. We assume a newly acquired customer’s decisions are described by a “trial” model, but the customer’s choice process evolves to a “post-trial” model as the customer learns his or her preferences and becomes familiar with the firm’s marketing efforts. The trial and post-trial decision processes are each described by different multinomial logit choice models, and the evolution from the trial to post-trial model is determined by a customer-level geometric distribution that captures the time it takes for the customer to make the transition. We utilize data for a major retailer who sells in three channels – retail store, the Internet, and via catalog. The model is estimated using Bayesian methods that allow for cross-customer heterogeneity. This allows us to have distinct parameters estimates for a trial and an after trial stages and to estimate the quickness of this transit at the individual level. The results show for example that the customer decision process indeed does evolve over time. Customers differ in the duration of the trial period and marketing has a different impact on channel choice in the trial and post-trial stages. Furthermore, we show that some people switch channel decision processes while others don’t and we found that several factors have an impact on the probability to switch decision process. Insights from this study can help managers tailor their marketing communication strategy as customers gain channel choice experience. Managers may also have insights on the timing of the direct marketing communications. They can predict the duration of the trial phase at individual level detecting the customers with a quick, long or even absent trial phase. They can even predict if the customer will change or not his decision process over time, and they can influence the switching process using specific marketing tools
Resumo:
This thesis focuses on two aspects of European economic integration: exchange rate stabilization between non-euro Countries and the Euro Area, and real and nominal convergence of Central and Eastern European Countries. Each Chapter covers these aspects from both a theoretical and empirical perspective. Chapter 1 investigates whether the introduction of the euro was accompanied by a shift in the de facto exchange rate policy of European countries outside the euro area, using methods recently developed by the literature to detect "Fear of Floating" episodes. I find that European Inflation Targeters have tried to stabilize the euro exchange rate, after its introduction; fixed exchange rate arrangements, instead, apart from official policy changes, remained stable. Finally, the euro seems to have gained a relevant role as a reference currency even outside Europe. Chapter 2 proposes an approach to estimate Central Bank preferences starting from the Central Bank's optimization problem within a small open economy, using Sweden as a case study, to find whether stabilization of the exchange rate played a role in the Monetary Policy rule of the Riksbank. The results show that it did not influence interest rate setting; exchange rate stabilization probably occurred as a result of increased economic integration and business cycle convergence. Chapter 3 studies the interactions between wages in the public sector, the traded private sector and the closed sector in ten EU Transition Countries. The theoretical literature on wage spillovers suggests that the traded sector should be the leader in wage setting, with non-traded sectors wages adjusting. We show that large heterogeneity across countries is present, and sheltered and public sector wages are often leaders in wage determination. This result is relevant from a policy perspective since wage spillovers, leading to costs growing faster than productivity, may affect the international cost competitiveness of the traded sector.
Resumo:
The dissertation consists of four papers that aim at providing new contributions in the field of macroeconomics, monetary policy and financial stability. The first paper proposes a new Dynamic Stochastic General Equilibrium (DSGE) model with credit frictions and a banking sector to study the pro-cyclicality of credit and the role of different prudential regulatory frameworks in affecting business cycle fluctuations and in restoring macroeconomic and financial stability. The second paper develops a simple DSGE model capable of evaluating the effects of large purchases of treasuries by central banks. This theoretical framework is employed to evaluate the impact on yields and the macroeconomy of large purchases of medium- and long-term government bonds recently implemented in the US and UK. The third paper studies the effects of ECB communications about unconventional monetary policy operations on the perceived sovereign risk of Italy over the last five years. The empirical results are derived from both an event-study analysis and a GARCH model, which uses Italian long-term bond futures to disentangle expected from unexpected policy actions. The fourth paper proposes a DSGE model with an endogenous term structure of interest rates, which is able to replicate the stylized facts regarding the yield curve and the term premium in the US over the period 1987:3-2011:3, without compromising its ability to match macro dynamics.
Resumo:
Autism Spectrum Disorders (ASDs) describe a set of neurodevelopmental disorders. ASD represents a significant public health problem. Currently, ASDs are not diagnosed before the 2nd year of life but an early identification of ASDs would be crucial as interventions are much more effective than specific therapies starting in later childhood. To this aim, cheap an contact-less automatic approaches recently aroused great clinical interest. Among them, the cry and the movements of the newborn, both involving the central nervous system, are proposed as possible indicators of neurological disorders. This PhD work is a first step towards solving this challenging problem. An integrated system is presented enabling the recording of audio (crying) and video (movements) data of the newborn, their automatic analysis with innovative techniques for the extraction of clinically relevant parameters and their classification with data mining techniques. New robust algorithms were developed for the selection of the voiced parts of the cry signal, the estimation of acoustic parameters based on the wavelet transform and the analysis of the infant’s general movements (GMs) through a new body model for segmentation and 2D reconstruction. In addition to a thorough literature review this thesis presents the state of the art on these topics that shows that no studies exist concerning normative ranges for newborn infant cry in the first 6 months of life nor the correlation between cry and movements. Through the new automatic methods a population of control infants (“low-risk”, LR) was compared to a group of “high-risk” (HR) infants, i.e. siblings of children already diagnosed with ASD. A subset of LR infants clinically diagnosed as newborns with Typical Development (TD) and one affected by ASD were compared. The results show that the selected acoustic parameters allow good differentiation between the two groups. This result provides new perspectives both diagnostic and therapeutic.
Resumo:
Background For reliable assessment of ventilation inhomogeneity, multiple-breath washout (MBW) systems should be realistically validated. We describe a new lung model for in vitro validation under physiological conditions and the assessment of a new nitrogen (N2)MBW system. Methods The N2MBW setup indirectly measures the N2 fraction (FN2) from main-stream carbon dioxide (CO2) and side-stream oxygen (O2) signals: FN2 = 1−FO2−FCO2−FArgon. For in vitro N2MBW, a double chamber plastic lung model was filled with water, heated to 37°C, and ventilated at various lung volumes, respiratory rates, and FCO2. In vivo N2MBW was undertaken in triplets on two occasions in 30 healthy adults. Primary N2MBW outcome was functional residual capacity (FRC). We assessed in vitro error (√[difference]2) between measured and model FRC (100–4174 mL), and error between tests of in vivo FRC, lung clearance index (LCI), and normalized phase III slope indices (Sacin and Scond). Results The model generated 145 FRCs under BTPS conditions and various breathing patterns. Mean (SD) error was 2.3 (1.7)%. In 500 to 4174 mL FRCs, 121 (98%) of FRCs were within 5%. In 100 to 400 mL FRCs, the error was better than 7%. In vivo FRC error between tests was 10.1 (8.2)%. LCI was the most reproducible ventilation inhomogeneity index. Conclusion The lung model generates lung volumes under the conditions encountered during clinical MBW testing and enables realistic validation of MBW systems. The new N2MBW system reliably measures lung volumes and delivers reproducible LCI values.
Resumo:
The purpose of this study is to develop statistical methodology to facilitate indirect estimation of the concentration of antiretroviral drugs and viral loads in the prostate gland and the seminal vesicle. The differences in antiretroviral drug concentrations in these organs may lead to suboptimal concentrations in one gland compared to the other. Suboptimal levels of the antiretroviral drugs will not be able to fully suppress the virus in that gland, lead to a source of sexually transmissible virus and increase the chance of selecting for drug resistant virus. This information may be useful selecting antiretroviral drug regimen that will achieve optimal concentrations in most of male genital tract glands. Using fractionally collected semen ejaculates, Lundquist (1949) measured levels of surrogate markers in each fraction that are uniquely produced by specific male accessory glands. To determine the original glandular concentrations of the surrogate markers, Lundquist solved a simultaneous series of linear equations. This method has several limitations. In particular, it does not yield a unique solution, it does not address measurement error, and it disregards inter-subject variability in the parameters. To cope with these limitations, we developed a mechanistic latent variable model based on the physiology of the male genital tract and surrogate markers. We employ a Bayesian approach and perform a sensitivity analysis with regard to the distributional assumptions on the random effects and priors. The model and Bayesian approach is validated on experimental data where the concentration of a drug should be (biologically) differentially distributed between the two glands. In this example, the Bayesian model-based conclusions are found to be robust to model specification and this hierarchical approach leads to more scientifically valid conclusions than the original methodology. In particular, unlike existing methods, the proposed model based approach was not affected by a common form of outliers.