960 resultados para Accelerated failure time model
Resumo:
Motivated by new and innovative rental business models, this paper develops a novel discrete-time model of a rental operation with random loss of inventory due to customer use. The inventory level is chosen before the start of a finite rental season, and customers not immediately served are lost. Our analysis framework uses stochastic comparisons of sample paths to derive structural results that hold under good generality for demands, rental durations, and rental unit lifetimes. Considering different \recirculation" rules | i.e., which rental unit to choose to meet each demand | we prove the concavity of the expected profit function and identify the optimal recirculation rule. A numerical study clarifies when considering rental unit loss and recirculation rules matters most for the inventory decision: Accounting for rental unit loss can increase the expected profit by 7% for a single season and becomes even more important as the time horizon lengthens. We also observe that the optimal inventory level in response to increasing loss probability is non-monotonic. Finally, we show that choosing the optimal recirculation rule over another simple policy allows more rental units to be profitably added, and the profit-maximizing service level increases by up to 6 percentage points.
Resumo:
This presentation was both an illustrated lecture and a published paper presented at the IMPACT 9 Conference Printmaking in the Post-Print Age, Hangzhou China 2015. It was an extension of the exhibition catalogue essay for the Bluecoat Gallery Exhibition of the same name. In 2014 I curated an exhibition The Negligent Eye at the Bluecoat Gallery in Liverpool as the result of longstanding interest in scanning and 3D printing and the role of these in changing the field of Print within Fine Art Practice. In the aftermath of curatingshow I have continued to reflect on this material with reference to the writings of Vilém Flusser and Hito Steyerl. The work in the exhibition came from a wide range of artists of all generations most of whom are not explicitly located within Printmaking. Whilst some work did not use any scanning technology at all, a shared fascination with the particular translating device of the systematizing ‘eye’ of a scanning digital video camera, flatbed or medical scanner was expressed by all the work in the show. Through writing this paper I aim to extend my own understanding of questions, which arose from the juxtapositions of work and the production of the accompanying catalogue. The show developed in dialogue with curators Bryan Biggs and Sarah-Jane Parsons of the Bluecoat Gallery who sent a series of questions about scanning to participating artists. In reflecting upon their answers I will extend the discussions begun in the process of this research. A kind of created attention deficit disorder seems to operate on us all today to make and distribute images and information at speed. What value do ways of making which require slow looking or intensive material explorations have in this accelerated system? What model of the world is being constructed by the drive to simulated realities toward ever-greater resolution, so called high definition? How are our perceptions of reality being altered by the world-view presented in the smooth colourful ever morphing simulations that surround us? The limitations of digital technology are often a starting point for artists to reflect on our relationship to real-world fragility. I will be looking at practices where tactility or dimensionality in a form of hard copy engages with these questions using examples from the exhibition. Artists included in the show were: Cory Arcangel, Christiane Baumgartner, Thomas Bewick, Jyll Bradley, Maurice Carlin, Helen Chadwick, Susan Collins, Conroy/Sanderson, Nicky Coutts, Elizabeth Gossling, Beatrice Haines, Juneau Projects, Laura Maloney, Bob Matthews, London Fieldworks (with the participation of Gustav Metzger), Marilène Oliver, Flora Parrott, South Atlantic Souvenirs, Imogen Stidworthy, Jo Stockham, Wolfgang Tillmans, Alessa Tinne, Michael Wegerer, Rachel Whiteread, Jane and Louise Wilson. Scanning, Art, Technology, Copy, Materiality.
Resumo:
This thesis presents quantitative studies of T cell and dendritic cell (DC) behaviour in mouse lymph nodes (LNs) in the naive state and following immunisation. These processes are of importance and interest in basic immunology, and better understanding could improve both diagnostic capacity and therapeutic manipulations, potentially helping in producing more effective vaccines or developing treatments for autoimmune diseases. The problem is also interesting conceptually as it is relevant to other fields where 3D movement of objects is tracked with a discrete scanning interval. A general immunology introduction is presented in chapter 1. In chapter 2, I apply quantitative methods to multi-photon imaging data to measure how T cells and DCs are spatially arranged in LNs. This has been previously studied to describe differences between the naive and immunised state and as an indicator of the magnitude of the immune response in LNs, but previous analyses have been generally descriptive. The quantitative analysis shows that some of the previous conclusions may have been premature. In chapter 3, I use Bayesian state-space models to test some hypotheses about the mode of T cell search for DCs. A two-state mode of movement where T cells can be classified as either interacting to a DC or freely migrating is supported over a model where T cells would home in on DCs at distance through for example the action of chemokines. In chapter 4, I study whether T cell migration is linked to the geometric structure of the fibroblast reticular network (FRC). I find support for the hypothesis that the movement is constrained to the fibroblast reticular cell (FRC) network over an alternative 'random walk with persistence time' model where cells would move randomly, with a short-term persistence driven by a hypothetical T cell intrinsic 'clock'. I also present unexpected results on the FRC network geometry. Finally, a quantitative method is presented for addressing some measurement biases inherent to multi-photon imaging. In all three chapters, novel findings are made, and the methods developed have the potential for further use to address important problems in the field. In chapter 5, I present a summary and synthesis of results from chapters 3-4 and a more speculative discussion of these results and potential future directions.
Resumo:
Atualmente, as organizações tendem a desenvolverem-se com o objetivo de se tornarem mais eficazes e eficientes. Neste contexto, esta investigação visa propor um modelo que permita calcular os Custos da Qualidade (CQ) na manutenção e sustentação dos Sistemas de Armas da Força Aérea (FA), contribuindo para a melhoria contínua do Sistema de Gestão da Qualidade e Aeronavegabilidade (SGQA). Assim, neste estudo é avaliada a utilização do modelo “Prevenção, Avaliação e Falhas” (PAF) para o cálculo dos CQ no SGQA, a forma como os Sistemas de Informação (SI) podem contribuir para este cálculo e qual a estrutura do sistema que deverá integrar e operacionalizar este modelo. Esta investigação desenvolve-se mediante um raciocínio hipotético-dedutivo, utilizando uma estratégia qualitativa aplicada num estudo de caso ao SA Epsilon Tb-30. Após apresentar um enquadramento teórico, são testadas as hipóteses identificadas através de análise documental e entrevistas a elementos com funções-chave neste âmbito. Verifica-se então a possibilidade de utilizar o modelo PAF para o cálculo dos CQ no SGQA. Contudo, é necessário adaptar os SI e os processos do sistema para a sua operacionalização. Finalmente, é proposto um plano para implementação do modelo de CQ, assim como são apresentadas algumas recomendações para o seu desenvolvimento. Abstract: Nowadays, the organizations tend to self-develop in order to increase their efficiency and effectiveness. In this context, this study has the purpose to propose a Quality Cost (CQ) model within the scope of maintenance and sustainability of Portuguese Air Force (FA) weapon systems, contributing to the continuous improvement of its Airworthiness and Quality Management System (SGQA). Therefore, throughout this study is evaluated the implementation of Prevention, Appraisal and Failure (PAF) model for CQ calculation, how the Information Systems (SI) can contribute for this calculus and what SGQA structure should integrate and operationalize this model. In this investigation is used a hypothetical-deductive reasoning, through a qualitative strategy applied to a case study in Epsilon TB-30 aircraft. After presenting an initial theoretical study, the raised hypotheses are tested through the relevant document analysis and interviews with elements in key functions within this scope. With this study it’s shown the possibility to use PAF model to calculate CQ of the SGQA. However, it’s necessary to adapt the SI and the system processes to get the operationalization of this model. Finally, an implementation plan of the evaluated CQ model is proposed, and some recommendations are made for its future development.
Resumo:
Doutoramento em Matemática
Resumo:
The first paper sheds light on the informational content of high frequency data and daily data. I assess the economic value of the two family models comparing their performance in forecasting asset volatility through the Value at Risk metric. In running the comparison this paper introduces two key assumptions: jumps in prices and leverage effect in volatility dynamics. Findings suggest that high frequency data models do not exhibit a superior performance over daily data models. In the second paper, building on Majewski et al. (2015), I propose an affine-discrete time model, labeled VARG-J, which is characterized by a multifactor volatility specification. In the VARG-J model volatility experiences periods of extreme movements through a jump factor modeled as an Autoregressive Gamma Zero process. The estimation under historical measure is done by quasi-maximum likelihood and the Extended Kalman Filter. This strategy allows to filter out both volatility factors introducing a measurement equation that relates the Realized Volatility to latent volatility. The risk premia parameters are calibrated using call options written on S&P500 Index. The results clearly illustrate the important contribution of the jump factor in the pricing performance of options and the economic significance of the volatility jump risk premia. In the third paper, I analyze whether there is empirical evidence of contagion at the bank level, measuring the direction and the size of contagion transmission between European markets. In order to understand and quantify the contagion transmission on banking market, I estimate the econometric model by Aït-Sahalia et al. (2015) in which contagion is defined as the within and between countries transmission of shocks and asset returns are directly modeled as a Hawkes jump diffusion process. The empirical analysis indicates that there is a clear evidence of contagion from Greece to European countries as well as self-contagion in all countries.
Resumo:
This work resumes a wide variety of research activities carried out with the main objective of increasing the efficiency and reducing the fuel consumption of Gasoline Direct Injection engines, especially under high loads. For this purpose, two main innovative technologies have been studied, Water Injection and Low-Pressure Exhaust Gas Recirculation, which help to reduce the temperature of the gases inside the combustion chamber and thus mitigate knock, being this one of the main limiting factors for the efficiency of modern downsized engines that operate at high specific power. A prototypal Port Water Injection system was developed and extensive experimental work has been carried out, initially to identify the benefits and limitations of this technology. This led to the subsequent development and testing of a combustion controller, which has been implemented on a Rapid Control Prototyping environment, capable of managing water injection to achieve knock mitigation and a more efficient combustion phase. Regarding Low-Pressure Exhaust Gas Recirculation, a commercial engine that was already equipped with this technology was used to carry out experimental work in a similar fashion to that of water injection. Another prototypal water injection system has been mounted to this second engine, to be able to test both technologies, at first separately to compare them on equal conditions, and secondly together in the search of a possible synergy. Additionally, based on experimental data from several engines that have been tested during this study, including both GDI and GCI engines, a real-time model (or virtual sensor) for the estimation of the maximum in-cylinder pressure has been developed and validated. This parameter is of vital importance to determine the speed at which damage occurs on the engine components, and therefore to extract the maximum performance without inducing permanent damages.
Resumo:
In face of the current economic and financial environment, predicting corporate bankruptcy is arguably a phenomenon of increasing interest to investors, creditors, borrowing firms, and governments alike. Within the strand of literature focused on bankruptcy forecasting we can find diverse types of research employing a wide variety of techniques, but only a few researchers have used survival analysis for the examination of this issue. We propose a model for the prediction of corporate bankruptcy based on survival analysis, a technique which stands on its own merits. In this research, the hazard rate is the probability of ‘‘bankruptcy’’ as of time t, conditional upon having survived until time t. Many hazard models are applied in a context where the running of time naturally affects the hazard rate. The model employed in this paper uses the time of survival or the hazard risk as dependent variable, considering the unsuccessful companies as censured observations.
Resumo:
First published online: December 16, 2014.
Resumo:
Leptospirosis in humans usually involves hypokalaemia and hypomagnesaemia and the putative mechanism underlying such ionic imbalances may be related to nitric oxide (NO) production. We previously demonstrated the correlation between serum levels of NO and the severity of renal disease in patients with severe leptospirosis. Methylene blue inhibits soluble guanylyl cyclase (downstream of the action of any NO synthase isoforms) and was recently reported to have beneficial effects on clinical and experimental sepsis. We investigated the occurrence of serum ionic changes in experimental leptospirosis at various time points (4, 8, 16 and 28 days) in a hamster model. We also determined the effect of methylene blue treatment when administered as an adjuvant therapy, combined with late initiation of standard antibiotic (ampicillin) treatment. Hypokalaemia was not reproduced in this model: all of the groups developed increased levels of serum potassium (K). Furthermore, hypermagnesaemia, rather than magnesium (Mg) depletion, was observed in this hamster model of acute infection. These findings may be associated with an accelerated progression to acute renal failure. Adjuvant treatment with methylene blue had no effect on survival or serum Mg and K levels during acute-phase leptospirosis in hamsters.
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
Purpose - The purpose of this paper is to present designs for an accelerated life test (ALT). Design/methodology/approach - Bayesian methods and simulation Monte Carlo Markov Chain (MCMC) methods were used. Findings - In the paper a Bayesian method based on MCMC for ALT under EW distribution (for life time) and Arrhenius models (relating the stress variable and parameters) was proposed. The paper can conclude that it is a reasonable alternative to the classical statistical methods since the implementation of the proposed method is simple, not requiring advanced computational understanding and inferences on the parameters can be made easily. By the predictive density of a future observation, a procedure was developed to plan ALT and also to verify if the conformance fraction of the manufactured process reaches some desired level of quality. This procedure is useful for statistical process control in many industrial applications. Research limitations/implications - The results may be applied in a semiconductor manufacturer. Originality/value - The Exponentiated-Weibull-Arrhenius model has never before been used to plan an ALT. © Emerald Group Publishing Limited.
Resumo:
Background: Several models have been designed to predict survival of patients with heart failure. These, while available and widely used for both stratifying and deciding upon different treatment options on the individual level, have several limitations. Specifically, some clinical variables that may influence prognosis may have an influence that change over time. Statistical models that include such characteristic may help in evaluating prognosis. The aim of the present study was to analyze and quantify the impact of modeling heart failure survival allowing for covariates with time-varying effects known to be independent predictors of overall mortality in this clinical setting. Methodology: Survival data from an inception cohort of five hundred patients diagnosed with heart failure functional class III and IV between 2002 and 2004 and followed-up to 2006 were analyzed by using the proportional hazards Cox model and variations of the Cox's model and also of the Aalen's additive model. Principal Findings: One-hundred and eighty eight (188) patients died during follow-up. For patients under study, age, serum sodium, hemoglobin, serum creatinine, and left ventricular ejection fraction were significantly associated with mortality. Evidence of time-varying effect was suggested for the last three. Both high hemoglobin and high LV ejection fraction were associated with a reduced risk of dying with a stronger initial effect. High creatinine, associated with an increased risk of dying, also presented an initial stronger effect. The impact of age and sodium were constant over time. Conclusions: The current study points to the importance of evaluating covariates with time-varying effects in heart failure models. The analysis performed suggests that variations of Cox and Aalen models constitute a valuable tool for identifying these variables. The implementation of covariates with time-varying effects into heart failure prognostication models may reduce bias and increase the specificity of such models.
Resumo:
Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^