993 resultados para Hazard Models
Resumo:
Fire is a major management issue in the southwestern United States. Three spatial models of fire risk for Coconino County, Northern Arizona. These models were generated using thematic data layers depicting vegetation, elevation, wind speed and direction, and precipitation for January (winter), June (summer), and July (start of monsoon season). ArcGIS 9.0 was used to weight attributes in raster layers to reflect their influence on fire risk and to interpolate raster data layers from point data. Final models were generated using the raster calculator in the Spatial Analyst extension of ArcGIS 9.0. Ultimately, the unique combinations of variables resulted in three different models illustrating the change in fire risk during the year.
Resumo:
This paper deals with the testing of autoregressive conditional duration (ACD) models by gauging the distance between the parametric density and hazard rate functions implied by the duration process and their non-parametric estimates. We derive the asymptotic justification using the functional delta method for fixed and gamma kernels, and then investigate the finite-sample properties through Monte Carlo simulations. Although our tests display some size distortion, bootstrapping suffices to correct the size without compromising their excellent power. We show the practical usefulness of such testing procedures for the estimation of intraday volatility patterns.
Resumo:
This paper deals with the estimation and testing of conditional duration models by looking at the density and baseline hazard rate functions. More precisely, we foeus on the distance between the parametric density (or hazard rate) function implied by the duration process and its non-parametric estimate. Asymptotic justification is derived using the functional delta method for fixed and gamma kernels, whereas finite sample properties are investigated through Monte Carlo simulations. Finally, we show the practical usefulness of such testing procedures by carrying out an empirical assessment of whether autoregressive conditional duration models are appropriate to oIs for modelling price durations of stocks traded at the New York Stock Exchange.
Resumo:
The paper analyzes a two period general equilibrium model with individual risk and moral hazard. Each household faces two individual states of nature in the second period. These states solely differ in the household's vector of initial endowments, which is strictly larger in the first state (good state) than in the second state (bad state). In the first period households choose a non-observable action. Higher leveis of action give higher probability of the good state of nature to occur, but lower leveIs of utility. Households have access to an insurance market that allows transfer of income across states of oature. I consider two models of financiaI markets, the price-taking behavior model and the nonlínear pricing modelo In the price-taking behavior model suppliers of insurance have a belief about each household's actíon and take asset prices as given. A variation of standard arguments shows the existence of a rational expectations equilibrium. For a generic set of economies every equilibrium is constraíned sub-optímal: there are commodity prices and a reallocation of financiaI assets satisfying the first period budget constraint such that, at each household's optimal choice given those prices and asset reallocation, markets clear and every household's welfare improves. In the nonlinear pricing model suppliers of insurance behave strategically offering nonlinear pricing contracts to the households. I provide sufficient conditions for the existence of equilibrium and investigate the optimality properties of the modeI. If there is a single commodity then every equilibrium is constrained optimaI. Ir there is more than one commodity, then for a generic set of economies every equilibrium is constrained sub-optimaI.
Resumo:
In this paper we propose a hybrid hazard regression model with threshold stress which includes the proportional hazards and the accelerated failure time models as particular cases. To express the behavior of lifetimes the generalized-gamma distribution is assumed and an inverse power law model with a threshold stress is considered. For parameter estimation we develop a sampling-based posterior inference procedure based on Markov Chain Monte Carlo techniques. We assume proper but vague priors for the parameters of interest. A simulation study investigates the frequentist properties of the proposed estimators obtained under the assumption of vague priors. Further, some discussions on model selection criteria are given. The methodology is illustrated on simulated and real lifetime data set.
Resumo:
Flood disasters are a major cause of fatalities and economic losses, and several studies indicate that global flood risk is currently increasing. In order to reduce and mitigate the impact of river flood disasters, the current trend is to integrate existing structural defences with non structural measures. This calls for a wider application of advanced hydraulic models for flood hazard and risk mapping, engineering design, and flood forecasting systems. Within this framework, two different hydraulic models for large scale analysis of flood events have been developed. The two models, named CA2D and IFD-GGA, adopt an integrated approach based on the diffusive shallow water equations and a simplified finite volume scheme. The models are also designed for massive code parallelization, which has a key importance in reducing run times in large scale and high-detail applications. The two models were first applied to several numerical cases, to test the reliability and accuracy of different model versions. Then, the most effective versions were applied to different real flood events and flood scenarios. The IFD-GGA model showed serious problems that prevented further applications. On the contrary, the CA2D model proved to be fast and robust, and able to reproduce 1D and 2D flow processes in terms of water depth and velocity. In most applications the accuracy of model results was good and adequate to large scale analysis. Where complex flow processes occurred local errors were observed, due to the model approximations. However, they did not compromise the correct representation of overall flow processes. In conclusion, the CA model can be a valuable tool for the simulation of a wide range of flood event types, including lowland and flash flood events.
Resumo:
This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.
Resumo:
Jewell and Kalbfleisch (1992) consider the use of marker processes for applications related to estimation of the survival distribution of time to failure. Marker processes were assumed to be stochastic processes that, at a given point in time, provide information about the current hazard and consequently on the remaining time to failure. Particular attention was paid to calculations based on a simple additive model for the relationship between the hazard function at time t and the history of the marker process up until time t. Specific applications to the analysis of AIDS data included the use of markers as surrogate responses for onset of AIDS with censored data and as predictors of the time elapsed since infection in prevalent individuals. Here we review recent work on the use of marker data to tackle these kinds of problems with AIDS data. The Poisson marker process with an additive model, introduced in Jewell and Kalbfleisch (1992) may be a useful "test" example for comparison of various procedures.
Resumo:
Intensive care unit (ICU) patients are ell known to be highly susceptible for nosocomial (i.e. hospital-acquired) infections due to their poor health and many invasive therapeutic treatments. The effects of acquiring such infections in ICU on mortality are however ill understood. Our goal is to quantify these effects using data from the National Surveillance Study of Nosocomial Infections in Intensive Care Units (Belgium). This is a challenging problem because of the presence of time-dependent confounders (such as exposure to mechanical ventilation)which lie on the causal path from infection to mortality. Standard statistical analyses may be severely misleading in such settings and have shown contradicting results. While inverse probability weighting for marginal structural models can be used to accommodate time-dependent confounders, inference for the effect of ?ICU acquired infections on mortality under such models is further complicated (a) by the fact that marginal structural models infer the effect of acquiring infection on a given, fixed day ?in ICU?, which is not well defined when ICU discharge comes prior to that day; (b) by informative censoring of the survival time due to hospital discharge; and (c) by the instability of the inverse weighting estimation procedure. We accommodate these problems by developing inference under a new class of marginal structural models which describe the hazard of death for patients if, possibly contrary to fact, they stayed in the ICU for at least a given number of days s and acquired infection or not on that day. Using these models we estimate that, if patients stayed in the ICU for at least s days, the effect of acquiring infection on day s would be to multiply the subsequent hazard of death by 2.74 (95 per cent conservative CI 1.48; 5.09).
Resumo:
There is an emerging interest in modeling spatially correlated survival data in biomedical and epidemiological studies. In this paper, we propose a new class of semiparametric normal transformation models for right censored spatially correlated survival data. This class of models assumes that survival outcomes marginally follow a Cox proportional hazard model with unspecified baseline hazard, and their joint distribution is obtained by transforming survival outcomes to normal random variables, whose joint distribution is assumed to be multivariate normal with a spatial correlation structure. A key feature of the class of semiparametric normal transformation models is that it provides a rich class of spatial survival models where regression coefficients have population average interpretation and the spatial dependence of survival times is conveniently modeled using the transformed variables by flexible normal random fields. We study the relationship of the spatial correlation structure of the transformed normal variables and the dependence measures of the original survival times. Direct nonparametric maximum likelihood estimation in such models is practically prohibited due to the high dimensional intractable integration of the likelihood function and the infinite dimensional nuisance baseline hazard parameter. We hence develop a class of spatial semiparametric estimating equations, which conveniently estimate the population-level regression coefficients and the dependence parameters simultaneously. We study the asymptotic properties of the proposed estimators, and show that they are consistent and asymptotically normal. The proposed method is illustrated with an analysis of data from the East Boston Ashma Study and its performance is evaluated using simulations.
Resumo:
In many clinical trials to evaluate treatment efficacy, it is believed that there may exist latent treatment effectiveness lag times after which medical procedure or chemical compound would be in full effect. In this article, semiparametric regression models are proposed and studied to estimate the treatment effect accounting for such latent lag times. The new models take advantage of the invariance property of the additive hazards model in marginalizing over random effects, so parameters in the models are easy to be estimated and interpreted, while the flexibility without specifying baseline hazard function is kept. Monte Carlo simulation studies demonstrate the appropriateness of the proposed semiparametric estimation procedure. Data collected in the actual randomized clinical trial, which evaluates the effectiveness of biodegradable carmustine polymers for treatment of recurrent brain tumors, are analyzed.
Resumo:
Previous studies of the sediments of Lake Lucerne have shown that massive subaqueous mass movements affecting unconsolidated sediments on lateral slopes are a common process in this lake, and, in view of historical reports describing damaging waves on the lake, it was suggested that tsunamis generated by mass movements represent a considerable natural hazard on the lakeshores. Newly performed numerical simulations combining two-dimensional, depth-averaged models for mass-movement propagation and for tsunami generation, propagation and inunda- tion reproduce a number of reported tsunami effects. Four analysed mass-movement scenarios—three based on documented slope failures involving volumes of 5.5 to 20.8 9 106 m3—show peak wave heights of several metres and maximum runup of 6 to [10 m in the directly affected basins, while effects in neighbouring basins are less drastic. The tsunamis cause large-scale inundation over distances of several hundred metres on flat alluvial plains close to the mass-movement source areas. Basins at the ends of the lake experience regular water-level oscillations with characteristic periods of several minutes. The vulnerability of potentially affected areas has increased dramatically since the times of the damaging historical events, recommending a thorough evaluation of the hazard.
Resumo:
Mathematical models of disease progression predict disease outcomes and are useful epidemiological tools for planners and evaluators of health interventions. The R package gems is a tool that simulates disease progression in patients and predicts the effect of different interventions on patient outcome. Disease progression is represented by a series of events (e.g., diagnosis, treatment and death), displayed in a directed acyclic graph. The vertices correspond to disease states and the directed edges represent events. The package gems allows simulations based on a generalized multistate model that can be described by a directed acyclic graph with continuous transition-specific hazard functions. The user can specify an arbitrary hazard function and its parameters. The model includes parameter uncertainty, does not need to be a Markov model, and may take the history of previous events into account. Applications are not limited to the medical field and extend to other areas where multistate simulation is of interest. We provide a technical explanation of the multistate models used by gems, explain the functions of gems and their arguments, and show a sample application.
Resumo:
Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.
Resumo:
The standard analyses of survival data involve the assumption that survival and censoring are independent. When censoring and survival are related, the phenomenon is known as informative censoring. This paper examines the effects of an informative censoring assumption on the hazard function and the estimated hazard ratio provided by the Cox model.^ The limiting factor in all analyses of informative censoring is the problem of non-identifiability. Non-identifiability implies that it is impossible to distinguish a situation in which censoring and death are independent from one in which there is dependence. However, it is possible that informative censoring occurs. Examination of the literature indicates how others have approached the problem and covers the relevant theoretical background.^ Three models are examined in detail. The first model uses conditionally independent marginal hazards to obtain the unconditional survival function and hazards. The second model is based on the Gumbel Type A method for combining independent marginal distributions into bivariate distributions using a dependency parameter. Finally, a formulation based on a compartmental model is presented and its results described. For the latter two approaches, the resulting hazard is used in the Cox model in a simulation study.^ The unconditional survival distribution formed from the first model involves dependency, but the crude hazard resulting from this unconditional distribution is identical to the marginal hazard, and inferences based on the hazard are valid. The hazard ratios formed from two distributions following the Gumbel Type A model are biased by a factor dependent on the amount of censoring in the two populations and the strength of the dependency of death and censoring in the two populations. The Cox model estimates this biased hazard ratio. In general, the hazard resulting from the compartmental model is not constant, even if the individual marginal hazards are constant, unless censoring is non-informative. The hazard ratio tends to a specific limit.^ Methods of evaluating situations in which informative censoring is present are described, and the relative utility of the three models examined is discussed. ^