980 resultados para Hazard Mitigatoin


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bird-aircraft strikes at the Atlantic City International Airport (ACY) increased from 18 in 1989 to 37 in 1990. The number of bird-aircraft strikes involving gulls (Larus spp.) during this time rose from 6 to 27, a 350% increase. The predominant species involved in bird strikes was the laughing gull (L. atricilla). Pursuant to an interagency agreement between the U.S. Department of Transportation (USDOT), Federal Aviation Administration (FAA) and the U.S. Department of Agriculture (USDA)l Animal and Plant Health Inspection Service (APHIS)/Animal Damage Control (ADC), ADC established a Emergency/Experimental Bird Hazard Reduction Force (BHFF) at ACY in 1991. An Environmental Assessment (EA) and Finding of No Significant Impact (FONSI) for the 1991 Emergency/Experimental BHRF was executed and signed by the FAA on 19 May 1991. The BHRF was adopted at this time by the FAA Technical Center as an annual program to reduce bird strikes at ACY. The BHRF goals are to minimize or eliminate the incidence of bird-aircraft strikes and runway closures due to increased bird activities. A BHRF team consisting of ADC personnel patrolled ACY for 95 days from 26 May until 28 August 1992, for a total of 2,949 person-hours. The BHRF used a combination of pyrotechnics, amplified gull distress tapes and live ammunition to harass gulls away from the airport from dawn to dusk. Gullaircraft strikes were reduced during BHRF operations in 1992 by 86% compared to gull strikes during summer months of 1990 when there was not a BHRF team. Runway closures due to bird activity decreased 100% compared to 1990 and 1991 closures. The BHRF should continue at ACY as long as birds are a threat to human safety and aircraft operations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Canadian Wildlife Service has had twenty-five years experience with the problem caused by bird contacts with aircraft. I experienced my first bird strike, while flying as an observer on a waterfowl survey in August, 1940. Officers of the Service investigated bird problems at airports at Yarmouth, Nova Scotia, and Cartierville, Quebec, in the late 1940's. Those incidents involving gulls and low speed piston-engined aircraft caused minor damage to the aircraft but considerable disturbance to the operators. As aircraft speeds increased and airports became more numerous and busier the problem increased in extent and complexity. By 1960 it was apparent that the problem would grow worse and that work should be directed toward reducing the number of incidents. In 1960 an electra aircraft crashed at Boston, Massachusetts, killing 61 passengers. Starlings were involved in the engine malfunction which preceded the crash. In November, 1962 a viscount aircraft was damaged by collision with two swans between Baltimore and Washington and crashed with a loss of 17 lives. Those incidents focused attention on the bird hazard problem in the United States.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a hybrid hazard regression model with threshold stress which includes the proportional hazards and the accelerated failure time models as particular cases. To express the behavior of lifetimes the generalized-gamma distribution is assumed and an inverse power law model with a threshold stress is considered. For parameter estimation we develop a sampling-based posterior inference procedure based on Markov Chain Monte Carlo techniques. We assume proper but vague priors for the parameters of interest. A simulation study investigates the frequentist properties of the proposed estimators obtained under the assumption of vague priors. Further, some discussions on model selection criteria are given. The methodology is illustrated on simulated and real lifetime data set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many applications of lifetime data analysis, it is important to perform inferences about the change-point of the hazard function. The change-point could be a maximum for unimodal hazard functions or a minimum for bathtub forms of hazard functions and is usually of great interest in medical or industrial applications. For lifetime distributions where this change-point of the hazard function can be analytically calculated, its maximum likelihood estimator is easily obtained from the invariance properties of the maximum likelihood estimators. From the asymptotical normality of the maximum likelihood estimators, confidence intervals can also be obtained. Considering the exponentiated Weibull distribution for the lifetime data, we have different forms for the hazard function: constant, increasing, unimodal, decreasing or bathtub forms. This model gives great flexibility of fit, but we do not have analytic expressions for the change-point of the hazard function. In this way, we consider the use of Markov Chain Monte Carlo methods to get posterior summaries for the change-point of the hazard function considering the exponentiated Weibull distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose methods for smooth hazard estimation of a time variable where that variable is interval censored. These methods allow one to model the transformed hazard in terms of either smooth (smoothing splines) or linear functions of time and other relevant time varying predictor variables. We illustrate the use of this method on a dataset of hemophiliacs where the outcome, time to seroconversion for HIV, is interval censored and left-truncated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The municipality of San Juan La Laguna, Guatemala is home to approximately 5,200 people and located on the western side of the Lake Atitlán caldera. Steep slopes surround all but the eastern side of San Juan. The Lake Atitlán watershed is susceptible to many natural hazards, but most predictable are the landslides that can occur annually with each rainy season, especially during high-intensity events. Hurricane Stan hit Guatemala in October 2005; the resulting flooding and landslides devastated the Atitlán region. Locations of landslide and non-landslide points were obtained from field observations and orthophotos taken following Hurricane Stan. This study used data from multiple attributes, at every landslide and non-landslide point, and applied different multivariate analyses to optimize a model for landslides prediction during high-intensity precipitation events like Hurricane Stan. The attributes considered in this study are: geology, geomorphology, distance to faults and streams, land use, slope, aspect, curvature, plan curvature, profile curvature and topographic wetness index. The attributes were pre-evaluated for their ability to predict landslides using four different attribute evaluators, all available in the open source data mining software Weka: filtered subset, information gain, gain ratio and chi-squared. Three multivariate algorithms (decision tree J48, logistic regression and BayesNet) were optimized for landslide prediction using different attributes. The following statistical parameters were used to evaluate model accuracy: precision, recall, F measure and area under the receiver operating characteristic (ROC) curve. The algorithm BayesNet yielded the most accurate model and was used to build a probability map of landslide initiation points. The probability map developed in this study was also compared to the results of a bivariate landslide susceptibility analysis conducted for the watershed, encompassing Lake Atitlán and San Juan. Landslides from Tropical Storm Agatha 2010 were used to independently validate this study’s multivariate model and the bivariate model. The ultimate aim of this study is to share the methodology and results with municipal contacts from the author's time as a U.S. Peace Corps volunteer, to facilitate more effective future landslide hazard planning and mitigation.