1000 resultados para multiple hazard
Resumo:
Large parts of the world are subjected to one or more natural hazards, such as earthquakes, tsunamis, landslides, tropical storms (hurricanes, cyclones and typhoons), costal inundation and flooding. Virtually the entire world is at risk of man-made hazards. In recent decades, rapid population growth and economic development in hazard-prone areas have greatly increased the potential of multiple hazards to cause damage and destruction of buildings, bridges, power plants, and other infrastructure; thus posing a grave danger to the community and disruption of economic and societal activities. Although an individual hazard is significant in many parts of the United States (U.S.), in certain areas more than one hazard may pose a threat to the constructed environment. In such areas, structural design and construction practices should address multiple hazards in an integrated manner to achieve structural performance that is consistent with owner expectations and general societal objectives. The growing interest and importance of multiple-hazard engineering has been recognized recently. This has spurred the evolution of multiple-hazard risk-assessment frameworks and development of design approaches which have paved way for future research towards sustainable construction of new and improved structures and retrofitting of the existing structures. This report provides a review of literature and the current state of practice for assessment, design and mitigation of the impact of multiple hazards on structural infrastructure. It also presents an overview of future research needs related to multiple-hazard performance of constructed facilities.
Resumo:
The objective for this thesis is to outline a Performance-Based Engineering (PBE) framework to address the multiple hazards of Earthquake (EQ) and subsequent Fire Following Earthquake (FFE). Currently, fire codes for the United States are largely empirical and prescriptive in nature. The reliance on prescriptive requirements makes quantifying sustained damage due to fire difficult. Additionally, the empirical standards have resulted from individual member or individual assembly furnace testing, which have been shown to differ greatly from full structural system behavior. The very nature of fire behavior (ignition, growth, suppression, and spread) is fundamentally difficult to quantify due to the inherent randomness present in each stage of fire development. The study of interactions between earthquake damage and fire behavior is also in its infancy with essentially no available empirical testing results. This thesis will present a literature review, a discussion, and critique of the state-of-the-art, and a summary of software currently being used to estimate loss due to EQ and FFE. A generalized PBE framework for EQ and subsequent FFE is presented along with a combined hazard probability to performance objective matrix and a table of variables necessary to fully implement the proposed framework. Future research requirements and summary are also provided with discussions of the difficulties inherent in adequately describing the multiple hazards of EQ and FFE.
Resumo:
In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2. © Author(s) 2015.
Resumo:
Background Estimates of the disease burden due to multiple risk factors can show the potential gain from combined preventive measures. But few such investigations have been attempted, and none on a global scale. Our aim was to estimate the potential health benefits from removal of multiple major risk factors. Methods We assessed the burden of disease and injury attributable to the joint effects of 20 selected leading risk factors in 14 epidemiological subregions of the world. We estimated population attributable fractions, defined as the proportional reduction in disease or mortality that would occur if exposure to a risk factor were reduced to an alternative level, from data for risk factor prevalence and hazard size. For every disease, we estimated joint population attributable fractions, for multiple risk factors, by age and sex, from the direct contributions of individual risk factors. To obtain the direct hazards, we reviewed publications and re-analysed cohort data to account for that part of hazard that is mediated through other risks. Results Globally, an estimated 47% of premature deaths and 39% of total disease burden in 2000 resulted from the joint effects of the risk factors considered. These risks caused a substantial proportion of important diseases, including diarrhoea (92%-94%), lower respiratory infections (55-62%), lung cancer (72%), chronic obstructive pulmonary disease (60%), ischaemic heart disease (83-89%), and stroke (70-76%). Removal of these risks would have increased global healthy life expectancy by 9.3 years (17%) ranging from 4.4 years (6%) in the developed countries of the western Pacific to 16.1 years (43%) in parts of sub-Saharan Africa. Interpretation Removal of major risk factors would not only increase healthy life expectancy in every region, but also reduce some of the differences between regions, The potential for disease prevention and health gain from tackling major known risks simultaneously would be substantial.
Resumo:
Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from the models applied and analysis scale, which are neglecting local controlling factors of debris flow hazard. The presented approach of debris flow hazard analysis, associating automatic detection of the source areas and a simple assessment of the debris flow spreading, provided results for consequent hazard and risk studies. However, for the validation and transferability of the parameters and results to other study areas, more testing is needed.
Resumo:
OBJECTIVE To assess the effectiveness of glatiramer acetate (GA) compared to other multiple sclerosis (MS) therapies in routine clinical practice. MATERIALS AND METHODS Observational cohort study carried out in MS patients treated with GA (GA cohort) or other MS therapies -switched from GA- (non-GA cohort). Study data were obtained through review of our MS patient database. The primary endpoint was the Expanded Disability Status Scale (EDSS) scores reached at the end of treatment/last check-up. RESULTS A total of 180 patients were included: GA cohort n = 120, non-GA cohort n = 60. Patients in the GA cohort showed better EDSS scores at the end of treatment/last check-up (mean ± SD, 2.8 ± 1.8 vs. 3.9 ± 2.2; P = 0.001) and were 1.65 times more likely to show better EDSS scores compared to the non-GA cohort (odds ratio, 0.606; 95%CI, 0.436-0.843; P = 0.003). Patients in the GA cohort showed longer mean time to reach EDSS scores of 6 (209.1 [95%CI, 187.6-230.6] vs. 164.3 [95% CI, 137.0-191.6] months; P = 0.004) and slower disability progression (hazard ratio, 0.415 [95%CI, 0.286-0.603]; P < 0.001). The annualized relapse rate was lower in the GA cohort (mean ± SD, 0.5 ± 0.5 vs. 0.8 ± 0.5; P = 0.001) and patients' quality of life was improved in this study cohort compared to the non-GA cohort (mean ± SD, 0.7 ± 0.1 vs. 0.6 ± 0.2; P = 0.01). CONCLUSIONS GA may slow down the progression of EDSS scores to a greater extent than other MS therapies, as well as achieving a greater reduction in relapses and a greater improvement in patients' quality of life. Switching from GA to other MS therapies has not proved to entail a better response to treatment.
Resumo:
BACKGROUND: High-dose chemotherapy with autologous stem-cell transplantation is a standard treatment for young patients with multiple myeloma. Residual disease is almost always present after transplantation and is responsible for relapse. This phase 3, placebo-controlled trial investigated the efficacy of lenalidomide maintenance therapy after transplantation. METHODS: We randomly assigned 614 patients younger than 65 years of age who had nonprogressive disease after first-line transplantation to maintenance treatment with either lenalidomide (10 mg per day for the first 3 months, increased to 15 mg if tolerated) or placebo until relapse. The primary end point was progression-free survival. RESULTS: Lenalidomide maintenance therapy improved median progression-free survival (41 months, vs. 23 months with placebo; hazard ratio, 0.50; P<0.001). This benefit was observed across all patient subgroups, including those based on the β(2)-microglobulin level, cytogenetic profile, and response after transplantation. With a median follow-up period of 45 months, more than 70% of patients in both groups were alive at 4 years. The rates of grade 3 or 4 peripheral neuropathy were similar in the two groups. The incidence of second primary cancers was 3.1 per 100 patient-years in the lenalidomide group versus 1.2 per 100 patient-years in the placebo group (P=0.002). Median event-free survival (with events that included second primary cancers) was significantly improved with lenalidomide (40 months, vs. 23 months with placebo; P<0.001). CONCLUSIONS: Lenalidomide maintenance after transplantation significantly prolonged progression-free and event-free survival among patients with multiple myeloma. Four years after randomization, overall survival was similar in the two study groups. (Funded by the Programme Hospitalier de Recherche Clinique and others; ClinicalTrials.gov number, NCT00430365.).
Resumo:
PURPOSE This prospective multicenter phase III study compared the efficacy and safety of a triple combination (bortezomib-thalidomide-dexamethasone [VTD]) versus a dual combination (thalidomide-dexamethasone [TD]) in patients with multiple myeloma (MM) progressing or relapsing after autologous stem-cell transplantation (ASCT). PATIENTS AND METHODS Overall, 269 patients were randomly assigned to receive bortezomib (1.3 mg/m(2) intravenous bolus) or no bortezomib for 1 year, in combination with thalidomide (200 mg per day orally) and dexamethasone (40 mg orally once a day on 4 days once every 3 weeks). Bortezomib was administered on days 1, 4, 8, and 11 with a 10-day rest period (day 12 to day 21) for eight cycles (6 months), and then on days 1, 8, 15, and 22 with a 20-day rest period (day 23 to day 42) for four cycles (6 months). Results Median time to progression (primary end point) was significantly longer with VTD than TD (19.5 v 13.8 months; hazard ratio, 0.59; 95% CI, 0.44 to 0.80; P = .001), the complete response plus near-complete response rate was higher (45% v 25%; P = .001), and the median duration of response was longer (17.2 v 13.4 months; P = .03). The 24-month survival rate was in favor of VTD (71% v 65%; P = .093). Grade 3 peripheral neuropathy was more frequent with VTD (29% v 12%; P = .001) as were the rates of grades 3 and 4 infection and thrombocytopenia. CONCLUSION VTD was more effective than TD in the treatment of patients with MM with progressive or relapsing disease post-ASCT but was associated with a higher incidence of grade 3 neurotoxicity.
Resumo:
HIV infection has a broad spectrum of renal manifestations. This study examined the clinical and histological manifestations of HIV-associated renal disease, and predictors of renal outcomes. Sixty-one (64% male, mean age 45 years) HIV patients were retrospectively evaluated. Clinical presentation and renal histopathology were assessed, as well as CD4 T-cell count and viral load. The predictive value of histological lesion, baseline CD4 cell count and viral load for end-stage renal disease (ESRD) or death were determined using the Cox regression model. The outcomes of chronic kidney disease (CKD) and ESRD or death were evaluated by baseline CD4 cell count. The percent distribution at initial clinical presentation was non-nephrotic proteinuria (54%), acute kidney injury (28%), nephrotic syndrome (23%), and chronic kidney disease (22%). Focal segmental glomerulosclerosis (28%), mainly the collapsing form (HIVAN), acute interstitial nephritis (AIN) (26%), and immune complex-mediated glomerulonephritis (ICGN) (25%) were the predominant renal histology. Baseline CD4 cell count ≥200 cells/mm3 was a protective factor against CKD (hazard ratio=0.997; 95%CI=0.994-0.999; P=0.012). At last follow-up, 64% of patients with baseline CD4 ≥200 cells/mm3 had eGFR >60 mL·min-1·(1.73 m2)-1 compared to the other 35% of patients who presented with CD4 <200 cells/mm3 (log rank=9.043, P=0.003). In conclusion, the main histological lesion of HIV-associated renal disease was HIVAN, followed by AIN and ICGN. These findings reinforce the need to biopsy HIV patients with kidney impairment and/or proteinuria. Baseline CD4 cell count ≥200 cells/mm3 was associated with better renal function after 2 years of follow-up.
Resumo:
In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Light-frame wood buildings are widely built in the United States (U.S.). Natural hazards cause huge losses to light-frame wood construction. This study proposes methodologies and a framework to evaluate the performance and risk of light-frame wood construction. Performance-based engineering (PBE) aims to ensure that a building achieves the desired performance objectives when subjected to hazard loads. In this study, the collapse risk of a typical one-story light-frame wood building is determined using the Incremental Dynamic Analysis method. The collapse risks of buildings at four sites in the Eastern, Western, and Central regions of U.S. are evaluated. Various sources of uncertainties are considered in the collapse risk assessment so that the influence of uncertainties on the collapse risk of lightframe wood construction is evaluated. The collapse risks of the same building subjected to maximum considered earthquakes at different seismic zones are found to be non-uniform. In certain areas in the U.S., the snow accumulation is significant and causes huge economic losses and threatens life safety. Limited study has been performed to investigate the snow hazard when combined with a seismic hazard. A Filtered Poisson Process (FPP) model is developed in this study, overcoming the shortcomings of the typically used Bernoulli model. The FPP model is validated by comparing the simulation results to weather records obtained from the National Climatic Data Center. The FPP model is applied in the proposed framework to assess the risk of a light-frame wood building subjected to combined snow and earthquake loads. The snow accumulation has a significant influence on the seismic losses of the building. The Bernoulli snow model underestimates the seismic loss of buildings in areas with snow accumulation. An object-oriented framework is proposed in this study to performrisk assessment for lightframe wood construction. For home owners and stake holders, risks in terms of economic losses is much easier to understand than engineering parameters (e.g., inter story drift). The proposed framework is used in two applications. One is to assess the loss of the building subjected to mainshock-aftershock sequences. Aftershock and downtime costs are found to be important factors in the assessment of seismic losses. The framework is also applied to a wood building in the state of Washington to assess the loss of the building subjected to combined earthquake and snow loads. The proposed framework is proven to be an appropriate tool for risk assessment of buildings subjected to multiple hazards. Limitations and future works are also identified.
Resumo:
The municipality of San Juan La Laguna, Guatemala is home to approximately 5,200 people and located on the western side of the Lake Atitlán caldera. Steep slopes surround all but the eastern side of San Juan. The Lake Atitlán watershed is susceptible to many natural hazards, but most predictable are the landslides that can occur annually with each rainy season, especially during high-intensity events. Hurricane Stan hit Guatemala in October 2005; the resulting flooding and landslides devastated the Atitlán region. Locations of landslide and non-landslide points were obtained from field observations and orthophotos taken following Hurricane Stan. This study used data from multiple attributes, at every landslide and non-landslide point, and applied different multivariate analyses to optimize a model for landslides prediction during high-intensity precipitation events like Hurricane Stan. The attributes considered in this study are: geology, geomorphology, distance to faults and streams, land use, slope, aspect, curvature, plan curvature, profile curvature and topographic wetness index. The attributes were pre-evaluated for their ability to predict landslides using four different attribute evaluators, all available in the open source data mining software Weka: filtered subset, information gain, gain ratio and chi-squared. Three multivariate algorithms (decision tree J48, logistic regression and BayesNet) were optimized for landslide prediction using different attributes. The following statistical parameters were used to evaluate model accuracy: precision, recall, F measure and area under the receiver operating characteristic (ROC) curve. The algorithm BayesNet yielded the most accurate model and was used to build a probability map of landslide initiation points. The probability map developed in this study was also compared to the results of a bivariate landslide susceptibility analysis conducted for the watershed, encompassing Lake Atitlán and San Juan. Landslides from Tropical Storm Agatha 2010 were used to independently validate this study’s multivariate model and the bivariate model. The ultimate aim of this study is to share the methodology and results with municipal contacts from the author's time as a U.S. Peace Corps volunteer, to facilitate more effective future landslide hazard planning and mitigation.
Resumo:
Background Estimates of the disease burden due to multiple risk factors can show the potential gain from combined preventive measures. But few such investigations have been attempted, and none on a global scale. Our aim was to estimate the potential health benefits from removal of multiple major risk factors. Methods We assessed the burden of disease and injury attributable to the joint effects of 20 selected leading risk factors in 14 epidemiological subregions of the world. We estimated population attributable fractions, defined as the proportional reduction in disease or mortality that would occur if exposure to a risk factor were reduced to an alternative level, from data for risk factor prevalence and hazard size. For every disease, we estimated joint population attributable fractions, for multiple risk factors, by age and sex, from the direct contributions of individual risk factors. To obtain the direct hazards, we reviewed publications and re-analysed cohort data to account for that part of hazard that is mediated through other risks. Results Globally, an estimated 47% of premature deaths and 39% of total disease burden in 2000 resulted from the joint effects of the risk factors considered. These risks caused a substantial proportion of important diseases, including diarrhoea (92%-94%), lower respiratory infections (55-62%), lung cancer (72%), chronic obstructive pulmonary disease (60%), ischaemic heart disease (83-89%), and stroke (70-76%). Removal of these risks would have increased global healthy life expectancy by 9.3 years (17%) ranging from 4.4 years (6%) in the developed countries of the western Pacific to 16.1 years (43%) in parts of sub-Saharan Africa. Interpretation Removal of major risk factors would not only increase healthy life expectancy in every region, but also reduce some of the differences between regions, The potential for disease prevention and health gain from tackling major known risks simultaneously would be substantial.
Resumo:
(Magill, M., Quinzii, M., 2002. Capital market equilibrium with moral hazard. Journal of Mathematical Economics 38, 149-190) showed that, in a stockmarket economy with private information, the moral hazard problem may be resolved provided that a spanning overlap condition is satisfed. This result depends on the assumption that the technology is given by a stochastic production function with a single scalar input. The object of the present paper is to extend the analysis of Magill and Quinzii to the case of multiple inputs. We show that their main result extends to this general case if and only if, for each firm, the number of linearly independent combinations of securities having payoffs correlated with, but not dependent on, the firms output is equal to the number of degrees of freedom in the firm's production technology.