913 resultados para Probabilistic fire risk analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Observability measures the support of computer systems to accurately capture, analyze, and present (collectively observe) the internal information about the systems. Observability frameworks play important roles for program understanding, troubleshooting, performance diagnosis, and optimizations. However, traditional solutions are either expensive or coarse-grained, consequently compromising their utility in accommodating today’s increasingly complex software systems. New solutions are emerging for VM-based languages due to the full control language VMs have over program executions. Existing such solutions, nonetheless, still lack flexibility, have high overhead, or provide limited context information for developing powerful dynamic analyses. In this thesis, we present a VM-based infrastructure, called marker tracing framework (MTF), to address the deficiencies in the existing solutions for providing better observability for VM-based languages. MTF serves as a solid foundation for implementing fine-grained low-overhead program instrumentation. Specifically, MTF allows analysis clients to: 1) define custom events with rich semantics ; 2) specify precisely the program locations where the events should trigger; and 3) adaptively enable/disable the instrumentation at runtime. In addition, MTF-based analysis clients are more powerful by having access to all information available to the VM. To demonstrate the utility and effectiveness of MTF, we present two analysis clients: 1) dynamic typestate analysis with adaptive online program analysis (AOPA); and 2) selective probabilistic calling context analysis (SPCC). In addition, we evaluate the runtime performance of MTF and the typestate client with the DaCapo benchmarks. The results show that: 1) MTF has acceptable runtime overhead when tracing moderate numbers of marker events; and 2) AOPA is highly effective in reducing the event frequency for the dynamic typestate analysis; and 3) language VMs can be exploited to offer greater observability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Obtaining ecotoxicological data on pesticides in tropical regions is imperative for performing more realistic risk analysis, and avoidance tests have been proposed as a useful, fast and cost-effective tool. Therefore, the present study aimed to evaluate the avoidance behavior of Eisenia andrei to a formulated product, Vertimec(A (R)) 18 EC (a.i abamectin), in tests performed on a reference tropical artificial soil (TAS), to derive ecotoxicological data on tropical conditions, and a natural soil (NS), simulating crop field conditions. In TAS tests an adaptation of the substrate recommended by OECD and ISO protocols was used, with residues of coconut fiber as a source of organic matter. Concentrations of the pesticide on TAS test ranged from 0 to 7 mg abamectin/kg (dry weight-d.w.). In NS tests, earthworms were exposed to samples of soils sprayed in situ with: 0.9 L of Vertimec(A (R)) 18 EC/ha (RD); twice as much this dosage (2RD); and distilled water (Control), respectively, and to 2RD: control dilutions (12.5, 25, 50, 75%). All tests were performed under 25 +/- A 2A degrees C, to simulate tropical conditions, and a 12hL:12hD photoperiod. The organisms avoided contaminated TAS for an EC50,48h = 3.918 mg/kg soil d.w., LOEC = 1.75 mg/kg soil d.w. and NOEC = 0.85 mg/kg soil d.w. No significant avoidance response occurred for any NS test. Abamectin concentrations in NS were rather lower than EC50, 48h and LOEC determined in TAS tests. The results obtained contribute to overcome a lack of ecotoxicological data on pesticides under tropical conditions, but more tests with different soil invertebrates are needed to improve pesticides risk analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We tested the early performance of 16 native early-, mid-, and late-successional tree species in response to four intensities of grass removal in an abandoned cattle pasture dominated by the introduced, invasive African grass, Cynodon plectostachyus, within the Lacandon rainforest region, southeast Mexico. The increase in grass removals significantly improved the performance of many species, especially of early-and mid-successional species, while performance of late-successional species was relatively poor and did not differ significantly among treatments. Good site preparation and at least one additional grass removal four months after seedling transplant were found to be essential; additional grass removals led to improved significantly performance of saplings in most cases. In order to evaluate the potential of transplanting tree seedlings successfully in abandoned tropical pastures, we developed a "planting risk index", combining field performance measurements and plantation cost estimations. Our results showed a great potential for establishing restoration plantings with many early-and mid-successional species. Although planting risk of late-successional species was considered high, certain species showed some possibilities of acclimation after 18 months and should be considered in future plantation arrangements in view of their long-term contributions to biodiversity maintenance and also to human welfare through delivery of ecosystem services. Conducting a planting risk analysis can help avoid failure of restoration strategies involving simultaneous planting of early-, mid-, and late-successional tree species. This in turn will improve cost-effectiveness of initial interventions in large-scale, long-term restoration programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Confronto tra due software specifici per l'analisi di rischio nel trasporto stradale di merci pericolose (TRAT GIS 4.1 e QRAM 3.6) mediante applicazione a un caso di studio semplice e al caso reale di Casalecchio di Reno, comune della provincia di Bologna.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La progettazione sismica negli ultimi anni ha subito una forte trasformazione, infatti con l’introduzione delle nuove normative tecniche si è passati dallo svolgere una verifica delle capacità locali dei singoli elementi ad una progettazione con un approccio di tipo probabilistico, il quale richiede il soddisfacimento di una serie di stati limite ai quali viene attribuita una certa probabilità di superamento. La valutazione dell’affidabilità sismica di una struttura viene condotta di solito attraverso metodologie che prendono il nome di Probabilistic Seismic Design Analysis (PSDA) in accordo con la procedura del Performance Based Earthquake Engineering (PBEE). In questa procedura di tipo probabilistico risulta di notevole importanza la definizione della misura d’intensità sismica, la quale può essere utilizzata sia come predittore della risposta strutturale a fronte di un evento sismico, sia come parametro per definire la pericolosità di un sito. Queste misure d’intensità possono essere definite direttamente dalla registrazione dell’evento sismico, come ad esempio l’accelerazione di picco del terreno, oppure sulla base della risposta, sia lineare che non, della struttura soggetta a tale evento, ovvero quelle che vengono chiamate misure d’intensità spettrali. Come vedremo è preferibile l’utilizzo di misure d’intensità che soddisfino certe proprietà, in modo da far risultare più efficace possibile le risoluzione del problema con l’approccio probabilistico PBEE. Obbiettivo principale di questa dissertazione è quello di valutare alcune di queste proprietà per un gran numero di misure d’intensità sismiche a partire dai risultati di risposta strutturale ottenuti mediante analisi dinamiche non lineari nel tempo, condotte per diverse tipologie di strutture con differenti proprietà meccaniche.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La presenza di Escherichia coli produttori di verocitotossine (VTEC o STEC) rappresenta una tra le più importanti cause di malattia alimentare attualmente presenti in Europa. La sua presenza negli allevamenti di animali destinati alla produzione di alimenti rappresenta un importante rischio per la salute del consumatore. In conseguenza di comuni contaminazioni che si realizzano nel corso della macellazione, della mungitura i VTEC possono essere presenti nelle carni e nel latte e rappresentano un grave rischio se la preparazione per il consumo o i processi di lavorazione non comportano trattamenti in grado d’inattivarli (es. carni crude o poco cotte, latte non pastorizzato, formaggi freschi a latte crudo). La contaminazione dei campi coltivati conseguente alla dispersione di letame o attraverso acque contaminate può veicolare questi stipiti che sono normalmente albergati nell’intestino di ruminanti (domestici e selvatici) e anche prodotti vegetali consumati crudi, succhi e perfino sementi sono stati implicati in gravi episodi di malattia con gravi manifestazioni enteriche e complicazioni in grado di causare quadri patologici gravi e anche la morte. Stipiti di VTEC patogeni ingeriti con gli alimenti possono causare sintomi gastroenterici, con diarrea acquosa o emorragica (nel 50% dei casi), crampi addominali, febbre lieve e in una percentuale più bassa nausea e vomito. In alcuni casi (circa 5-10%) l’infezione gastroenterica si complica con manifestazioni tossiemiche caratterizzate da Sindrome Emolitico Uremica (SEU o HUS) con anemia emolitica, insufficienza renale grave e coinvolgimento neurologico o con una porpora trombotica trombocitopenica. Il tasso di mortalità dei pazienti che presentano l’infezione da E. coli è inferiore all’1%. I dati forniti dall’ECDC sulle infezioni alimentari nel periodo 2006-2010 hanno evidenziato un trend in leggero aumento del numero di infezioni a partire dal 2007. L’obiettivo degli studi condotti è quello di valutare la prevalenza ed il comportamento dei VTEC per una analisi del rischio più approfondita.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a comprehensive methodology for the reduction of analytical or numerical stochastic models characterized by uncertain input parameters or boundary conditions. The technique, based on the Polynomial Chaos Expansion (PCE) theory, represents a versatile solution to solve direct or inverse problems related to propagation of uncertainty. The potentiality of the methodology is assessed investigating different applicative contexts related to groundwater flow and transport scenarios, such as global sensitivity analysis, risk analysis and model calibration. This is achieved by implementing a numerical code, developed in the MATLAB environment, presented here in its main features and tested with literature examples. The procedure has been conceived under flexibility and efficiency criteria in order to ensure its adaptability to different fields of engineering; it has been applied to different case studies related to flow and transport in porous media. Each application is associated with innovative elements such as (i) new analytical formulations describing motion and displacement of non-Newtonian fluids in porous media, (ii) application of global sensitivity analysis to a high-complexity numerical model inspired by a real case of risk of radionuclide migration in the subsurface environment, and (iii) development of a novel sensitivity-based strategy for parameter calibration and experiment design in laboratory scale tracer transport.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On one side, prosthodontic reconstructions compensate for the sequelae of negative changes in the oral cavity; on the other side, they often enhance or accelerate them. As a consequence of negative changes in the oral cavity over time, treatment planning for RPDs becomes highly complex. A set of reliable criteria is necessary for decision-making and problem management It appears that the majority of published data on RPDs does not depict high effectiveness of this treatment modality. From a strict point of view of evidence-based dentistry, the level of evidence is low if not missing for RPDs. Randomized controlled trials on RPDs are difficult to design, they are not feasible for some questions due to the complexity of the material, or may remain without clinical relevance. The literature rarely gives information on the denture design, tooth selection, and management of the compromised structural integrity of teeth. So far treatment outcomes with RPDs must be considered under the aspect of bias due to the bias in indication and patient selection for RPDs. Better clinical models should be elaborated with more stringent concepts for providing RPDs. This encompasses: risk analysis and patient assessment, proper indications for maintenance or extraction of teeth, strategic placement of implants, biomechanical aspects, materials, and technology. Although there is a tendency to offer fixed prostheses to our patients, this might change again with demographic changes and with an increase in the ageing population, an increase in their reduced dentition, and low socioeconomic wealth in large parts of the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In natural hazard research, risk is defined as a function of (1) the probability of occurrence of a hazardous process, and (2) the assessment of the related extent of damage, defined by the value of elements at risk exposed and their physical vulnerability. Until now, various works have been undertaken to determine vulnerability values for objects exposed to geomorphic hazards such as mountain torrents. Yet, many studies only provide rough estimates for vulnerability values based on proxies for process intensities. However, the deduced vulnerability functions proposed in the literature show a wide range, in particular with respect to medium and high process magnitudes. In our study, we compare vulnerability functions for torrent processes derived from studies in test sites located in the Austrian Alps and in Taiwan. Based on this comparison we expose needs for future research in order to enhance mountain hazard risk management with a particular focus on the question of vulnerability on a catchment scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Skeletal diseases such as osteoporosis impose a severe socio-economic burden to ageing societies. Decreasing mechanical competence causes a rise in bone fracture incidence and mortality especially after the age of 65 y. The mechanisms of how bone damage is accumulated under different loading modes and its impact on bone strength are unclear. We hypothesise that damage accumulated in one loading mode increases the fracture risk in another. This study aimed at identifying continuum damage interactions between tensile and compressive loading modes. We propose and identify the material constants of a novel piecewise 1D constitutive model capable of describing the mechanical response of bone in combined tensile and compressive loading histories. We performed several sets of loading–reloading experiments to compute stiffness, plastic strains, and stress-strain curves. For tensile overloading, a stiffness reduction (damage) of 60% at 0.65% accumulated plastic strain was detectable as stiffness reduction of 20% under compression. For compressive overloading, 60% damage at 0.75% plastic strain was detectable as a stiffness reduction of 50% in tension. Plastic strain at ultimate stress was the same in tension and compression. Compression showed softening and tension exponential hardening in the post-yield regime. The hardening behaviour in compression is unaffected by a previous overload in tension but the hardening behaviour in tension is affected by a previous overload in compression as tensile reloading strength is significantly reduced. This paper demonstrates how damage accumulated under one loading mode affects the mechanical behaviour in another loading mode. To explain this and to illustrate a possible implementation we proposed a theoretical model. Including such loading mode dependent damage and plasticity behaviour in finite element models will help to improve fracture risk analysis of whole bones and bone implant structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

External beam radiation therapy is used to treat nearly half of the more than 200,000 new cases of prostate cancer diagnosed in the United States each year. During a radiation therapy treatment, healthy tissues in the path of the therapeutic beam are exposed to high doses. In addition, the whole body is exposed to a low-dose bath of unwanted scatter radiation from the pelvis and leakage radiation from the treatment unit. As a result, survivors of radiation therapy for prostate cancer face an elevated risk of developing a radiogenic second cancer. Recently, proton therapy has been shown to reduce the dose delivered by the therapeutic beam to normal tissues during treatment compared to intensity modulated x-ray therapy (IMXT, the current standard of care). However, the magnitude of stray radiation doses from proton therapy, and their impact on this incidence of radiogenic second cancers, was not known. ^ The risk of a radiogenic second cancer following proton therapy for prostate cancer relative to IMXT was determined for 3 patients of large, median, and small anatomical stature. Doses delivered to healthy tissues from the therapeutic beam were obtained from treatment planning system calculations. Stray doses from IMXT were taken from the literature, while stray doses from proton therapy were simulated using a Monte Carlo model of a passive scattering treatment unit and an anthropomorphic phantom. Baseline risk models were taken from the Biological Effects of Ionizing Radiation VII report. A sensitivity analysis was conducted to characterize the uncertainty of risk calculations to uncertainties in the risk model, the relative biological effectiveness (RBE) of neutrons for carcinogenesis, and inter-patient anatomical variations. ^ The risk projections revealed that proton therapy carries a lower risk for radiogenic second cancer incidence following prostate irradiation compared to IMXT. The sensitivity analysis revealed that the results of the risk analysis depended only weakly on uncertainties in the risk model and inter-patient variations. Second cancer risks were sensitive to changes in the RBE of neutrons. However, the findings of the study were qualitatively consistent for all patient sizes and risk models considered, and for all neutron RBE values less than 100. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este estudio se aplica una metodología de obtención de las leyes de frecuencia derivadas (de caudales máximo vertidos y niveles máximos alcanzados) en un entorno de simulaciones de Monte Carlo, para su inclusión en un modelo de análisis de riesgo de presas. Se compara su comportamiento respecto del uso de leyes de frecuencia obtenidas con las técnicas tradicionalmente utilizadas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los incendios forestales son la principal causa de mortalidad de árboles en la Europa mediterránea y constituyen la amenaza más seria para los ecosistemas forestales españoles. En la Comunidad Valenciana, diariamente se despliega cerca de un centenar de vehículos de vigilancia, cuya distribución se apoya, fundamentalmente, en un índice de riesgo de incendios calculado en función de las condiciones meteorológicas. La tesis se centra en el diseño y validación de un nuevo índice de riesgo integrado de incendios, especialmente adaptado a la región mediterránea y que facilite el proceso de toma de decisiones en la distribución diaria de los medios de vigilancia contra incendios forestales. El índice adopta el enfoque de riesgo integrado introducido en la última década y que incluye dos componentes de riesgo: el peligro de ignición y la vulnerabilidad. El primero representa la probabilidad de que se inicie un fuego y el peligro potencial para que se propague, mientras que la vulnerabilidad tiene en cuenta las características del territorio y los efectos potenciales del fuego sobre el mismo. Para el cálculo del peligro potencial se han identificado indicadores relativos a los agentes naturales y humanos causantes de incendios, la ocurrencia histórica y el estado de los combustibles, extremo muy relacionado con la meteorología y las especies. En cuanto a la vulnerabilidad se han empleado indicadores representativos de los efectos potenciales del incendio (comportamiento del fuego, infraestructuras de defensa), como de las características del terreno (valor, capacidad de regeneración…). Todos estos indicadores constituyen una estructura jerárquica en la que, siguiendo las recomendaciones de la Comisión europea para índices de riesgo de incendios, se han incluido indicadores representativos del riesgo a corto plazo y a largo plazo. El cálculo del valor final del índice se ha llevado a cabo mediante la progresiva agregación de los componentes que forman cada uno de los niveles de la estructura jerárquica del índice y su integración final. Puesto que las técnicas de decisión multicriterio están especialmente orientadas a tratar con problemas basados en estructuras jerárquicas, se ha aplicado el método TOPSIS para obtener la integración final del modelo. Se ha introducido en el modelo la opinión de los expertos, mediante la ponderación de cada uno de los componentes del índice. Se ha utilizado el método AHP, para obtener las ponderaciones de cada experto y su integración en un único peso por cada indicador. Para la validación del índice se han empleado los modelos de Ecuaciones de Estimación Generalizadas, que tienen en cuenta posibles respuestas correlacionadas. Para llevarla a cabo se emplearon los datos de oficiales de incendios ocurridos durante el período 1994 al 2003, referenciados a una cuadrícula de 10x10 km empleando la ocurrencia de incendios y su superficie, como variables dependientes. Los resultados de la validación muestran un buen funcionamiento del subíndice de peligro de ocurrencia con un alto grado de correlación entre el subíndice y la ocurrencia, un buen ajuste del modelo logístico y un buen poder discriminante. Por su parte, el subíndice de vulnerabilidad no ha presentado una correlación significativa entre sus valores y la superficie de los incendios, lo que no descarta su validez, ya que algunos de sus componentes tienen un carácter subjetivo, independiente de la superficie incendiada. En general el índice presenta un buen funcionamiento para la distribución de los medios de vigilancia en función del peligro de inicio. No obstante, se identifican y discuten nuevas líneas de investigación que podrían conducir a una mejora del ajuste global del índice. En concreto se plantea la necesidad de estudiar más profundamente la aparente correlación que existe en la provincia de Valencia entre la superficie forestal que ocupa cada cuadrícula de 10 km del territorio y su riesgo de incendios y que parece que a menor superficie forestal, mayor riesgo de incendio. Otros aspectos a investigar son la sensibilidad de los pesos de cada componente o la introducción de factores relativos a los medios potenciales de extinción en el subíndice de vulnerabilidad. Summary Forest fires are the main cause of tree mortality in Mediterranean Europe and the most serious threat to the Spanisf forest. In the Spanish autonomous region of Valencia, forest administration deploys a mobile fleet of 100 surveillance vehicles in forest land whose allocation is based on meteorological index of wildlandfire risk. This thesis is focused on the design and validation of a new Integrated Wildland Fire Risk Index proposed to efficient allocation of vehicles and specially adapted to the Mediterranean conditions. Following the approaches of integrated risk developed last decade, the index includes two risk components: Wildland Fire Danger and Vulnerability. The former represents the probability a fire ignites and the potential hazard of fire propagation or spread danger, while vulnerability accounts for characteristics of the land and potential effects of fire. To calculate the Wildland Fire Danger, indicators of ignition and spread danger have been identified, including human and natural occurrence agents, fuel conditions, historical occurrence and spread rate. Regarding vulnerability se han empleado indicadores representativos de los efectos potenciales del incendio (comportamiento del fuego, infraestructurasd de defensa), como de las características del terreno (valor, capacidad de regeneración…). These indicators make up the hierarchical structure for the index, which, following the criteria of the European Commission both short and long-term indicators have been included. Integration consists of the progressive aggregation of the components that make up every level in risk the index and, after that, the integration of these levels to obtain a unique value for the index. As Munticriteria methods are oriented to deal with hierarchically structured problems and with situations in which conflicting goals prevail, TOPSIS method is used in the integration of components. Multicriteria methods were also used to incorporate expert opinion in weighting of indicators and to carry out the aggregation process into the final index. The Analytic Hierarchy Process method was used to aggregate experts' opinions on each component into a single value. Generalized Estimation Equations, which account for possible correlated responses, were used to validate the index. Historical records of daily occurrence for the period from 1994 to 2003, referred to a 10x10-km-grid cell, as well as the extent of the fires were the dependant variables. The results of validation showed good Wildland Fire Danger component performance, with high correlation degree between Danger and occurrence, a good fit of the logistic model used and a good discrimination power. The vulnerability component has not showed a significant correlation between their values and surface fires, which does not mean the index is not valid, because of the subjective character of some of its components, independent of the surface of the fires. Overall, the index could be used to optimize the preventing resources allocation. Nevertheless, new researching lines are identified and discussed to improve the overall performance of the index. More specifically the need of study the inverse relationship between the value of the wildfire Fire Danger component and the forested surface of each 10 - km cell is set out. Other points to be researched are the sensitivity of the index component´s weight and the possibility of taking into account indicators related to fire fighting resources to make up the vulnerability component.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Algorithms for distributed agreement are a powerful means for formulating distributed versions of existing centralized algorithms. We present a toolkit for this task and show how it can be used systematically to design fully distributed algorithms for static linear Gaussian models, including principal component analysis, factor analysis, and probabilistic principal component analysis. These algorithms do not rely on a fusion center, require only low-volume local (1-hop neighborhood) communications, and are thus efficient, scalable, and robust. We show how they are also guaranteed to asymptotically converge to the same solution as the corresponding existing centralized algorithms. Finally, we illustrate the functioning of our algorithms on two examples, and examine the inherent cost-performance tradeoff.