895 resultados para Realistic threat
Resumo:
Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.
Resumo:
BACKGROUND Ambrosia artemisiifolia (short name = Ambrosia common ragweed) pollen is a potent allergen and has recently been found in Switzerland, spreading from the southwest of the country. The aim of this study is to describe Ambrosia sensitisation rates in the population-based SAPALDIA cohort (Swiss Study on Air Pollution And Lung Diseases In Adults) and to test whether an increase in these rates could be observed. METHODS Among the 6345 participants from 8 areas who provided blood samples in 1991 and 2002, 5823 had valid results for specific IgE against common inhalant allergens tested with Phadiatop. In 2002 Ambrosia sensitisation was measured and positive tests were analysed for Artemisia vulgaris (mugwort). Blood samples taken in 1991 in Ticino and Geneva were also tested for Ambrosia. RESULTS Sensitisation rate (Phadiatop) did not increase significantly between the two surveys and sensitisation was found in 30% of the participants. A proportion of 7.9% showed specific IgE to Ambrosia pollen. The sensitisation rate in Lugano and Geneva had not changed substantially since 1991. Among those sensitised to Ambrosia 82% also showed specific IgE against Artemisia, suggesting a high rate of cross-reactivity. Only 1.3% were sensitized to Ambrosia alone. The incidence of asthma or hay fever in participants with specific IgE to Ambrosia pollen was not higher than in the general study population. CONCLUSION Currently Ambrosia pollen does not appear to be an important cause of inhalant allergies in Switzerland. Sensitisation rates are low and have not increased since 1991. Due to cross-reactivity Ambrosia sensitisation may be a consequence of primary sensitisation to Artemisia. Elimination of Ambrosia plants is nevertheless mandatory to avoid a future increase.
Resumo:
OBJECTIVE To systematically review the reporting of MII (MII) oocyte development after xenotransplantation of human ovarian tissue. DESIGN Systematic review in accordance with the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA). SETTING Not applicable. PATIENT(S) Not applicable. INTERVENTION(S) Formation of MII oocytes after xenotransplantation of human ovarian tissue. MAIN OUTCOME MEASURE(S) Any outcome reported in Pubmed. RESULT(S) Six publications were identified that report on formation of MII oocytes after xenotransplantation of human ovarian tissue. CONCLUSION(S) Xenografting of human ovarian tissue has proved to be a useful model for examining ovarian function and follicle development in vivo. With human follicles that have matured through xenografting, the possibility of cancer transmission and relapse can also be eliminated, because cancer cells are not able to penetrate the zona pellucida. The reported studies have demonstrated that xenografted ovarian tissue from a range of species, including humans, can produce antral follicles that contain mature (MII) oocytes, and it has been shown that mice oocytes have the potential to give rise to live young. Although some ethical questions remain unresolved, xenotransplantation may be a promising method for restoring fertility. This review furthermore describes the value of xenotransplantation as a tool in reproductive biology and discusses the ethical and potential safety issues regarding ovarian tissue xenotransplantation as a means of recovering fertility.
Resumo:
Previous multicast research often makes commonly accepted but unverifed assumptions on network topologies and group member distribution in simulation studies. In this paper, we propose a framework to systematically evaluate multicast performance for different protocols. We identify a series of metrics, and carry out extensive simulation studies on these metrics with different topological models and group member distributions for three case studies. Our simulation results indicate that realistic topology and group membership models are crucial to accurate multicast performance evaluation. These results can provide guidance for multicast researchers to perform realistic simulations, and facilitate the design and development of multicast protocols.
Resumo:
Decades of research show that environmental exposure to the chemical benzene is associated with severe carcinogenic, hematoxic and genotoxic effects on the human body. As such, the Environmental Protection Agency (EPA) has designated the chemical as a Hazardous Air Pollutant and prescribed benzene air concentration guidelines that provide cities with an ideal ambient level to protect human health. However, in Houston, Texas, a city home to the top industrial benzene emitters in the US who undoubtedly contribute greatly to the potentially unsafe levels of ambient benzene, regulations beyond the EPA’s unenforceable guidelines are critical to protecting public health. Despite this, the EPA has failed to establish National Ambient Air Quality Standards (NAAQS) for benzene. States are thus left to regulate air benzene levels on their own; in the case of Texas, the Texas Commission on Environmental Quality (TCEQ) and state legislature have failed to proactively develop legally enforceable policies to reduce major source benzene emissions. This inaction continues to exacerbate a public health problem, which may only be solved through a legal framework that restricts preventable benzene emissions to protect human health and holds industrial companies accountable for violations of such regulations and standards. This analysis explores legal barriers that the City of Houston and other relevant agencies currently face in their attempt to demand and bring about such change. ^
Resumo:
This chapter attempts to identify some important issues in developing realistic simulation models based on new economic geography, and it suggests a direction for solving the difficulties. Specifically, adopting the IDE Geographical Simulation Model (IDE-GSM) as an example, we discuss some problems in developing a realistic simulation model for East Asia. The first and largest problem in this region is the lack of reliable economic datasets at the sub-national level, and this issue needs to be resolved in the long term. However, to deal with the existing situation in the short term, we utilize some techniques to produce more realistic and reliable simulation models. One key compromise is to use a 'topology' representation of geography, rather than a 'mesh' or 'grid' representation or simple 'straight lines' connecting each city which are used in many other models. In addition to this, a modal choice model that takes into consideration both money and time costs seems to work well.
Resumo:
The threat of impact or explosive loads is regrettably a scenario to be taken into account in the design of lifeline or critical civilian buildings. These are often made of concrete and not specifically designed for military threats. Numerical simulation of such cases may be undertaken with the aid of state of the art explicit dynamic codes, however several difficult challenges are inherent to such models: the material modeling for the concrete anisotropic failure, consideration of reinforcement bars and important structural details, adequate modeling of pressure waves from explosions in complex geometries, and efficient solution to models of complete buildings which can realistically assess failure modes. In this work we employ LS-Dyna for calculation, with Lagrangian finite elements and explicit time integration. Reinforced concrete may be represented in a fairly accurate fashion with recent models such as CSCM model [1] and segregated rebars constrained within the continuum mesh. However, such models cannot be realistically employed for complete models of large buildings, due to limitations of time and computer resources. The use of structural beam and shell elements for this purpose would be the obvious solution, with much lower computational cost. However, this modeling requires careful calibration in order to reproduce adequately the highly nonlinear response of structural concrete members, including bending with and without compression, cracking or plastic crushing, plastic deformation of reinforcement, erosion of vanished elements etc. The main objective of this work is to provide a strategy for modeling such scenarios based on structural elements, using available material models for structural elements [2] and techniques to include the reinforcement in a realistic way. These models are calibrated against fully three-dimensional models and shown to be accurate enough. At the same time they provide the basis for realistic simulation of impact and explosion on full-scale buildings
Resumo:
Accurate characterization of the radio channel in tunnels is of great importance for new signaling and train control communications systems. To model this environment, measurements have been taken at 2.4 GHz in a real environment in Madrid subway. The measurements were carried out with four base station transmitters installed in a 2-km tunnel and using a mobile receiver installed on a standard train. First, with an optimum antenna configuration, all the propagation characteristics of a complex subway environment, including near shadowing, path loss,shadow fading, fast fading, level crossing rate (LCR), and average fade duration (AFD), have been measured and computed. Thereafter, comparisons of propagation characteristics in a double-track tunnel (9.8-m width) and a single-track tunnel (4.8-m width) have been made. Finally, all the measurement results have been shown in a complete table for accurate statistical modeling.
Resumo:
The energy and specific energy absorbed in the main cell compartments (nucleus and cytoplasm) in typical radiobiology experiments are usually estimated by calculations as they are not accessible for a direct measurement. In most of the work, the cell geometry is modelled using the combination of simple mathematical volumes. We propose a method based on high resolution confocal imaging and ion beam analysis (IBA) in order to import realistic cell nuclei geometries in Monte-Carlo simulations and thus take into account the variety of different geometries encountered in a typical cell population. Seventy-six cell nuclei have been imaged using confocal microscopy and their chemical composition has been measured using IBA. A cellular phantom was created from these data using the ImageJ image analysis software and imported in the Geant4 Monte-Carlo simulation toolkit. Total energy and specific energy distributions in the 76 cell nuclei have been calculated for two types of irradiation protocols: a 3 MeV alpha particle microbeam used for targeted irradiation and a 239Pu alpha source used for large angle random irradiation. Qualitative images of the energy deposited along the particle tracks have been produced and show good agreement with images of DNA double strand break signalling proteins obtained experimentally. The methodology presented in this paper provides microdosimetric quantities calculated from realistic cellular volumes. It is based on open-source oriented software that is publicly available.
Resumo:
The behavior of quantum dot, quantum wire, and quantum well InAs/GaAs solar cells is studied with a very simplified model based on experimental results in order to assess their performance as a function of the low bandgap material volume fraction fLOW. The efficiency of structured devices is found to exceed the efficiency of a non-structured GaAs cell, in particular under concentration, when fLOW is high; this condition is easier to achieve with quantum wells. If three different quasi Fermi levels appear with quantum dots the efficiency can be much higher.
Resumo:
An attractive but challenging technology for high efficiency solar energy conversion is the intermediate band solar cell (IBSC), whose theoretical efficiency limit is 63%, yet which has so far failed to yield high efficiencies in practice. The most advanced IBSC technology is that based on quantum dots (QDs): the QD-IBSC. In this paper, k·p calculations of photon absorption in the QDs are combined with a multi-level detailed balance model. The model has been used to reproduce the measured quantum efficiency of a real QD-IBSC and its temperature dependence. This allows the analysis of individual sub-bandgap transition currents, which has as yet not been possible experimentally, yielding a deeper understanding of the failure of current QD-IBSCs. Based on the agreement with experimental data, the model is believed to be realistic enough to evaluate future QD-IBSC proposals.
Resumo:
Intercontinental Ballistic Missiles are capable of placing a nuclear warhead at more than 5,000 km away from its launching base. With the lethal power of a nuclear warhead a whole city could be wiped out by a single weapon causing millions of deaths. This means that the threat posed to any country from a single ICBM captured by a terrorist group or launched by a 'rogue' state is huge. This threat is increasing as more countries are achieving nuclear and advanced launcher capabilities. In order to suppress or at least reduce this threat the United States created the National Missile Defense System which involved, among other systems, the development of long-range interceptors whose aim is to destroy incoming ballistic missiles in their midcourse phase. The Ballistic Missile Defense is a high-profile topic that has been the focus of political controversy lately when the U.S. decided to expand the Ballistic Missile system to Europe, with the opposition of Russia. However the technical characteristics of this system are mostly unknown by the general public. The Interception of an ICBM using a long range Interceptor Missile as intended within the Ground-Based Missile Defense System by the American National Missile Defense (NMD) implies a series of problems of incredible complexity: - The incoming missile has to be detected almost immediately after launch. - The incoming missile has to be tracked along its trajectory with a great accuracy. - The Interceptor Missile has to implement a fast and accurate guidance algorithm in order to reach the incoming missile as soon as possible. - The Kinetic Kill Vehicle deployed by the interceptor boost vehicle has to be able to detect the reentry vehicle once it has been deployed by ICBM, when it offers a very low infrared signature, in order to perform a final rendezvous manoeuvre. - The Kinetic Kill Vehicle has to be able to discriminate the reentry vehicle from the surrounding debris and decoys. - The Kinetic Kill Vehicle has to be able to implement an accurate guidance algorithm in order to perform a kinetic interception (direct collision) of the reentry vehicle, at relative speeds of more than 10 km/s. All these problems are being dealt simultaneously by the Ground-Based Missile Defense System that is developing very complex and expensive sensors, communications and control centers and long-range interceptors (Ground-Based Interceptor Missile) including a Kinetic Kill Vehicle. Among all the technical challenges involved in this interception scenario, this thesis focuses on the algorithms required for the guidance of the Interceptor Missile and the Kinetic Kill Vehicle in order to perform the direct collision with the ICBM. The involved guidance algorithms are deeply analysed in this thesis in part III where conventional guidance strategies are reviewed and optimal guidance algorithms are developed for this interception problem. The generation of a realistic simulation of the interception scenario between an ICBM and a Ground Based Interceptor designed to destroy it was considered as necessary in order to be able to compare different guidance strategies with meaningful results. As a consequence, a highly representative simulator for an ICBM and a Kill Vehicle has been implemented, as detailed in part II, and the generation of these simulators has also become one of the purposes of this thesis. In summary, the main purposes of this thesis are: - To develop a highly representative simulator of an interception scenario between an ICBM and a Kill Vehicle launched from a Ground Based Interceptor. -To analyse the main existing guidance algorithms both for the ascent phase and the terminal phase of the missiles. Novel conclusions of these analyses are obtained. - To develop original optimal guidance algorithms for the interception problem. - To compare the results obtained using the different guidance strategies, assess the behaviour of the optimal guidance algorithms, and analyse the feasibility of the Ballistic Missile Defense system in terms of guidance (part IV). As a secondary objective, a general overview of the state of the art in terms of ballistic missiles and anti-ballistic missile defence is provided (part I).