946 resultados para Natural disaster warning systems.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The disaster natural disaster stand as one of the greatest challenges of urban man. Cities are built and modeled as a function of economic and political issues, without respecting environmental characteristics. So it is possible to see through the data of the National Civil Defence large number of disasters occurring in Brazilian cities in the years 2009 to 2011, and in all were reported over 5000 occurrences of natural disasters over the years. The Brazilian public policy failures show up in issues of urban planning where to admit the allocation of people in inappropriate areas. Another issue to be considered is the non-response of the population to civil defense warnings, people often prefer to risk staying in high-risk areas for fear of being robbed while they are away, and end up not serving the notices given by the Civil Defense, increase thus the number of victims when in fact the weather event triggers natural disasters one
Resumo:
This research work aims to analyze and understand the teaching of geography in the perspective of historical and critical pedagogy, with the object of analysis, the Center for Integrated Natural Disaster Alerts (CIADEN) for students in Cycle I of the Elementary School, located in Etec “Astor de Mattos Carvalho”, in Cabrália Paulista-SP. Deepening the knowledge of geography teaching in a critical and controversial about the bad weather and the use of new technologies to transform the social order and its relationship with the environment. Contributing to the students and the community have a new proposal for action from the content learned, capable of solving the present and future with respect to weather, climate and natural disasters. To implement a culture of prevention and risk perception, providing the diffusion of knowledge socially useful
Resumo:
Pós-graduação em Engenharia Mecânica - FEIS
Resumo:
The use and inadequate exploitation of natural resources is restricting the occurrence of aroeira (Myracrodruon urundeuva F.F. & M.F. Allemão), which now is on the FAO list of endangered species. This exploitation causes a decrease in the genetic base of M. urundeuva populations, which makes it difficult to find genotypes with stability and adaptability to different growing conditions. This study aimed at estimating the genetic variation and productivity, stability and adaptability of progenies of a M. urundeuva natural population, from the Ecological Station of Paulo de Faria-SP, under different planting systems. DBH (diameter at breast height) was evaluated in four progeny tests of M. urundeuva: i) planted with Anandenanthera falcata and Guazuma ulmifolia (TP-AMA); ii) single (TP-ASO); iii) planted with annual crops (TP-SAF) and iv) planted with Corymbia citriodora (TP-EUCA), installed in Selvíria-MS. The experimental design consisted of complete randomized blocks with three replications and a variable number of plants per plot in each of the four planting systems. From the joint analysis of the planting systems studied, it was found that: i) there were variations among planting systems particularly in TP-SAF; ii) only in TP-EUCA it was possible to detect variations among the progenies; iii) the effects of the genotype x environment interaction were not significant. Thereby, the harmonic mean of genotypic values (MHVG), the relative performance of genotypic values from the mean of each site (PRVG) and the harmonic mean of the relative performance of genotypic values (MHPRVG) for DBH showed, respectively: progenies with greater stability, adaptability, and stability and simultaneous adaptability within different planting systems. The use of these selection criteria provided a more refined selection of the best progenies of M. urundeuva under the different planting systems studied.
Resumo:
Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change. The consortium comprises 12 work packages to address a set of research questions in three areas: Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring. Research area 2: Develop novel strategies to prevent dengue in children. Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change. In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.
Resumo:
Abstract Background This article aims to discuss the incorporation of traditional time in the construction of a management scenario for pink shrimp in the Patos Lagoon estuary (RS), Brazil. To meet this objective, two procedures have been adopted; one at a conceptual level and another at a methodological level. At the conceptual level, the concept of traditional time as a form of traditional ecological knowledge (TEK) was adopted. Method At the methodological level, we conduct a wide literature review of the scientific knowledge (SK) that guides recommendations for pink shrimp management by restricting the fishing season in the Patos Lagoon estuary; in addition, we review the ethno-scientific literature which describes traditional calendars as a management base for artisanal fishers in the Patos Lagoon estuary. Results Results demonstrate that TEK and SK describe similar estuarine biological processes, but are incommensurable at a resource management level. On the other hand, the construction of a “management scenario” for pink shrimp is possible through the development of “criteria for hierarchies of validity” which arise from a productive dialog between SK and TEK. Conclusions The commensurable and the incommensurable levels reveal different basis of time-space perceptions between traditional ecological knowledge and scientific knowledge. Despite incommensurability at the management level, it is possible to establish guidelines for the construction of “management scenarios” and to support a co-management process.
Resumo:
Subduction zones are the favorite places to generate tsunamigenic earthquakes, where friction between oceanic and continental plates causes the occurrence of a strong seismicity. The topics and the methodologies discussed in this thesis are focussed to the understanding of the rupture process of the seismic sources of great earthquakes that generate tsunamis. The tsunamigenesis is controlled by several kinematical characteristic of the parent earthquake, as the focal mechanism, the depth of the rupture, the slip distribution along the fault area and by the mechanical properties of the source zone. Each of these factors plays a fundamental role in the tsunami generation. Therefore, inferring the source parameters of tsunamigenic earthquakes is crucial to understand the generation of the consequent tsunami and so to mitigate the risk along the coasts. The typical way to proceed when we want to gather information regarding the source process is to have recourse to the inversion of geophysical data that are available. Tsunami data, moreover, are useful to constrain the portion of the fault area that extends offshore, generally close to the trench that, on the contrary, other kinds of data are not able to constrain. In this thesis I have discussed the rupture process of some recent tsunamigenic events, as inferred by means of an inverse method. I have presented the 2003 Tokachi-Oki (Japan) earthquake (Mw 8.1). In this study the slip distribution on the fault has been inferred by inverting tsunami waveform, GPS, and bottom-pressure data. The joint inversion of tsunami and geodetic data has revealed a much better constrain for the slip distribution on the fault rather than the separate inversions of single datasets. Then we have studied the earthquake occurred on 2007 in southern Sumatra (Mw 8.4). By inverting several tsunami waveforms, both in the near and in the far field, we have determined the slip distribution and the mean rupture velocity along the causative fault. Since the largest patch of slip was concentrated on the deepest part of the fault, this is the likely reason for the small tsunami waves that followed the earthquake, pointing out how much the depth of the rupture plays a crucial role in controlling the tsunamigenesis. Finally, we have presented a new rupture model for the great 2004 Sumatra earthquake (Mw 9.2). We have performed the joint inversion of tsunami waveform, GPS and satellite altimetry data, to infer the slip distribution, the slip direction, and the rupture velocity on the fault. Furthermore, in this work we have presented a novel method to estimate, in a self-consistent way, the average rigidity of the source zone. The estimation of the source zone rigidity is important since it may play a significant role in the tsunami generation and, particularly for slow earthquakes, a low rigidity value is sometimes necessary to explain how a relatively low seismic moment earthquake may generate significant tsunamis; this latter point may be relevant for explaining the mechanics of the tsunami earthquakes, one of the open issues in present day seismology. The investigation of these tsunamigenic earthquakes has underlined the importance to use a joint inversion of different geophysical data to determine the rupture characteristics. The results shown here have important implications for the implementation of new tsunami warning systems – particularly in the near-field – the improvement of the current ones, and furthermore for the planning of the inundation maps for tsunami-hazard assessment along the coastal area.
Resumo:
This thesis concerns artificially intelligent natural language processing systems that are capable of learning the properties of lexical items (properties like verbal valency or inflectional class membership) autonomously while they are fulfilling their tasks for which they have been deployed in the first place. Many of these tasks require a deep analysis of language input, which can be characterized as a mapping of utterances in a given input C to a set S of linguistically motivated structures with the help of linguistic information encoded in a grammar G and a lexicon L: G + L + C → S (1) The idea that underlies intelligent lexical acquisition systems is to modify this schematic formula in such a way that the system is able to exploit the information encoded in S to create a new, improved version of the lexicon: G + L + S → L' (2) Moreover, the thesis claims that a system can only be considered intelligent if it does not just make maximum usage of the learning opportunities in C, but if it is also able to revise falsely acquired lexical knowledge. So, one of the central elements in this work is the formulation of a couple of criteria for intelligent lexical acquisition systems subsumed under one paradigm: the Learn-Alpha design rule. The thesis describes the design and quality of a prototype for such a system, whose acquisition components have been developed from scratch and built on top of one of the state-of-the-art Head-driven Phrase Structure Grammar (HPSG) processing systems. The quality of this prototype is investigated in a series of experiments, in which the system is fed with extracts of a large English corpus. While the idea of using machine-readable language input to automatically acquire lexical knowledge is not new, we are not aware of a system that fulfills Learn-Alpha and is able to deal with large corpora. To instance four major challenges of constructing such a system, it should be mentioned that a) the high number of possible structural descriptions caused by highly underspeci ed lexical entries demands for a parser with a very effective ambiguity management system, b) the automatic construction of concise lexical entries out of a bulk of observed lexical facts requires a special technique of data alignment, c) the reliability of these entries depends on the system's decision on whether it has seen 'enough' input and d) general properties of language might render some lexical features indeterminable if the system tries to acquire them with a too high precision. The cornerstone of this dissertation is the motivation and development of a general theory of automatic lexical acquisition that is applicable to every language and independent of any particular theory of grammar or lexicon. This work is divided into five chapters. The introductory chapter first contrasts three different and mutually incompatible approaches to (artificial) lexical acquisition: cue-based queries, head-lexicalized probabilistic context free grammars and learning by unification. Then the postulation of the Learn-Alpha design rule is presented. The second chapter outlines the theory that underlies Learn-Alpha and exposes all the related notions and concepts required for a proper understanding of artificial lexical acquisition. Chapter 3 develops the prototyped acquisition method, called ANALYZE-LEARN-REDUCE, a framework which implements Learn-Alpha. The fourth chapter presents the design and results of a bootstrapping experiment conducted on this prototype: lexeme detection, learning of verbal valency, categorization into nominal count/mass classes, selection of prepositions and sentential complements, among others. The thesis concludes with a review of the conclusions and motivation for further improvements as well as proposals for future research on the automatic induction of lexical features.
Resumo:
The so called cascading events, which lead to high-impact low-frequency scenarios are rising concern worldwide. A chain of events result in a major industrial accident with dreadful (and often unpredicted) consequences. Cascading events can be the result of the realization of an external threat, like a terrorist attack a natural disaster or of “domino effect”. During domino events the escalation of a primary accident is driven by the propagation of the primary event to nearby units, causing an overall increment of the accident severity and an increment of the risk associated to an industrial installation. Also natural disasters, like intense flooding, hurricanes, earthquake and lightning are found capable to enhance the risk of an industrial area, triggering loss of containment of hazardous materials and in major accidents. The scientific community usually refers to those accidents as “NaTechs”: natural events triggering industrial accidents. In this document, a state of the art of available approaches to the modelling, assessment, prevention and management of domino and NaTech events is described. On the other hand, the relevant work carried out during past studies still needs to be consolidated and completed, in order to be applicable in a real industrial framework. New methodologies, developed during my research activity, aimed at the quantitative assessment of domino and NaTech accidents are presented. The tools and methods provided within this very study had the aim to assist the progress toward a consolidated and universal methodology for the assessment and prevention of cascading events, contributing to enhance safety and sustainability of the chemical and process industry.
Resumo:
This Strategy and Action Plan was written within the framework of the project on Sustainable Land Management in the High Pamir and Pamir-Alai Mountains (PALM). PALM is an integrated transboundary initiative of the governments of the Kyrgyz Republic and the Republic of Tajikistan. It aims to address the interlinked problems of land degradation and poverty within a region that is one of Central Asia’s crucial sources of freshwater and a location of biodiversity hotspots. The project is executed by the Committee on Environment Protection in Tajikistan and the National Center for Mountain Regions Development in Kyrgyzstan, with fi nancial support from the Global Environment Facility (GEF) and other donors. The United Nations Environment Programme (UNEP) is the GEF Implementing Agency for the project, and the United Nations University (UNU) is the International Executing Agency. This Strategy and Action Plan integrates the work of three main teams of experts, namely the Pamir-Alai Transboundary Strategy and Action Plan (PATSAP) team, the Legal Task Forces, and a team of Natural Disaster Risk specialists. The PATSAP team was coordinated by the Centre for Development and Environment (CDE), University of Bern, Switzerland. The Legal Task Force was led by the Australian Centre for Agriculture and Law of the University of New England (UNE), and responsibility for the Natural Disaster Risk assessment was with the Central- Asian Institute of Applied Geosciences (CAIAG) in Bishkek, Kyrgyzstan. The development of the strategy took place from June 2009 to October 2010. The activities included fi eld study tours for updating the information base with fi rst-hand information from the local level, coordination meetings with actors from the region, and two multi-level stakeholder forums conducted in Khorog and Osh to identify priorities and to collect ideas for concrete action plans. The baseline information collected for the Strategy and Action Plan has been compiled by the experts and made available as reports1. A joint multi-level stakeholder forum was conducted in Jirgitol, Tajikistan, for in-depth discussion of the transboundary aspects. In August 2010, the draft Strategy and Action Plan was distributed among local, national, and international actors for consultation, and their comments were discussed at feedback forums in Khorog and Bishkek. This Strategy and Action Plan is intended as a recommendation. Nevertheless, it proposes concrete mechanisms for implementing the proposed sustainable land management (SLM) activities: The Regional Natural Resources Governance Framework provides the legal and policy concepts, principles, and regulatory requirements needed to create an enabling environment for SLM in the High Pamir and Pamir-Alai region at the transboundary, national, and local levels. The priority directions outlined provide a framework for the elaboration of rayon-level strategies and for strategies on specifi c topics (forestry, livestock, etc.), as well as for further development of government programmes and international projects. The action plans may serve as a pool of concrete ideas, which can be taken up by diff erent institutions and in smaller or larger projects. Finally, this document provides a basis for the elaboration and signing of targeted cooperation agreements on land use and management between the leaders of Osh oblast (Kyrgyz Republic), Gorno Badakhshan Autonomous Oblast, and Jirgitol rayon (Republic of Tajikistan).
Resumo:
Plant‐mediated interactions between herbivores are important determinants of community structure and plant performance in natural and agricultural systems. Current research suggests that the outcome of the interactions is determined by herbivore and plant identity, which may result in stochastic patterns that impede adaptive evolution and agricultural exploitation. However, few studies have systemically investigated specificity versus general patterns in a given plant system by varying the identity of all involved players. We investigated the influence of herbivore identity and plant genotype on the interaction between leaf‐chewing and root‐feeding herbivores in maize using a partial factorial design. We assessed the influence of leaf induction by oral secretions of six different chewing herbivores on the response of nine different maize genotypes and three different root feeders. Contrary to our expectations, we found a highly conserved pattern across all three dimensions of specificity: The majority of leaf herbivores elicited a negative behavioral response from the different root feeders in the large majority of tested plant genotypes. No facilitation was observed in any of the treatment combinations. However, the oral secretions of one leaf feeder and the responses of two maize genotypes did not elicit a response from a root‐feeding herbivore. Together, these results suggest that plant‐mediated interactions in the investigated system follow a general pattern, but that a degree of specificity is nevertheless present. Our study shows that within a given plant species, plant‐mediated interactions between herbivores of the same feeding guild can be stable. This stability opens up the possibility of adaptations by associated organisms and suggests that plant‐mediated interactions may contribute more strongly to evolutionary dynamics in terrestrial (agro)ecosystems than previously assumed.
Resumo:
This study examined both changing call volume and type with resulting effect of TeleHealth Nurse, the Houston Fire Department's (HFD) telephone nurse line, on call burden during Hurricane Ike. On September 13, 2008, Hurricane Ike made landfall in the Galveston area and continued north through Houston resulting in catastrophic damages in infrastructure and posing a public health threat. The overall goal of this study looked at data from Houston Fire Department to obtain a better understanding of the needs of citizens before, during, and after a hurricane. This study looked at four aspects of emergency response from HFD. The first section looked at call volumes surrounding the time of Hurricane Ike in 2008 compared to the same time period in 2007. The data showed a 12% increase in calls surrounding Hurricane Ike compared to previous years with a p value <.001. Next, the study evaluated the types of calls prevalent during Hurricane Ike compared to the same time period in 2007. The data showed a statistically significant increase in chronic health problems such as diabetes and cardiac events, Obstetric calls and an increase in breathing problems, falls, and lacerations during the days following Hurricane Ike. There was also a statistically significant increase in auto med alerts and check patients surrounding Hurricane Ike's landfall. The third section analyzed the change in call volume sent to HFD's Telephone Nurse Line during Hurricane Ike and compares this to earlier time periods while the fourth and final section looks at the types of calls sent to the nurse line during Hurricane Ike. The data showed limited use of the TeleHealth Nurse line before Hurricane Ike, but when the winds were at their strongest and ambulances were unable to leave the station, the nurse line was the only functioning medical help some people were able to receive. These studies bring a better understanding to the number and types of calls that a city might experience during a natural disaster, such as a hurricane. This study also shows the usefulness of an EMS Telephone Nurse Line during a natural disaster.^
Resumo:
Decreasing the accidents on highway and urban environments is the main motivation for the research and developing of driving assistance systems, also called ADAS (Advanced Driver Assistance Systems). In recent years, there are many applications of these systems in commercial vehicles: ABS systems, Cruise Control (CC), parking assistance and warning systems (including GPS), among others. However, the implementation of driving assistance systems on the steering wheel is more limited, because of their complexity and sensitivity. This paper is focused in the development, test and implementation of a driver assistance system for controlling the steering wheel in curve zones. This system is divided in two levels: an inner control loop which permits to execute the position and speed target, softening the action over the steering wheel, and a second control outer loop (controlling for fuzzy logic) that sends the reference to the inner loop according the environment and vehicle conditions. The tests have been done in different curves and speeds. The system has been proved in a commercial vehicle with satisfactory results.
Resumo:
Cuando una colectividad de sistemas dinámicos acoplados mediante una estructura irregular de interacciones evoluciona, se observan dinámicas de gran complejidad y fenómenos emergentes imposibles de predecir a partir de las propiedades de los sistemas individuales. El objetivo principal de esta tesis es precisamente avanzar en nuestra comprensión de la relación existente entre la topología de interacciones y las dinámicas colectivas que una red compleja es capaz de mantener. Siendo este un tema amplio que se puede abordar desde distintos puntos de vista, en esta tesis se han estudiado tres problemas importantes dentro del mismo que están relacionados entre sí. Por un lado, en numerosos sistemas naturales y artificiales que se pueden describir mediante una red compleja la topología no es estática, sino que depende de la dinámica que se desarrolla en la red: un ejemplo son las redes de neuronas del cerebro. En estas redes adaptativas la propia topología emerge como consecuencia de una autoorganización del sistema. Para conocer mejor cómo pueden emerger espontáneamente las propiedades comúnmente observadas en redes reales, hemos estudiado el comportamiento de sistemas que evolucionan según reglas adaptativas locales con base empírica. Nuestros resultados numéricos y analíticos muestran que la autoorganización del sistema da lugar a dos de las propiedades más universales de las redes complejas: a escala mesoscópica, la aparición de una estructura de comunidades, y, a escala macroscópica, la existencia de una ley de potencias en la distribución de las interacciones en la red. El hecho de que estas propiedades aparecen en dos modelos con leyes de evolución cuantitativamente distintas que siguen unos mismos principios adaptativos sugiere que estamos ante un fenómeno que puede ser muy general, y estar en el origen de estas propiedades en sistemas reales. En segundo lugar, proponemos una medida que permite clasificar los elementos de una red compleja en función de su relevancia para el mantenimiento de dinámicas colectivas. En concreto, estudiamos la vulnerabilidad de los distintos elementos de una red frente a perturbaciones o grandes fluctuaciones, entendida como una medida del impacto que estos acontecimientos externos tienen en la interrupción de una dinámica colectiva. Los resultados que se obtienen indican que la vulnerabilidad dinámica es sobre todo dependiente de propiedades locales, por tanto nuestras conclusiones abarcan diferentes topologías, y muestran la existencia de una dependencia no trivial entre la vulnerabilidad y la conectividad de los elementos de una red. Finalmente, proponemos una estrategia de imposición de una dinámica objetivo genérica en una red dada e investigamos su validez en redes con diversas topologías que mantienen regímenes dinámicos turbulentos. Se obtiene como resultado que las redes heterogéneas (y la amplia mayora de las redes reales estudiadas lo son) son las más adecuadas para nuestra estrategia de targeting de dinámicas deseadas, siendo la estrategia muy efectiva incluso en caso de disponer de un conocimiento muy imperfecto de la topología de la red. Aparte de la relevancia teórica para la comprensión de fenómenos colectivos en sistemas complejos, los métodos y resultados propuestos podrán dar lugar a aplicaciones en sistemas experimentales y tecnológicos, como por ejemplo los sistemas neuronales in vitro, el sistema nervioso central (en el estudio de actividades síncronas de carácter patológico), las redes eléctricas o los sistemas de comunicaciones. ABSTRACT The time evolution of an ensemble of dynamical systems coupled through an irregular interaction scheme gives rise to dynamics of great of complexity and emergent phenomena that cannot be predicted from the properties of the individual systems. The main objective of this thesis is precisely to increase our understanding of the interplay between the interaction topology and the collective dynamics that a complex network can support. This is a very broad subject, so in this thesis we will limit ourselves to the study of three relevant problems that have strong connections among them. First, it is a well-known fact that in many natural and manmade systems that can be represented as complex networks the topology is not static; rather, it depends on the dynamics taking place on the network (as it happens, for instance, in the neuronal networks in the brain). In these adaptive networks the topology itself emerges from the self-organization in the system. To better understand how the properties that are commonly observed in real networks spontaneously emerge, we have studied the behavior of systems that evolve according to local adaptive rules that are empirically motivated. Our numerical and analytical results show that self-organization brings about two of the most universally found properties in complex networks: at the mesoscopic scale, the appearance of a community structure, and, at the macroscopic scale, the existence of a power law in the weight distribution of the network interactions. The fact that these properties show up in two models with quantitatively different mechanisms that follow the same general adaptive principles suggests that our results may be generalized to other systems as well, and they may be behind the origin of these properties in some real systems. We also propose a new measure that provides a ranking of the elements in a network in terms of their relevance for the maintenance of collective dynamics. Specifically, we study the vulnerability of the elements under perturbations or large fluctuations, interpreted as a measure of the impact these external events have on the disruption of collective motion. Our results suggest that the dynamic vulnerability measure depends largely on local properties (our conclusions thus being valid for different topologies) and they show a non-trivial dependence of the vulnerability on the connectivity of the network elements. Finally, we propose a strategy for the imposition of generic goal dynamics on a given network, and we explore its performance in networks with different topologies that support turbulent dynamical regimes. It turns out that heterogeneous networks (and most real networks that have been studied belong in this category) are the most suitable for our strategy for the targeting of desired dynamics, the strategy being very effective even when the knowledge on the network topology is far from accurate. Aside from their theoretical relevance for the understanding of collective phenomena in complex systems, the methods and results here discussed might lead to applications in experimental and technological systems, such as in vitro neuronal systems, the central nervous system (where pathological synchronous activity sometimes occurs), communication systems or power grids.