921 resultados para Disaster communications


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lorsque les ouragans entrent en contact avec l'environnement bâti et naturel, les autorités publiques n'ont parfois d'autre choix que de déclarer l'évacuation obligatoire de la population située en zone à risque. En raison de l'imprévisibilité du déroulement d'une catastrophe et des comportements humains, les opérations d'évacuation sont confrontées à une incertitude significative. Les expériences passées ont montré que les technologies de l'information et des communications (TIC) ont le potentiel d'améliorer l'état de l'art en gestion des évacuations. Malgré cette reconnaissance, les recherches empiriques sur ce sujet sont à ce jour limitées. La présente étude de cas de la ville de New York explore comment l'intégration des TIC dans la planification opérationnelle des organisations ayant des responsabilités en matière de transport peut améliorer leurs réponses aux événements et influencer le succès global du système de gestion des catastrophes. L'analyse est basée sur les informations recueillies au moyen d'entretiens semi-dirigés avec les organisations de transport et de gestion des catastrophes de la ville de New York ainsi qu’avec des experts du milieu universitaire. Les résultats mettent en lumière le potentiel des TIC pour la prise de décision en interne. Même s’il est largement reconnu que les TIC sont des moyens efficaces d'échanger de l'information en interne et entre les organisations, ces usages sont confrontés à certaines contraintes technologique, organisationnelle, structurelle et systémique. Cette observation a permis d'identifier les contraintes vécues dans les pratiques usuelles de gestion des systèmes urbains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract 1: Social Networks such as Twitter are often used for disseminating and collecting information during natural disasters. The potential for its use in Disaster Management has been acknowledged. However, more nuanced understanding of the communications that take place on social networks are required to more effectively integrate this information into the processes within disaster management. The type and value of information shared should be assessed, determining the benefits and issues, with credibility and reliability as known concerns. Mapping the tweets in relation to the modelled stages of a disaster can be a useful evaluation for determining the benefits/drawbacks of using data from social networks, such as Twitter, in disaster management.A thematic analysis of tweets’ content, language and tone during the UK Storms and Floods 2013/14 was conducted. Manual scripting was used to determine the official sequence of events, and classify the stages of the disaster into the phases of the Disaster Management Lifecycle, to produce a timeline. Twenty- five topics discussed on Twitter emerged, and three key types of tweets, based on the language and tone, were identified. The timeline represents the events of the disaster, according to the Met Office reports, classed into B. Faulkner’s Disaster Management Lifecycle framework. Context is provided when observing the analysed tweets against the timeline. This illustrates a potential basis and benefit for mapping tweets into the Disaster Management Lifecycle phases. Comparing the number of tweets submitted in each month with the timeline, suggests users tweet more as an event heightens and persists. Furthermore, users generally express greater emotion and urgency in their tweets.This paper concludes that the thematic analysis of content on social networks, such as Twitter, can be useful in gaining additional perspectives for disaster management. It demonstrates that mapping tweets into the phases of a Disaster Management Lifecycle model can have benefits in the recovery phase, not just in the response phase, to potentially improve future policies and activities. Abstract2: The current execution of privacy policies, as a mode of communicating information to users, is unsatisfactory. Social networking sites (SNS) exemplify this issue, attracting growing concerns regarding their use of personal data and its effect on user privacy. This demonstrates the need for more informative policies. However, SNS lack the incentives required to improve policies, which is exacerbated by the difficulties of creating a policy that is both concise and compliant. Standardization addresses many of these issues, providing benefits for users and SNS, although it is only possible if policies share attributes which can be standardized. This investigation used thematic analysis and cross- document structure theory, to assess the similarity of attributes between the privacy policies (as available in August 2014), of the six most frequently visited SNS globally. Using the Jaccard similarity coefficient, two types of attribute were measured; the clauses used by SNS and the coverage of forty recommendations made by the UK Information Commissioner’s Office. Analysis showed that whilst similarity in the clauses used was low, similarity in the recommendations covered was high, indicating that SNS use different clauses, but to convey similar information. The analysis also showed that low similarity in the clauses was largely due to differences in semantics, elaboration and functionality between SNS. Therefore, this paper proposes that the policies of SNS already share attributes, indicating the feasibility of standardization and five recommendations are made to begin facilitating this, based on the findings of the investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Includes bibliography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This occasional paper examines the experiences of three leading global centres of the ICT industry – India, Silicon Valley, and Estonia – to reflect on how the lessons of these models can be applied to the context of countries in the Caribbean region.Several sectors of the technology industry are considered in relation to the suitability for their establishment in the Caribbean. Animation is an area that is showing encouraging signs of development in several countries, and which offers some promise to provide a significant source of employment in the region. However, the global market for animation production is likely to become increasingly competitive, as improved technology has reduced barriers to entry into the industry not only in the Caribbean, but around the world. The region’s animation industry will need to move swiftly up the value chain if it is to avoid the downsides of being caught in an increasingly commoditized market. Mobile applications development has also been widely a heralded industry for the Caribbean. However, the market for consumer-oriented smartphone applications has matured very quickly, and is now a very difficult sector in which to compete. Caribbean mobile developers would be better served to focus on creating applications to suit the needs of regional industries and governments, rather than attempting to gain notice in over-saturated consumer marketplaces such as the iTunes App Store and Google Play. Another sector considered for the Caribbean is “big data” analysis. This area holds significant potential for growth in coming years, but the Caribbean, which is generally considered to be a datapoor region, currently lacks a sufficient base of local customers to form a competitive foundation for such an industry. While a Caribbean big data industry could plausibly be oriented toward outsourcing, that orientation would limit positive externalities from the sector, and benefits from its establishment would largely accrue only to a relatively small number of direct participants in the industry. Instead, development in the big data sector should be twinned with the development of products to build a regional customer base for the industry. The region has pressing needs in areas such as disaster risk reduction, water resource management, and support for agricultural production. Development of big data solutions – and other technology products – to address areas such as these could help to establish niche industries that both support the needs of local populations, and provide viable opportunities for the export of higher-value products and services to regions of the world with similar needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document was adapted from a paper originally presented to the 8th Annual Caribbean Conference of Comprehensive Disaster Management, held in Montego Bay, Jamaica in December, 2013. It summarizes several activities that ECLAC has undertaken to assess the current state of information and communications technology (ICT) in the field of disaster risk management (DRM) as practiced in the Caribbean. These activities included an in-depth study that encompassed a survey of disaster management organizations in the region, an Expert Group Meeting attended by the heads of several national disaster offices, and a training workshop for professionals working in DRM in the Caribbean. One of the notable conclusions of ECLAC’s investigation on this topic is that the lack of human capacity is the single largest constraint that is faced in the implementation of ICT projects for DRM in the Caribbean. In considering strategies to address the challenge of limited human capacity at a regional level, two separate issues are recognized – the need to increase the ICT capabilities of disaster management professionals, and the need to make ICT specialists available to disaster management organizations to advise and assist in the implementation of technology-focused projects. To that end, two models are proposed to engage with this issue at a regional level. The first entails the establishment of a network of ICT trainers in the Caribbean to help DRM staff develop a strategic understanding of how technology can be used to further their organizational goals. The second is the development of “Centres of Excellence” for ICT in the Caribbean, which would enable the deployment of specialized ICT expertise to national disaster management offices on a project-by-project basis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hurricanes, earthquakes, floods, and other serious natural hazards have been attributed with causing changes in regional economic growth, income, employment, and wealth. Natural disasters are said to cause; (1) an acceleration of existing economic trends; (2) an expansion of employment and income, due to recovery operations (the so-called silver lining); and (3) an alteration in the structure of regional economic activity due to changes in "intra" and "inter" regional trading patterns, and technological change.^ Theoretical and stylized disaster simulations (Cochrane 1975; Haas, Cochrane, and Kates 1977; Petak et al. 1982; Ellson et al. 1983, 1984; Boisvert 1992; Brookshire and McKee 1992) point towards a wide scope of possible negative and long lasting impacts upon economic activity and structure. This work examines the consequences of Hurricane Andrew on Dade County's economy. Following the work of Ellson et al. (1984), Guimaraes et al. (1993), and West and Lenze (1993; 1994), a regional econometric forecasting model (DCEFM) using a framework of "with" and "without" the hurricane is constructed and utilized to assess Hurricane Andrew's impact on the structure and level of economic activity in Dade County, Florida.^ The results of the simulation exercises show that the direct economic impact associated with Hurricane Andrew on Dade County is of short duration, and of isolated sectoral impact, with impact generally limited to construction, TCP (transportation, communications, and public utilities), and agricultural sectors. Regional growth, and changes in income and employment reacted directly to, and within the range and direction set by national economic activity. The simulations also lead to the conclusion that areal extent, infrastructure, and sector specific damages or impacts, as opposed to monetary losses, are the primary determinants of a disaster's effects upon employment, income, growth, and economic structure. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attributed to human-mediated dispersal, a species of the Anopheles gambiae complex invaded northeastern Brazil in 1930. This event is considered unique among the intercontinental introductions of disease vectors and the most serious one: "Few threats to the future health of the Americas have equalled that inherent in the invasion of Brazil, in 1930, by Anopheles gambiae." Because it was only in the 1960s that An. gambiae was recognized as a species complex now including seven species, the precise species identity of the Brazilian invader remains a mystery. Here we used historical DNA analysis of museum specimens, collected at the time of invasion from Brazil, and aimed at the identification of the Brazilian invader. Our results identify the arid-adapted Anopheles arabiensis as being the actual invading species. Establishing the identity of the species, in addition to being of intrinsic historical interest, can inform future threats of this sort especially in a changing environment. Furthermore, these results highlight the potential danger of human-mediated range expansions of insect disease vectors and the importance of museum collections in retrieving historical information

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a robust and low complexity scheme to estimate and track carrier frequency from signals traveling under low signal-to-noise ratio (SNR) conditions in highly nonstationary channels. These scenarios arise in planetary exploration missions subject to high dynamics, such as the Mars exploration rover missions. The method comprises a bank of adaptive linear predictors (ALP) supervised by a convex combiner that dynamically aggregates the individual predictors. The adaptive combination is able to outperform the best individual estimator in the set, which leads to a universal scheme for frequency estimation and tracking. A simple technique for bias compensation considerably improves the ALP performance. It is also shown that retrieval of frequency content by a fast Fourier transform (FFT)-search method, instead of only inspecting the angle of a particular root of the error predictor filter, enhances performance, particularly at very low SNR levels. Simple techniques that enforce frequency continuity improve further the overall performance. In summary we illustrate by extensive simulations that adaptive linear prediction methods render a robust and competitive frequency tracking technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the characteristics of the Power Spectral Density (PSD) of chaotic signals generated by skew tent maps. The influence of the Lyapunov exponent on the autocorrelation sequence and on the PSD is evaluated via computational simulations. We conclude that the essential bandwidth of these signals is strongly related to this exponent and they can be low-pass or high-pass depending on the family`s parameter. This way, the PSD of a chaotic signal is a function of the generating map although this is not a one-to-one relationship. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background. A sample of 1089 Australian adults was selected for the longitudinal component of the Quake Impact Study, a 2-year, four-phase investigation of the psychosocial effects of the 1989 Newcastle earthquake. Of these, 845 (78%) completed a survey 6 months post-disaster as well as one or more of the three follow-up surveys. Methods. The phase I survey was used to construct dimensional indices of self-reported exposure to threat the disruption and also to classify subjects by their membership of five 'at risk' groups (the injured; the displaced; owners of damaged small businesses; helpers in threat and non-threat situations). Psychological morbidity was assessed at each phase using the 12-item General Health Questionnaire (GHQ-12) and the Impact of Event Scale (IES). Results. Psychological morbidity declined over time but tended to stabilize at about 12 months post-disaster for general morbidity (GHQ-12) and at about 18 months for trauma-related (IES) morbidity. Initial exposure to threat and/or disruption were significant predictors of psychological morbidity throughout the study and had superior predictive power to membership of the targeted 'at risk' groups. The degree of ongoing disruption and other life events since the earthquake were also significant predictors of morbidity. The injured reported the highest levels of distress, but there was a relative absence of morbidity among the helpers. Conclusions. Future disaster research should carefully assess the threat and disruption experiences of the survivors at the time of the event and monitor ongoing disruptions in the aftermath in order to target interventions more effectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a general model of fiber optics, we investigate the physical limits of soliton-based terabaud communication systems. In particular we consider Raman and initial quantum noise effects which are often neglected in fiber communications. Simulations of the position diffusion in dark and bright solitons show that these effects become increasingly important at short pulse durations, even over kilometer-scale distances. We also obtain an approximate analytic theory in agreement with numerical simulations, which shows that the Raman effects exceed the Gordon-Haus jitter for sub-picosecond pulses. (C) 1997 Elsevier Science B.V.