836 resultados para Catastrophic events
Resumo:
Collisions between pedestrians and vehicles continue to be a major problem throughout the world. Pedestrians trying to cross roads and railway tracks without any caution are often highly susceptible to collisions with vehicles and trains. Continuous financial, human and other losses have prompted transport related organizations to come up with various solutions addressing this issue. However, the quest for new and significant improvements in this area is still ongoing. This work addresses this issue by building a general framework using computer vision techniques to automatically monitor pedestrian movements in such high-risk areas to enable better analysis of activity, and the creation of future alerting strategies. As a result of rapid development in the electronics and semi-conductor industry there is extensive deployment of CCTV cameras in public places to capture video footage. This footage can then be used to analyse crowd activities in those particular places. This work seeks to identify the abnormal behaviour of individuals in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM), Full-2D HMM and Spatial HMM to model the normal activities of people. The outliers of the model (i.e. those observations with insufficient likelihood) are identified as abnormal activities. Location features, flow features and optical flow textures are used as the features for the model. The proposed approaches are evaluated using the publicly available UCSD datasets, and we demonstrate improved performance using a Semi-2D Hidden Markov Model compared to other state of the art methods. Further we illustrate how our proposed methods can be applied to detect anomalous events at rail level crossings.
Resumo:
The current approach for protecting the receiving water environment from urban stormwater pollution is the adoption of structural measures commonly referred to as Water Sensitive Urban Design (WSUD). The treatment efficiency of WSUD measures closely depends on the design of the specific treatment units. As stormwater quality can be influenced by rainfall characteristics, the selection of appropriate rainfall events for treatment design is essential to ensure the effectiveness of WSUD systems. Based on extensive field investigation of four urban residential catchments and computer modelling, this paper details a technically robust approach for the selection of rainfall events for stormwater treatment design using a three-component model. The modelling outcomes indicate that selecting smaller average recurrence interval (ARI) events with high intensity-short duration as the threshold for the treatment system design is the most feasible since these events cumulatively generate a major portion of the annual pollutant load compared to the other types of rainfall events, despite producing a relatively smaller runoff volume. This implies that designs based on small and more frequent rainfall events rather than larger rainfall events would be appropriate in the context of efficiency in treatment performance, cost-effectiveness and possible savings in land area needed.
Resumo:
Bioacoustic data can provide an important base for environmental monitoring. To explore a large amount of field recordings collected, an automated similarity search algorithm is presented in this paper. A region of an audio defined by frequency and time bounds is provided by a user; the content of the region is used to construct a query. In the retrieving process, our algorithm will automatically scan through recordings to search for similar regions. In detail, we present a feature extraction approach based on the visual content of vocalisations – in this case ridges, and develop a generic regional representation of vocalisations for indexing. Our feature extraction method works best for bird vocalisations showing ridge characteristics. The regional representation method allows the content of an arbitrary region of a continuous recording to be described in a compressed format.
Resumo:
Evidence from population-based studies of women increasingly points to the inter-related nature of reproductive health, lifestyle, and chronic disease risk. This paper describes the recently established International Collaboration for a Life Course Approach to Reproductive Health and Chronic Disease. InterLACE aims to advance the evidence base for women's health policy beyond associations from disparate studies by means of systematic and culturally sensitive synthesis of longitudinal data. Currently InterLACE draws on individual level data for reproductive health and chronic disease among 200,000 women from over thirteen studies of women's health in seven countries. The rationale for this multi-study research programme is set out in terms of a life course perspective to reproductive health. The research programme will build a comprehensive picture of reproductive health through life in relation to chronic disease risk. Although combining multiple international studies poses methodological challenges, InterLACE represents an invaluable opportunity to strength evidence to guide the development of timely and tailored preventive health strategies.
Resumo:
Acoustic sensing is a promising approach to scaling faunal biodiversity monitoring. Scaling the analysis of audio collected by acoustic sensors is a big data problem. Standard approaches for dealing with big acoustic data include automated recognition and crowd based analysis. Automatic methods are fast at processing but hard to rigorously design, whilst manual methods are accurate but slow at processing. In particular, manual methods of acoustic data analysis are constrained by a 1:1 time relationship between the data and its analysts. This constraint is the inherent need to listen to the audio data. This paper demonstrates how the efficiency of crowd sourced sound analysis can be increased by an order of magnitude through the visual inspection of audio visualized as spectrograms. Experimental data suggests that an analysis speedup of 12× is obtainable for suitable types of acoustic analysis, given that only spectrograms are shown.
Resumo:
The medical board of Australia Code of conduct reminds doctors that" "When adverse events occur, you have a responsibility to be open and honest in your communication with your patient, to review what has occurred and to report appropriately." More honoured in the breach rather than the observence may or may not be correct. Faced with the English concerns and the Netherlands research, an evidence based assessment of compliance with the ethical duty to disclose adverse events is warranted.
Resumo:
The Climate Commission recently outlined the trend of major extreme weather events in different regions of Australia, including heatwaves, floods, droughts, bushfires, cyclones and storms. These events already impose an enormous health and financial burden onto society and are projected to occur more frequently and intensely. Unless we act now, further financial losses and increasing health burdens seem inevitable. We seek to highlight the major areas for interdisciplinary investigation, identify barriers and formulate response strategies.
Resumo:
Poor complaint management may result in organizations losing customers and revenue. Consumers exhibit negative emotional responses when dissatisfied and this may lead to a complaint to a third-party organization. Since little information is available on the role of emotion in the consumer complaint process or how to manage complaints effectively, we offer an emotions perspective by applying Affective Events Theory (AET) to complaint behavior. This study presents the first application of AET in a consumption context and advances a theoretical framework supported by qualitative research for emotional responses to complaints. In contrast to commonly held views on gender and emotion, men as well as women use emotion-focused coping to complain.
Resumo:
The Chinese government should be commended for its open, concerted, and rapid response to the recent H7N9 influenza outbreak. However, the first known case was not reported until 48 days after disease onset.1 Although the difficulties in detecting the virus and the lack of suitable diagnostic methods have been the focus of discussion,2 systematic limitations that may have contributed to this delay have hardly been discussed. The detection speed of surveillance systems is limited by the highly structured nature of information flow and hierarchical organisation of these systems. Flu surveillance usually relies on notification to a central authority of laboratory confirmed cases or presentations to sentinel practices for flu-like illness. Each step in this pathway presents a bottleneck at which information and time can be lost; this limitation must be dealt with...
Resumo:
The terrorist attacks in the United States on September 11, 2001 appeared to be a harbinger of increased terrorism and violence in the 21st century, bringing terrorism and political violence to the forefront of public discussion. Questions about these events abound, and “Estimating the Historical and Future Probabilities of Large Scale Terrorist Event” [Clauset and Woodard (2013)] asks specifically, “how rare are large scale terrorist events?” and, in general, encourages discussion on the role of quantitative methods in terrorism research and policy and decision-making. Answering the primary question raises two challenges. The first is identify- ing terrorist events. The second is finding a simple yet robust model for rare events that has good explanatory and predictive capabilities. The challenges of identifying terrorist events is acknowledged and addressed by reviewing and using data from two well-known and reputable sources: the Memorial Institute for the Prevention of Terrorism-RAND database (MIPT-RAND) [Memorial Institute for the Prevention of Terrorism] and the Global Terror- ism Database (GTD) [National Consortium for the Study of Terrorism and Responses to Terrorism (START) (2012), LaFree and Dugan (2007)]. Clauset and Woodard (2013) provide a detailed discussion of the limitations of the data and the models used, in the context of the larger issues surrounding terrorism and policy.
Resumo:
Twitter is the focus of much research attention, both in traditional academic circles and in commercial market and media research, as analytics give increasing insight into the performance of the platform in areas as diverse as political communication, crisis management, television audiencing and other industries. While methods for tracking Twitter keywords and hashtags have developed apace and are well documented, the make-up of the Twitter user base and its evolution over time have been less understood to date. Recent research efforts have taken advantage of functionality provided by Twitter's Application Programming Interface to develop methodologies to extract information that allows us to understand the growth of Twitter, its geographic spread and the processes by which particular Twitter users have attracted followers. From politicians to sporting teams, and from YouTube personalities to reality television stars, this technique enables us to gain an understanding of what prompts users to follow others on Twitter. This article outlines how we came upon this approach, describes the method we adopted to produce accession graphs and discusses their use in Twitter research. It also addresses the wider ethical implications of social network analytics, particularly in the context of a detailed study of the Twitter user base.
Resumo:
The incidence of major storm surges in the last decade have dramatically emphasized the immense destructive capabilities of extreme water level events, particularly when driven by severe tropical cyclones. Given this risk, it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood and erosion management, engineering and for future land-use planning and to ensure the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. Australia has a long history of coastal flooding from tropical cyclones. Using a novel integration of two modeling techniques, this paper provides the first estimates of present day extreme water level exceedance probabilities around the whole coastline of Australia, and the first estimates that combine the influence of astronomical tides, storm surges generated by both extra-tropical and tropical cyclones, and seasonal and inter-annual variations in mean sea level. Initially, an analysis of tide gauge records has been used to assess the characteristics of tropical cyclone-induced surges around Australia. However, given the dearth (temporal and spatial) of information around much of the coastline, and therefore the inability of these gauge records to adequately describe the regional climatology, an observationally based stochastic tropical cyclone model has been developed to synthetically extend the tropical cyclone record to 10,000 years. Wind and pressure fields derived for these synthetically generated events have then been used to drive a hydrodynamic model of the Australian continental shelf region with annual maximum water levels extracted to estimate exceedance probabilities around the coastline. To validate this methodology, selected historic storm surge events have been simulated and resultant storm surges compared with gauge records. Tropical cyclone induced exceedance probabilities have been combined with estimates derived from a 61-year water level hindcast described in a companion paper to give a single estimate of present day extreme water level probabilities around the whole coastline of Australia. Results of this work are freely available to coastal engineers, managers and researchers via a web-based tool (www.sealevelrise.info). The described methodology could be applied to other regions of the world, like the US east coast, that are subject to both extra-tropical and tropical cyclones.
Resumo:
Existing techniques for automated discovery of process models from event logs largely focus on extracting flat process models. In other words, they fail to exploit the notion of subprocess, as well as structured error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of BPMN models containing subprocesses, interrupting and non-interrupting boundary events, and loop and multi-instance markers. The technique analyzes dependencies between data attributes associated with events, in order to identify subprocesses and to extract their associated logs. Parent process and subprocess models are then discovered separately using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. A validation with one synthetic and two real-life logs shows that process models derived using the proposed technique are more accurate and less complex than those derived with flat process model discovery techniques.
Resumo:
The life history strategies of massive Porites corals make them a valuable resource not only as key providers of reef structure, but also as recorders of past environmental change. Yet recent documented evidence of an unprecedented increase in the frequency of mortality in Porites warrants investigation into the history of mortality and associated drivers. To achieve this, both an accurate chronology and an understanding of the life history strategies of Porites are necessary. Sixty-two individual Uranium–Thorium (U–Th) dates from 50 dead massive Porites colonies from the central inshore region of the Great Barrier Reef (GBR) revealed the timing of mortality to have occurred predominantly over two main periods from 1989.2 ± 4.1 to 2001.4 ± 4.1, and from 2006.4 ± 1.8 to 2008.4 ± 2.2 A.D., with a small number of colonies dating earlier. Overall, the peak ages of mortality are significantly correlated with maximum sea-surface temperature anomalies. Despite potential sampling bias, the frequency of mortality increased dramatically post-1980. These observations are similar to the results reported for the Southern South China Sea. High resolution measurements of Sr/Ca and Mg/Ca obtained from a well preserved sample that died in 1994.6 ± 2.3 revealed that the time of death occurred at the peak of sea surface temperatures (SST) during the austral summer. In contrast, Sr/Ca and Mg/Ca analysis in two colonies dated to 2006.9 ± 3.0 and 2008.3 ± 2.0, suggest that both died after the austral winter. An increase in Sr/Ca ratios and the presence of low Mg-calcite cements (as determined by SEM and elemental ratio analysis) in one of the colonies was attributed to stressful conditions that may have persisted for some time prior to mortality. For both colonies, however, the timing of mortality coincides with the 4th and 6th largest flood events reported for the Burdekin River in the past 60 years, implying that factors associated with terrestrial runoff may have been responsible for mortality. Our results show that a combination of U–Th and elemental ratio geochemistry can potentially be used to precisely and accurately determine the timing and season of mortality in modern massive Porites corals. For reefs where long-term monitoring data are absent, the ability to reconstruct historical events in coral communities may prove useful to reef managers by providing some baseline knowledge on disturbance history and associated drivers.
Resumo:
Low-temperature plasmas in direct contact with arbitrary, written linear features on a Si wafer enable catalyst-free integration of carbon nanotubes into a Si-based nanodevice platform and in situ resolution of individual nucleation events. The graded nanotube arrays show reliable, reproducible, and competitive performance in electron field emission and biosensing nanodevices.