967 resultados para extreme climatic event
Resumo:
The rapid increase in the deployment of CCTV systems has led to a greater demand for algorithms that are able to process incoming video feeds. These algorithms are designed to extract information of interest for human operators. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned `normal' model. Many researchers have tried various sets of features to train different learning models to detect abnormal behaviour in video footage. In this work we propose using a Semi-2D Hidden Markov Model (HMM) to model the normal activities of people. The outliers of the model with insufficient likelihood are identified as abnormal activities. Our Semi-2D HMM is designed to model both the temporal and spatial causalities of the crowd behaviour by assuming the current state of the Hidden Markov Model depends not only on the previous state in the temporal direction, but also on the previous states of the adjacent spatial locations. Two different HMMs are trained to model both the vertical and horizontal spatial causal information. Location features, flow features and optical flow textures are used as the features for the model. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.
Resumo:
In the face of Australia’s disaster-prone environment, architects Ian Weir and James Davidson are reconceptualising how our residential buildings might become more resilient to fire, flood and cyclone. With their first-hand experience of natural disasters, James, director of Emergency Architects Australia (EAA), and Ian, one of Australia’s few ‘bushfire architects’, discuss the ways we can design with disaster in mind. Dr Ian Weir is one of Australia’s few ‘bushfire architects’. Exploring a holistic ‘ground up’ approach to bushfire where landscape, building design and habitation patterns are orchestrated to respond to site-specific fire characteristics. Ian’s research is developed through design studio teaching at QUT and through built works in Western Australia’s fire prone forests and heathlands.
Resumo:
Bouncing Back Architecture Exhibition: This exhibition showcases interpretations of urban resiliency by 2nd and 4th Year undergraduate architecture students who explore the notion of Bouncing Back from the 2011 Queensland floods, in the context of contemporary Brisbane built environment. Design solutions have been expressed in a variety of forms including emergency shelters, flood-proof housing and a range of urban designs, some of which address extreme environmental conditions. Design Process Workshop | Architecture Workshop with Queensland Academy of Creative Industries Students: In collaboration with Homegrown Facilitator Natalie Wright, Lindy Osborne and Glenda Caldwell and some of their architecture students from the QUT School of Design, extended the university design studio experience to 18 Secondary School students, who brainstormed and designed emergency food distribution shelters for those affected by floods. Designs and models created in the workshop were subsequently included in the Bouncing Back Architecture Exhibition.
Resumo:
Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.
Resumo:
Free association norms indicate that words are organized into semantic/associative neighborhoods within a larger network of words and links that bind the net together. We present evidence indicating that memory for a recent word event can depend on implicitly and simultaneously activating related words in its neighborhood. Processing a word during encoding primes its network representation as a function of the density of the links in its neighborhood. Such priming increases recall and recognition and can have long lasting effects when the word is processed in working memory. Evidence for this phenomenon is reviewed in extralist cuing, primed free association, intralist cuing, and single-item recognition tasks. The findings also show that when a related word is presented to cue the recall of a studied word, the cue activates it in an array of related words that distract and reduce the probability of its selection. The activation of the semantic network produces priming benefits during encoding and search costs during retrieval. In extralist cuing recall is a negative function of cue-to-distracter strength and a positive function of neighborhood density, cue-to-target strength, and target-to cue strength. We show how four measures derived from the network can be combined and used to predict memory performance. These measures play different roles in different tasks indicating that the contribution of the semantic network varies with the context provided by the task. We evaluate spreading activation and quantum-like entanglement explanations for the priming effect produced by neighborhood density.
Resumo:
In this paper, we propose an approach which attempts to solve the problem of surveillance event detection, assuming that we know the definition of the events. To facilitate the discussion, we first define two concepts. The event of interest refers to the event that the user requests the system to detect; and the background activities are any other events in the video corpus. This is an unsolved problem due to many factors as listed below: 1) Occlusions and clustering: The surveillance scenes which are of significant interest at locations such as airports, railway stations, shopping centers are often crowded, where occlusions and clustering of people are frequently encountered. This significantly affects the feature extraction step, and for instance, trajectories generated by object tracking algorithms are usually not robust under such a situation. 2) The requirement for real time detection: The system should process the video fast enough in both of the feature extraction and the detection step to facilitate real time operation. 3) Massive size of the training data set: Suppose there is an event that lasts for 1 minute in a video with a frame rate of 25fps, the number of frames for this events is 60X25 = 1500. If we want to have a training data set with many positive instances of the event, the video is likely to be very large in size (i.e. hundreds of thousands of frames or more). How to handle such a large data set is a problem frequently encountered in this application. 4) Difficulty in separating the event of interest from background activities: The events of interest often co-exist with a set of background activities. Temporal groundtruth typically very ambiguous, as it does not distinguish the event of interest from a wide range of co-existing background activities. However, it is not practical to annotate the locations of the events in large amounts of video data. This problem becomes more serious in the detection of multi-agent interactions, since the location of these events can often not be constrained to within a bounding box. 5) Challenges in determining the temporal boundaries of the events: An event can occur at any arbitrary time with an arbitrary duration. The temporal segmentation of events is difficult and ambiguous, and also affected by other factors such as occlusions.
Resumo:
Extracting and aggregating the relevant event records relating to an identified security incident from the multitude of heterogeneous logs in an enterprise network is a difficult challenge. Presenting the information in a meaningful way is an additional challenge. This paper looks at solutions to this problem by first identifying three main transforms; log collection, correlation, and visual transformation. Having identified that the CEE project will address the first transform, this paper focuses on the second, while the third is left for future work. To aggregate by correlating event records we demonstrate the use of two correlation methods, simple and composite. These make use of a defined mapping schema and confidence values to dynamically query the normalised dataset and to constrain result events to within a time window. Doing so improves the quality of results, required for the iterative re-querying process being undertaken. Final results of the process are output as nodes and edges suitable for presentation as a network graph.
Resumo:
Risk identification is one of the most challenging stages in the risk management process. Conventional risk management approaches provide little guidance and companies often rely on the knowledge of experts for risk identification. In this paper we demonstrate how risk indicators can be used to predict process delays via a method for configuring so-called Process Risk Indicators(PRIs). The method learns suitable configurations from past process behaviour recorded in event logs. To validate the approach we have implemented it as a plug-in of the ProM process mining framework and have conducted experiments using various data sets from a major insurance company.
Extreme temperatures and emergency department admissions for childhood asthma in Brisbane, Australia
Resumo:
Objectives To examine the effect of extreme temperatures on emergency department admissions (EDAs) for childhood asthma. Methods An ecological design was used in this study. A Poisson linear regression model combined with a distributed lag non-linear model was used to quantify the effect of temperature on EDAs for asthma among children aged 0–14 years in Brisbane, Australia, during January 2003–December 2009, while controlling for air pollution, relative humidity, day of the week, season and long-term trends. The model residuals were checked to identify whether there was an added effect due to heat waves or cold spells. Results There were 13 324 EDAs for childhood asthma during the study period. Both hot and cold temperatures were associated with increases in EDAs for childhood asthma, and their effects both appeared to be acute. An added effect of heat waves on EDAs for childhood asthma was observed, but no added effect of cold spells was found. Male children and children aged 0–4 years were most vulnerable to heat effects, while children aged 10–14 years were most vulnerable to cold effects. Conclusions Both hot and cold temperatures seemed to affect EDAs for childhood asthma. As climate change continues, children aged 0–4 years are at particular risk for asthma.
Resumo:
Temperate Australia sits between the heat engine of the tropics and the cold Southern Ocean, encompassing a range of rainfall regimes and falling under the influence of different climatic drivers. Despite this heterogeneity, broad-scale trends in climatic and environmental change are evident over the past 30 ka. During the early glacial period (∼30–22 ka) and the Last Glacial Maximum (∼22–18 ka), climate was relatively cool across the entire temperate zone and there was an expansion of grasslands and increased fluvial activity in regionally important Murray–Darling Basin. The temperate region at this time appears to be dominated by expanded sea ice in the Southern Ocean forcing a northerly shift in the position of the oceanic fronts and a concomitant influx of cold water along the southeast (including Tasmania) and southwest Australian coasts. The deglacial period (∼18–12 ka) was characterised by glacial recession and eventual disappearance resulting from an increase in temperature deduced from terrestrial records, while there is some evidence for climatic reversals (e.g. the Antarctic Cold Reversal) in high resolution marine sediment cores through this period. The high spatial density of Holocene terrestrial records reveals an overall expansion of sclerophyll woodland and rainforest taxa across the temperate region after ∼12 ka, presumably in response to increasing temperature, while hydrological records reveal spatially heterogeneous hydro-climatic trends. Patterns after ∼6 ka suggest higher frequency climatic variability that possibly reflects the onset of large scale climate variability caused by the El Niño/Southern Oscillation.
Resumo:
Part of the chapter: "Sale of Sperm, Health Records, Minimally Conscious States, and Duties of Candour" Although ethical obligations and good medical practice guidelines clearly contemplate open disclosure, there is a dearth of authority as to the nature and extent of a legal duty on Australian doctors to disclose adverse events to patients.