830 resultados para discrete event systems
Resumo:
An important responsibility of the Environment Protection Authority, Victoria, is to set objectives for levels of environmental contaminants. To support the development of environmental objectives for water quality, a need has been identified to understand the dual impacts of concentration and duration of a contaminant on biota in freshwater streams. For suspended solids contamination, information reported by Newcombe and Jensen [ North American Journal of Fisheries Management , 16(4):693--727, 1996] study of freshwater fish and the daily suspended solids data from the United States Geological Survey stream monitoring network is utilised. The study group was requested to examine both the utility of the Newcombe and Jensen and the USA data, as well as the formulation of a procedure for use by the Environment Protection Authority Victoria that takes concentration and duration of harmful episodes into account when assessing water quality. The extent to which the impact of a toxic event on fish health could be modelled deterministically was also considered. It was found that concentration and exposure duration were the main compounding factors on the severity of effects of suspended solids on freshwater fish. A protocol for assessing the cumulative effect on fish health and a simple deterministic model, based on the biology of gill harm and recovery, was proposed. References D. W. T. Au, C. A. Pollino, R. S. S Wu, P. K. S. Shin, S. T. F. Lau, and J. Y. M. Tang. Chronic effects of suspended solids on gill structure, osmoregulation, growth, and triiodothyronine in juvenile green grouper epinephelus coioides . Marine Ecology Press Series , 266:255--264, 2004. J.C. Bezdek, S.K. Chuah, and D. Leep. Generalized k-nearest neighbor rules. Fuzzy Sets and Systems , 18:237--26, 1986. E. T. Champagne, K. L. Bett-Garber, A. M. McClung, and C. Bergman. {Sensory characteristics of diverse rice cultivars as influenced by genetic and environmental factors}. Cereal Chem. , {81}:{237--243}, {2004}. S. G. Cheung and P. K. S. Shin. Size effects of suspended particles on gill damage in green-lipped mussel perna viridis. Marine Pollution Bulletin , 51(8--12):801--810, 2005. D. H. Evans. The fish gill: site of action and model for toxic effects of environmental pollutants. Environmental Health Perspectives , 71:44--58, 1987. G. C. Grigg. The failure of oxygen transport in a fish at low levels of ambient oxygen. Comp. Biochem. Physiol. , 29:1253--1257, 1969. G. Holmes, A. Donkin, and I.H. Witten. {Weka: A machine learning workbench}. In Proceedings of the Second Australia and New Zealand Conference on Intelligent Information Systems , volume {24}, pages {357--361}, {Brisbane, Australia}, {1994}. {IEEE Computer Society}. D. D. Macdonald and C. P. Newcombe. Utility of the stress index for predicting suspended sediment effects: response to comments. North American Journal of Fisheries Management , 13:873--876, 1993. C. P. Newcombe. Suspended sediment in aquatic ecosystems: ill effects as a function of concentration and duration of exposure. Technical report, British Columbia Ministry of Environment, Lands and Parks, Habitat Protection branch, Victoria, 1994. C. P. Newcombe and J. O. T. Jensen. Channel suspended sediment and fisheries: A synthesis for quantitative assessment of risk and impact. North American Journal of Fisheries Management , 16(4):693--727, 1996. C. P. Newcombe and D. D. Macdonald. Effects of suspended sediments on aquatic ecosystems. North American Journal of Fisheries Management , 11(1):72--82, 1991. K. Schmidt-Nielsen. Scaling. Why is animal size so important? Cambridge University Press, NY, 1984. J. S. Schwartz, A. Simon, and L. Klimetz. Use of fish functional traits to associate in-stream suspended sediment transport metrics with biological impairment. Environmental Monitoring and Assessment , 179(1--4):347--369, 2011. E. Al Shaw and J. S. Richardson. Direct and indirect effects of sediment pulse duration on stream invertebrate assemb ages and rainbow trout ( Oncorhynchus mykiss ) growth and survival. Canadian Journal of Fish and Aquatic Science , 58:2213--2221, 2001. P. Tiwari and H. Hasegawa. {Demand for housing in Tokyo: A discrete choice analysis}. Regional Studies , {38}:{27--42}, {2004}. Y. Tramblay, A. Saint-Hilaire, T. B. M. J. Ouarda, F. Moatar, and B Hecht. Estimation of local extreme suspended sediment concentrations in california rivers. Science of the Total Environment , 408:4221--
Resumo:
We address the problem of finite horizon optimal control of discrete-time linear systems with input constraints and uncertainty. The uncertainty for the problem analysed is related to incomplete state information (output feedback) and stochastic disturbances. We analyse the complexities associated with finding optimal solutions. We also consider two suboptimal strategies that could be employed for larger optimization horizons.
Resumo:
The development of global navigation satellite systems (GNSS) provides a solution of many applied problems with increasingly higher quality and accuracy nowadays. Researches that are carried out by the Bavarian Academy of Sciences and Humanities in Munich (BAW) in the field of airborne gravimetry are based on sophisticated data processing from high frequency GNSS receiver for kinematic aircraft positioning. Applied algorithms for inertial acceleration determination are based on the high sampling rate (50Hz) and on reducing of such factors as ionosphere scintillation and multipath at aircraft /antenna near field effects. The quality of the GNSS derived kinematic height are studied also by intercomparison with lift height variations collected by a precise high sampling rate vertical scale [1]. This work is aimed at the ways of more accurate determination of mini-aircraft altitude by means of high frequency GNSS receivers, in particular by considering their dynamic behaviour.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.
Resumo:
Through the application of process mining, valuable evidence-based insights can be obtained about business processes in organisations. As a result the field has seen an increased uptake in recent years as evidenced by success stories and increased tool support. However, despite this impact, current performance analysis capabilities remain somewhat limited in the context of information-poor event logs. For example, natural daily and weekly patterns are not considered. In this paper a new framework for analysing event logs is defined which is based on the concept of event gap. The framework allows for a systematic approach to sophisticated performance-related analysis of event logs containing varying degrees of information. The paper formalises a range of event gap types and then presents an implementation as well as an evaluation of the proposed approach.
Resumo:
Collisions between different types of road users at intersections form a substantial component of the road toll. This paper presents an analysis of driver, cyclist, motorcyclist and pedestrian behaviour at intersections that involved the application of an integrated suite of ergonomics methods, the Event Analysis of Systemic Teamwork (EAST) framework, to on-road study data. EAST was used to analyse behaviour at three intersections using data derived from an on-road study of driver, cyclist, motorcyclist and pedestrian behaviour. The analysis shows the differences in behaviour and cognition across the different road user groups and pinpoints instances where this may be creating conflicts between different road users. The role of intersection design in creating these differences in behaviour and resulting conflicts is discussed. It is concluded that currently intersections are not designed in a way that supports behaviour across the four forms of road user studied. Interventions designed to improve intersection safety are discussed.
Resumo:
Novel computer vision techniques have been developed to automatically detect unusual events in crowded scenes from video feeds of surveillance cameras. The research is useful in the design of the next generation intelligent video surveillance systems. Two major contributions are the construction of a novel machine learning model for multiple instance learning through compressive sensing, and the design of novel feature descriptors in the compressed video domain.
Resumo:
Due to the popularity of security cameras in public places, it is of interest to design an intelligent system that can efficiently detect events automatically. This paper proposes a novel algorithm for multi-person event detection. To ensure greater than real-time performance, features are extracted directly from compressed MPEG video. A novel histogram-based feature descriptor that captures the angles between extracted particle trajectories is proposed, which allows us to capture motion patterns of multi-person events in the video. To alleviate the need for fine-grained annotation, we propose the use of Labelled Latent Dirichlet Allocation, a “weakly supervised” method that allows the use of coarse temporal annotations which are much simpler to obtain. This novel system is able to run at approximately ten times real-time, while preserving state-of-theart detection performance for multi-person events on a 100-hour real-world surveillance dataset (TRECVid SED).
Resumo:
This paper presents a low-bandwidth multi-robot communication system designed to serve as a backup communication channel in the event a robot suffers a network device fault. While much research has been performed in the area of distributing network communication across multiple robots within a system, individual robots are still susceptible to hardware failure. In the past, such robots would simply be removed from service, and their tasks re-allocated to other members. However, there are times when a faulty robot might be crucial to a mission, or be able to contribute in a less communication intensive area. By allowing robots to encode and decode messages into unique sequences of DTMF symbols, called words, our system is able to facilitate continued low-bandwidth communication between robots without access to network communication. Our results have shown that the system is capable of permitting robots to negotiate task initiation and termination, and is flexible enough to permit a pair of robots to perform a simple turn taking task.
Resumo:
Abnormal event detection has attracted a lot of attention in the computer vision research community during recent years due to the increased focus on automated surveillance systems to improve security in public places. Due to the scarcity of training data and the definition of an abnormality being dependent on context, abnormal event detection is generally formulated as a data-driven approach where activities are modeled in an unsupervised fashion during the training phase. In this work, we use a Gaussian mixture model (GMM) to cluster the activities during the training phase, and propose a Gaussian mixture model based Markov random field (GMM-MRF) to estimate the likelihood scores of new videos in the testing phase. Further-more, we propose two new features: optical acceleration, and the histogram of optical flow gradients; to detect the presence of any abnormal objects and speed violations in the scene. We show that our proposed method outperforms other state of the art abnormal event detection algorithms on publicly available UCSD dataset.
Resumo:
DNA double-strand breaks (DSBs) are particularly lethal and genotoxic lesions, that can arise either by endogenous (physiological or pathological) processes or by exogenous factors, particularly ionizing radiation and radiomimetic compounds. Phosphorylation of the H2A histone variant, H2AX, at the serine-139 residue, in the highly conserved C-terminal SQEY motif, forming γH2AX, is an early response to DNA double-strand breaks1. This phosphorylation event is mediated by the phosphatidyl-inosito 3-kinase (PI3K) family of proteins, ataxia telangiectasia mutated (ATM), DNA-protein kinase catalytic subunit and ATM and RAD3-related (ATR)2. Overall, DSB induction results in the formation of discrete nuclear γH2AX foci which can be easily detected and quantitated by immunofluorescence microscopy2. Given the unique specificity and sensitivity of this marker, analysis of γH2AX foci has led to a wide range of applications in biomedical research, particularly in radiation biology and nuclear medicine. The quantitation of γH2AX foci has been most widely investigated in cell culture systems in the context of ionizing radiation-induced DSBs. Apart from cellular radiosensitivity, immunofluorescence based assays have also been used to evaluate the efficacy of radiation-modifying compounds. In addition, γH2AX has been used as a molecular marker to examine the efficacy of various DSB-inducing compounds and is recently being heralded as important marker of ageing and disease, particularly cancer3. Further, immunofluorescence-based methods have been adapted to suit detection and quantitation of γH2AX foci ex vivo and in vivo4,5. Here, we demonstrate a typical immunofluorescence method for detection and quantitation of γH2AX foci in mouse tissues.
Resumo:
Today’s information systems log vast amounts of data. These collections of data (implicitly) describe events (e.g. placing an order or taking a blood test) and, hence, provide information on the actual execution of business processes. The analysis of such data provides an excellent starting point for business process improvement. This is the realm of process mining, an area which has provided a repertoire of many analysis techniques. Despite the impressive capabilities of existing process mining algorithms, dealing with the abundance of data recorded by contemporary systems and devices remains a challenge. Of particular importance is the capability to guide the meaningful interpretation of “oceans of data” by process analysts. To this end, insights from the field of visual analytics can be leveraged. This article proposes an approach where process states are reconstructed from event logs and visualised in succession, leading to an animated history of a process. This approach is customisable in how a process state, partially defined through a collection of activity instances, is visualised: one can select a map and specify a projection of events on this map based on the properties of the events. This paper describes a comprehensive implementation of the proposal. It was realised using the open-source process mining framework ProM. Moreover, this paper also reports on an evaluation of the approach conducted with Suncorp, one of Australia’s largest insurance companies.
Resumo:
Human resources are often responsible for the execution of business processes. In order to evaluate resource performance and identify best practices as well as opportunities for improvement, managers need objective information about resource behaviours. Companies often use information systems to support their processes and these systems record information about process execution in event logs. We present a framework for analysing and evaluating resource behaviour through mining such event logs. The framework provides a method for extracting descriptive information about resource skills, utilisation, preferences, productivity and collaboration patterns; a method for analysing relationships between different resource behaviours and outcomes; and a method for evaluating the overall resource productivity, tracking its changes over time and comparing it with the productivity of other resources. To demonstrate the applicability of our framework we apply it to analyse behaviours of employees in an Australian company and evaluate its usefulness by a survey among managers in industry.
Resumo:
Oscillations of neural activity may bind widespread cortical areas into a neural representation that encodes disparate aspects of an event. In order to test this theory we have turned to data collected from complex partial epilepsy (CPE) patients with chronically implanted depth electrodes. Data from regions critical to word and face information processing was analyzed using spectral coherence measurements. Similar analyses of intracranial EEG (iEEG) during seizure episodes display HippoCampal Formation (HCF)—NeoCortical (NC) spectral coherence patterns that are characteristic of specific seizure stages (Klopp et al. 1996). We are now building a computational memory model to examine whether spatio-temporal patterns of human iEEG spectral coherence emerge in a computer simulation of HCF cellular distribution, membrane physiology and synaptic connectivity. Once the model is reasonably scaled it will be used as a tool to explore neural parameters that are critical to memory formation and epileptogenesis.