956 resultados para Event data recorders.
Resumo:
We have searched for periodic variations of the electronic recoil event rate in the (2-6) keV energy range recorded between February 2011 and March 2012 with the XENON100 detector, adding up to 224.6 live days in total. Following a detailed study to establish the stability of the detector and its background contributions during this run, we performed an un-binned profile likelihood analysis to identify any periodicity up to 500 days. We find a global significance of less than 1 sigma for all periods suggesting no statistically significant modulation in the data. While the local significance for an annual modulation is 2.8 sigma, the analysis of a multiple-scatter control sample and the phase of the modulation disfavor a dark matter interpretation. The DAMA/LIBRA annual modulation interpreted as a dark matter signature with axial-vector coupling of WIMPs to electrons is excluded at 4.8 sigma.
Resumo:
Adult male and female Weddell seals (Leptonychotes weddellii) were fitted with Time-depth recorders (TDR) at Drescher Inlet (Riiser Larsen Ice Shelf), eastern Weddell Sea coast, in February 1998. Eight of 15 data sets were selected for analyses to investigate the seals' foraging behaviour (doi:10.1594/PANGAEA.511465, doi:10.1594/PANGAEA.511454, doi:10.1594/PANGAEA.511456, doi:10.1594/PANGAEA.511457, doi:10.1594/PANGAEA.511459, doi:10.1594/PANGAEA.511462, doi:10.1594/PANGAEA.511466, doi:10.1594/PANGAEA.511467). These data sets provided simultaneous dive records of eight seals over eight days. The seals primarily foraged within two depth layers, these being from the sea surface to 160 m where temperature and salinity varied considerably, and from 340 to 450 m near the bottom where temperature was lowest and salinity highest, with little variation. While pelagic and benthic diving occurred during daylight, the seals foraged almost exclusively in the upper water column at night. Trawling during daytime confirmed that Pleuragramma antarcticum were by far the most abundant fish both in the pelagial and close to the bottom. Pelagic night-hauls at 110-170 m depth showed highly variable biomass of P. antarcticum with a peak at around midnight. The temporal changes in the local abundance of P. antarcticum, particularly in the pelagial, may explain the trends in the seals' pelagic and benthic foraging activities. This is the first study which describes the jaw movements of a hunting seal which are presumably indicative of feeding events. Trophic links from the Weddell seal to fish, zooplankton and krill, Euphausia superba, are discussed. Another seven data sets did not overlap substantially with the selected time frame (doi:10.1594/PANGAEA.511458, doi:10.1594/PANGAEA.511460, doi:10.1594/PANGAEA.511464, doi:10.1594/PANGAEA.511468, doi:10.1594/PANGAEA.511469, doi:10.1594/PANGAEA.511453, doi:10.1594/PANGAEA.511463). A total of 25 Weddell seals were immobilised during the study period using a combination of ketamine, xylazine, and diazepam. Seven seals were drugged once, 15 seals two times, and three were drugged three times, coming to a total of 46 immobilisation procedures. Narcoses were terminated with yohimbine (doi:10.1594/PANGAEA.438933).
Resumo:
To date, big data applications have focused on the store-and-process paradigm. In this paper we describe an initiative to deal with big data applications for continuous streams of events. In many emerging applications, the volume of data being streamed is so large that the traditional ‘store-then-process’ paradigm is either not suitable or too inefficient. Moreover, soft-real time requirements might severely limit the engineering solutions. Many scenarios fit this description. In network security for cloud data centres, for instance, very high volumes of IP packets and events from sensors at firewalls, network switches and routers and servers need to be analyzed and should detect attacks in minimal time, in order to limit the effect of the malicious activity over the IT infrastructure. Similarly, in the fraud department of a credit card company, payment requests should be processed online and need to be processed as quickly as possible in order to provide meaningful results in real-time. An ideal system would detect fraud during the authorization process that lasts hundreds of milliseconds and deny the payment authorization, minimizing the damage to the user and the credit card company.
Resumo:
Traffic flow time series data are usually high dimensional and very complex. Also they are sometimes imprecise and distorted due to data collection sensor malfunction. Additionally, events like congestion caused by traffic accidents add more uncertainty to real-time traffic conditions, making traffic flow forecasting a complicated task. This article presents a new data preprocessing method targeting multidimensional time series with a very high number of dimensions and shows its application to real traffic flow time series from the California Department of Transportation (PEMS web site). The proposed method consists of three main steps. First, based on a language for defining events in multidimensional time series, mTESL, we identify a number of types of events in time series that corresponding to either incorrect data or data with interference. Second, each event type is restored utilizing an original method that combines real observations, local forecasted values and historical data. Third, an exponential smoothing procedure is applied globally to eliminate noise interference and other random errors so as to provide good quality source data for future work.
Resumo:
The EPA promulgated the Exceptional Events Rule codifying guidance regarding exclusion of monitoring data from compliance decisions due to uncontrollable natural or exceptional events. This capstone examines documentation systems utilized by agencies requesting data be excluded from compliance decisions due to exceptional events. A screening tool is developed to determine whether an event would meet exceptional event criteria. New data sources are available to enhance analysis but evaluation shows many are unusable in their current form. The EPA and States must collaborate to develop consistent evaluation methodologies documenting exceptional events to improve the efficiency and effectiveness of the new rule. To utilize newer sophisticated data, consistent, user-friendly translation systems must be developed.
Resumo:
There is consensus among community and road safety agencies that driver fatigue is a major road safety issue and it is well known that excessive fatigue is linked with an increased risk of a motor vehicle crash. Previous research has implicated a wide variety of factors involved in fatigue-related crashes and the effects of these various factors in regard to crash risk can be interpreted as causal (i.e. alcohol and/or drugs may induce fatigue states) or additive (e.g. where a lack of sleep is combined with alcohol). As such, the purpose of this investigation was to examine self-report data to determine whether there are any differences in the prevalence, crash characteristics, and travel patterns of males and females involved in a fatigue-related crash or close call event. Such research is important to understand how fatigue related incidents occur within the typical driving patterns of men and women and it provides a starting point in order to explore if males and females experience and understand the risk of diving when tired in the same way. A representative sample of (N = 1,600) residents living in the Australian Capital Territory (ACT) and New South Wales (NSW), Australia, were surveyed regarding their experience of fatigue and their involvement in fatigue-related crashes and close call incidents. Results revealed that over 35% of participants reported having had a close call or crash due to driving when tired in the five years prior to the study being conducted. In addition, the results obtained revealed a number of interesting characteristics that provide preliminary evidence that gender differences do exist when examining the prevalence, crash characteristics, and travel patterns of males and females involved in a fatigue-related crash or close call event. It is argued that the results obtained can provide particularly useful information for the refinement and further development of appropriate countermeasures that better target this complex issue.
Resumo:
With the increasing growth of cultural events both in Australia and internationally, there has also been an increase in event management studies; in theory and in practice. Although a series of related knowledge and skills required specifically by event managers has already been identified by many researchers (Perry et al., 1996; Getz, 2002 & Silvers et al., 2006) and generic event management models proposed, including ‘project management’ strategies in an event context (Getz, 2007), knowledge gaps still exist in relation to identifying specific types of events, especially for not-for-profit arts events. For events of a largely voluntary nature, insufficient resources are recognised as the most challenging; including finance, human resources and infrastructure. Therefore, the concepts and principles which are adopted by large scale commercial events may not be suitable for not-for-profit arts events aiming at providing professional network opportunities for artists. Building partnerships are identified as a key strategy in developing an effective event management model for this type of event. Using the 2008 World Dance Alliance Global Summit (WDAGS) in Brisbane 13-18 July, as a case study, the level, nature and relationship of key partners are investigated. Data is triangulated from interviews with organisers of the 2008 WDAGS, on-line and email surveys of delegates, participant observation and analysis of formal and informal documents, to produce a management model suited to this kind of event.
Resumo:
Emerging data streaming applications in Wireless Sensor Networks require reliable and energy-efficient Transport Protocols. Our recent Wireless Sensor Network deployment in the Burdekin delta, Australia, for water monitoring [T. Le Dinh, W. Hu, P. Sikka, P. Corke, L. Overs, S. Brosnan, Design and deployment of a remote robust sensor network: experiences from an outdoor water quality monitoring network, in: Second IEEE Workshop on Practical Issues in Building Sensor Network Applications (SenseApp 2007), Dublin, Ireland, 2007] is one such example. This application involves streaming sensed data such as pressure, water flow rate, and salinity periodically from many scattered sensors to the sink node which in turn relays them via an IP network to a remote site for archiving, processing, and presentation. While latency is not a primary concern in this class of application (the sampling rate is usually in terms of minutes or hours), energy-efficiency is. Continuous long-term operation and reliable delivery of the sensed data to the sink are also desirable. This paper proposes ERTP, an Energy-efficient and Reliable Transport Protocol for Wireless Sensor Networks. ERTP is designed for data streaming applications, in which sensor readings are transmitted from one or more sensor sources to a base station (or sink). ERTP uses a statistical reliability metric which ensures the number of data packets delivered to the sink exceeds the defined threshold. Our extensive discrete event simulations and experimental evaluations show that ERTP is significantly more energyefficient than current approaches and can reduce energy consumption by more than 45% when compared to current approaches. Consequently, sensor nodes are more energy-efficient and the lifespan of the unattended WSN is increased.
Resumo:
This dissertation develops the model of a prototype system for the digital lodgement of spatial data sets with statutory bodies responsible for the registration and approval of land related actions under the Torrens Title system. Spatial data pertain to the location of geographical entities together with their spatial dimensions and are classified as point, line, area or surface. This dissertation deals with a sub-set of spatial data, land boundary data that result from the activities performed by surveying and mapping organisations for the development of land parcels. The prototype system has been developed, utilising an event-driven paradigm for the user-interface, to exploit the potential of digital spatial data being generated from the utilisation of electronic techniques. The system provides for the creation of a digital model of the cadastral network and dependent data sets for an area of interest from hard copy records. This initial model is calibrated on registered control and updated by field survey to produce an amended model. The field-calibrated model then is electronically validated to ensure it complies with standards of format and content. The prototype system was designed specifically to create a database of land boundary data for subsequent retrieval by land professionals for surveying, mapping and related activities. Data extracted from this database are utilised for subsequent field survey operations without the need to create an initial digital model of an area of interest. Statistical reporting of differences resulting when subsequent initial and calibrated models are compared, replaces the traditional checking operations of spatial data performed by a land registry office. Digital lodgement of survey data is fundamental to the creation of the database of accurate land boundary data. This creation of the database is fundamental also to the efficient integration of accurate spatial data about land being generated by modem technology such as global positioning systems, and remote sensing and imaging, with land boundary information and other information held in Government databases. The prototype system developed provides for the delivery of accurate, digital land boundary data for the land registration process to ensure the continued maintenance of the integrity of the cadastre. Such data should meet also the more general and encompassing requirements of, and prove to be of tangible, longer term benefit to the developing, electronic land information industry.
Resumo:
This paper describes and analyses the procurement processes employed in delivering the Sydney Olympic Stadium – arguably the most significant stadia project in the region today. This current high profile project is discussed in terms of a case study into the procurement processes used. Interviews, personal site visits and questionnaires were used to obtain information on the procurement processes used and comments on their application to the project. The alternative procurement process used on this project—Design and Construction within a Build, Own, Operate and Transfer (BOOT) project—is likely to impact on the construction industry as a whole. Already other projects and sectors are following this lead. Based on a series of on-site interviews and questionnaires, a series of benefits and drawbacks to this procurement strategy are provided.The Olympic Stadium project has also been further analysed during construction through a Degree of Interaction framework to determine anticipated project success. This analysis investigates project interaction and user satisfaction to provide a comparable rating. A series of questionnaires were used to collect data to calculate the Degree of Interaction and User Satisfaction ratings.
Resumo:
Unusual event detection in crowded scenes remains challenging because of the diversity of events and noise. In this paper, we present a novel approach for unusual event detection via sparse reconstruction of dynamic textures over an overcomplete basis set, with the dynamic texture described by local binary patterns from three orthogonal planes (LBPTOP). The overcomplete basis set is learnt from the training data where only the normal items observed. In the detection process, given a new observation, we compute the sparse coefficients using the Dantzig Selector algorithm which was proposed in the literature of compressed sensing. Then the reconstruction errors are computed, based on which we detect the abnormal items. Our application can be used to detect both local and global abnormal events. We evaluate our algorithm on UCSD Abnormality Datasets for local anomaly detection, which is shown to outperform current state-of-the-art approaches, and we also get promising results for rapid escape detection using the PETS2009 dataset.
Resumo:
INTRODUCTION: Workforce planning for first aid and medical coverage of mass gatherings is hampered by limited research. In particular, the characteristics and likely presentation patterns of low-volume mass gatherings of between several hundred to several thousand people are poorly described in the existing literature. OBJECTIVES: This study was conducted to: 1. Describe key patient and event characteristics of medical presentations at a series of mass gatherings, including events smaller than those previously described in the literature; 2. Determine whether event type and event size affect the mean number of patients presenting for treatment per event, and specifically, whether the 1:2,000 deployment rule used by St John Ambulance Australia is appropriate; and 3. Identify factors that are predictive of injury at mass gatherings. METHODS: A retrospective, observational, case-series design was used to examine all cases treated by two Divisions of St John Ambulance (Queensland) in the greater metropolitan Brisbane region over a three-year period (01 January 2002-31 December 2004). Data were obtained from routinely collected patient treatment forms completed by St John officers at the time of treatment. Event-related data (e.g., weather, event size) were obtained from event forms designed for this study. Outcome measures include: total and average number of patient presentations for each event; event type; and event size category. Descriptive analyses were conducted using chi-square tests, and mean presentations per event and event type were investigated using Kruskal-Wallis tests. Logistic regression analyses were used to identify variables independently associated with injury presentation (compared with non-injury presentations). RESULTS: Over the three-year study period, St John Ambulance officers treated 705 patients over 156 separate events. The mean number of patients who presented with any medical condition at small events (less than or equal to 2,000 attendees) did not differ significantly from that of large (>2,000 attendees) events (4.44 vs. 4.67, F = 0.72, df = 1, 154, p = 0.79). Logistic regression analyses indicated that presentation with an injury compared with non-injury was independently associated with male gender, winter season, and sporting events, even after adjusting for relevant variables. CONCLUSIONS: In this study of low-volume mass gatherings, a similar number of patients sought medical treatment at small (<2,000 patrons) and large (>2,000 patrons) events. This demonstrates that for low-volume mass gatherings, planning based solely on anticipated event size may be flawed, and could lead to inappropriate levels of first-aid coverage. This study also highlights the importance of considering other factors, such as event type and patient characteristics, when determining appropriate first-aid resourcing for low-volume events. Additionally, identification of factors predictive of injury presentations at mass gatherings has the potential to significantly enhance the ability of event coordinators to plan effective prevention strategies and response capability for these events.