406 resultados para Acoustic event classification
Resumo:
Load in distribution networks is normally measured at the 11kV supply points; little or no information is known about the type of customers and their contributions to the load. This paper proposes statistical methods to decompose an unknown distribution feeder load to its customer load sector/subsector profiles. The approach used in this paper should assist electricity suppliers in economic load management, strategic planning and future network reinforcements.
Resumo:
Structural health monitoring (SHM) refers to the procedure used to assess the condition of structures so that their performance can be monitored and any damage can be detected early. Early detection of damage and appropriate retrofitting will aid in preventing failure of the structure and save money spent on maintenance or replacement and ensure the structure operates safely and efficiently during its whole intended life. Though visual inspection and other techniques such as vibration based ones are available for SHM of structures such as bridges, the use of acoustic emission (AE) technique is an attractive option and is increasing in use. AE waves are high frequency stress waves generated by rapid release of energy from localised sources within a material, such as crack initiation and growth. AE technique involves recording these waves by means of sensors attached on the surface and then analysing the signals to extract information about the nature of the source. High sensitivity to crack growth, ability to locate source, passive nature (no need to supply energy from outside, but energy from damage source itself is utilised) and possibility to perform real time monitoring (detecting crack as it occurs or grows) are some of the attractive features of AE technique. In spite of these advantages, challenges still exist in using AE technique for monitoring applications, especially in the area of analysis of recorded AE data, as large volumes of data are usually generated during monitoring. The need for effective data analysis can be linked with three main aims of monitoring: (a) accurately locating the source of damage; (b) identifying and discriminating signals from different sources of acoustic emission and (c) quantifying the level of damage of AE source for severity assessment. In AE technique, the location of the emission source is usually calculated using the times of arrival and velocities of the AE signals recorded by a number of sensors. But complications arise as AE waves can travel in a structure in a number of different modes that have different velocities and frequencies. Hence, to accurately locate a source it is necessary to identify the modes recorded by the sensors. This study has proposed and tested the use of time-frequency analysis tools such as short time Fourier transform to identify the modes and the use of the velocities of these modes to achieve very accurate results. Further, this study has explored the possibility of reducing the number of sensors needed for data capture by using the velocities of modes captured by a single sensor for source localization. A major problem in practical use of AE technique is the presence of sources of AE other than crack related, such as rubbing and impacts between different components of a structure. These spurious AE signals often mask the signals from the crack activity; hence discrimination of signals to identify the sources is very important. This work developed a model that uses different signal processing tools such as cross-correlation, magnitude squared coherence and energy distribution in different frequency bands as well as modal analysis (comparing amplitudes of identified modes) for accurately differentiating signals from different simulated AE sources. Quantification tools to assess the severity of the damage sources are highly desirable in practical applications. Though different damage quantification methods have been proposed in AE technique, not all have achieved universal approval or have been approved as suitable for all situations. The b-value analysis, which involves the study of distribution of amplitudes of AE signals, and its modified form (known as improved b-value analysis), was investigated for suitability for damage quantification purposes in ductile materials such as steel. This was found to give encouraging results for analysis of data from laboratory, thereby extending the possibility of its use for real life structures. By addressing these primary issues, it is believed that this thesis has helped improve the effectiveness of AE technique for structural health monitoring of civil infrastructures such as bridges.
Resumo:
This technical report describes the methods used to obtain a list of acoustic indices that are used to characterise the structure and distribution of acoustic energy in recordings of the natural environment. In particular it describes methods for noise reduction from recordings of the environment and a fast clustering algorithm used to estimate the spectral richness of long recordings.
Resumo:
Acoustic sensors provide an effective means of monitoring biodiversity at large spatial and temporal scales. They can continuously and passively record large volumes of data over extended periods, however these data must be analysed to detect the presence of vocal species. Automated analysis of acoustic data for large numbers of species is complex and can be subject to high levels of false positive and false negative results. Manual analysis by experienced users can produce accurate results, however the time and effort required to process even small volumes of data can make manual analysis prohibitive. Our research examined the use of sampling methods to reduce the cost of analysing large volumes of acoustic sensor data, while retaining high levels of species detection accuracy. Utilising five days of manually analysed acoustic sensor data from four sites, we examined a range of sampling rates and methods including random, stratified and biologically informed. Our findings indicate that randomly selecting 120, one-minute samples from the three hours immediately following dawn provided the most effective sampling method. This method detected, on average 62% of total species after 120 one-minute samples were analysed, compared to 34% of total species from traditional point counts. Our results demonstrate that targeted sampling methods can provide an effective means for analysing large volumes of acoustic sensor data efficiently and accurately.
Resumo:
Effective risk management is crucial for any organisation. One of its key steps is risk identification, but few tools exist to support this process. Here we present a method for the automatic discovery of a particular type of process-related risk, the danger of deadline transgressions or overruns, based on the analysis of event logs. We define a set of time-related process risk indicators, i.e., patterns observable in event logs that highlight the likelihood of an overrun, and then show how instances of these patterns can be identified automatically using statistical principles. To demonstrate its feasibility, the approach has been implemented as a plug-in module to the process mining framework ProM and tested using an event log from a Dutch financial institution.
Resumo:
This article outlines the key recommendations of the Australian Law Reform Commission’s review of the National Classification Scheme, as outlined in its report Classification – Content Regulation and Convergent Media (ALRC, 2012). It identifies key contextual factors that underpin the need for reform of media classification laws and policies, including the fragmentation of regulatory responsibilities and the convergence of media platforms, content and services, as well as discussing the ALRC’s approach to law reform.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, as the gathered information is from the crowd, the data quality is always hard to manage. There are many ways to manage data quality, and reputation management is one of the common approaches. In recent year, many research teams have deployed many audio or image sensors in natural environment in order to monitor the status of animals or plants. The collected data will be analysed by ecologists. However, as the amount of collected data is exceedingly huge and the number of ecologists is very limited, it is impossible for scientists to manually analyse all these data. The functions of existing automated tools to process the data are still very limited and the results are still not very accurate. Therefore, researchers have turned to recruiting general citizens who are interested in helping scientific research to do the pre-processing tasks such as species tagging. Although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Therefore, this research aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we aim to investigate how to use reputation management to enhance data reliability. Reputation systems have been used to solve the uncertainty and improve data quality in many marketing and E-Commerce domains. The commercial organizations which have chosen to embrace the reputation management and implement the technology have gained many benefits. Data quality issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. However, research on reputation management in this area is relatively new. We therefore start our investigation by examining existing reputation systems in different domains. Then we design novel reputation management approaches for Citizen Science projects to categorise participants and data. We have investigated some critical elements which may influence data reliability in Citizen Science projects. These elements include personal information such as location and education and performance information such as the ability to recognise certain bird calls. The designed reputation framework is evaluated by a series of experiments involving many participants for collecting and interpreting data, in particular, environmental acoustic data. Our research in exploring the advantages of reputation management in Citizen Science (or crowdsourcing in general) will help increase awareness among organizations that are unacquainted with its potential benefits.
Resumo:
During a major flood event, the inundation of urban environments leads to some complicated flow motion most often associated with significant sediment fluxes. In the present study, a series of field measurements were conducted in an inundated section of the City of Brisbane (Australia) about the peak of a major flood in January 2011. Some experiments were performed to use ADV backscatter amplitude as a surrogate estimate of the suspended sediment concentration (SSC) during the flood event. The flood water deposit samples were predominantly silty material with a median particle size about 25 μm and they exhibited a non-Newtonian behavior under rheological testing. In the inundated urban environment during the flood, estimates of suspended sediment concentration presented a general trend with increasing SSC for decreasing water depth. The suspended sediment flux data showed some substantial sediment flux amplitudes consistent with the murky appearance of floodwaters. Altogether the results highlighted the large suspended sediment loads and fluctuations in the inundated urban setting associated possibly with a non-Newtonian behavior. During the receding flood, some unusual long-period oscillations were observed (periods about 18 min), although the cause of these oscillations remains unknown. The field deployment was conducted in challenging conditions highlighting a number of practical issues during a natural disaster.
Resumo:
This article investigates the role of information communication technologies (ICTs) in establishing a well-aligned, authentic learning environment for a diverse cohort of non-cognate and cognate students studying event management in a higher education context. Based on a case study which examined the way ICTs assisted in accommodating diverse learning needs, styles and stages in an event management subject offered in the Creative Industries Faculty at Queensland University of Technology in Brisbane, Australia, the article uses an action research approach to generate grounded, empirical data on the effectiveness of the dynamic, individualised curriculum frameworks that the use of ICTs makes possible. The study provides insights into the way non-cognate and cognate students respond to different learning tools. It finds that whilst non-cognate and cognate students do respond to learning tools differently, due to a differing degree of emphasis on technical, task or theoretical competencies, the use of ICTs allows all students to improve their performance by providing multiple points of entry into the content. In this respect, whilst the article focuses on the way ICTs can be used to develop an authentic, well-aligned curriculum model that meets the needs of event management students in a higher education context, with findings relevant for event educators in Business, Hospitality, Tourism and Creative Industries, the strategies outlined may also be useful for educators in other fields who are faced with similar challenges when designing and developing curriculum for diverse cohorts.
Performance of elite seated discus throwers in F30s classes : part II: does feet positioning matter?
Resumo:
Background: Studies on the relationship between performance and design of the throwing frame have been limited. Part I provided only a description of the whole body positioning. Objectives: The specific objectives were (a) to benchmark feet positioning characteristics (i.e. position, spacing and orientation) and (b) to investigate the relationship between performance and these characteristics for male seated discus throwers in F30s classes. Study Design: Descriptive analysis. Methods: A total of 48 attempts performed by 12 stationary discus throwers in F33 and F34 classes during seated discus throwing event of 2002 International Paralympic Committee Athletics World Championships were analysed in this study. Feet positioning was characterised by tridimensional data of the front and back feet position as well as spacing and orientation corresponding to the distance between and the angle made by both feet, respectively. Results: Only 4 of 30 feet positioning characteristics presented a coefficient correlation superior to 0.5, including the feet spacing on mediolateral and anteroposterior axes in F34 class as well as the back foot position and feet spacing on mediolateral axis in F33 class. Conclusions: This study provided key information for a better understanding of the interaction between throwing technique of elite seated throwers and their throwing frame.
Resumo:
With the increasing number of stratospheric particles available for study (via the U2 and/or WB57F collections), it is essential that a simple, yet rational, classification scheme be developed for general use. Such a scheme should be applicable to all particles collected from the stratosphere, rather than limited to only extraterrestial or chemical sub-groups. Criteria for the efficacy of such a scheme would include: (a) objectivity , (b) ease of use, (c) acceptance within the broader scientific community and (d) how well the classification provides intrinsic categories which are consistent with our knowledge of particle types present in the stratosphere.
Resumo:
Several investigators have recently proposed classification schemes for stratospheric dust particles [1-3]. In addition, extraterrestrial materials within stratospheric dust collections may be used as a measure of micrometeorite flux [4]. However, little attention has been given to the problems of the stratospheric collection as a whole. Some of these problems include: (a) determination of accurate particle abundances at a given point in time; (b) the extent of bias in the particle selection process; (c) the variation of particle shape and chemistry with size; (d) the efficacy of proposed classification schemes and (e) an accurate determination of physical parameters associated with the particle collection process (e.g. minimum particle size collected, collection efficiency, variation of particle density with time). We present here preliminary results from SEM, EDS and, where appropriate, XRD analysis of all of the particles from a collection surface which sampled the stratosphere between 18 and 20km in altitude. Determinations of particle densities from this study may then be used to refine models of the behavior of particles in the stratosphere [5].
Resumo:
This project investigates musicalisation and intermediality in the writing and devising of composed theatre. Its research question asks “How does the narrative of a musical play differ when it emerges from a setlist of original songs?”, the aim being to create performance event that is neither music nor theatre. This involves composition of lyrics, music, action and spoken text, projected image: gathered in a script and presented in performance. Scholars such as Kulezic-Wilson(in Kendrick, L and Roesner, D 2011:34) outline the acoustic dimension to the ‘performative turn’ (Mungen, Ernst and Bentzweizer, 2012) as heralding “…a shift of emphasis on how meaning is created (and veiled) and how the spectrum of theatrical creation and reception is widened.” Rebstock and Roesner (2012) capture approaches similar to this, building on Lehmann’s work in the post-dramatic under the new term ‘composed theatre’. This practice led research draws influence from these new theoretical frames, pushing beyond ‘the musical’. Springing from a set of original songs in dialogue with performed narrative, Bear with Me is a 45 minute music driven work for children, involving projected image and participatory action. Bear with Me’s intermedial hybrid of theatrical, screen and concert presentations shows that a simple setlist of original songs can be the starting point for the structure of a complex intermedial performance. Bear with Me was programmed into the Queensland Performing Arts Centre’s Out of the Box Festival. It was first performed in the Tony Gould Gallery at the Queensland in June 2012. The season sold out. A masterclass on my playwriting methodology was presented at the Connecting The Dots Symposium which ran alongside the festival.