983 resultados para Acoustic event classification
Resumo:
INTRODUCTION: Workforce planning for first aid and medical coverage of mass gatherings is hampered by limited research. In particular, the characteristics and likely presentation patterns of low-volume mass gatherings of between several hundred to several thousand people are poorly described in the existing literature. OBJECTIVES: This study was conducted to: 1. Describe key patient and event characteristics of medical presentations at a series of mass gatherings, including events smaller than those previously described in the literature; 2. Determine whether event type and event size affect the mean number of patients presenting for treatment per event, and specifically, whether the 1:2,000 deployment rule used by St John Ambulance Australia is appropriate; and 3. Identify factors that are predictive of injury at mass gatherings. METHODS: A retrospective, observational, case-series design was used to examine all cases treated by two Divisions of St John Ambulance (Queensland) in the greater metropolitan Brisbane region over a three-year period (01 January 2002-31 December 2004). Data were obtained from routinely collected patient treatment forms completed by St John officers at the time of treatment. Event-related data (e.g., weather, event size) were obtained from event forms designed for this study. Outcome measures include: total and average number of patient presentations for each event; event type; and event size category. Descriptive analyses were conducted using chi-square tests, and mean presentations per event and event type were investigated using Kruskal-Wallis tests. Logistic regression analyses were used to identify variables independently associated with injury presentation (compared with non-injury presentations). RESULTS: Over the three-year study period, St John Ambulance officers treated 705 patients over 156 separate events. The mean number of patients who presented with any medical condition at small events (less than or equal to 2,000 attendees) did not differ significantly from that of large (>2,000 attendees) events (4.44 vs. 4.67, F = 0.72, df = 1, 154, p = 0.79). Logistic regression analyses indicated that presentation with an injury compared with non-injury was independently associated with male gender, winter season, and sporting events, even after adjusting for relevant variables. CONCLUSIONS: In this study of low-volume mass gatherings, a similar number of patients sought medical treatment at small (<2,000 patrons) and large (>2,000 patrons) events. This demonstrates that for low-volume mass gatherings, planning based solely on anticipated event size may be flawed, and could lead to inappropriate levels of first-aid coverage. This study also highlights the importance of considering other factors, such as event type and patient characteristics, when determining appropriate first-aid resourcing for low-volume events. Additionally, identification of factors predictive of injury presentations at mass gatherings has the potential to significantly enhance the ability of event coordinators to plan effective prevention strategies and response capability for these events.
Resumo:
Many existing schemes for malware detection are signature-based. Although they can effectively detect known malwares, they cannot detect variants of known malwares or new ones. Most network servers do not expect executable code in their in-bound network traffic, such as on-line shopping malls, Picasa, Youtube, Blogger, etc. Therefore, such network applications can be protected from malware infection by monitoring their ports to see if incoming packets contain any executable contents. This paper proposes a content-classification scheme that identifies executable content in incoming packets. The proposed scheme analyzes the packet payload in two steps. It first analyzes the packet payload to see if it contains multimedia-type data (such as . If not, then it classifies the payload either as text-type (such as or executable. Although in our experiments the proposed scheme shows a low rate of false negatives and positives (4.69% and 2.53%, respectively), the presence of inaccuracies still requires further inspection to efficiently detect the occurrence of malware. In this paper, we also propose simple statistical and combinatorial analysis to deal with false positives and negatives.
Resumo:
People interact with mobile computing devices everywhere, while sitting, walking, running or even driving. Adapting the interface to suit these contexts is important, thus this paper proposes a simple human activity classification system. Our approach uses a vector magnitude recognition technique to detect and classify when a person is stationary (or not walking), casually walking, or jogging, without any prior training. The user study has confirmed the accuracy.
Resumo:
This paper considers issues of methodological innovation in communication, media and cultural studies, that arise out of the extent to which we now live in a media environment characterised by an digital media abundance, the convergence of media platforms, content and services, and the globalisation of media content through ubiquitous computing and high-speed broadband networks. These developments have also entailed a shift in the producer-consumer relationships that characterised the 20th century mass communications paradigm, with the rapid proliferation of user-created content, accelerated innovation, the growing empowerment of media users themselves, and the blurring of distinctions between public and private, as well as age-based distinctions in terms of what media can be accessed by whom and for what purpose. It considers these issues through a case study of the Australian Law Reform Commission's National Classification Scheme Review.
Resumo:
Automatic species recognition plays an important role in assisting ecologists to monitor the environment. One critical issue in this research area is that software developers need prior knowledge of specific targets people are interested in to build templates for these targets. This paper proposes a novel approach for automatic species recognition based on generic knowledge about acoustic events to detect species. Acoustic component detection is the most critical and fundamental part of this proposed approach. This paper gives clear definitions of acoustic components and presents three clustering algorithms for detecting four acoustic components in sound recordings; whistles, clicks, slurs, and blocks. The experiment result demonstrates that these acoustic component recognisers have achieved high precision and recall rate.
Resumo:
Purpose – The work presented in this paper aims to provide an approach to classifying web logs by personal properties of users. Design/methodology/approach – The authors describe an iterative system that begins with a small set of manually labeled terms, which are used to label queries from the log. A set of background knowledge related to these labeled queries is acquired by combining web search results on these queries. This background set is used to obtain many terms that are related to the classification task. The system then ranks each of the related terms, choosing those that most fit the personal properties of the users. These terms are then used to begin the next iteration. Findings – The authors identify the difficulties of classifying web logs, by approaching this problem from a machine learning perspective. By applying the approach developed, the authors are able to show that many queries in a large query log can be classified. Research limitations/implications – Testing results in this type of classification work is difficult, as the true personal properties of web users are unknown. Evaluation of the classification results in terms of the comparison of classified queries to well known age-related sites is a direction that is currently being exploring. Practical implications – This research is background work that can be incorporated in search engines or other web-based applications, to help marketing companies and advertisers. Originality/value – This research enhances the current state of knowledge in short-text classification and query log learning. Classification schemes, Computer networks, Information retrieval, Man-machine systems, User interfaces
Resumo:
In this study we set out to dissociate the developmental time course of automatic symbolic number processing and cognitive control functions in grade 1-3 British primary school children. Event-related potential (ERP) and behavioral data were collected in a physical size discrimination numerical Stroop task. Task-irrelevant numerical information was processed automatically already in grade 1. Weakening interference and strengthening facilitation indicated the parallel development of general cognitive control and automatic number processing. Relationships among ERP and behavioral effects suggest that control functions play a larger role in younger children and that automaticity of number processing increases from grade 1 to 3.
Resumo:
Pedestrians’ use of mp3 players or mobile phones can pose the risk of being hit by motor vehicles. We present an approach for detecting a crash risk level using the computing power and the microphone of mobile devices that can be used to alert the user in advance of an approaching vehicle so as to avoid a crash. A single feature extractor classifier is not usually able to deal with the diversity of risky acoustic scenarios. In this paper, we address the problem of detection of vehicles approaching a pedestrian by a novel, simple, non resource intensive acoustic method. The method uses a set of existing statistical tools to mine signal features. Audio features are adaptively thresholded for relevance and classified with a three component heuristic. The resulting Acoustic Hazard Detection (AHD) system has a very low false positive detection rate. The results of this study could help mobile device manufacturers to embed the presented features into future potable devices and contribute to road safety.
Resumo:
In this paper, we describe the main processes and operations in mining industries and present a comprehensive survey of operations research methodologies that have been applied over the last several decades. The literature review is classified into four main categories: mine design; mine production; mine transportation; and mine evaluation. Mining design models are further separated according to two main mining methods: open-pit and underground. Moreover, mine production models are subcategorised into two groups: ore mining and coal mining. Mine transportation models are further partitioned in accordance with fleet management, truck haulage and train scheduling. Mine evaluation models are further subdivided into four clusters in terms of mining method selection, quality control, financial risks and environmental protection. The main characteristics of four Australian commercial mining software are addressed and compared. This paper bridges the gaps in the literature and motivates researchers to develop more applicable, realistic and comprehensive operations research models and solution techniques that are directly linked with mining industries.
Resumo:
It is a big challenge to acquire correct user profiles for personalized text classification since users may be unsure in providing their interests. Traditional approaches to user profiling adopt machine learning (ML) to automatically discover classification knowledge from explicit user feedback in describing personal interests. However, the accuracy of ML-based methods cannot be significantly improved in many cases due to the term independence assumption and uncertainties associated with them. This paper presents a novel relevance feedback approach for personalized text classification. It basically applies data mining to discover knowledge from relevant and non-relevant text and constraints specific knowledge by reasoning rules to eliminate some conflicting information. We also developed a Dempster-Shafer (DS) approach as the means to utilise the specific knowledge to build high-quality data models for classification. The experimental results conducted on Reuters Corpus Volume 1 and TREC topics support that the proposed technique achieves encouraging performance in comparing with the state-of-the-art relevance feedback models.
Resumo:
Background: Access to cardiac services is essential for appropriate implementation of evidence-based therapies to improve outcomes. The Cardiac Accessibility and Remoteness Index for Australia (Cardiac ARIA) aimed to derive an objective, geographic measure reflecting access to cardiac services. Methods: An expert panel defined an evidence-based clinical pathway. Using Geographic Information Systems (GIS), a numeric/alpha index was developed at two points along the continuum of care. The acute category (numeric) measured the time from the emergency call to arrival at an appropriate medical facility via road ambulance. The aftercare category (alpha) measured access to four basic services (family doctor, pharmacy, cardiac rehabilitation, and pathology services) when a patient returned to their community. Results: The numeric index ranged from 1 (access to principle referral center with cardiac catheterization service ≤ 1 hour) to 8 (no ambulance service, > 3 hours to medical facility, air transport required). The alphabetic index ranged from A (all 4 services available within 1 hour drive-time) to E (no services available within 1 hour). 13.9 million (71%) Australians resided within Cardiac ARIA 1A locations (hospital with cardiac catheterization laboratory and all aftercare within 1 hour). Those outside Cardiac 1A were over-represented by people aged over 65 years (32%) and Indigenous people (60%). Conclusion: The Cardiac ARIA index demonstrated substantial inequity in access to cardiac services in Australia. This methodology can be used to inform cardiology health service planning and the methodology could be applied to other common disease states within other regions of the world.
Resumo:
Cardiovascular disease (CVD) continues to impose a heavy burden in terms of cost, disability and death in Australia. Evidence suggests that increasing remoteness, where cardiac services are scarce, is linked to an increased risk of dying from CVD. Fatal CVD events are reported to be between 20% and 50% higher in rural areas compared to major cities. The Cardiac ARIA project, with its extensive use of geographic Information Systems (GIS), ranks each of Australia’s 20,387 urban, rural and remote population centres by accessibility to essential services or resources for the management of a cardiac event. This unique, innovative and highly collaborative project delivers a powerful tool to highlight and combat the burden imposed by cardiovascular disease (CVD) in Australia. Cardiac ARIA is innovative. It is a model that could be applied internationally and to other acute and chronic conditions such as mental health, midwifery, cancer, respiratory, diabetes and burns services. Cardiac ARIA was designed to: 1. Determine by expert panel, what were the minimal services and resources required for the management of a cardiac event in any urban, rural or remote population locations in Australia using a single patient pathway to access care. 2. Derive a classification using GIS accessibility modelling for each of Australia’s 20,387 urban, rural and remote population locations. 3. Compare the Cardiac ARIA categories and population locations with census derived population characteristics. Key findings are as follows: • In the event of a cardiac emergency, the majority of Australians had very good access to cardiac services. Approximately 71% or 13.9 million people lived within one hour of a category one hospital. • 68% of older Australians lived within one hour of a category one hospital (Principal Referral Hospital with access to Cardiac Catheterisation). • Only 40% of indigenous people lived within one hour of the category one hospital. • 16% (74000) of indigenous people lived more than one hour from a hospital. • 3% (91,000) of people 65 years of age or older lived more than one hour from any hospital or clinic. • Approximately 96%, or 19 million, of people lived within one hour of the four key services to support cardiac rehabilitation and secondary prevention. • 75% of indigenous people lived within one hour of the four cardiac rehabilitation services to support cardiac rehabilitation and secondary prevention. Fourteen percent (64,000 persons) indigenous people had poor access to the four key services to support cardiac rehabilitation and secondary prevention. • 12% (56,000) of indigenous people were more than one hour from a hospital and only had access one the four key services (usually a medical service) to support cardiac rehabilitation and secondary prevention.
Resumo:
This paper presents techniques which can lead to diagnosis of faults in a small size multi-cylinder diesel engine. Preliminary analysis of the acoustic emission (AE) signals is outline, including time-frequency analysis and selection of optimum frequency band.The results of applying mean field independent component analysis (MFICA) to separate the AE root mean square (RMS) signals and the effects of changing parameter values are also outlined. The results on separation of RMS signals show thsi technique has the potential of increasing the probability to successfully identify the AE events associated with the various mechanical events within the combustion process of multi-cylinder diesel engines.
Resumo:
Vibration analysis has been a prime tool in condition monitoring of rotating machines, however, its application to internal combustion engines remains a challenge because engine vibration signatures are highly non-stationary that are not suitable for popular spectrum-based analysis. Signal-to-noise ratio is a main concern in engine signature analysis due to severe background noise being generated by consecutive mechanical events, such as combustion, valve opening and closing, especially in multi-cylinder engines. Acoustic Emission (AE) has been found to give excellent signal-to-noise ratio allowing discrimination of fine detail of normal or abnormal events during a given cycle. AE has been used to detect faults, such as exhaust valve leakage, fuel injection behaviour, and aspects of the combustion process. This paper presents a review of AE application to diesel engine monitoring and preliminary investigation of AE signature measured on an 18-cylinder diesel engine. AE is compared with vibration acceleration for varying operating conditions: load and speed. Frequency characteristics of AE from those events are analysed in time-frequency domain via short time Fourier trasform. The result shows a great potential of AE analysis for detection of various defects in diesel engines.
Resumo:
As civil infrastructures such as bridges age, there is a concern for safety and a need for cost-effective and reliable monitoring tool. Different diagnostic techniques are available nowadays for structural health monitoring (SHM) of bridges. Acoustic emission is one such technique with potential of predicting failure. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). AEtechnique involves recording the stress waves bymeans of sensors and subsequent analysis of the recorded signals,which then convey information about the nature of the source. AE can be used as a local SHM technique to monitor specific regions with visible presence of cracks or crack prone areas such as welded regions and joints with bolted connection or as a global technique to monitor the whole structure. Strength of AE technique lies in its ability to detect active crack activity, thus helping in prioritising maintenance work by helping focus on active cracks rather than dormant cracks. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. One is the generation of large amount of data during the testing; hence an effective data analysis and management is necessary, especially for long term monitoring uses. Complications also arise as a number of spurious sources can giveAEsignals, therefore, different source discrimination strategies are necessary to identify genuine signals from spurious ones. Another major challenge is the quantification of damage level by appropriate analysis of data. Intensity analysis using severity and historic indices as well as b-value analysis are some important methods and will be discussed and applied for analysis of laboratory experimental data in this paper.