332 resultados para decomposition techniques
Resumo:
Frog protection has become increasingly essential due to the rapid decline of its biodiversity. Therefore, it is valuable to develop new methods for studying this biodiversity. In this paper, a novel feature extraction method is proposed based on perceptual wavelet packet decomposition for classifying frog calls in noisy environments. Pre-processing and syllable segmentation are first applied to the frog call. Then, a spectral peak track is extracted from each syllable if possible. Track duration, dominant frequency and oscillation rate are directly extracted from the track. With k-means clustering algorithm, the calculated dominant frequency of all frog species is clustered into k parts, which produce a frequency scale for wavelet packet decomposition. Based on the adaptive frequency scale, wavelet packet decomposition is applied to the frog calls. Using the wavelet packet decomposition coefficients, a new feature set named perceptual wavelet packet decomposition sub-band cepstral coefficients is extracted. Finally, a k-nearest neighbour (k-NN) classifier is used for the classification. The experiment results show that the proposed features can achieve an average classification accuracy of 97.45% which outperforms syllable features (86.87%) and Mel-frequency cepstral coefficients (MFCCs) feature (90.80%).
Resumo:
Frogs have received increasing attention due to their effectiveness for indicating the environment change. Therefore, it is important to monitor and assess frogs. With the development of sensor techniques, large volumes of audio data (including frog calls) have been collected and need to be analysed. After transforming the audio data into its spectrogram representation using short-time Fourier transform, the visual inspection of this representation motivates us to use image processing techniques for analysing audio data. Applying acoustic event detection (AED) method to spectrograms, acoustic events are firstly detected from which ridges are extracted. Three feature sets, Mel-frequency cepstral coefficients (MFCCs), AED feature set and ridge feature set, are then used for frog call classification with a support vector machine classifier. Fifteen frog species widely spread in Queensland, Australia, are selected to evaluate the proposed method. The experimental results show that ridge feature set can achieve an average classification accuracy of 74.73% which outperforms the MFCCs (38.99%) and AED feature set (67.78%).
Resumo:
Eleanor Smith [pseudonym], teacher : I was talking to the kids about MacDonalds*/I forget exactly what the context was*/I said ‘‘ah, the Americans call them French fries, and, you know, MacDonalds is an American chain and they call them French fries because the Americans call them French fries’’, and this little Australian kid in the front row, very Australian child, said to me, ‘‘I call them French fries!’’ . . . Um, a fourth grade boy whom I taught in 1993 at this school, the world basketball championships were on . . . Americans were playing their dream machine and the Boomers were up against them . . . and, ah, this boy was very interested in basketball . . . but it’s not in my blood, not in the way cricket is for example . . . Um, Um, and I said to this fellow, ‘‘um, well’’, I said, ‘‘Australia’s up against Dream Machine tomorrow’’. He [Jason, pseudonym] said, ‘‘Ah, you know, Boomers probably won’t win’’. . . . I said, ‘‘Well that’s sport, mate’’. I said, ‘‘You never know in sport. Australia might win’’. And he looked at me and he said, ‘‘I’m not going for Australia, I’m going for America’’. This is from an Australian boy! And I thought so strong is the hype, so strong is the, is the, power of the media, etc., that this boy is not [pause], I can’t tell you how outraged I was. Here’s me as an Australian and I don’t even support basketball, it’s not even my sport, um, but that he would respond like that because of the power of the American machine that’s converting kids’ minds, the way they think, where they’re putting their loyalties, etc. I was just appalled, but that’s where he was. And when I asked kids for their favourite place, he said Los Angeles.
Resumo:
The purpose of this article is to show the applicability and benefits of the techniques of design of experiments as an optimization tool for discrete simulation models. The simulated systems are computational representations of real-life systems; its characteristics include a constant evolution that follows the occurrence of discrete events along the time. In this study, a production system, designed with the business philosophy JIT (Just in Time) is used, which seeks to achieve excellence in organizations through waste reduction in all the operational aspects. The most typical tool of JIT systems is the KANBAN production control that seeks to synchronize demand with flow of materials, minimize work in process, and define production metrics. Using experimental design techniques for stochastic optimization, the impact of the operational factors on the efficiency of the KANBAN / CONWIP simulation model is analyzed. The results show the effectiveness of the integration of experimental design techniques and discrete simulation models in the calculation of the operational parameters. Furthermore, the reliability of the methodologies found was improved with a new statistical consideration.
Resumo:
Over the last two decades, there has been an increasing awareness of, and interest in, the use of spatial moment techniques to provide insight into a range of biological and ecological processes. Models that incorporate spatial moments can be viewed as extensions of mean-field models. These mean-field models often consist of systems of classical ordinary differential equations and partial differential equations, whose derivation, at some point, hinges on the simplifying assumption that individuals in the underlying stochastic process encounter each other at a rate that is proportional to the average abundance of individuals. This assumption has several implications, the most striking of which is that mean-field models essentially neglect any impact of the spatial structure of individuals in the system. Moment dynamics models extend traditional mean-field descriptions by accounting for the dynamics of pairs, triples and higher n-tuples of individuals. This means that moment dynamics models can, to some extent, account for how the spatial structure affects the dynamics of the system in question.
Resumo:
Large integration of solar Photo Voltaic (PV) in distribution network has resulted in over-voltage problems. Several control techniques are developed to address over-voltage problem using Deterministic Load Flow (DLF). However, intermittent characteristics of PV generation require Probabilistic Load Flow (PLF) to introduce variability in analysis that is ignored in DLF. The traditional PLF techniques are not suitable for distribution systems and suffer from several drawbacks such as computational burden (Monte Carlo, Conventional convolution), sensitive accuracy with the complexity of system (point estimation method), requirement of necessary linearization (multi-linear simulation) and convergence problem (Gram–Charlier expansion, Cornish Fisher expansion). In this research, Latin Hypercube Sampling with Cholesky Decomposition (LHS-CD) is used to quantify the over-voltage issues with and without the voltage control algorithm in the distribution network with active generation. LHS technique is verified with a test network and real system from an Australian distribution network service provider. Accuracy and computational burden of simulated results are also compared with Monte Carlo simulations.
Resumo:
Environmental changes have put great pressure on biological systems leading to the rapid decline of biodiversity. To monitor this change and protect biodiversity, animal vocalizations have been widely explored by the aid of deploying acoustic sensors in the field. Consequently, large volumes of acoustic data are collected. However, traditional manual methods that require ecologists to physically visit sites to collect biodiversity data are both costly and time consuming. Therefore it is essential to develop new semi-automated and automated methods to identify species in automated audio recordings. In this study, a novel feature extraction method based on wavelet packet decomposition is proposed for frog call classification. After syllable segmentation, the advertisement call of each frog syllable is represented by a spectral peak track, from which track duration, dominant frequency and oscillation rate are calculated. Then, a k-means clustering algorithm is applied to the dominant frequency, and the centroids of clustering results are used to generate the frequency scale for wavelet packet decomposition (WPD). Next, a new feature set named adaptive frequency scaled wavelet packet decomposition sub-band cepstral coefficients is extracted by performing WPD on the windowed frog calls. Furthermore, the statistics of all feature vectors over each windowed signal are calculated for producing the final feature set. Finally, two well-known classifiers, a k-nearest neighbour classifier and a support vector machine classifier, are used for classification. In our experiments, we use two different datasets from Queensland, Australia (18 frog species from commercial recordings and field recordings of 8 frog species from James Cook University recordings). The weighted classification accuracy with our proposed method is 99.5% and 97.4% for 18 frog species and 8 frog species respectively, which outperforms all other comparable methods.
Resumo:
Sampling design is critical to the quality of quantitative research, yet it does not always receive appropriate attention in nursing research. The current article details how balancing probability techniques with practical considerations produced a representative sample of Australian nursing homes (NHs). Budgetary, logistical, and statistical constraints were managed by excluding some NHs (e.g., those too difficult to access) from the sampling frame; a stratified, random sampling methodology yielded a final sample of 53 NHs from a population of 2,774. In testing the adequacy of representation of the study population, chi-square tests for goodness of fit generated nonsignificant results for distribution by distance from major city and type of organization. A significant result for state/territory was expected and was easily corrected for by the application of weights. The current article provides recommendations for conducting high-quality, probability-based samples and stresses the importance of testing the representativeness of achieved samples.
Resumo:
This paper provides an empirical estimation of energy efficiency and other proximate factors that explain energy intensity in Australia for the period 1978-2009. The analysis is performed by decomposing the changes in energy intensity by means of energy efficiency, fuel mix and structural changes using sectoral and sub-sectoral levels of data. The results show that the driving forces behind the decrease in energy intensity in Australia are efficiency effect and sectoral composition effect, where the former is found to be more prominent than the latter. Moreover, the favourable impact of the composition effect has slowed consistently in recent years. A perfect positive association characterizes the relationship between energy intensity and carbon intensity in Australia. The decomposition results indicate that Australia needs to improve energy efficiency further to reduce energy intensity and carbon emissions. © 2012 Elsevier Ltd.
Resumo:
Changes in energy-related CO2 emissions aggregate intensity, total CO2 emissions and per-capita CO2 emissions in Australia are decomposed by using a Logarithmic Mean Divisia Index (LMDI) method for the period 1978-2010. Results indicate improvements in energy efficiency played a dominant role in the measured 17% reduction in CO2 emissions aggregate intensity in Australia over the period. Structural changes in the economy, such as changes in the relative importance of the services sector vis-à-vis manufacturing, have also played a major role in achieving this outcome. Results also suggest that, without these mitigating factors, income per capita and population effects could well have produced an increase in total emissions of more than 50% higher than actually occurred over the period. Perhaps most starkly, the results indicate that, without these mitigating factors, the growth in CO2 emissions per capita could have been over 150% higher than actually observed. Notwithstanding this, the study suggests that, for Australia to meet its Copenhagen commitment, the relative average per annum effectiveness of these mitigating factors during 2010-2020 probably needs to be almost three times what it was in the 2005-2010 period-a very daunting challenge indeed for Australia's policymakers.
Resumo:
This paper examines the asymmetry of changes in CO
Resumo:
In Australia, railway systems play a vital role in transporting the sugarcane crop from farms to mills. In this paper, a novel job shop approach is proposed to create a more efficient integrated harvesting and sugarcane transport scheduling system to reduce the cost of sugarcane transport. There are several benefits that can be attained by treating the train scheduling problem as a job shop problem. Job shop is generic and suitable for all trains scheduling problems. Job shop technique prevents operating two trains on one section at the same time because it considers that the section or the machine is unique. This technique is more promising to find better solutions in reasonable computation times.
Resumo:
Detect and Avoid (DAA) technology is widely acknowledged as a critical enabler for unsegregated Remote Piloted Aircraft (RPA) operations, particularly Beyond Visual Line of Sight (BVLOS). Image-based DAA, in the visible spectrum, is a promising technological option for addressing the challenges DAA presents. Two impediments to progress for this approach are the scarcity of available video footage to train and test algorithms, in conjunction with testing regimes and specifications which facilitate repeatable, statistically valid, performance assessment. This paper includes three key contributions undertaken to address these impediments. In the first instance, we detail our progress towards the creation of a large hybrid collision and near-collision encounter database. Second, we explore the suitability of techniques employed by the biometric research community (Speaker Verification and Language Identification), for DAA performance optimisation and assessment. These techniques include Detection Error Trade-off (DET) curves, Equal Error Rates (EER), and the Detection Cost Function (DCF). Finally, the hybrid database and the speech-based techniques are combined and employed in the assessment of a contemporary, image based DAA system. This system includes stabilisation, morphological filtering and a Hidden Markov Model (HMM) temporal filter.
Resumo:
This paper presents a statistical aircraft trajectory clustering approach aimed at discriminating between typical manned and expected unmanned traffic patterns. First, a resampled version of each trajectory is modelled using a mixture of Von Mises distributions (circular statistics). Second, the remodelled trajectories are globally aligned using tools from bioinformatics. Third, the alignment scores are used to cluster the trajectories using an iterative k-medoids approach and an appropriate distance function. The approach is then evaluated using synthetically generated unmanned aircraft flights combined with real air traffic position reports taken over a sector of Northern Queensland, Australia. Results suggest that the technique is useful in distinguishing between expected unmanned and manned aircraft traffic behaviour, as well as identifying some common conventional air traffic patterns.
Resumo:
Electricity generation is vital in developed countries to power the many mechanical and electrical devices that people require. Unfortunately electricity generation is costly. Though electricity can be generated it cannot be stored efficiently. Electricity generation is also difficult to manage because exact demand is unknown from one instant to the next. A number of services are required to manage fluctuations in electricity demand, and to protect the system when frequency falls too low. A current approach is called automatic under frequency load shedding (AUFLS). This article proposes new methods for optimising AUFLS in New Zealand’s power system. The core ideas were developed during the 2015 Maths and Industry Study Group (MISG) in Brisbane, Australia. The problem has been motivated by Transpower Limited, a company that manages New Zealand’s power system and transports bulk electricity from where it is generated to where it is needed. The approaches developed in this article can be used in electrical power systems anywhere in the world.