2 resultados para Modeling Non-Verbal Behaviors Using Machine Learning
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Replicable experimental studies using a novel experimental facility and a machine-based odour quantification technique were conducted to demonstrate the relationship between odour emission rates and pond loading rates. The odour quantification technique consisted of an electronic nose, AromaScan A32S, and an artificial neural network. Odour concentrations determined by olfactometry were used along with the AromaScan responses to train the artificial neural network. The trained network was able to predict the odour emission rates for the test data with a correlation coefficient of 0.98. Time averaged odour emission rates predicted by the machine-based odour quantification technique, were strongly correlated with volatile solids loading rate, demonstrating the increased magnitude of emissions from a heavily loaded effluent pond. However, it was not possible to obtain the same relationship between volatile solids loading rates and odour emission rates from the individual data. It is concluded that taking a limited number of odour samples over a short period is unlikely to provide a representative rate of odour emissions from an effluent pond. A continuous odour monitoring instrument will be required for that more demanding task.
Resumo:
Agricultural pests are responsible for millions of dollars in crop losses and management costs every year. In order to implement optimal site-specific treatments and reduce control costs, new methods to accurately monitor and assess pest damage need to be investigated. In this paper we explore the combination of unmanned aerial vehicles (UAV), remote sensing and machine learning techniques as a promising methodology to address this challenge. The deployment of UAVs as a sensor platform is a rapidly growing field of study for biosecurity and precision agriculture applications. In this experiment, a data collection campaign is performed over a sorghum crop severely damaged by white grubs (Coleoptera: Scarabaeidae). The larvae of these scarab beetles feed on the roots of plants, which in turn impairs root exploration of the soil profile. In the field, crop health status could be classified according to three levels: bare soil where plants were decimated, transition zones of reduced plant density and healthy canopy areas. In this study, we describe the UAV platform deployed to collect high-resolution RGB imagery as well as the image processing pipeline implemented to create an orthoimage. An unsupervised machine learning approach is formulated in order to create a meaningful partition of the image into each of the crop levels. The aim of this approach is to simplify the image analysis step by minimizing user input requirements and avoiding the manual data labelling necessary in supervised learning approaches. The implemented algorithm is based on the K-means clustering algorithm. In order to control high-frequency components present in the feature space, a neighbourhood-oriented parameter is introduced by applying Gaussian convolution kernels prior to K-means clustering. The results show the algorithm delivers consistent decision boundaries that classify the field into three clusters, one for each crop health level as shown in Figure 1. The methodology presented in this paper represents a venue for further esearch towards automated crop damage assessments and biosecurity surveillance.