262 resultados para litter mixture


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a robust stochastic framework for the incorporation of visual observations into conventional estimation, data fusion, navigation and control algorithms. The representation combines Isomap, a non-linear dimensionality reduction algorithm, with expectation maximization, a statistical learning scheme. The joint probability distribution of this representation is computed offline based on existing training data. The training phase of the algorithm results in a nonlinear and non-Gaussian likelihood model of natural features conditioned on the underlying visual states. This generative model can be used online to instantiate likelihoods corresponding to observed visual features in real-time. The instantiated likelihoods are expressed as a Gaussian mixture model and are conveniently integrated within existing non-linear filtering algorithms. Example applications based on real visual data from heterogenous, unstructured environments demonstrate the versatility of the generative models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a robust stochastic model for the incorporation of natural features within data fusion algorithms. The representation combines Isomap, a non-linear manifold learning algorithm, with Expectation Maximization, a statistical learning scheme. The representation is computed offline and results in a non-linear, non-Gaussian likelihood model relating visual observations such as color and texture to the underlying visual states. The likelihood model can be used online to instantiate likelihoods corresponding to observed visual features in real-time. The likelihoods are expressed as a Gaussian Mixture Model so as to permit convenient integration within existing nonlinear filtering algorithms. The resulting compactness of the representation is especially suitable to decentralized sensor networks. Real visual data consisting of natural imagery acquired from an Unmanned Aerial Vehicle is used to demonstrate the versatility of the feature representation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to demonstrate the validity of using Gaussian mixture models (GMM) for representing probabilistic distributions in a decentralised data fusion (DDF) framework. GMMs are a powerful and compact stochastic representation allowing efficient communication of feature properties in large scale decentralised sensor networks. It will be shown that GMMs provide a basis for analytical solutions to the update and prediction operations for general Bayesian filtering. Furthermore, a variant on the Covariance Intersect algorithm for Gaussian mixtures will be presented ensuring a conservative update for the fusion of correlated information between two nodes in the network. In addition, purely visual sensory data will be used to show that decentralised data fusion and tracking of non-Gaussian states observed by multiple autonomous vehicles is feasible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present the application of a non-linear dimensionality reduction technique for the learning and probabilistic classification of hyperspectral image. Hyperspectral image spectroscopy is an emerging technique for geological investigations from airborne or orbital sensors. It gives much greater information content per pixel on the image than a normal colour image. This should greatly help with the autonomous identification of natural and manmade objects in unfamiliar terrains for robotic vehicles. However, the large information content of such data makes interpretation of hyperspectral images time-consuming and userintensive. We propose the use of Isomap, a non-linear manifold learning technique combined with Expectation Maximisation in graphical probabilistic models for learning and classification. Isomap is used to find the underlying manifold of the training data. This low dimensional representation of the hyperspectral data facilitates the learning of a Gaussian Mixture Model representation, whose joint probability distributions can be calculated offline. The learnt model is then applied to the hyperspectral image at runtime and data classification can be performed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the study of traffic safety, expected crash frequencies across sites are generally estimated via the negative binomial model, assuming time invariant safety. Since the time invariant safety assumption may be invalid, Hauer (1997) proposed a modified empirical Bayes (EB) method. Despite the modification, no attempts have been made to examine the generalisable form of the marginal distribution resulting from the modified EB framework. Because the hyper-parameters needed to apply the modified EB method are not readily available, an assessment is lacking on how accurately the modified EB method estimates safety in the presence of the time variant safety and regression-to-the-mean (RTM) effects. This study derives the closed form marginal distribution, and reveals that the marginal distribution in the modified EB method is equivalent to the negative multinomial (NM) distribution, which is essentially the same as the likelihood function used in the random effects Poisson model. As a result, this study shows that the gamma posterior distribution from the multivariate Poisson-gamma mixture can be estimated using the NM model or the random effects Poisson model. This study also shows that the estimation errors from the modified EB method are systematically smaller than those from the comparison group method by simultaneously accounting for the RTM and time variant safety effects. Hence, the modified EB method via the NM model is a generalisable method for estimating safety in the presence of the time variant safety and the RTM effects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a comprehensive review of scientific and grey literature on gross pollutant traps (GPTs). GPTs are designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. Their application involves professional societies, research organisations, local city councils, government agencies and the stormwater industry—often in partnership. In view of this, the 113 references include unpublished manuscripts from these bodies along with scientific peer-reviewed conference papers and journal articles. The literature reviewed was organised into a matrix of six main devices and nine research areas (testing methodologies) which include: design appraisal study, field monitoring/testing, experimental flow fields, gross pollutant capture/retention characteristics, residence time calculations, hydraulic head loss, screen blockages, flow visualisations and computational fluid dynamics (CFD). When the fifty-four item matrix was analysed, twenty-eight research gaps were found in the tabulated literature. It was also found that the number of research gaps increased if only the scientific literature was considered. It is hoped, that in addition to informing the research community at QUT, this literature review will also be of use to other researchers in this field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Reporting and Reception of Indigenous Issues in the Australian Media was a three year project financed by the Australian government through its Australian Research Council Large Grants Scheme and run by Professor John Hartley (of Murdoch and then Edith Cowan University, Western Australia). The purpose of the research was to map the ways in which indigeneity was constructed and circulated in Australia's mediasphere. The analysis of the 'reporting' element of the project was almost straightforward: a mixture of content analysis of a large number of items in the media, and detailed textual analysis of a smaller number of key texts. The discoveries were interesting - that when analysis approaches the media as a whole, rather than focussing exclusively on news or serious drama genres, then representation of indigeneity is not nearly as homogenous as has previously been assumed. And if researchers do not explicitly set out to uncover racism in every text, it is by no means guaranteed they will find it1. The question of how to approach the 'reception' of these issues - and particularly reception by indigenous Australians - proved to be a far more challenging one. In attempting to research this area, Hartley and I (working as a research assistant on the project) often found ourselves hampered by the axioms that underlie much media research. Traditionally, the 'reception' of media by indigenous people in Australia has been researched in ethnographic ways. This research repeatedly discovers that indigenous people in Australia are powerless in the face of new forms of media. Indigenous populations are represented as victims of aggressive and powerful intrusions: ‘What happens when a remote community is suddenly inundated by broadcast TV?’; ‘Overnight they will go from having no radio and television to being bombarded by three TV channels’; ‘The influence of film in an isolated, traditionally oriented Aboriginal community’2. This language of ‘influence’, ‘bombarded’, and ‘inundated’, presents metaphors not just of war but of a war being lost. It tells of an unequal struggle, of a more powerful force impinging upon a weaker one. What else could be the relationship of an Aboriginal audience to something which is ‘bombarding’ them? Or by which they are ‘inundated’? This attitude might best be summed up by the title of an article by Elihu Katz: ‘Can authentic cultures survive new media?’3. In such writing, there is little sense that what is being addressed might be seen as a series of discursive encounters, negotiations and acts of meaning-making in which indigenous people — communities and audiences —might be productive. Certainly, the points of concern in this type of writing are important. The question of what happens when a new communication medium is summarily introduced to a culture is certainly an important one. But the language used to describe this interaction is a misleading one. And it is noticeable that such writing is fascinated with the relationship of only traditionally-oriented Aboriginal communities to the media of mass communication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermogravimetry combined with evolved gas mass spectrometry has been used to ascertain the stability of the ‘cave’ mineral brushite. X-ray diffraction shows that brushite from the Jenolan Caves is very pure. Thermogravimetric analysis coupled with ion current mass spectrometry shows a mass loss at 111°C due to loss of water of hydration. A further decomposition step occurs at 190°C with the conversion of hydrogen phosphate to a mixture of calcium ortho-phosphate and calcium pyrophosphate. TG-DTG shows the mineral is not stable above 111°C. A mechanism for the formation of brushite on calcite surfaces is proposed, and this mechanism has relevance to the formation of brushite in urinary tracts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biochars produced by slow pyrolysis of greenwaste (GW), poultry litter (PL), papermill waste (PS), and biosolids (BS) were shown to reduce N2O emissions from an acidic Ferrosol. Similar reductions were observed for the untreated GW feedstock. Soil was amended with biochar or feedstock giving application rates of 1 and 5%. Following an initial incubation, nitrogen (N) was added at 165 kg/ha as urea. Microcosms were again incubated before being brought to 100% water-filled porosity and held at this water content for a further 47 days. The flooding phase accounted for the majority (<80%) of total N2O emissions. The control soil released 3165 mg N2O-N/m2, or 15.1% of the available N as N2O. Amendment with 1 and 5% GW feedstock significantly reduced emissions to 1470 and 636 mg N2O-N/m2, respectively. This was equivalent to 8.6 and 3.8% of applied N. The GW biochar produced at 350°C was least effective in reducing emissions, resulting in 1625 and 1705 mg N2O-N/m2 for 1 and 5% amendments. Amendment with BS biochar at 5% had the greatest impact, reducing emissions to 518 mg N2O-N/m2, or 2.2% of the applied N over the incubation period. Metabolic activity as measured by CO2 production could not explain the differences in N2O emissions between controls and amendments, nor could NH4+ or NO3 concentrations in biochar-amended soils. A decrease in NH4+ and NO3 following GW feedstock application is likely to have been responsible for reducing N2O emissions from this amendment. Reduction in N2O emissions from the biochar-amended soils was attributed to increased adsorption of NO3. Small reductions are possible due to improved aeration and porosity leading to lower levels of denitrification and N2O emissions. Alternatively, increased pH was observed, which can drive denitrification through to dinitrogen during soil flooding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hybrid system representations have been applied to many challenging modeling situations. In these hybrid system representations, a mixture of continuous and discrete states is used to capture the dominating behavioural features of a nonlinear, possible uncertain, model under approximation. Unfortunately, the problem of how to best design a suitable hybrid system model has not yet been fully addressed. This paper proposes a new joint state measurement relative entropy rate based approach for this design purpose. Design examples and simulation studies are presented which highlight the benefits of our proposed design approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pt/graphene nanosheet/SiC based devices are fabricated and characterized and their performances toward hydrogen gas are investigated. The graphene nanosheets are synthesized via the reduction of spray-coated graphite oxide deposited onto SiC substrates. Raman and X-ray photoelectron spectroscopies indicate incomplete reduction of the graphite oxide, resulting in partially oxidized graphene nanosheet layers of less than 10 nm thickness. The effects of interfaces on the nonlinear behavior of the Pt/graphene and graphene/SiC junctions are investigated. Current-voltage measurements of the sensors toward 1% hydrogen in synthetic air gas mixture at various temperatures ranging up to 100. ° C are performed. From the dynamic response, a voltage shift of ∼100 mV is recorded for 1% hydrogen at a constant current bias of 1 mA at 100. °C. © 2010 American Chemical Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Epilepsy is characterized by the spontaneous and seemingly unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic system that detects seizure onsets would allow patients or the people near them to take appropriate precautions, and could provide more insight into this phenomenon. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, we made a comparative study of the performance of Gaussian mixture model (GMM) and Support Vector Machine (SVM) classifiers using the features derived from HOS and from the power spectrum. Results show that the selected HOS based features achieve 93.11% classification accuracy compared to 88.78% with features derived from the power spectrum for a GMM classifier. The SVM classifier achieves an improvement from 86.89% with features based on the power spectrum to 92.56% with features based on the bispectrum.