225 resultados para BENCHMARK


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a novel approach to mobile robot navigation using visual information towards the goal of long-term autonomy. A novel concept of a continuous appearance-based trajectory is proposed in order to solve the limitations of previous robot navigation systems, and two new algorithms for mobile robots, CAT-SLAM and CAT-Graph, are presented and evaluated. These algorithms yield performance exceeding state-of-the-art methods on public benchmark datasets and large-scale real-world environments, and will help enable widespread use of mobile robots in everyday applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The coupling of kurtosis based-indexes and envelope analysis represents one of the most successful and widespread procedures for the diagnostics of incipient faults on rolling element bearings. Kurtosis-based indexes are often used to select the proper demodulation band for the application of envelope-based techniques. Kurtosis itself, in slightly different formulations, is applied for the prognostic and condition monitoring of rolling element bearings, as a standalone tool for a fast indication of the development of faults. This paper shows for the first time the strong analytical connection which holds for these two families of indexes. In particular, analytical identities are shown for the squared envelope spectrum (SES) and the kurtosis of the corresponding band-pass filtered analytic signal. In particular, it is demonstrated how the sum of the peaks in the SES corresponds to the raw 4th order moment. The analytical results show as well a link with an another signal processing technique: the cepstrum pre-whitening, recently used in bearing diagnostics. The analytical results are the basis for the discussion on an optimal indicator for the choice of the demodulation band, the ratio of cyclic content (RCC), which endows the kurtosis with selectivity in the cyclic frequency domain and whose performance is compared with more traditional kurtosis-based indicators such as the protrugram. A benchmark, performed on numerical simulations and experimental data coming from two different test-rigs, proves the superior effectiveness of such an indicator. Finally a short introduction to the potential offered by the newly proposed index in the field of prognostics is given in an additional experimental example. In particular the RCC is tested on experimental data collected on an endurance bearing test-rig, showing its ability to follow the development of the damage with a single numerical index.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diagnostics of rolling element bearings involves a combination of different techniques of signal enhancing and analysis. The most common procedure presents a first step of order tracking and synchronous averaging, able to remove the undesired components, synchronous with the shaft harmonics, from the signal, and a final step of envelope analysis to obtain the squared envelope spectrum. This indicator has been studied thoroughly, and statistically based criteria have been obtained, in order to identify damaged bearings. The statistical thresholds are valid only if all the deterministic components in the signal have been removed. Unfortunately, in various industrial applications, characterized by heterogeneous vibration sources, the first step of synchronous averaging is not sufficient to eliminate completely the deterministic components and an additional step of pre-whitening is needed before the envelope analysis. Different techniques have been proposed in the past with this aim: The most widely spread are linear prediction filters and spectral kurtosis. Recently, a new technique for pre-whitening has been proposed, based on cepstral analysis: the so-called cepstrum pre-whitening. Owing to its low computational requirements and its simplicity, it seems a good candidate to perform the intermediate pre-whitening step in an automatic damage recognition algorithm. In this paper, the effectiveness of the new technique will be tested on the data measured on a full-scale industrial bearing test-rig, able to reproduce the harsh conditions of operation. A benchmark comparison with the traditional pre-whitening techniques will be made, as a final step for the verification of the potentiality of the cepstrum pre-whitening.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whole-image descriptors such as GIST have been used successfully for persistent place recognition when combined with temporal filtering or sequential filtering techniques. However, whole-image descriptor localization systems often apply a heuristic rather than a probabilistic approach to place recognition, requiring substantial environmental-specific tuning prior to deployment. In this paper we present a novel online solution that uses statistical approaches to calculate place recognition likelihoods for whole-image descriptors, without requiring either environmental tuning or pre-training. Using a real world benchmark dataset, we show that this method creates distributions appropriate to a specific environment in an online manner. Our method performs comparably to FAB-MAP in raw place recognition performance, and integrates into a state of the art probabilistic mapping system to provide superior performance to whole-image methods that are not based on true probability distributions. The method provides a principled means for combining the powerful change-invariant properties of whole-image descriptors with probabilistic back-end mapping systems without the need for prior training or system tuning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present a novel place recognition algorithm inspired by recent discoveries in human visual neuroscience. The algorithm combines intolerant but fast low resolution whole image matching with highly tolerant, sub-image patch matching processes. The approach does not require prior training and works on single images (although we use a cohort normalization score to exploit temporal frame information), alleviating the need for either a velocity signal or image sequence, differentiating it from current state of the art methods. We demonstrate the algorithm on the challenging Alderley sunny day – rainy night dataset, which has only been previously solved by integrating over 320 frame long image sequences. The system is able to achieve 21.24% recall at 100% precision, matching drastically different day and night-time images of places while successfully rejecting match hypotheses between highly aliased images of different places. The results provide a new benchmark for single image, condition-invariant place recognition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel method of matching stiffness and continuous variable damping of an ECAS (electronically controlled air suspension) based on LQG (linear quadratic Gaussian) control was proposed to simultaneously improve the road-friendliness and ride comfort of a two-axle school bus. Taking account of the suspension nonlinearities and target-height-dependent variation in suspension characteristics, a stiffness model of the ECAS mounted on the drive axle of the bus was developed based on thermodynamics and the key parameters were obtained through field tests. By determining the proper range of the target height for the ECAS of the fully-loaded bus based on the design requirements of vehicle body bounce frequency, the control algorithm of the target suspension height (i.e., stiffness) was derived according to driving speed and road roughness. Taking account of the nonlinearities of a continuous variable semi-active damper, the damping force was obtained through the subtraction of the air spring force from the optimum integrated suspension force, which was calculated based on LQG control. Finally, a GA (genetic algorithm)-based matching method between stepped variable damping and stiffness was employed as a benchmark to evaluate the effectiveness of the LQG-based matching method. Simulation results indicate that compared with the GA-based matching method, both dynamic tire force and vehicle body vertical acceleration responses are markedly reduced around the vehicle body bounce frequency employing the LQG-based matching method, with peak values of the dynamic tire force PSD (power spectral density) decreased by 73.6%, 60.8% and 71.9% in the three cases, and corresponding reduction are 71.3%, 59.4% and 68.2% for the vehicle body vertical acceleration. A strong robustness to variation of driving speed and road roughness is also observed for the LQG-based matching method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we provide an overview of the Social Event Detection (SED) task that is part of the MediaEval Bench mark for Multimedia Evaluation 2013. This task requires participants to discover social events and organize the re- lated media items in event-specific clusters within a collection of Web multimedia. Social events are events that are planned by people, attended by people and for which the social multimedia are also captured by people. We describe the challenges, datasets, and the evaluation methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of Wireless Sensor Networks (WSNs) for Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data synchronization error and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research examining effects of uncertainties of generic WSN platform and verifying the capability of SHM-oriented WSNs, particularly on demanding SHM applications like modal analysis and damage identification of real civil structures. This article first reviews the major technical uncertainties of both generic and SHM-oriented WSN platforms and efforts of SHM research community to cope with them. Then, effects of the most inherent WSN uncertainty on the first level of a common Output-only Modal-based Damage Identification (OMDI) approach are intensively investigated. Experimental accelerations collected by a wired sensory system on a benchmark civil structure are initially used as clean data before being contaminated with different levels of data pollutants to simulate practical uncertainties in both WSN platforms. Statistical analyses are comprehensively employed in order to uncover the distribution pattern of the uncertainty influence on the OMDI approach. The result of this research shows that uncertainties of generic WSNs can cause serious impact for level 1 OMDI methods utilizing mode shapes. It also proves that SHM-WSN can substantially lessen the impact and obtain truly structural information without having used costly computation solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pile foundations transfer loads from superstructures to stronger sub soil. Their strength and stability can hence affect structural safety. This paper treats the response of reinforced concrete pile in saturated sand to a buried explosion. Fully coupled computer simulation techniques are used together with five different material models. Influence of reinforcement on pile response is investigated and important safety parameters of horizontal deformations and tensile stresses in the pile are evaluated. Results indicate that adequate longitudinal reinforcement and proper detailing of transverse reinforcement can reduce pile damage. Present findings can serve as a benchmark reference for future analysis and design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The wide applicability of correlation analysis inspired the development of this paper. In this paper, a new correlated modified particle swarm optimization (COM-PSO) is developed. The Correlation Adjustment algorithm is proposed to recover the correlation between the considered variables of all particles at each of iterations. It is shown that the best solution, the mean and standard deviation of the solutions over the multiple runs as well as the convergence speed were improved when the correlation between the variables was increased. However, for some rotated benchmark function, the contrary results are obtained. Moreover, the best solution, the mean and standard deviation of the solutions are improved when the number of correlated variables of the benchmark functions is increased. The results of simulations and convergence performance are compared with the original PSO. The improvement of results, the convergence speed, and the ability to simulate the correlated phenomena by the proposed COM-PSO are discussed by the experimental results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The experiences of the loss reduction projects in electric power distribution companies (EPDCs) of Iran are presented. The loss reduction methods, which are proposed individually by 14 EPDCs, corresponding energy saving (ES), Investment costs (IC), and loss rate reductions are provided. In order to illustrate the effectiveness and performance of the loss reduction methods, three parameters are proposed as energy saving per investment costs (ESIC), energy saving per quantity (ESPQ), and investment costs per quantity (ICPQ). The overall ESIC of 14 EPDC as well as individual average and standard deviation of the EISC for each method is presented and compared. In addition, the average and standard deviation of the ESPQs and ICPQs for the loss reduction methods, individually, are provided and investigated. These parameters are useful for EPDCs that intend to reduce the electric losses in distribution networks as a benchmark and as a background in the planning purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

UV-vis photodissociation action spectroscopy is becoming increasingly prevalent because of advances in, and commercial availability of, ion trapping technologies and tunable laser sources. This study outlines in detail an instrumental arrangement, combining a commercial ion-trap mass spectrometer and tunable nanosecond pulsed laser source, for performing fully automated photodissociation action spectroscopy on gas-phase ions. The components of the instrumentation are outlined, including the optical and electronic interfacing, in addition to the control software for automating the experiment and performing online analysis of the spectra. To demonstrate the utility of this ensemble, the photodissociation action spectra of 4-chloroanilinium, 4-bromoanilinium, and 4-iodoanilinium cations are presented and discussed. Multiple photoproducts are detected in each case and the photoproduct yields are followed as a function of laser wavelength. It is shown that the wavelength-dependent partitioning of the halide loss, H loss, and NH3 loss channels can be broadly rationalized in terms of the relative carbon-halide bond dissociation energies and processes of energy redistribution. The photodissociation action spectrum of (phenyl)Ag-2 (+) is compared with a literature spectrum as a further benchmark.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sugar cane is a major source of food and fuel worldwide. Biotechnology has the potential to improve economically-important traits in sugar cane as well as diversify sugar cane beyond traditional applications such as sucrose production. High levels of transgene expression are key to the success of improving crops through biotechnology. Here we describe new molecular tools that both expand and improve gene expression capabilities in sugar cane. We have identified promoters that can be used to drive high levels of gene expression in the leaf and stem of transgenic sugar cane. One of these promoters, derived from the Cestrum yellow leaf curling virus, drives levels of constitutive transgene expression that are significantly higher than those achieved by the historical benchmark maize polyubiquitin-1 (Zm-Ubi1) promoter. A second promoter, the maize phosphonenolpyruvate carboxylate promoter, was found to be a strong, leaf-preferred promoter that enables levels of expression comparable to Zm-Ubi1 in this organ. Transgene expression was increased approximately 50-fold by gene modification, which included optimising the codon usage of the coding sequence to better suit sugar cane. We also describe a novel dual transcriptional enhancer that increased gene expression from different promoters, boosting expression from Zm-Ubi1 over eightfold. These molecular tools will be extremely valuable for the improvement of sugar cane through biotechnology.