334 resultados para RM extended algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose an iterative algorithm to detect transient segments in audio signals. Short time Fourier transform(STFT) is used to detect rapid local changes in the audio signal. The algorithm has two steps that iteratively - (a) calculate a function of the STFT and (b) build a transient signal. A dynamic thresholding scheme is used to locate the potential positions of transients in the signal. The iterative procedure ensures that genuine transients are built up while the localised spectral noise are suppressed by using an energy criterion. The extracted transient signal is later compared to a ground truth dataset. The algorithm performed well on two databases. On the EBU-SQAM database of monophonic sounds, the algorithm achieved an F-measure of 90% while on our database of polyphonic audio an F-measure of 91% was achieved. This technique is being used as a preprocessing step for a tempo analysis algorithm and a TSR (Transients + Sines + Residue) decomposition scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been shown recently that the acoustic performance of the extended tube expansion chambers can be improved substantially by making the extended inlet and outlet equal to half and quarter chamber lengths, duly incorporating the end corrections due to the evanescent higher order modes that would be generated at the discontinuities. Such chambers however suffer from the disadvantages of high back pressure and generation of aerodynamic noise at the area discontinuities. These two disadvantages can be overcome by means of a perforated bridge between the extended inlet and extended outlet. This paper deals with design or tuning of these extended concentric tube resonators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective conservation and management of natural resources requires up-to-date information of the land cover (LC) types and their dynamics. The LC dynamics are being captured using multi-resolution remote sensing (RS) data with appropriate classification strategies. RS data with important environmental layers (either remotely acquired or derived from ground measurements) would however be more effective in addressing LC dynamics and associated changes. These ancillary layers provide additional information for delineating LC classes' decision boundaries compared to the conventional classification techniques. This communication ascertains the possibility of improved classification accuracy of RS data with ancillary and derived geographical layers such as vegetation index, temperature, digital elevation model (DEM), aspect, slope and texture. This has been implemented in three terrains of varying topography. The study would help in the selection of appropriate ancillary data depending on the terrain for better classified information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Algorithms for adaptive mesh refinement using a residual error estimator are proposed for fluid flow problems in a finite volume framework. The residual error estimator, referred to as the R-parameter is used to derive refinement and coarsening criteria for the adaptive algorithms. An adaptive strategy based on the R-parameter is proposed for continuous flows, while a hybrid adaptive algorithm employing a combination of error indicators and the R-parameter is developed for discontinuous flows. Numerical experiments for inviscid and viscous flows on different grid topologies demonstrate the effectiveness of the proposed algorithms on arbitrary polygonal grids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fully discrete C-0 interior penalty finite element method is proposed and analyzed for the Extended Fisher-Kolmogorov (EFK) equation u(t) + gamma Delta(2)u - Delta u + u(3) - u = 0 with appropriate initial and boundary conditions, where gamma is a positive constant. We derive a regularity estimate for the solution u of the EFK equation that is explicit in gamma and as a consequence we derive a priori error estimates that are robust in gamma. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the problem of mining targeted association rules over multidimensional market-basket data. Here, each transaction has, in addition to the set of purchased items, ancillary dimension attributes associated with it. Based on these dimensions, transactions can be visualized as distributed over cells of an n-dimensional cube. In this framework, a targeted association rule is of the form {X -> Y} R, where R is a convex region in the cube and X. Y is a traditional association rule within region R. We first describe the TOARM algorithm, based on classical techniques, for identifying targeted association rules. Then, we discuss the concepts of bottom-up aggregation and cubing, leading to the CellUnion technique. This approach is further extended, using notions of cube-count interleaving and credit-based pruning, to derive the IceCube algorithm. Our experiments demonstrate that IceCube consistently provides the best execution time performance, especially for large and complex data cubes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid growth in the field of data mining has lead to the development of various methods for outlier detection. Though detection of outliers has been well explored in the context of numerical data, dealing with categorical data is still evolving. In this paper, we propose a two-phase algorithm for detecting outliers in categorical data based on a novel definition of outliers. In the first phase, this algorithm explores a clustering of the given data, followed by the ranking phase for determining the set of most likely outliers. The proposed algorithm is expected to perform better as it can identify different types of outliers, employing two independent ranking schemes based on the attribute value frequencies and the inherent clustering structure in the given data. Unlike some existing methods, the computational complexity of this algorithm is not affected by the number of outliers to be detected. The efficacy of this algorithm is demonstrated through experiments on various public domain categorical data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel multi-timescale Q-learning algorithm for average cost control in a Markov decision process subject to multiple inequality constraints. We formulate a relaxed version of this problem through the Lagrange multiplier method. Our algorithm is different from Q-learning in that it updates two parameters - a Q-value parameter and a policy parameter. The Q-value parameter is updated on a slower time scale as compared to the policy parameter. Whereas Q-learning with function approximation can diverge in some cases, our algorithm is seen to be convergent as a result of the aforementioned timescale separation. We show the results of experiments on a problem of constrained routing in a multistage queueing network. Our algorithm is seen to exhibit good performance and the various inequality constraints are seen to be satisfied upon convergence of the algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using a Girsanov change of measures, we propose novel variations within a particle-filtering algorithm, as applied to the inverse problem of state and parameter estimations of nonlinear dynamical systems of engineering interest, toward weakly correcting for the linearization or integration errors that almost invariably occur whilst numerically propagating the process dynamics, typically governed by nonlinear stochastic differential equations (SDEs). Specifically, the correction for linearization, provided by the likelihood or the Radon-Nikodym derivative, is incorporated within the evolving flow in two steps. Once the likelihood, an exponential martingale, is split into a product of two factors, correction owing to the first factor is implemented via rejection sampling in the first step. The second factor, which is directly computable, is accounted for via two different schemes, one employing resampling and the other using a gain-weighted innovation term added to the drift field of the process dynamics thereby overcoming the problem of sample dispersion posed by resampling. The proposed strategies, employed as add-ons to existing particle filters, the bootstrap and auxiliary SIR filters in this work, are found to non-trivially improve the convergence and accuracy of the estimates and also yield reduced mean square errors of such estimates vis-a-vis those obtained through the parent-filtering schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Latent variable methods, such as PLCA (Probabilistic Latent Component Analysis) have been successfully used for analysis of non-negative signal representations. In this paper, we formulate PLCS (Probabilistic Latent Component Segmentation), which models each time frame of a spectrogram as a spectral distribution. Given the signal spectrogram, the segmentation boundaries are estimated using a maximum-likelihood approach. For an efficient solution, the algorithm imposes a hard constraint that each segment is modelled by a single latent component. The hard constraint facilitates the solution of ML boundary estimation using dynamic programming. The PLCS framework does not impose a parametric assumption unlike earlier ML segmentation techniques. PLCS can be naturally extended to model coarticulation between successive phones. Experiments on the TIMIT corpus show that the proposed technique is promising compared to most state of the art speech segmentation algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We theoretically propose and computationally demonstrate the generation of extended light-sheet for fluorescence microscopy. This is made possible by the introduction of a specially designed double-window spatial filter that allows the light to pass through the periphery and center of a cylindrical lens. When illuminated with a plane wave, the proposed filter results in an extended depth-of-focus along with side-lobes which are due to other interferences in the transverse focal plane. Computational studies show a maximum extension of light-sheet by 3.38 times for single photon excitation and 3.68 times for multiphoton excitation as compared to state-of-art single plane illumination microscopy system. This technique may facilitate the study of large biological specimens (such as Zebrafish embryo and tissue) with high spatial resolution and reduced photobleaching. (C) 2013 AIP Publishing LLC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neutron powder diffraction study of Ba(Ti1-xZrx)O-3 at close composition intervals has revealed coexistence of ferroelectric phases: orthorhombic (Amm2) + tetragonal (P4mm) for 0.02 <= x <= 0.05 and rhombohedral (R3m) + orthorhombic (Amm2) for 0.07 <= x < 0.09. These compositions exhibit relatively enhanced piezoelectric properties as compared to their single phase counterparts outside this composition region, confirming the polymorphic phase boundary nature of the phase coexistence regions. (C) 2013 AIP Publishing LLC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An electron rich porous metal-organic framework (MOF) has been synthesized, which acts as an effective heterogeneous catalyst for Diels-Alder reactions through encapsulation of the reactants in confined nano-channels of the framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The contour tree is a topological abstraction of a scalar field that captures evolution in level set connectivity. It is an effective representation for visual exploration and analysis of scientific data. We describe a work-efficient, output sensitive, and scalable parallel algorithm for computing the contour tree of a scalar field defined on a domain that is represented using either an unstructured mesh or a structured grid. A hybrid implementation of the algorithm using the GPU and multi-core CPU can compute the contour tree of an input containing 16 million vertices in less than ten seconds with a speedup factor of upto 13. Experiments based on an implementation in a multi-core CPU environment show near-linear speedup for large data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using Genetic Algorithm, a global optimization method inspired by nature's evolutionary process, we have improved the quantitative refocused constant-time INEPT experiment (Q-INEPT-CT) of Makela et al. (JMR 204 (2010) 124-130) with various optimization constraints. The improved `average polarization transfer' and `min-max difference' of new delay sets effectively reduces the experimental time by a factor of two (compared with Q-INEPT-CT, Makela et al.) without compromising on accuracy. We also discuss a quantitative spectral editing technique based on average polarization transfer. (C) 2013 Elsevier Inc. All rights reserved.