922 resultados para Polynomial Approximation


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Local polynomial approximation of data is an approach towards signal denoising. Savitzky-Golay (SG) filters are finite-impulse-response kernels, which convolve with the data to result in polynomial approximation for a chosen set of filter parameters. In the case of noise following Gaussian statistics, minimization of mean-squared error (MSE) between noisy signal and its polynomial approximation is optimum in the maximum-likelihood (ML) sense but the MSE criterion is not optimal for non-Gaussian noise conditions. In this paper, we robustify the SG filter for applications involving noise following a heavy-tailed distribution. The optimal filtering criterion is achieved by l(1) norm minimization of error through iteratively reweighted least-squares (IRLS) technique. It is interesting to note that at any stage of the iteration, we solve a weighted SG filter by minimizing l(2) norm but the process converges to l(1) minimized output. The results show consistent improvement over the standard SG filter performance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A linear photodiode array spectrometer based, high resolution interrogation technique for fiber Bragg grating sensors is demonstrated. Spline interpolation and Polynomial Approximation Algorithm (PAA) are applied to the data points acquired by the spectrometer to improve the original PAA based interrogation method. Thereby fewer pixels are required to achieve the same resolution as original. Theoretical analysis indicates that if the FWHM of a FBG covers more than 3 pixels, the resolution of central wavelength shift will arrive at less than 1 pm. While the number of pixels increases to 6, the nominal resolution will decrease to 0.001 pm. Experimental result shows that Bragg wavelength resolution of similar to 1 pm is obtained for a FBG with FWHM of similar to 0.2 nm using a spectrometer with a pixel resolution of similar to 70 pm.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The band structure of the Zn1-xCdxSySe1-y quaternary alloy is calculated using the empirical pseudopotential method and the virtual crystal approximation. The alloy is found to be a direct-gap semiconductor for all x and y composition. Polynomial approximation is obtained for the energy gap as a function of the composition x and y. Electron and hole effective masses are also calculated along various symmetry axes for different compositions and the results agree fairly well with available experimental values.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The first thermodynamic dissociation constants of glycine in 5, 15 mass % glucose + water mixed solvents at five temperatures from 5 to 45-degrees-C have been determined from precise emf measurements of a cell without liquid junction using hydrogen and Ag-AgCl electrodes and a new method of polynomial approximation proposed on the basis of Pitzer's electrolytic solution theory in our previous paper. The results obtained from both methods agree within experimental error. The standard free energy of transfer for HCl from water to aqueous mixed solvent have been calculated and the results are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The emfs of Cu|CuSO_4|Hg_2SO_4-Hg were determined at 5 temperature points from 278.15K to 313.15K. Based on the Pitzer' s Equation a polynomial approximation for the determination of standard emf, E_m, was proposed. The values of E_m obtained by author's method agree with values of E_m obtained by the extended Debye-Huckel equation within experimental errors. Compared with the extrapolation result of extended Debye-Huckel equation, the uncertainty by the selecting of parameter of ion size was avoided.By the...

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper considers the open shop scheduling problem to minimize the make-span, provided that one of the machines has to process the jobs according to a given sequence. We show that in the preemptive case the problem is polynomially solvable for an arbitrary number of machines. If preemption is not allowed, the problem is NP-hard in the strong sense if the number of machines is variable, and is NP-hard in the ordinary sense in the case of two machines. For the latter case we give a heuristic algorithm that runs in linear time and produces a schedule with the makespan that is at most 5/4 times the optimal value. We also show that the two-machine problem in the nonpreemptive case is solvable in pseudopolynomial time by a dynamic programming algorithm, and that the algorithm can be converted into a fully polynomial approximation scheme. © 1998 John Wiley & Sons, Inc. Naval Research Logistics 45: 705–731, 1998

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper considers the problem of processing n jobs in a two-machine non-preemptive open shop to minimize the makespan, i.e., the maximum completion time. One of the machines is assumed to be non-bottleneck. It is shown that, unlike its flow shop counterpart, the problem is NP-hard in the ordinary sense. On the other hand, the problem is shown to be solvable by a dynamic programming algorithm that requires pseudopolynomial time. The latter algorithm can be converted into a fully polynomial approximation scheme that runs in time. An O(n log n) approximation algorithm is also designed whi finds a schedule with makespan at most 5/4 times the optimal value, and this bound is tight.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider the problem of scattering of a time-harmonic acoustic incident plane wave by a sound soft convex polygon. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the computational cost required to achieve a prescribed level of accuracy grows linearly with respect to the frequency of the incident wave. Recently Chandler–Wilde and Langdon proposed a novel Galerkin boundary element method for this problem for which, by incorporating the products of plane wave basis functions with piecewise polynomials supported on a graded mesh into the approximation space, they were able to demonstrate that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency. Here we propose a related collocation method, using the same approximation space, for which we demonstrate via numerical experiments a convergence rate identical to that achieved with the Galerkin scheme, but with a substantially reduced computational cost.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider the scattering of a time-harmonic acoustic incident plane wave by a sound soft convex curvilinear polygon with Lipschitz boundary. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the number of degrees of freedom required to achieve a prescribed level of accuracy grows at least linearly with respect to the frequency of the incident wave. Here we propose a novel Galerkin boundary element method with a hybrid approximation space, consisting of the products of plane wave basis functions with piecewise polynomials supported on several overlapping meshes; a uniform mesh on illuminated sides, and graded meshes refined towards the corners of the polygon on illuminated and shadow sides. Numerical experiments suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy need only grow logarithmically as the frequency of the incident wave increases.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The present work aims to study the macroeconomic factors influence in credit risk for installment autoloans operations. The study is based on 4.887 credit operations surveyed in the Credit Risk Information System (SCR) hold by the Brazilian Central Bank. Using Survival Analysis applied to interval censured data, we achieved a model to estimate the hazard function and we propose a method for calculating the probability of default in a twelve month period. Our results indicate a strong time dependence for the hazard function by a polynomial approximation in all estimated models. The model with the best Akaike Information Criteria estimate a positive effect of 0,07% for males over de basic hazard function, and 0,011% for the increasing of ten base points on the operation annual interest rate, toward, for each R$ 1.000,00 on the installment, the hazard function suffer a negative effect of 0,28% , and an estimated elevation of 0,0069% for the same amount added to operation contracted value. For de macroeconomics factors, we find statistically significant effects for the unemployment rate (-0,12%) , for the one lag of the unemployment rate (0,12%), for the first difference of the industrial product index(-0,008%), for one lag of inflation rate (-0,13%) and for the exchange rate (-0,23%). We do not find statistic significant results for all other tested variables.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents a modelling and identification method for a wheeled mobile robot, including the actuator dynamics. Instead of the classic modelling approach, where the robot position coordinates (x,y) are utilized as state variables (resulting in a non linear model), the proposed discrete model is based on the travelled distance increment Delta_l. Thus, the resulting model is linear and time invariant and it can be identified through classical methods such as Recursive Least Mean Squares. This approach has a problem: Delta_l can not be directly measured. In this paper, this problem is solved using an estimate of Delta_l based on a second order polynomial approximation. Experimental data were colected and the proposed method was used to identify the model of a real robot

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A RBFN implemented with quantized parameters is proposed and the relative or limited approximation property is presented. Simulation results for sinusoidal function approximation with various quantization levels are shown. The results indicate that the network presents good approximation capability even with severe quantization. The parameter quantization decreases the memory size and circuit complexity required to store the network parameters leading to compact mixed-signal circuits proper for low-power applications. ©2008 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fractal theory presents a large number of applications to image and signal analysis. Although the fractal dimension can be used as an image object descriptor, a multiscale approach, such as multiscale fractal dimension (MFD), increases the amount of information extracted from an object. MFD provides a curve which describes object complexity along the scale. However, this curve presents much redundant information, which could be discarded without loss in performance. Thus, it is necessary the use of a descriptor technique to analyze this curve and also to reduce the dimensionality of these data by selecting its meaningful descriptors. This paper shows a comparative study among different techniques for MFD descriptors generation. It compares the use of well-known and state-of-the-art descriptors, such as Fourier, Wavelet, Polynomial Approximation (PA), Functional Data Analysis (FDA), Principal Component Analysis (PCA), Symbolic Aggregate Approximation (SAX), kernel PCA, Independent Component Analysis (ICA), geometrical and statistical features. The descriptors are evaluated in a classification experiment using Linear Discriminant Analysis over the descriptors computed from MFD curves from two data sets: generic shapes and rotated fish contours. Results indicate that PCA, FDA, PA and Wavelet Approximation provide the best MFD descriptors for recognition and classification tasks. (C) 2012 Elsevier B.V. All rights reserved.