171 resultados para Filters methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An aeroelastic analysis based on finite elements in space and time is used to model the helicopter rotor in forward flight. The rotor blade is represented as an elastic cantilever beam undergoing flap and lag bending, elastic torsion and axial deformations. The objective of the improved design is to reduce vibratory loads at the rotor hub that are the main source of helicopter vibration. Constraints are imposed on aeroelastic stability, and move limits are imposed on the blade elastic stiffness design variables. Using the aeroelastic analysis, response surface approximations are constructed for the objective function (vibratory hub loads). It is found that second order polynomial response surfaces constructed using the central composite design of the theory of design of experiments adequately represents the aeroelastic model in the vicinity of the baseline design. Optimization results show a reduction in the objective function of about 30 per cent. A key accomplishment of this paper is the decoupling of the analysis problem and the optimization problems using response surface methods, which should encourage the use of optimization methods by the helicopter industry. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Satisfiability algorithms for propositional logic have improved enormously in recently years. This improvement increases the attractiveness of satisfiability methods for first-order logic that reduce the problem to a series of ground-level satisfiability problems. R. Jeroslow introduced a partial instantiation method of this kind that differs radically from the standard resolution-based methods. This paper lays the theoretical groundwork for an extension of his method that is general enough and efficient enough for general logic programming with indefinite clauses. In particular we improve Jeroslow's approach by (1) extending it to logic with functions, (2) accelerating it through the use of satisfiers, as introduced by Gallo and Rago, and (3) simplifying it to obtain further speedup. We provide a similar development for a "dual" partial instantiation approach defined by Hooker and suggest a primal-dual strategy. We prove correctness of the primal and dual algorithms for full first-order logic with functions, as well as termination on unsatisfiable formulas. We also report some preliminary computational results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a derivative-free two-stage extended Kalman filter (2-EKF) especially suited for state and parameter identification of mechanical oscillators under Gaussian white noise. Two sources of modeling uncertainties are considered: (1) errors in linearization, and (2) an inadequate system model. The state vector is presently composed of the original dynamical/parameter states plus the so-called bias states accounting for the unmodeled dynamics. An extended Kalman estimation concept is applied within a framework predicated on explicit and derivative-free local linearizations (DLL) of nonlinear drift terms in the governing stochastic differential equations (SDEs). The original and bias states are estimated by two separate filters; the bias filter improves the estimates of the original states. Measurements are artificially generated by corrupting the numerical solutions of the SDEs with noise through an implicit form of a higher-order linearization. Numerical illustrations are provided for a few single- and multidegree-of-freedom nonlinear oscillators, demonstrating the remarkable promise that 2-EKF holds over its more conventional EKF-based counterparts. DOI: 10.1061/(ASCE)EM.1943-7889.0000255. (C) 2011 American Society of Civil Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A study of environmental chloride and groundwater balance has been carried out in order to estimate their relative value for measuring average groundwater recharge under a humid climatic environment with a relatively shallow water table. The hybrid water fluctuation method allowed the split of the hydrologic year into two seasons of recharge (wet season) and no recharge (dry season) to appraise specific yield during the dry season and, second, to estimate recharge from the water table rise during the wet season. This well elaborated and suitable method has then been used as a standard to assess the effectiveness of the chloride method under forest humid climatic environment. Effective specific yield of 0.08 was obtained for the study area. It reflects an effective basin-wide process and is insensitive to local heterogeneities in the aquifer system. The hybrid water fluctuation method gives an average recharge value of 87.14 mm/year at the basin scale, which represents 5.7% of the annual rainfall. Recharge value estimated based on the chloride method varies between 16.24 and 236.95 mm/year with an average value of 108.45 mm/year. It represents 7% of the mean annual precipitation. The discrepancy observed between recharge value estimated by the hybrid water fluctuation and the chloride mass balance methods appears to be very important, which could imply the ineffectiveness of the chloride mass balance method for this present humid environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the successful performance of a granular filter medium, existing design guidelines, which are based on the particle size distribution (PSD) characteristics of the base soil and filter medium, require two contradictory conditions to be satisfied, viz., soil retention and permeability. In spite of the wider applicability of these guidelines, it is well recognized that (i) they are applicable to a particular range of soils tested in the laboratory, (ii) the design procedures do not include performance-based selection criteria, and (iii) there are no means to establish the sensitivity of the important variables influencing performance. In the present work, analytical solutions are developed to obtain a factor of safety with respect to soil-retention and permeability criteria for a base soil - filter medium system subjected to a soil boiling condition. The proposed analytical solutions take into consideration relevant geotechnical properties such as void ratio, permeability, dry unit weight, effective friction angle, shape and size of soil particles, seepage discharge, and existing hydraulic gradient. The solution is validated through example applications and experimental results, and it is established that it can be used successfully in the selection as well as design of granular filters and can be applied to all types of base soils.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, expressions for convolution multiplication properties of MDCT are derived starting from the equivalent DFT representations. Using these expressions, methods for implementing linear filtering through block convolution in the MDCT domain are presented. The implementation is exact for symmetric filters and approximate for non-symmetric filters in the case of rectangular window based MDCT. For a general MDCT window function, the filtering is done on the windowed segments and hence the convolution is approximate for symmetric as well as non-symmetric filters. This approximation error is shown to be perceptually insignificant for symmetric impulse response filters. Moreover, the inherent $50 \%$ overlap between adjacent frames used in MDCT computation does reduce this approximation error similar to smoothing of other block processing errors. The presented techniques are useful for compressed domain processing of audio signals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventional hardware implementation techniques for FIR filters require the computation of filter coefficients in software and have them stored in memory. This approach is static in the sense that any further fine tuning of the filter requires computation of new coefficients in software. In this paper, we propose an alternate technique for implementing FIR filters in hardware. We store a considerably large number of impulse response coefficients of the ideal filter (having box type frequency response) in memory. We then do the windowing process, on these coefficients, in hardware using integer sequences as window functions. The integer sequences are also generated in hardware. This approach offers the flexibility in fine tuning the filter, like varying the transition bandwidth around a particular cutoff frequency.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context-sensitive points-to analysis is critical for several program optimizations. However, as the number of contexts grows exponentially, storage requirements for the analysis increase tremendously for large programs, making the analysis non-scalable. We propose a scalable flow-insensitive context-sensitive inclusion-based points-to analysis that uses a specially designed multi-dimensional bloom filter to store the points-to information. Two key observations motivate our proposal: (i) points-to information (between pointer-object and between pointer-pointer) is sparse, and (ii) moving from an exact to an approximate representation of points-to information only leads to reduced precision without affecting correctness of the (may-points-to) analysis. By using an approximate representation a multi-dimensional bloom filter can significantly reduce the memory requirements with a probabilistic bound on loss in precision. Experimental evaluation on SPEC 2000 benchmarks and two large open source programs reveals that with an average storage requirement of 4MB, our approach achieves almost the same precision (98.6%) as the exact implementation. By increasing the average memory to 27MB, it achieves precision upto 99.7% for these benchmarks. Using Mod/Ref analysis as the client, we find that the client analysis is not affected that often even when there is some loss of precision in the points-to representation. We find that the NoModRef percentage is within 2% of the exact analysis while requiring 4MB (maximum 15MB) memory and less than 4 minutes on average for the points-to analysis. Another major advantage of our technique is that it allows to trade off precision for memory usage of the analysis.