171 resultados para Filters methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the problem of computing numerical solutions for Ito stochastic differential equations (SDEs). The five-stage Milstein (FSM) methods are constructed for solving SDEs driven by an m-dimensional Wiener process. The FSM methods are fully explicit methods. It is proved that the FSM methods are convergent with strong order 1 for SDEs driven by an m-dimensional Wiener process. The analysis of stability (with multidimensional Wiener process) shows that the mean-square stable regions of the FSM methods are unbounded. The analysis of stability shows that the mean-square stable regions of the methods proposed in this paper are larger than the Milstein method and three-stage Milstein methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work analyses the influence of several design methods on the degree of creativity of the design outcome. A design experiment has been carried out in which the participants were divided into four teams of three members, and each team was asked to work applying different design methods. The selected methods were Brainstorming, Functional Analysis, and SCAMPER method. The `degree of creativity' of each design outcome is assessed by means of a questionnaire offered to a number of experts and by means of three different metrics: the metric of Moss, the metric of Sarkar and Chakrabarti, and the evaluation of innovative potential. The three metrics share the property of measuring the creativity as a combination of the degree of novelty and the degree of usefulness. The results show that Brainstorming provides more creative outcomes than when no method is applied, while this is not proved for SCAMPER and Functional Analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a modified cellulose acetate membrane prepared using a dry casting technique that can be used to perform lysis of erythrocytes and isolation of hemoglobin. Isolation of hemoglobin is thus achieved without the use of lysis buffers. Cellulose acetate (CA) membranes are embedded with ammonium chloride (NH4Cl) and potassium bicarbonate (KHCO3), which act as lysing agents. The presence of embedded salts is confirmed using EDS analysis. The pores in the CA membrane act as filters. The average pore size in these membranes is designed to be 1.5 mu M, as characterized by SEM analysis, so that they allow hemoglobin to pass through and block all other cells and unlysed erythrocytes present in blood. When a drop of blood is added to the membrane, the NH4Cl and KHCO3 embedded in the membrane dissolve in plasma and lyse the erythrocytes. The filtered hemoglobin is characterized using UV-Vis Spectroscopy. The results indicate extraction of higher concentration of hemoglobin compared with conventional methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study constrained maximum entropy and minimum divergence optimization problems, in the cases where integer valued sufficient statistics exists, using tools from computational commutative algebra. We show that the estimation of parametric statistical models in this case can be transformed to solving a system of polynomial equations. We give an implicit description of maximum entropy models by embedding them in algebraic varieties for which we give a Grobner basis method to compute it. In the cases of minimum KL-divergence models we show that implicitization preserves specialization of prior distribution. This result leads us to a Grobner basis method to embed minimum KL-divergence models in algebraic varieties. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many dynamical systems, including lakes, organisms, ocean circulation patterns, or financial markets, are now thought to have tipping points where critical transitions to a contrasting state can happen. Because critical transitions can occur unexpectedly and are difficult to manage, there is a need for methods that can be used to identify when a critical transition is approaching. Recent theory shows that we can identify the proximity of a system to a critical transition using a variety of so-called `early warning signals', and successful empirical examples suggest a potential for practical applicability. However, while the range of proposed methods for predicting critical transitions is rapidly expanding, opinions on their practical use differ widely, and there is no comparative study that tests the limitations of the different methods to identify approaching critical transitions using time-series data. Here, we summarize a range of currently available early warning methods and apply them to two simulated time series that are typical of systems undergoing a critical transition. In addition to a methodological guide, our work offers a practical toolbox that may be used in a wide range of fields to help detect early warning signals of critical transitions in time series data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The three-component chiral derivatization protocols have been developed for H-1, C-13 and F-19 NMR spectroscopic discrimination of chiral diacids by their coordination and self-assembly with optically active (R)-alpha-methylbenzylamine and 2-formylphenylboronic acid or 3-fluoro-2-formylmethylboronic acid. These protocols yield a mixture of diastereomeric imino-boronate esters which are identified by the well-resolved diastereotopic peaks with significant chemical shift differences ranging up to 0.6 and 2.1 ppm in their corresponding H-1 and F-19 NMR spectra, without any racemization or kinetic resolution, thereby enabling the determination of enantiopurity. A protocol has also been developed for discrimination of chiral alpha-methyl amines, using optically pure trans-1,2-cyclohexanedicarboxylic acid in combination with 2-formylphenylboronic acid or 3-fluoro-2-fluoromethylboronic acid. The proposed strategies have been demonstrated on large number of chiral diacids and chiral alpha-methyl amines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical failure of insulation is known to be an extremal random process wherein nominally identical pro-rated specimens of equipment insulation, at constant stress fail at inordinately different times even under laboratory test conditions. In order to be able to estimate the life of power equipment, it is necessary to run long duration ageing experiments under accelerated stresses, to acquire and analyze insulation specific failure data. In the present work, Resin Impregnated Paper (RIP) a relatively new insulation system of choice used in transformer bushings, is taken as an example. The failure data has been processed using proven statistical methods, both graphical and analytical. The physical model governing insulation failure at constant accelerated stress has been assumed to be based on temperature dependent inverse power law model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study the problem of designing SVM classifiers when the kernel matrix, K, is affected by uncertainty. Specifically K is modeled as a positive affine combination of given positive semi definite kernels, with the coefficients ranging in a norm-bounded uncertainty set. We treat the problem using the Robust Optimization methodology. This reduces the uncertain SVM problem into a deterministic conic quadratic problem which can be solved in principle by a polynomial time Interior Point (IP) algorithm. However, for large-scale classification problems, IP methods become intractable and one has to resort to first-order gradient type methods. The strategy we use here is to reformulate the robust counterpart of the uncertain SVM problem as a saddle point problem and employ a special gradient scheme which works directly on the convex-concave saddle function. The algorithm is a simplified version of a general scheme due to Juditski and Nemirovski (2011). It achieves an O(1/T-2) reduction of the initial error after T iterations. A comprehensive empirical study on both synthetic data and real-world protein structure data sets show that the proposed formulations achieve the desired robustness, and the saddle point based algorithm outperforms the IP method significantly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Edge-preserving smoothing is widely used in image processing and bilateral filtering is one way to achieve it. Bilateral filter is a nonlinear combination of domain and range filters. Implementing the classical bilateral filter is computationally intensive, owing to the nonlinearity of the range filter. In the standard form, the domain and range filters are Gaussian functions and the performance depends on the choice of the filter parameters. Recently, a constant time implementation of the bilateral filter has been proposed based on raisedcosine approximation to the Gaussian to facilitate fast implementation of the bilateral filter. We address the problem of determining the optimal parameters for raised-cosine-based constant time implementation of the bilateral filter. To determine the optimal parameters, we propose the use of Stein's unbiased risk estimator (SURE). The fast bilateral filter accelerates the search for optimal parameters by faster optimization of the SURE cost. Experimental results show that the SURE-optimal raised-cosine-based bilateral filter has nearly the same performance as the SURE-optimal standard Gaussian bilateral filter and the Oracle mean squared error (MSE)-based optimal bilateral filter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nonlinear equations in mathematical physics and engineering are solved by linearizing the equations and forming various iterative procedures, then executing the numerical simulation. For strongly nonlinear problems, the solution obtained in the iterative process can diverge due to numerical instability. As a result, the application of numerical simulation for strongly nonlinear problems is limited. Helicopter aeroelasticity involves the solution of systems of nonlinear equations in a computationally expensive environment. Reliable solution methods which do not need Jacobian calculation at each iteration are needed for this problem. In this paper, a comparative study is done by incorporating different methods for solving the nonlinear equations in helicopter trim. Three different methods based on calculating the Jacobian at the initial guess are investigated. (C) 2011 Elsevier Masson SAS. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Present study performs the spatial and temporal trend analysis of annual, monthly and seasonal maximum and minimum temperatures (t(max), t(min)) in India. Recent trends in annual, monthly, winter, pre-monsoon, monsoon and post-monsoon extreme temperatures (t(max), t(min)) have been analyzed for three time slots viz. 1901-2003,1948-2003 and 1970-2003. For this purpose, time series of extreme temperatures of India as a whole and seven homogeneous regions, viz. Western Himalaya (WH), Northwest (NW), Northeast (NE), North Central (NC), East coast (EC), West coast (WC) and Interior Peninsula (IP) are considered. Rigorous trend detection analysis has been exercised using variety of non-parametric methods which consider the effect of serial correlation during analysis. During the last three decades minimum temperature trend is present in All India as well as in all temperature homogeneous regions of India either at annual or at any seasonal level (winter, pre-monsoon, monsoon, post-monsoon). Results agree with the earlier observation that the trend in minimum temperature is significant in the last three decades over India (Kothawale et al., 2010). Sequential MK test reveals that most of the trend both in maximum and minimum temperature began after 1970 either in annual or seasonal levels. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Savitzky-Golay (S-G) filters are finite impulse response lowpass filters obtained while smoothing data using a local least-squares (LS) polynomial approximation. Savitzky and Golay proved in their hallmark paper that local LS fitting of polynomials and their evaluation at the mid-point of the approximation interval is equivalent to filtering with a fixed impulse response. The problem that we address here is, ``how to choose a pointwise minimum mean squared error (MMSE) S-G filter length or order for smoothing, while preserving the temporal structure of a time-varying signal.'' We solve the bias-variance tradeoff involved in the MMSE optimization using Stein's unbiased risk estimator (SURE). We observe that the 3-dB cutoff frequency of the SURE-optimal S-G filter is higher where the signal varies fast locally, and vice versa, essentially enabling us to suitably trade off the bias and variance, thereby resulting in near-MMSE performance. At low signal-to-noise ratios (SNRs), it is seen that the adaptive filter length algorithm performance improves by incorporating a regularization term in the SURE objective function. We consider the algorithm performance on real-world electrocardiogram (ECG) signals. The results exhibit considerable SNR improvement. Noise performance analysis shows that the proposed algorithms are comparable, and in some cases, better than some standard denoising techniques available in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low-frequency sounds are advantageous for long-range acoustic signal transmission, but for small animals they constitute a challenge for signal detection and localization. The efficient detection of sound in insects is enhanced by mechanical resonance either in the tracheal or tympanal system before subsequent neuronal amplification. Making small structures resonant at low sound frequencies poses challenges for insects and has not been adequately studied. Similarly, detecting the direction of long-wavelength sound using interaural signal amplitude and/or phase differences is difficult for small animals. Pseudophylline bushcrickets predominantly call at high, often ultrasonic frequencies, but a few paleotropical species use lower frequencies. We investigated the mechanical frequency tuning of the tympana of one such species, Onomarchus uninotatus, a large bushcricket that produces a narrow bandwidth call at an unusually low carrier frequency of 3.2. kHz. Onomarchus uninotatus, like most bushcrickets, has two large tympanal membranes on each fore-tibia. We found that both these membranes vibrate like hinged flaps anchored at the dorsal wall and do not show higher modes of vibration in the frequency range investigated (1.5-20. kHz). The anterior tympanal membrane acts as a low-pass filter, attenuating sounds at frequencies above 3.5. kHz, in contrast to the high-pass filter characteristic of other bushcricket tympana. Responses to higher frequencies are partitioned to the posterior tympanal membrane, which shows maximal sensitivity at several broad frequency ranges, peaking at 3.1, 7.4 and 14.4. kHz. This partitioning between the two tympanal membranes constitutes an unusual feature of peripheral auditory processing in insects. The complex tracheal shape of O. uninotatus also deviates from the known tube or horn shapes associated with simple band-pass or high-pass amplification of tracheal input to the tympana. Interestingly, while the anterior tympanal membrane shows directional sensitivity at conspecific call frequencies, the posterior tympanal membrane is not directional at conspecific frequencies and instead shows directionality at higher frequencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an experimental study that was conducted to compare the results obtained from using different design methods (brainstorming (BR), functional analysis (FA), and SCAMPER) in design processes. The objectives of this work are twofold. The first was to determine whether there are any differences in the length of time devoted to the different types of activities that are carried out in the design process, depending on the method that is employed; in other words, whether the design methods that are used make a difference in the profile of time spent across the design activities. The second objective was to analyze whether there is any kind of relationship between the time spent on design process activities and the degree of creativity in the solutions that are obtained. Creativity evaluation has been done by means of the degree of novelty and the level of resolution of the designed solutions using creative product semantic scale (CPSS) questionnaire. The results show that there are significant differences between the amounts of time devoted to activities related to understanding the problem and the typology of the design method, intuitive or logical, that are used. While the amount of time spent on analyzing the problem is very small in intuitive methods, such as brainstorming and SCAMPER (around 8-9% of the time), with logical methods like functional analysis practically half the time is devoted to analyzing the problem. Also, it has been found that the amount of time spent in each design phase has an influence on the results in terms of creativity, but results are not enough strong to define in which measure are they affected. This paper offers new data and results on the distinct benefits to be obtained from applying design methods. DOI: 10.1115/1.4007362]