23 resultados para Smoothing

em Deakin Research Online - Australia


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aggregation operators model various operations on fuzzy sets, such as conjunction, disjunction and averaging. Recently double aggregation operators have been introduced; they model multistep aggregation process. The choice of aggregation operators depends on the particular problem, and can be done by fitting the operator to empirical data. We examine fitting general aggregation operators by using a new method of monotone Lipschitz smoothing. We study various boundary conditions and constraints which determine specific types of aggregation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a new approach to multivariate scattered data smoothing. It is assumed that the data are generated by a Lipschitz continuous function f, and include random noise to be filtered out. The proposed approach uses known, or estimated value of the Lipschitz constant of f, and forces the data to be consistent with the Lipschitz properties of f. Depending on the assumptions about the distribution of the random noise, smoothing is reduced to a standard quadratic or a linear programming problem. We discuss an efficient algorithm which eliminates the redundant inequality constraints. Numerical experiments illustrate applicability and efficiency of the method. This approach provides an efficient new tool of multivariate scattered data approximation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Q-ball imaging was presented as a model free, linear and multimodal diffusion sensitive approach to reconstruct diffusion orientation distribution function (ODF) using diffusion weighted MRI data. The ODFs are widely used to estimate the fiber orientations. However, the smoothness constraint was proposed to achieve a balance between the angular resolution and noise stability for ODF constructs. Different regularization methods were proposed for this purpose. However, these methods are not robust and quite sensitive to the global regularization parameter. Although, numerical methods such as L-curve test are used to define a globally appropriate regularization parameter, it cannot serve as a universal value suitable for all regions of interest. This may result in over smoothing and potentially end up in neglecting an existing fiber population. In this paper, we propose to include an interpolation step prior to the spherical harmonic decomposition. This interpolation based approach is based on Delaunay triangulation provides a reliable, robust and accurate smoothing approach. This method is easy to implement and does not require other numerical methods to define the required parameters. Also, the fiber orientations estimated using this approach are more accurate compared to other common approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software reliability growth models (SRGMs) are extensively employed in software engineering to assess the reliability of software before their release for operational use. These models are usually parametric functions obtained by statistically fitting parametric curves, using Maximum Likelihood estimation or Least–squared method, to the plots of the cumulative number of failures observed N(t) against a period of systematic testing time t. Since the 1970s, a very large number of SRGMs have been proposed in the reliability and software engineering literature and these are often very complex, reflecting the involved testing regime that often took place during the software development process. In this paper we extend some of our previous work by adopting a nonparametric approach to SRGM modeling based on local polynomial modeling with kernel smoothing. These models require very few assumptions, thereby facilitating the estimation process and also rendering them more relevant under a wide variety of situations. Finally, we provide numerical examples where these models will be evaluated and compared.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hemodynamic models have a high potential in application to understanding the functional differences of the brain. However, full system identification with respect to model fitting to actual functional magnetic resonance imaging (fMRI) data is practically difficult and is still an active area of research. We present a simulation based Bayesian approach for nonlinear model based analysis of the fMRI data. The idea is to do a joint state and parameter estimation within a general filtering framework. One advantage of using Bayesian methods is that they provide a complete description of the posterior distribution, not just a single point estimate. We use an Auxiliary Particle Filter adjoined with a kernel smoothing approach to address this joint estimation problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Twomultidimensional HPLC separations of an Australian red wine are presented, >70% of the available separation space was used. A porous graphitic carbon (PGC) stationary phase was used as the first dimension in both separations with both RP core–shell and hydrophilic interaction chromatography fully porous columns used separately in the second dimension. To overcome peak analysis problems caused by signal noise and low detection limits, the data were pre-processed with penalised least-squares smoothing. The PGC × RP combination separated 85 peaks with a spreading angle of 71 and the PGC × hydrophilic interaction chromatography separated 207 peaks with a spreading angle of 80. Both 2D-HPLC steps were completed in 76 min using a comprehensive stop-and-go approach. A smoothing step was added to peak-picking processes and was able to greatly reduce the number of false peaks present due to noise in the chromatograms. The required thresholds were not able to ignore the noise because of the small magnitude of the peaks; 1874 peaks were located in the non-smoothed PGC × RP separation that reduced to 227 peaks after smoothing was included.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we address a new problem of noisy images which present in the procedure of relevance feedback for medical image retrieval. We concentrate on the noisy images, caused by the users mislabeling some irrelevant images as relevant ones, and a noisy-smoothing relevance feedback (NS-RF) method is proposed. In NS-RF, a two-step strategy is proposed to handle the noisy images. In step 1, a noisy elimination algorithm is adopted to identify and eliminate the noisy images. In step 2, to further alleviate the influence of noisy images, a fuzzy membership function is employed to estimate the relevance probabilities of retained relevant images. After noisy handling, the fuzzy support vector machine, which can take into account different relevant images with different relevance probabilities, is adopted to re-rank the images. The experimental results on the IRMA medical image collection demonstrate that the proposed method can deal with the noisy images effectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a new method of monotone interpolation and smoothing of multivariate scattered data. It is based on the assumption that the function to be approximated is Lipschitz continuous. The method provides the optimal approximation in the worst case scenario and tight error bounds. Smoothing of noisy data subject to monotonicity constraints is converted into a quadratic programming problem. Estimation of the unknown Lipschitz constant from the data by sample splitting and cross-validation is described. Extension of the method for locally Lipschitz functions is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For various applications it is necessary to know not only global solar radiation values, but also the diffuse and beam components. Because often only global values are available, there have been several models developed to establish correlations between the diffuse fraction and various predictors. These typically include the clearness index, but also may include the solar angle, temperature and humidity. The clearness index is the proportion of extraterrestrial radiation reaching a location, where the extraterrestrial value used in the calculation varies with latitude and time of year. These correlations have been developed using data principally from latitudes greater than 40°, often using only data from a few locations and with few exceptions have not used solar altitude as a predictor. Generally the data consist of hourly integrated values. A model has been developed using hourly data from a weather station set up at Deakin University, Geelong. Another model has also been developed for 15 minute data values in order to ascertain if the smoothing generated by using hourly data makes a significant difference to overall results. The construction of such models has been investigated, enabling an extension to the research, inclusive of other stations, to be performed systematically. A final investigation was carried out, using data from other Australian locations, to explain some of the considerable scatter by adding apparent solar time as a predictor, which proved to be significantly better than solar altitude.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new versatile computer controlled electrochemlcal/ESR data acquisition system has been developed for the Investigation of short-lived radicals with life-times of 20 milliseconds and greater, Different computer programs have been developed to monitor the decay of radicals; over hours or minutes, seconds or milliseconds. Signal averaging and Fourier smoothing is employed in order to improve the signal to noise ratio. Two microcomputers are used to control the system, one home-made computer containing the M6800 chip which controls the magnetic field, and an IBM PC XT which controls the electrochemistry and the data acquisition. The computer programs are written in Fortran and C, and call machine language subroutines, The system functions by having the radical generated by an electrochemical pulse: after or during the pulse the ESR data are collected. Decaying radicals which have half-lives of seconds or greater have their spectra collected in the magnetic field domain, which can be swept as fast as 200 Gauss per second. The decay of the radicals in the millisecond region is monitored by time-resolved ESR: a technique in which data is collected in both the time domain and in the magnetic field domain. Previously, time-resolved ESR has been used (without field modulation) to investigate ultra-short-lived species with life-times in the region of only a few microseconds. The application of the data acquisition system to chemical systems is illustrated. This is the first time a computer controlled system whereby the radical is generated by electrochemical means and subsequently the ESR data collected, has been developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australia’s superannuation system consists of individual retirement accounts that cannot be accessed until the taxpayer reaches the legislated preservation age. Most of the deposits to these accounts are the mandatory contributions that employers make. Some of the claimed justifications for superannuation are weak. Specifically, claims that superannuation is necessary to prevent a looming ageing crisis and is justified on the grounds of intergenerational equity lose much of their force when examined in the context of substantially higher future incomes. One of the justifications for superannuation that has merit is that it helps promote income smoothing. Although there are some strong arguments for retirement policies that help promote income smoothing, given the long term trend towards income inequality, there are also convincing arguments towards an emphasis on retirement policies that distribute incomes more equally. If income smoothing is on balance seen as a desirable goal then there is merit in Australia’s superannuation system being complemented by a fully funded government run defined benefits scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chromatographic detection responses are recorded digitally. A peak is represented ideally by a Guassian distribution. Raising a Guassian distribution to the power ‘n’ increases the height of the peak to that power, but decreases the standard deviation by √n. Hence there is an increasing disparity in detection responses as the signal moves from low level noise, with a corresponding decrease in peak width. This increases the S/N ratio and increases peak to peak resolution. The ramifications of these factors are that poor resolution in complex chromatographic data can be improved, and low signal responses embedded at near noise levels can be enhanced. The application of this data treatment process is potentially very useful in 2D-HPLC where sample dilution occurs between dimension, reducing signal response, and in the application of post-reaction detection methods, where band broadening is increased by virtue of reaction coils. In this work power functions applied to chromatographic data are discussed in the context of (a) complex separation problems, (b) 2D-HPLC separations, and (c) post-column reaction detectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work demonstrates a novel Bayesian learning approach for model based analysis of Functional Magnetic Resonance (fMRI) data. We use a physiologically inspired hemodynamic model and investigate a method to simultaneously infer the neural activity together with hidden state and the physiological parameter of the model. This joint estimation problem is still an open topic. In our work we use a Particle Filter accompanied with a kernel smoothing approach to address this problem within a general filtering framework. Simulation results show that the proposed method is a consistent approach and has a good potential to be enhanced for further fMRI data analysis.