750 resultados para SMOOTHING SPLINES


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the strategies interwar working-class British households used to “smooth” consumption over time and guard against negative contingencies such as illness, unemployment, and death. Newly discovered returns from the U.K. Ministry of Labour's 1937/38 Household Expenditure Survey are used to fully categorize expenditure smoothing via nineteen credit/savings vehicles. We find that households made extensive use of expenditure-smoothing devices. Families' reliance on expenditure-smoothing is shown to be inversely related to household income, while households also used these mechanisms more intensively during expenditure crisis phases of the family life cycle, especially the years immediately after new household formation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new subcortical structure shape modeling framework using heat kernel smoothing constructed with the Laplace-Beltrami eigenfunctions. The cotan discretization is used to numerically obtain the eigenfunctions of the Laplace-Beltrami operator along the surface of subcortical structures of the brain. The eigenfunctions are then used to construct the heat kernel and used in smoothing out measurements noise along the surface. The proposed framework is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shape. We detected a significant age effect on hippocampus in accordance with the previous studies. In addition, we also detected a significant gender effect on amygdala. Since we did not find any such differences in the traditional volumetric methods, our results demonstrate the benefit of the current framework over traditional volumetric methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To quantify to what extent the new registration method, DARTEL (Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra), may reduce the smoothing kernel width required and investigate the minimum group size necessary for voxel-based morphometry (VBM) studies. Materials and Methods: A simulated atrophy approach was employed to explore the role of smoothing kernel, group size, and their interactions on VBM detection accuracy. Group sizes of 10, 15, 25, and 50 were compared for kernels between 0–12 mm. Results: A smoothing kernel of 6 mm achieved the highest atrophy detection accuracy for groups with 50 participants and 8–10 mm for the groups of 25 at P < 0.05 with familywise correction. The results further demonstrated that a group size of 25 was the lower limit when two different groups of participants were compared, whereas a group size of 15 was the minimum for longitudinal comparisons but at P < 0.05 with false discovery rate correction. Conclusion: Our data confirmed DARTEL-based VBM generally benefits from smaller kernels and different kernels perform best for different group sizes with a tendency of smaller kernels for larger groups. Importantly, the kernel selection was also affected by the threshold applied. This highlighted that the choice of kernel in relation to group size should be considered with care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radar refractivity retrievals have the potential to accurately capture near-surface humidity fields from the phase change of ground clutter returns. In practice, phase changes are very noisy and the required smoothing will diminish large radial phase change gradients, leading to severe underestimates of large refractivity changes (ΔN). To mitigate this, the mean refractivity change over the field (ΔNfield) must be subtracted prior to smoothing. However, both observations and simulations indicate that highly correlated returns (e.g., when single targets straddle neighboring gates) result in underestimates of ΔNfield when pulse-pair processing is used. This may contribute to reported differences of up to 30 N units between surface observations and retrievals. This effect can be avoided if ΔNfield is estimated using a linear least squares fit to azimuthally averaged phase changes. Nevertheless, subsequent smoothing of the phase changes will still tend to diminish the all-important spatial perturbations in retrieved refractivity relative to ΔNfield; an iterative estimation approach may be required. The uncertainty in the target location within the range gate leads to additional phase noise proportional to ΔN, pulse length, and radar frequency. The use of short pulse lengths is recommended, not only to reduce this noise but to increase both the maximum detectable refractivity change and the number of suitable targets. Retrievals of refractivity fields must allow for large ΔN relative to an earlier reference field. This should be achievable for short pulses at S band, but phase noise due to target motion may prevent this at C band, while at X band even the retrieval of ΔN over shorter periods may at times be impossible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need for monotone approximation of scattered data often arises in many problems of regression, when the monotonicity is semantically important. One such domain is fuzzy set theory, where membership functions and aggregation operators are order preserving. Least squares polynomial splines provide great flexbility when modeling non-linear functions, but may fail to be monotone. Linear restrictions on spline coefficients provide necessary and sufficient conditions for spline monotonicity. The basis for splines is selected in such a way that these restrictions take an especially simple form. The resulting non-negative least squares problem can be solved by a variety of standard proven techniques. Additional interpolation requirements can also be imposed in the same framework. The method is applied to fuzzy systems, where membership functions and aggregation operators are constructed from empirical data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Splines with free knots have been extensively studied in regard to calculating the optimal knot positions. The dependence of the accuracy of approximation on the knot distribution is highly nonlinear, and optimisation techniques face a difficult problem of multiple local minima. The domain of the problem is a simplex, which adds to the complexity. We have applied a recently developed cutting angle method of deterministic global optimisation, which allows one to solve a wide class of optimisation problems on a simplex. The results of the cutting angle method are subsequently improved by local discrete gradient method. The resulting algorithm is sufficiently fast and guarantees that the global minimum has been reached. The results of numerical experiments are presented.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aggregation operators model various operations on fuzzy sets, such as conjunction, disjunction and averaging. Recently double aggregation operators have been introduced; they model multistep aggregation process. The choice of aggregation operators depends on the particular problem, and can be done by fitting the operator to empirical data. We examine fitting general aggregation operators by using a new method of monotone Lipschitz smoothing. We study various boundary conditions and constraints which determine specific types of aggregation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a new approach to multivariate scattered data smoothing. It is assumed that the data are generated by a Lipschitz continuous function f, and include random noise to be filtered out. The proposed approach uses known, or estimated value of the Lipschitz constant of f, and forces the data to be consistent with the Lipschitz properties of f. Depending on the assumptions about the distribution of the random noise, smoothing is reduced to a standard quadratic or a linear programming problem. We discuss an efficient algorithm which eliminates the redundant inequality constraints. Numerical experiments illustrate applicability and efficiency of the method. This approach provides an efficient new tool of multivariate scattered data approximation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Least squares polynomial splines are an effective tool for data fitting, but they may fail to preserve essential properties of the underlying function, such as monotonicity or convexity. The shape restrictions are translated into linear inequality conditions on spline coefficients. The basis functions are selected in such a way that these conditions take a simple form, and the problem becomes non-negative least squares problem, for which effecitive and robust methods of solution exist. Multidimensional monotone approximation is achieved by using tensor-product splines with the appropriate restrictions. Additional inter polation conditions can also be introduced. The conversion formulas to traditional B-spline representation are provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Q-ball imaging was presented as a model free, linear and multimodal diffusion sensitive approach to reconstruct diffusion orientation distribution function (ODF) using diffusion weighted MRI data. The ODFs are widely used to estimate the fiber orientations. However, the smoothness constraint was proposed to achieve a balance between the angular resolution and noise stability for ODF constructs. Different regularization methods were proposed for this purpose. However, these methods are not robust and quite sensitive to the global regularization parameter. Although, numerical methods such as L-curve test are used to define a globally appropriate regularization parameter, it cannot serve as a universal value suitable for all regions of interest. This may result in over smoothing and potentially end up in neglecting an existing fiber population. In this paper, we propose to include an interpolation step prior to the spherical harmonic decomposition. This interpolation based approach is based on Delaunay triangulation provides a reliable, robust and accurate smoothing approach. This method is easy to implement and does not require other numerical methods to define the required parameters. Also, the fiber orientations estimated using this approach are more accurate compared to other common approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software reliability growth models (SRGMs) are extensively employed in software engineering to assess the reliability of software before their release for operational use. These models are usually parametric functions obtained by statistically fitting parametric curves, using Maximum Likelihood estimation or Least–squared method, to the plots of the cumulative number of failures observed N(t) against a period of systematic testing time t. Since the 1970s, a very large number of SRGMs have been proposed in the reliability and software engineering literature and these are often very complex, reflecting the involved testing regime that often took place during the software development process. In this paper we extend some of our previous work by adopting a nonparametric approach to SRGM modeling based on local polynomial modeling with kernel smoothing. These models require very few assumptions, thereby facilitating the estimation process and also rendering them more relevant under a wide variety of situations. Finally, we provide numerical examples where these models will be evaluated and compared.