205 resultados para SPLINES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present paper we study the approximation of functions with bounded mixed derivatives by sparse tensor product polynomials in positive order tensor product Sobolev spaces. We introduce a new sparse polynomial approximation operator which exhibits optimal convergence properties in L2 and tensorized View the MathML source simultaneously on a standard k-dimensional cube. In the special case k=2 the suggested approximation operator is also optimal in L2 and tensorized H1 (without essential boundary conditions). This allows to construct an optimal sparse p-version FEM with sparse piecewise continuous polynomial splines, reducing the number of unknowns from O(p2), needed for the full tensor product computation, to View the MathML source, required for the suggested sparse technique, preserving the same optimal convergence rate in terms of p. We apply this result to an elliptic differential equation and an elliptic integral equation with random loading and compute the covariances of the solutions with View the MathML source unknowns. Several numerical examples support the theoretical estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a novel statistical test is introduced to compare two locally stationary time series. The proposed approach is a Wald test considering time-varying autoregressive modeling and function projections in adequate spaces. The covariance structure of the innovations may be also time- varying. In order to obtain function estimators for the time- varying autoregressive parameters, we consider function expansions in splines and wavelet bases. Simulation studies provide evidence that the proposed test has a good performance. We also assess its usefulness when applied to a financial time series.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we extend partial linear models with normal errors to Student-t errors Penalized likelihood equations are applied to derive the maximum likelihood estimates which appear to be robust against outlying observations in the sense of the Mahalanobis distance In order to study the sensitivity of the penalized estimates under some usual perturbation schemes in the model or data the local influence curvatures are derived and some diagnostic graphics are proposed A motivating example preliminary analyzed under normal errors is reanalyzed under Student-t errors The local influence approach is used to compare the sensitivity of the model estimates (C) 2010 Elsevier B V All rights reserved

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fuzzy logic provides a mathematical formalism for a unified treatment of vagueness and imprecision that are ever present in decision support and expert systems in many areas. The choice of aggregation operators is crucial to the behavior of the system that is intended to mimic human decision making. This paper discusses how aggregation operators can be selected and adjusted to fit empirical data—a series of test cases. Both parametric and nonparametric regression are considered and compared. A practical application of the proposed methods to electronic implementation of clinical guidelines is presented

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Approximation order is an important feature of all wavelets. It implies that polynomials up to degree p−1 are in the space spanned by the scaling function(s). In the scalar case, the scalar sum rules determine the approximation order or the left eigenvectors of the infinite down-sampled convolution matrix H determine the combinations of scaling functions required to produce the desired polynomial. For multi-wavelets the condition for approximation order is similar to the conditions in the scalar case. Generalized left eigenvectors of the matrix Hf; a finite portion of H determines the combinations of scaling functions that produce the desired superfunction from which polynomials of desired degree can be reproduced. The superfunctions in this work are taken to be B-splines. However, any refinable function can serve as the superfunction. The condition of approximation order is derived and new, symmetric, compactly supported and orthogonal multi-wavelets with approximation orders one, two, three and four are constructed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Detection of lane boundaries of a road based on the images or video taken by a video capturing device in a suburban environment is a challenging task. In this paper, a novel lane detection algorithm is proposed without considering camera parameters; which robustly detects lane boundaries in real-time especially for sub-urban roads. Initially, the proposed method fits the CIE L*a*b* transformed road chromaticity values (that is a* and b* values) to a bi-variate Gaussian model followed by the classification of road area based on Mahalanobis distance. Secondly, the classified road area acts as an arbitrary shaped region of interest (AROI) in order to extract blobs resulting from the filtered image by a two dimensional Gabor filter. This is considered as the first cue of images. Thirdly, another cue of images was employed in order to obtain an entropy image. Moreover, results from the color based image cue and entropy image cue were integrated following an outlier removing process. Finally, the correct road lane points are fitted with Bezier splines which act as control points that can form arbitrary shapes. The algorithm was implemented and experiments were carried out on sub-urban roads. The results show the effectiveness of the algorithm in producing more accurate lane boundaries on curvatures and other objects on the road.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One approach to the detection of curves at subpixel accuracy involves the reconstruction of such features from subpixel edge data points. A new technique is presented for reconstructing and segmenting curves with subpixel accuracy using deformable models. A curve is represented as a set of interconnected Hermite splines forming a snake generated from the subpixel edge information that minimizes the global energy functional integral over the set. While previous work on the minimization was mostly based on the Euler-Lagrange transformation, the authors use the finite element method to solve the energy minimization equation. The advantages of this approach over the Euler-Lagrange transformation approach are that the method is straightforward, leads to positive m-diagonal symmetric matrices, and has the ability to cope with irregular geometries such as junctions and corners. The energy functional integral solved using this method can also be used to segment the features by searching for the location of the maxima of the first derivative of the energy over the elementary curve set.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emergency department access block is an urgent problem faced by many public hospitals today. When access block occurs, patients in need of acute care cannot access inpatient wards within an optimal time frame. A widely held belief is that access block is the end product of a long causal chain, which involves poor discharge planning, insufficient bed capacity, and inadequate admission intensity to the wards. This paper studies the last link of the causal chain-the effect of admission intensity on access block, using data from a metropolitan hospital in Australia. We applied several modern statistical methods to analyze the data. First, we modeled the admission events as a nonhomogeneous Poisson process and estimated time-varying admission intensity with penalized regression splines. Next, we established a functional linear model to investigate the effect of the time-varying admission intensity on emergency department access block. Finally, we used functional principal component analysis to explore the variation in the daily time-varying admission intensities. The analyses suggest that improving admission practice during off-peak hours may have most impact on reducing the number of ED access blocks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monotonicity preserving interpolation and approximation have received substantial attention in the last thirty years because of their numerous applications in computer aided-design, statistics, and machine learning [9, 10, 19]. Constrained splines are particularly popular because of their flexibility in modeling different geometrical shapes, sound theoretical properties, and availability of numerically stable algorithms [9,10,26]. In this work we examine parallelization and adaptation for GPUs of a few algorithms of monotone spline interpolation and data smoothing, which arose in the context of estimating probability distributions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective. We have developed an image analysis methodology for quantifying the anisotropy of neuronal projections on patterned substrates. Approach. Our method is based on the fitting of smoothing splines to the digital traces produced using a non-maximum suppression technique. This enables precise estimates of the local tangents uniformly along the neurite length, and leads to unbiased orientation distributions suitable for objectively assessing the anisotropy induced by tailored surfaces. Main results. In our application, we demonstrate that carbon nanotubes arrayed in parallel bundles over gold surfaces induce a considerable neurite anisotropy; a result which is relevant for regenerative medicine. Significance. Our pipeline is generally applicable to the study of fibrous materials on 2D surfaces and should also find applications in the study of DNA, microtubules, and other polymeric materials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a subdivision-based vector graphics for image representation and creation. The graphics representation is a subdivision surface defined by a triangular mesh augmented with color attribute at vertices and feature attribute at edges. Special cubic B-splines are proposed to describe curvilinear features of an image. New subdivision rules are then designed accordingly, which are applied to the mesh and the color attribute to define the spatial distribution and piecewise-smoothly varying colors of the image. A sharpness factor is introduced to control the color transition across the curvilinear edges. In addition, an automatic algorithm is developed to convert a raster image into such a vector graphics representation. The algorithm first detects the curvilinear features of the image, then constructs a triangulation based on the curvilinear edges and feature attributes, and finally iteratively optimizes the vertex color attributes and updates the triangulation. Compared with existing vector-based image representations, the proposed representation and algorithm have the following advantages in addition to the common merits (such as editability and scalability): 1) they allow flexible mesh topology and handle images or objects with complicated boundaries or features effectively; 2) they are able to faithfully reconstruct curvilinear features, especially in modeling subtle shading effects around feature curves; and 3) they offer a simple way for the user to create images in a freehand style. The effectiveness of the proposed method has been demonstrated in experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose We analyzed the changes in the body mass index (BMI) distribution for urban Australian adults between 1980 and 2007.

Methods We used data from participants of six consecutive Australian nation-wide surveys with measured weight and height between 1980 and 2007. We used quantile regression to estimate mean BMI (for percentiles of BMI) and prevalence of severe obesity, modeled by natural splines in age, date of birth, and survey date.

Results Since 1980, the right skew in the BMI distribution for Australian adults has increased greatly for men and women, driven by increases in skew associated with age and birth cohort/period. Between 1980 and 2007, the average 5-year increase in BMI was 1 kg/m2 (0.8) for the 95th percentile of BMI in women (men). The increase in the median was about a third of this, and for the 10th percentile, a fifth of this. We estimated that for the cohort born in 1960 around 31% of men and women were obese by age 50 years compared with 11% of the 1930 birth cohort.

Conclusions There have been large increases in the right skew of the BMI distribution for urban Australian adults between 1980 and 2007, and birth cohort effects suggests similar increases are likely to continue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho apresenta uma sistemática para realizar a otimização numérica de pré-formas e de matrizes em problemas de forjamento axissimétricos e em estado plano de deformações. Para este fim, desenvolveu-se um código computacional composto basicamente de três módulos: módulo de pré-processamento, módulo de análise e módulo de otimização. Cada um destes foi elaborado acrescentando rotinas em programas comerciais ou acadêmicos disponíveis no GMAp e no CEMACOM. Um programa gerenciador foi desenvolvido para controlar os módulos citados no processo de otimização. A abordagem proposta apresenta uma nova função objetivo a minimizar, a qual está baseada em uma operação booleana XOR (exclusive or) sobre os dois polígonos planos que representam a geometria desejada para o componente e a obtida na simulação, respectivamente. Esta abordagem visa eliminar possíveis problemas geométricos associados com as funções objetivo comumente utilizadas em pesquisas correlatas. O trabalho emprega análise de sensibilidade numérica, via método das diferenças finitas. As dificuldades associadas a esta técnica são estudadas e dois pontos são identificados como limitadores da abordagem para problemas de conformação mecânica (grandes deformações elastoplásticas com contato friccional): baixa eficiência e contaminação dos gradientes na presença de remalhamentos. Um novo procedimento de diferenças finitas é desenvolvido, o qual elimina as dificuldades citadas, possibilitando a sua aplicação em problemas quaisquer, com características competitivas com as da abordagem analítica Malhas não estruturadas são tratadas mediante suavizações Laplacianas, mantendo as suas topologias. No caso de otimização de pré-formas, o contorno do componente a otimizar é parametrizado por B-Splines cujos pontos de controle são adotados como variáveis de projeto. Por outro lado, no caso de otimização de matrizes, a parametrização é realizada em termos de segmentos de reta e arcos de circunferências. As variáveis de projeto adotadas são, então, as coordenadas das extremidades das retas, os raios e centros dos arcos, etc. A sistemática é fechada pela aplicação dos algoritmos de programação matemática de Krister Svanberg (Método das Assíntotas Móveis Globalmente Convergente) e de Klaus Schittkowski (Programação Quadrática Sequencial – NLPQLP). Resultados numéricos são apresentados mostrando a evolução das implementações adotadas e o ganho de eficiência obtido.