15 resultados para SMOOTHING SPLINE

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: To quantitatively evaluate changes induced by the application of a femoral blood-pressure cuff (BPC) on run-off magnetic resonance angiography (MRA). which is a method generally previously proposed to reduce venous contamination in the leg. Materials and Methods: This study was Health Insurance Portability and Accountability Act (HIPAA)- and Institutional Review Board (IRB)-compliant, We used time-resolved gradient-echo gadolinium (Gd)-enhanced MRA to measure BPC effects on arterial, venous, and soft-tissue enhancement. Seven healthy volunteers (six men) were studied with the BPC applied at the mid-femoral level unilaterally using a 1.5T MR system after intravenous injection of Gd-BOPTA. Different statistical tools were used such as the Wilcoxon signed rank test and a cubic smoothing spline fit. Results: We found that BPC application induces delayed venous filling (as previously described), but also induces significant decreases in arterial inflow, arterial enhancement, vascular-soft tissue contrast, and delayed peak enhancement (which have not been previously measured). Conclusion: The potential benefits from using a BPC for run-off MRA must be balanced against the potential pitfalls, elucidated by our findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este artigo discute um modelo de previsão combinada para a realização de prognósticos climáticos na escala sazonal. Nele, previsões pontuais de modelos estocásticos são agregadas para obter as melhores projeções no tempo. Utilizam-se modelos estocásticos autoregressivos integrados a médias móveis, de suavização exponencial e previsões por análise de correlações canônicas. O controle de qualidade das previsões é feito através da análise dos resíduos e da avaliação do percentual de redução da variância não-explicada da modelagem combinada em relação às previsões dos modelos individuais. Exemplos da aplicação desses conceitos em modelos desenvolvidos no Instituto Nacional de Meteorologia (INMET) mostram bons resultados e ilustram que as previsões do modelo combinado, superam na maior parte dos casos a de cada modelo componente, quando comparadas aos dados observados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Patients with chronic obstructive pulmonary disease (COPD) can have recurrent disease exacerbations triggered by several factors, including air pollution. Visits to the emergency respiratory department can be a direct result of short-term exposure to air pollution. The aim of this study was to investigate the relationship between the daily number of COPD emergency department visits and the daily environmental air concentrations of PM(10), SO(2), NO(2), CO and O(3) in the City of Sao Paulo, Brazil. Methods: The sample data were collected between 2001 and 2003 and are categorised by gender and age. Generalised linear Poisson regression models were adopted to control for both short-and long-term seasonal changes as well as for temperature and relative humidity. The non-linear dependencies were controlled using a natural cubic spline function. Third-degree polynomial distributed lag models were adopted to estimate both lag structures and the cumulative effects of air pollutants. Results: PM(10) and SO(2) readings showed both acute and lagged effects on COPD emergency department visits. Interquartile range increases in their concentration (28.3 mg/m(3) and 7.8 mg/m(3), respectively) were associated with a cumulative 6-day increase of 19% and 16% in COPD admissions, respectively. An effect on women was observed at lag 0, and among the elderly the lag period was noted to be longer. Increases in CO concentration showed impacts in the female and elderly groups. NO(2) and O(3) presented mild effects on the elderly and in women, respectively. Conclusion: These results indicate that air pollution affects health in a gender-and age-specific manner and should be considered a relevant risk factor that exacerbates COPD in urban environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today several different unsupervised classification algorithms are commonly used to cluster similar patterns in a data set based only on its statistical properties. Specially in image data applications, self-organizing methods for unsupervised classification have been successfully applied for clustering pixels or group of pixels in order to perform segmentation tasks. The first important contribution of this paper refers to the development of a self-organizing method for data classification, named Enhanced Independent Component Analysis Mixture Model (EICAMM), which was built by proposing some modifications in the Independent Component Analysis Mixture Model (ICAMM). Such improvements were proposed by considering some of the model limitations as well as by analyzing how it should be improved in order to become more efficient. Moreover, a pre-processing methodology was also proposed, which is based on combining the Sparse Code Shrinkage (SCS) for image denoising and the Sobel edge detector. In the experiments of this work, the EICAMM and other self-organizing models were applied for segmenting images in their original and pre-processed versions. A comparative analysis showed satisfactory and competitive image segmentation results obtained by the proposals presented herein. (C) 2008 Published by Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulated annealing (SA) is an optimization technique that can process cost functions with degrees of nonlinearities, discontinuities and stochasticity. It can process arbitrary boundary conditions and constraints imposed on these cost functions. The SA technique is applied to the problem of robot path planning. Three situations are considered here: the path is represented as a polyline; as a Bezier curve; and as a spline interpolated curve. In the proposed SA algorithm, the sensitivity of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensitivity of each parameter is associated to its probability distribution in the definition of the next candidate. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The practicability of estimating directional wave spectra based on a vessel`s 1st order response has been recently addressed by several researchers. Different alternatives regarding statistical inference methods and possible drawbacks that could arise from their application have been extensively discussed, with an apparent preference for estimations based on Bayesian inference algorithms. Most of the results on this matter, however, rely exclusively on numerical simulations or at best on few and sparse full-scale measurements, comprising a questionable basis for validation purposes. This paper discusses several issues that have recently been debated regarding the advantages of Bayesian inference and different alternatives for its implementation. Among those are the definition of the best set of input motions, the number of parameters required for guaranteeing smoothness of the spectrum in frequency and direction and how to determine their optimum values. These subjects are addressed in the light of an extensive experimental campaign performed with a small-scale model of an FPSO platform (VLCC hull), which was conducted in an ocean basin in Brazil. Tests involved long and short crested seas with variable levels of directional spreading and also bimodal conditions. The calibration spectra measured in the tank by means of an array of wave probes configured the paradigm for estimations. Results showed that a wide range of sea conditions could be estimated with good precision, even those with somewhat low peak periods. Some possible drawbacks that have been pointed out in previous works concerning the viability of employing large vessels for such a task are then refuted. Also, it is shown that a second parameter for smoothing the spectrum in frequency may indeed increase the accuracy in some situations, although the criterion usually proposed for estimating the optimum values (ABIC) demands large computational effort and does not seem adequate for practical on-board systems, which require expeditious estimations. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mixed models have become important in analyzing the results of experiments, particularly those that require more complicated models (e.g., those that involve longitudinal data). This article describes a method for deriving the terms in a mixed model. Our approach extends an earlier method by Brien and Bailey to explicitly identify terms for which autocorrelation and smooth trend arising from longitudinal observations need to be incorporated in the model. At the same time we retain the principle that the model used should include, at least, all the terms that are justified by the randomization. This is done by dividing the factors into sets, called tiers, based on the randomization and determining the crossing and nesting relationships between factors. The method is applied to formulate mixed models for a wide range of examples. We also describe the mixed model analysis of data from a three-phase experiment to investigate the effect of time of refinement on Eucalyptus pulp from four different sources. Cubic smoothing splines are used to describe differences in the trend over time and unstructured covariance matrices between times are found to be necessary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We model and calibrate the arguments in favor and against short-term and long-term debt. These arguments broadly include: maturity premium, sustainability, and service smoothing. We use a dynamic-equilibrium model with tax distortions and government outlays uncertainty, and model maturity as the fraction of debt that needs to be rolled over every period. In the model, the benefits of defaulting are tempered by higher future interest rates. We then calibrate our artificial economy and solve for the optimal debt maturity for Brazil as an example of a developing country and the US as an example of a mature economy. We obtain that the calibrated costs from defaulting on long-term debt more than offset costs associated with short-term debt. Therefore, short-term debt implies higher welfare levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main arguments in favor and against nominal and indexed debts are the incentive to default through inflation versus hedging against unforeseen shocks. We model and calibrate these arguments to assess their quantitative importance. We use a dynamic equilibrium model with tax distortion, government outlays uncertainty, and contingent-debt service. Our framework also recognizes that contingent debt can be associated with incentive problems and lack of commitment. Thus, the benefits of unexpected inflation are tempered by higher interest rates. We obtain that costs from inflation more than offset the benefits from reducing tax distortions. We further discuss sustainability of nominal debt in developing (volatile) countries. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: People with less education in Europe, Asia, and the United States are at higher risk of mortality associated with daily and longer-term air pollution exposure. We examined whether educational level modified associations between mortality and ambient particulate pollution (PM(10)) in Latin America, using several timescales. Methods: The study population included people who died during 1998-2002 in Mexico City, Mexico; Santiago, Chile; and Sao Paulo, Brazil. We fit city-specific robust Poisson regressions to daily deaths for nonexternal-cause mortality, and then stratified by age, sex, and educational attainment among adults older than age 21 years (none, some primary, some secondary, and high school degree or more). Predictor variables included a natural spline for temporal trend, linear PM(10) and apparent temperature at matching lags, and day-of-week indicators. We evaluated PM(10) for lags 0 and I day, and fit an unconstrained distributed lag model for cumulative 6-day effects. Results: The effects of a 10-mu g/m(3) increment in lag 1 PM(10) on all nonextemal-cause adult mortality were for Mexico City 0.39% (95% confidence interval = 0.131/-0.65%); Sao Paulo 1.04% (0.71%-1.38%); and for Santiago 0.61% (0.40%-0.83%. We found cumulative 6-day effects for adult mortality in Santiago (0.86% [0.48%-1.23%]) and Sao Paulo (1.38% [0.85%-1.91%]), but no consistent gradients by educational status. Conclusions: PM(10) had important short- and intermediate-term effects on mortality in these Latin American cities, but associations did not differ consistently by educational level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a novel template-based meshing approach for generating good quality quadrilateral meshes from 2D digital images. This approach builds upon an existing image-based mesh generation technique called Imeshp, which enables us to create a segmented triangle mesh from an image without the need for an image segmentation step. Our approach generates a quadrilateral mesh using an indirect scheme, which converts the segmented triangle mesh created by the initial steps of the Imesh technique into a quadrilateral one. The triangle-to-quadrilateral conversion makes use of template meshes of triangles. To ensure good element quality, the conversion step is followed by a smoothing step, which is based on a new optimization-based procedure. We show several examples of meshes generated by our approach, and present a thorough experimental evaluation of the quality of the meshes given as examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Brazil`s State of Sao Paulo Research Foundation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The issue of smoothing in kriging has been addressed either by estimation or simulation. The solution via estimation calls for postprocessing kriging estimates in order to correct the smoothing effect. Stochastic simulation provides equiprobable images presenting no smoothing and reproducing the covariance model. Consequently, these images reproduce both the sample histogram and the sample semivariogram. However, there is still a problem, which is the lack of local accuracy of simulated images. In this paper, a postprocessing algorithm for correcting the smoothing effect of ordinary kriging estimates is compared with sequential Gaussian simulation realizations. Based on samples drawn from exhaustive data sets, the postprocessing algorithm is shown to be superior to any individual simulation realization yet, at the expense of providing one deterministic estimate of the random function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we extend partial linear models with normal errors to Student-t errors Penalized likelihood equations are applied to derive the maximum likelihood estimates which appear to be robust against outlying observations in the sense of the Mahalanobis distance In order to study the sensitivity of the penalized estimates under some usual perturbation schemes in the model or data the local influence curvatures are derived and some diagnostic graphics are proposed A motivating example preliminary analyzed under normal errors is reanalyzed under Student-t errors The local influence approach is used to compare the sensitivity of the model estimates (C) 2010 Elsevier B V All rights reserved