545 resultados para Smoothing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an analysis of the accuracy of the method introduced by Lockwood et al. (1994) for the determination of the magnetopause reconnection rate from the dispersion of precipitating ions in the ionospheric cusp region. Tests are made by applying the method to synthesised data. The simulated cusp ion precipitation data are produced by an analytic model of the evolution of newly-opened field lines, along which magnetosheath ions are firstly injected across the magnetopause and then dispersed as they propagate into the ionosphere. The rate at which these newly opened field lines are generated by reconnection can be varied. The derived reconnection rate estimates are then compared with the input variation to the model and the accuracy of the method assessed. Results are presented for steady-state reconnection, for continuous reconnection showing a sine-wave variation in rate and for reconnection which only occurs in square wave pulses. It is found that the method always yields the total flux reconnected (per unit length of the open-closed field-line boundary) to within an accuracy of better than 5%, but that pulses tend to be smoothed so that the peak reconnection rate within the pulse is underestimated and the pulse length is overestimated. This smoothing is reduced if the separation between energy channels of the instrument is reduced; however this also acts to increase the experimental uncertainty in the estimates, an effect which can be countered by improving the time resolution of the observations. The limited time resolution of the data is shown to set a minimum reconnection rate below which the method gives spurious short-period oscillations about the true value. Various examples of reconnection rate variations derived from cusp observations are discussed in the light of this analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the effect on balance of a number of Schur product-type localization schemes which have been designed with the primary function of reducing spurious far-field correlations in forecast error statistics. The localization schemes studied comprise a non-adaptive scheme (where the moderation matrix is decomposed in a spectral basis), and two adaptive schemes, namely a simplified version of SENCORP (Smoothed ENsemble COrrelations Raised to a Power) and ECO-RAP (Ensemble COrrelations Raised to A Power). The paper shows, we believe for the first time, how the degree of balance (geostrophic and hydrostatic) implied by the error covariance matrices localized by these schemes can be diagnosed. Here it is considered that an effective localization scheme is one that reduces spurious correlations adequately but also minimizes disruption of balance (where the 'correct' degree of balance or imbalance is assumed to be possessed by the unlocalized ensemble). By varying free parameters that describe each scheme (e.g. the degree of truncation in the schemes that use the spectral basis, the 'order' of each scheme, and the degree of ensemble smoothing), it is found that a particular configuration of the ECO-RAP scheme is best suited to the convective-scale system studied. According to our diagnostics this ECO-RAP configuration still weakens geostrophic and hydrostatic balance, but overall this is less so than for other schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantitative palaeoclimate reconstructions are widely used to evaluate climatemodel performance. Here, as part of an effort to provide such a data set for Australia, we examine the impact of analytical decisions and sampling assumptions on modern-analogue reconstructions using a continent-wide pollen data set. There is a high degree of correlation between temperature variables in the modern climate of Australia, but there is sufficient orthogonality in the variations of precipitation, summer and winter temperature and plant–available moisture to allow independent reconstructions of these four variables to be made. The method of analogue selection does not affect the reconstructions, although bootstrap resampling provides a more reliable technique for obtaining robust measures of uncertainty. The number of analogues used affects the quality of the reconstructions: the most robust reconstructions are obtained using 5 analogues. The quality of reconstructions based on post-1850 CE pollen samples differ little from those using samples from between 1450 and 1849 CE, showing that European post settlement modification of vegetation has no impact on the fidelity of the reconstructions although it substantially increases the availability of potential analogues. Reconstructions based on core top samples are more realistic than those using surface samples, but only using core top samples would substantially reduce the number of available analogues and therefore increases the uncertainty of the reconstructions. Spatial and/or temporal averaging of pollen assemblages prior to analysis negatively affects the subsequent reconstructions for some variables and increases the associated uncertainties. In addition, the quality of the reconstructions is affected by the degree of spatial smoothing of the original climate data, with the best reconstructions obtained using climate data froma 0.5° resolution grid, which corresponds to the typical size of the pollen catchment. This study provides a methodology that can be used to provide reliable palaeoclimate reconstructions for Australia, which will fill in a major gap in the data sets used to evaluate climate models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We provide new evidence on sea surface temperature (SST) variations and paleoceanographic/paleoenvironmental changes over the past 1500 years for the north Aegean Sea (NE Mediterranean). The reconstructions are based on multiproxy analyses, obtained from the high resolution (decadal to multidecadal) marine record M2 retrieved from the Athos basin. Reconstructed SSTs show an increase from ca. 850 to 950 AD and from ca. 1100 to 1300 AD. A cooling phase of almost 1.5 �C is observed from ca. 1600 AD to 1700 AD. This seems to have been the starting point of a continuous SST warming trend until the end of the reconstructed period, interrupted by two prominent cooling events at 1832 ± 15 AD and 1995 ± 1 AD. Application of an adaptive Kernel smoothing suggests that the current warming in the reconstructed SSTs of the north Aegean might be unprecedented in the context of the past 1500 years. Internal variability in atmospheric/oceanic circulations systems as well as external forcing as solar radiation and volcanic activity could have affected temperature variations in the north Aegean Sea over the past 1500 years. The marked temperature drop of approximately ~2 �C at 1832 ± 15 yr AD could be related to the 1809 АD ‘unknown’ and the 1815 AD Tambora volcanic eruptions. Paleoenvironmental proxy-indices of the M2 record show enhanced riverine/continental inputs in the northern Aegean after ca. 1450 AD. The paleoclimatic evidence derived from the M2 record is combined with a socio-environmental study of the history of the north Aegean region. We show that the cultivation of temperature-sensitive crops, i.e. walnut, vine and olive, co-occurred with stable and warmer temperatures, while its end coincided with a significant episode of cooler temperatures. Periods of agricultural growth in Macedonia coincide with periods of warmer and more stable SSTs, but further exploration is required in order to identify the causal links behind the observed phenomena. The Black Death likely caused major changes in agricultural activity in the north Aegean region, as reflected in the pollen data from land sites of Macedonia and the M2 proxy-reconstructions. Finally, we conclude that the early modern peaks in mountain vegetation in the Rhodope and Macedonia highlands, visible also in the M2 record, were very likely climate-driven.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a novel template-based meshing approach for generating good quality quadrilateral meshes from 2D digital images. This approach builds upon an existing image-based mesh generation technique called Imeshp, which enables us to create a segmented triangle mesh from an image without the need for an image segmentation step. Our approach generates a quadrilateral mesh using an indirect scheme, which converts the segmented triangle mesh created by the initial steps of the Imesh technique into a quadrilateral one. The triangle-to-quadrilateral conversion makes use of template meshes of triangles. To ensure good element quality, the conversion step is followed by a smoothing step, which is based on a new optimization-based procedure. We show several examples of meshes generated by our approach, and present a thorough experimental evaluation of the quality of the meshes given as examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The issue of smoothing in kriging has been addressed either by estimation or simulation. The solution via estimation calls for postprocessing kriging estimates in order to correct the smoothing effect. Stochastic simulation provides equiprobable images presenting no smoothing and reproducing the covariance model. Consequently, these images reproduce both the sample histogram and the sample semivariogram. However, there is still a problem, which is the lack of local accuracy of simulated images. In this paper, a postprocessing algorithm for correcting the smoothing effect of ordinary kriging estimates is compared with sequential Gaussian simulation realizations. Based on samples drawn from exhaustive data sets, the postprocessing algorithm is shown to be superior to any individual simulation realization yet, at the expense of providing one deterministic estimate of the random function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we extend partial linear models with normal errors to Student-t errors Penalized likelihood equations are applied to derive the maximum likelihood estimates which appear to be robust against outlying observations in the sense of the Mahalanobis distance In order to study the sensitivity of the penalized estimates under some usual perturbation schemes in the model or data the local influence curvatures are derived and some diagnostic graphics are proposed A motivating example preliminary analyzed under normal errors is reanalyzed under Student-t errors The local influence approach is used to compare the sensitivity of the model estimates (C) 2010 Elsevier B V All rights reserved

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A massive amount has been written about forecasting but few articles are written about the development of time series models of call volumes for emergency services. In this study, we use different techniques for forecasting and make the comparison of the techniques for the call volume of the emergency service Rescue 1122 Lahore, Pakistan. For the purpose of this study data is taken from emergency calls of Rescue 1122 from 1st January 2008 to 31 December 2009 and 731 observations are used. Our goal is to develop a simple model that could be used for forecasting the daily call volume. Two different approaches are used for forecasting the daily call volume Box and Jenkins (ARIMA) methodology and Smoothing methodology. We generate the models for forecasting of call volume and present a comparison of the two different techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work I analyze the model proposed by Goldfajn (2000) to study the choice of the denomination of the public debt. The main purpose of the analysis is pointing out possible reasons why new empirical evidence provided by Bevilaqua, Garcia and Nechio (2004), regarding a more recent time period, Önds a lower empirical support to the model. I also provide a measure of the overestimation of the welfare gains of hedging the debt led by the simpliÖed time frame of the model. Assuming a time-preference parameter of 0.9, for instance, welfare gains associated with a hedge to the debt that reduces to a half a once-for-all 20%-of-GDP shock to government spending run around 1.43% of GDP under the no-tax-smoothing structure of the model. Under a Ramsey allocation, though, welfare gains amount to just around 0.05% of GDP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Asymmetric kernels are quite useful for the estimation of density functions with bounded support. Gamma kernels are designed to handle density functions whose supports are bounded from one end only, whereas beta kernels are particularly convenient for the estimation of density functions with compact support. These asymmetric kernels are nonnegative and free of boundary bias. Moreover, their shape varies according to the location of the data point, thus also changing the amount of smoothing. This paper applies the central limit theorem for degenerate U-statistics to compute the limiting distribution of a class of asymmetric kernel functionals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although there has been substantial research on long-run co-movement (common trends) in the empirical macroeconomics literature. little or no work has been done on short run co-movement (common cycles). Investigating common cycles is important on two grounds: first. their existence is an implication of most dynamic macroeconomic models. Second. they impose important restrictions on dynamic systems. Which can be used for efficient estimation and forecasting. In this paper. using a methodology that takes into account short- and long-run co-movement restrictions. we investigate their existence in a multivariate data set containing U.S. per-capita output. consumption. and investment. As predicted by theory. the data have common trends and common cycles. Based on the results of a post-sample forecasting comparison between restricted and unrestricted systems. we show that a non-trivial loss of efficiency results when common cycles are ignored. If permanent shocks are associated with changes in productivity. the latter fails to be an important source of variation for output and investment contradicting simple aggregate dynamic models. Nevertheless. these shocks play a very important role in explaining the variation of consumption. Showing evidence of smoothing. Furthermore. it seems that permanent shocks to output play a much more important role in explaining unemployment fluctuations than previously thought.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta dissertação tem como objetivo principal investigar o impacto dos accruals na variabilidade dos resultados corporativos (EVAR) que influenciam a aplicação prática do income smoothing nas firmas brasileiras de capital aberto. Inicialmente, é demonstrada a importância das demonstrações contábeis que devem ser evidenciadas em cumprimento aos princípios contábeis geralmente aceitos. Sua evidenciação deve representar a realidade econômico-financeira da firma para o processo de tomada de decisão dos acionistas e credores. Porém, em determinados momentos, os gestores se sentem motivados a praticar o gerenciamento dos resultados contábeis na tentativa de reduzir a variabilidade dos lucros por meio da utilização dos accruals. Os accruals correspondem à diferença entre o lucro líquido e o fluxo de caixa operacional. Nesse processo de redução da volatilidade dos resultados, os gestores se utilizam da prática do income smoothing procurando reduzir eventuais distorções no preço das ações da firma. A amostra neste estudo é composta por um grupo de 163 firmas de capital aberto listadas na Bovespa e que apresentaram informações financeiras no intervalo de 2000 a 2007, categorizadas por setores através de dados obtidos na Economática. O modelo estatístico utilizado na pesquisa foi a análise de regressão para explicar os diferentes modelos de cross-sectional. Os resultados desta pesquisa indicam que os accruals são significativos para explicar a variabilidade dos resultados corporativos (EVAR) de empresas brasileiras. Além disso, nossos resultados sugerem que o modelo estrutural de identificação do EVAR nas empresas brasileiras deve ser explicado por variáveis não contábeis diferentes das que são apresentadas pelas firmas norte-americanas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A tese analisa a política fiscal do Estado do Rio Grande do Sul entre 1970 e 2003. O estudo está assim dividido: em primeiro lugar, revê o debate sobre se os déficits públicos importam ou não; em segundo lugar, testa a hipótese de sustentabilidade da política fiscal do Rio Grande do Sul, por meio de testes de raiz unitária e de cointegração; em terceiro lugar, testa a hipótese tax-smoothing para o caso do Rio Grande do Sul; em quarto lugar, testa diversas hipóteses sobre os determinantes do déficit público, medido pela variação da relação dívida/PIB, e do déficit primário para o caso do Rio Grande do Sul. As hipóteses foram divididas em grupos de fatores: os econômicos, os institucionais e os políticos.