914 resultados para Monotonic interpolation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present error analysis for a Monte Carlo algorithm for evaluating bilinear forms of matrix powers. An almost Optimal Monte Carlo (MAO) algorithm for solving this problem is formulated. Results for the structure of the probability error are presented and the construction of robust and interpolation Monte Carlo algorithms are discussed. Results are presented comparing the performance of the Monte Carlo algorithm with that of a corresponding deterministic algorithm. The two algorithms are tested on a well balanced matrix and then the effects of perturbing this matrix, by small and large amounts, is studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we analyse applicability and robustness of Markov chain Monte Carlo algorithms for eigenvalue problems. We restrict our consideration to real symmetric matrices. Almost Optimal Monte Carlo (MAO) algorithms for solving eigenvalue problems are formulated. Results for the structure of both - systematic and probability error are presented. It is shown that the values of both errors can be controlled independently by different algorithmic parameters. The results present how the systematic error depends on the matrix spectrum. The analysis of the probability error is presented. It shows that the close (in some sense) the matrix under consideration is to the stochastic matrix the smaller is this error. Sufficient conditions for constructing robust and interpolation Monte Carlo algorithms are obtained. For stochastic matrices an interpolation Monte Carlo algorithm is constructed. A number of numerical tests for large symmetric dense matrices are performed in order to study experimentally the dependence of the systematic error from the structure of matrix spectrum. We also study how the probability error depends on the balancing of the matrix. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Concentrations of dissolved organic carbon have increased in many, but not all, surface waters across acid impacted areas of Europe and North America over the last two decades. Over the last eight years several hypotheses have been put forward to explain these increases, but none are yet accepted universally. Research in this area appears to have reached a stalemate between those favouring declining atmospheric deposition, climate change or land management as the key driver of long-term DOC trends. While it is clear that many of these factors influence DOC dynamics in soil and stream waters, their effect varies over different temporal and spatial scales. We argue that regional differences in acid deposition loading may account for the apparent discrepancies between studies. DOC has shown strong monotonic increases in areas which have experienced strong downward trends in pollutant sulphur and/or seasalt deposition. Elsewhere climatic factors, that strongly influence seasonality, have also dominated inter-annual variability, and here long-term monotonic DOC trends are often difficult to detect. Furthermore, in areas receiving similar acid loadings, different catchment characteristics could have affected the site specific sensitivity to changes in acidity and therefore the magnitude of DOC release in response to changes in sulphur deposition. We suggest that confusion over these temporal and spatial scales of investigation has contributed unnecessarily to the disagreement over the main regional driver(s) of DOC trends, and that the data behind the majority of these studies is more compatible than is often conveyed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Kriging interpolation method is combined with an object-based evaluation measure to assess the ability of the UK Met Office's dispersion and weather prediction models to predict the evolution of a plume of tracer as it was transported across Europe. The object-based evaluation method, SAL, considers aspects of the Structure, Amplitude and Location of the pollutant field. The SAL method is able to quantify errors in the predicted size and shape of the pollutant plume, through the structure component, the over- or under-prediction of the pollutant concentrations, through the amplitude component, and the position of the pollutant plume, through the location component. The quantitative results of the SAL evaluation are similar for both models and close to a subjective visual inspection of the predictions. A negative structure component for both models, throughout the entire 60 hour plume dispersion simulation, indicates that the modelled plumes are too small and/or too peaked compared to the observed plume at all times. The amplitude component for both models is strongly positive at the start of the simulation, indicating that surface concentrations are over-predicted by both models for the first 24 hours, but modelled concentrations are within a factor of 2 of the observations at later times. Finally, for both models, the location component is small for the first 48 hours after the start of the tracer release, indicating that the modelled plumes are situated close to the observed plume early on in the simulation, but this plume location error grows at later times. The SAL methodology has also been used to identify differences in the transport of pollution in the dispersion and weather prediction models. The convection scheme in the weather prediction model is found to transport more pollution vertically out of the boundary layer into the free troposphere than the dispersion model convection scheme resulting in lower pollutant concentrations near the surface and hence a better forecast for this case study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ARM Shortwave Spectrometer (SWS) measures zenith radiance at 418 wavelengths between 350 and 2170 nm. Because of its 1-sec sampling resolution, the SWS provides a unique capability to study the transition zone between cloudy and clear sky areas. A spectral invariant behavior is found between ratios of zenith radiance spectra during the transition from cloudy to cloud-free. This behavior suggests that the spectral signature of the transition zone is a linear mixture between the two extremes (definitely cloudy and definitely clear). The weighting function of the linear mixture is a wavelength-independent characteristic of the transition zone. It is shown that the transition zone spectrum is fully determined by this function and zenith radiance spectra of clear and cloudy regions. An important result of these discoveries is that high temporal resolution radiance measurements in the clear-to-cloud transition zone can be well approximated by lower temporal resolution measurements plus linear interpolation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rainfall can be modeled as a spatially correlated random field superimposed on a background mean value; therefore, geostatistical methods are appropriate for the analysis of rain gauge data. Nevertheless, there are certain typical features of these data that must be taken into account to produce useful results, including the generally non-Gaussian mixed distribution, the inhomogeneity and low density of observations, and the temporal and spatial variability of spatial correlation patterns. Many studies show that rigorous geostatistical analysis performs better than other available interpolation techniques for rain gauge data. Important elements are the use of climatological variograms and the appropriate treatment of rainy and nonrainy areas. Benefits of geostatistical analysis for rainfall include ease of estimating areal averages, estimation of uncertainties, and the possibility of using secondary information (e.g., topography). Geostatistical analysis also facilitates the generation of ensembles of rainfall fields that are consistent with a given set of observations, allowing for a more realistic exploration of errors and their propagation in downstream models, such as those used for agricultural or hydrological forecasting. This article provides a review of geostatistical methods used for kriging, exemplified where appropriate by daily rain gauge data from Ethiopia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study we quantify the relationship between the aerosol optical depth increase from a volcanic eruption and the severity of the subsequent surface temperature decrease. This investigation is made by simulating 10 different sizes of eruption in a global circulation model (GCM) by changing stratospheric sulfate aerosol optical depth at each time step. The sizes of the simulated eruptions range from Pinatubo‐sized up to the magnitude of supervolcanic eruptions around 100 times the size of Pinatubo. From these simulations we find that there is a smooth monotonic relationship between the global mean maximum aerosol optical depth anomaly and the global mean temperature anomaly and we derive a simple mathematical expression which fits this relationship well. We also construct similar relationships between global mean aerosol optical depth and the temperature anomaly at every individual model grid box to produce global maps of best‐fit coefficients and fit residuals. These maps are used with caution to find the eruption size at which a local temperature anomaly is clearly distinct from the local natural variability and to approximate the temperature anomalies which the model may simulate following a Tambora‐sized eruption. To our knowledge, this is the first study which quantifies the relationship between aerosol optical depth and resulting temperature anomalies in a simple way, using the wealth of data that is available from GCM simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using 4 years of radar and lidar observations of layer clouds from the Chilbolton Observatory in the UK, we show that almost all (95%) ice particles formed at temperatures >-20°C appear to originate from supercooled liquid clouds. At colder temperatures, there is a monotonic decline in the fraction of liquid-topped ice clouds: 50% at -27°C, falling to zero at -37°C (where homogeneous freezing of water droplets occurs). This strongly suggests that deposition nucleation plays a relatively minor role in the initiation of ice in mid-level clouds. It also means that the initial growth of the ice particles occurs predominantly within a liquid cloud, a situation which promotes rapid production of precipitation via the Bergeron-Findeison mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dependence of much of Africa on rain fed agriculture leads to a high vulnerability to fluctuations in rainfall amount. Hence, accurate monitoring of near-real time rainfall is particularly useful, for example in forewarning possible crop shortfalls in drought-prone areas. Unfortunately, ground based observations are often inadequate. Rainfall estimates from satellite-based algorithms and numerical model outputs can fill this data gap, however rigorous assessment of such estimates is required. In this case, three satellite based products (NOAA-RFE 2.0, GPCP-1DD and TAMSAT) and two numerical model outputs (ERA-40 and ERA-Interim) have been evaluated for Uganda in East Africa using a network of 27 rain gauges. The study focuses on the years 2001 to 2005 and considers the main rainy season (February to June). All data sets were converted to the same temporal and spatial scales. Kriging was used for the spatial interpolation of the gauge data. All three satellite products showed similar characteristics and had a high level of skill that exceeded both model outputs. ERA-Interim had a tendency to overestimate whilst ERA-40 consistently underestimated the Ugandan rainfall.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a nonlinear regression structure comprising a wavelet network and a linear term. The introduction of the linear term is aimed at providing a more parsimonious interpolation in high-dimensional spaces when the modelling samples are sparse. A constructive procedure for building such structures, termed linear-wavelet networks, is described. For illustration, the proposed procedure is employed in the framework of dynamic system identification. In an example involving a simulated fermentation process, it is shown that a linear-wavelet network yields a smaller approximation error when compared with a wavelet network with the same number of regressors. The proposed technique is also applied to the identification of a pressure plant from experimental data. In this case, the results show that the introduction of wavelets considerably improves the prediction ability of a linear model. Standard errors on the estimated model coefficients are also calculated to assess the numerical conditioning of the identification process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[English] This paper is a tutorial introduction to pseudospectral optimal control. With pseudospectral methods, a function is approximated as a linear combination of smooth basis functions, which are often chosen to be Legendre or Chebyshev polynomials. Collocation of the differential-algebraic equations is performed at orthogonal collocation points, which are selected to yield interpolation of high accuracy. Pseudospectral methods directly discretize the original optimal control problem to recast it into a nonlinear programming format. A numerical optimizer is then employed to find approximate local optimal solutions. The paper also briefly describes the functionality and implementation of PSOPT, an open source software package written in C++ that employs pseudospectral discretization methods to solve multi-phase optimal control problems. The software implements the Legendre and Chebyshev pseudospectral methods, and it has useful features such as automatic differentiation, sparsity detection, and automatic scaling. The use of pseudospectral methods is illustrated in two problems taken from the literature on computational optimal control. [Portuguese] Este artigo e um tutorial introdutorio sobre controle otimo pseudo-espectral. Em metodos pseudo-espectrais, uma funcao e aproximada como uma combinacao linear de funcoes de base suaves, tipicamente escolhidas como polinomios de Legendre ou Chebyshev. A colocacao de equacoes algebrico-diferenciais e realizada em pontos de colocacao ortogonal, que sao selecionados de modo a minimizar o erro de interpolacao. Metodos pseudoespectrais discretizam o problema de controle otimo original de modo a converte-lo em um problema de programa cao nao-linear. Um otimizador numerico e entao empregado para obter solucoes localmente otimas. Este artigo tambem descreve sucintamente a funcionalidade e a implementacao de um pacote computacional de codigo aberto escrito em C++ chamado PSOPT. Tal pacote emprega metodos de discretizacao pseudo-spectrais para resolver problemas de controle otimo com multiplas fase. O PSOPT permite a utilizacao de metodos de Legendre ou Chebyshev, e possui caractersticas uteis tais como diferenciacao automatica, deteccao de esparsidade e escalonamento automatico. O uso de metodos pseudo-espectrais e ilustrado em dois problemas retirados da literatura de controle otimo computacional.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the implementation of a 3D variational (3D-Var) data assimilation scheme for a morphodynamic model applied to Morecambe Bay, UK. A simple decoupled hydrodynamic and sediment transport model is combined with a data assimilation scheme to investigate the ability of such methods to improve the accuracy of the predicted bathymetry. The inverse forecast error covariance matrix is modelled using a Laplacian approximation which is calibrated for the length scale parameter required. Calibration is also performed for the Soulsby-van Rijn sediment transport equations. The data used for assimilation purposes comprises waterlines derived from SAR imagery covering the entire period of the model run, and swath bathymetry data collected by a ship-borne survey for one date towards the end of the model run. A LiDAR survey of the entire bay carried out in November 2005 is used for validation purposes. The comparison of the predictive ability of the model alone with the model-forecast-assimilation system demonstrates that using data assimilation significantly improves the forecast skill. An investigation of the assimilation of the swath bathymetry as well as the waterlines demonstrates that the overall improvement is initially large, but decreases over time as the bathymetry evolves away from that observed by the survey. The result of combining the calibration runs into a pseudo-ensemble provides a higher skill score than for a single optimized model run. A brief comparison of the Optimal Interpolation assimilation method with the 3D-Var method shows that the two schemes give similar results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The arbitrarily structured C-grid, TRiSK (Thuburn, Ringler, Skamarock and Klemp, 2009, 2010) is being used in the ``Model for Prediction Across Scales'' (MPAS) and is being considered by the UK Met Office for their next dynamical core. However the hexagonal C-grid supports a branch of spurious Rossby modes which lead to erroneous grid-scale oscillations of potential vorticity (PV). It is shown how these modes can be harmlessly controlled by using upwind-biased interpolation schemes for PV. A number of existing advection schemes for PV are tested, including that used in MPAS, and none are found to give adequate results for all grids and all cases. Therefore a new scheme is proposed; continuous, linear-upwind stabilised transport (CLUST), a blend between centred and linear-upwind with the blend dependent on the flow direction with respect to the cell edge. A diagnostic of grid-scale oscillations is proposed which gives further discrimination between schemes than using potential enstrophy alone and indeed some schemes are found to destroy potential enstrophy while grid-scale oscillations grow. CLUST performs well on hexagonal-icosahedral grids and unrotated skipped latitude-longitude grids of the sphere for various shallow water test cases. Despite the computational modes, the hexagonal icosahedral grid performs well since these modes are easy and harmless to filter. As a result TRiSK appears to perform better than a spectral shallow water model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Records of Atlantic basin tropical cyclones (TCs) since the late nineteenth century indicate a very large upward trend in storm frequency. This increase in documented TCs has been previously interpreted as resulting from anthropogenic climate change. However, improvements in observing and recording practices provide an alternative interpretation for these changes: recent studies suggest that the number of potentially missed TCs is sufficient to explain a large part of the recorded increase in TC counts. This study explores the influence of another factor—TC duration—on observed changes in TC frequency, using a widely used Atlantic hurricane database (HURDAT). It is found that the occurrence of short-lived storms (duration of 2 days or less) in the database has increased dramatically, from less than one per year in the late nineteenth–early twentieth century to about five per year since about 2000, while medium- to long-lived storms have increased little, if at all. Thus, the previously documented increase in total TC frequency since the late nineteenth century in the database is primarily due to an increase in very short-lived TCs. The authors also undertake a sampling study based upon the distribution of ship observations, which provides quantitative estimates of the frequency of missed TCs, focusing just on the moderate to long-lived systems with durations exceeding 2 days in the raw HURDAT. Upon adding the estimated numbers of missed TCs, the time series of moderate to long-lived Atlantic TCs show substantial multidecadal variability, but neither time series exhibits a significant trend since the late nineteenth century, with a nominal decrease in the adjusted time series. Thus, to understand the source of the century-scale increase in Atlantic TC counts in HURDAT, one must explain the relatively monotonic increase in very short-duration storms since the late nineteenth century. While it is possible that the recorded increase in short-duration TCs represents a real climate signal, the authors consider that it is more plausible that the increase arises primarily from improvements in the quantity and quality of observations, along with enhanced interpretation techniques. These have allowed National Hurricane Center forecasters to better monitor and detect initial TC formation, and thus incorporate increasing numbers of very short-lived systems into the TC database.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Typeface design: a series of collaborative projects commissioned by Adobe, Inc. and Brill to develop extensive polytonic Greek typefaces. The two Adobe typefaces can be seen as extension of previous research for the Garamond Premier Pro family (2005), and concludes a research theme started in 1998 with work for Adobe’s Minion Pro Greek. These typefaces together define the state of the art for text-intensive Greek typesetting for wide character set texts (from classical texts, to poetry, to essays, to prose). They serve both as exemplar for other developers, and as vehicles for developing the potential of Greek text typography, for example with the parallel inclusion of monotonic and polytonic characters, detailed localised punctuation options, fluid handling of case-conversion issues, and innovative options such as accented small caps (originally requested by bibliographers, and subsequently rolled out to a general user base). The Brill typeface (for the established academic publisher) has an exceptionally wide character set to cover several academic disciplines, and is intended to differentiate sufficiently from its partner Latin typeface, while maintaining a clear texture in both offset and low-resolution print-on-demand reproduction. This work involved substantial amounts of testing and modifying the design, especially of diacritics, to maintain clarity the readability of unfamiliar words. All together these typefaces form a study in how Greek typesetting meets contemporary typographic requirements, while resonating with historically accurate styles, where these are present. Significant research in printing archives helped to identify appropriate styles, as well as originate variants that are coherent stylistically, even when historical equivalents were absent.