26 resultados para Trigonometric interpolation

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first part of this paper provides a comprehensive and self-contained account of the interrelationships between algebraic properties of varieties and properties of their free algebras and equational consequence relations. In particular, proofs are given of known equivalences between the amalgamation property and the Robinson property, the congruence extension property and the extension property, and the flat amalgamation property and the deductive interpolation property, as well as various dependencies between these properties. These relationships are then exploited in the second part of the paper in order to provide new proofs of amalgamation and deductive interpolation for the varieties of lattice-ordered abelian groups and MV-algebras, and to determine important subvarieties of residuated lattices where these properties hold or fail. In particular, a full description is given of all subvarieties of commutative GMV-algebras possessing the amalgamation property.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stable oxygen isotope composition of atmospheric precipitation (δ18Op) was scrutinized from 39 stations distributed over Switzerland and its border zone. Monthly amount-weighted δ18Op values averaged over the 1995–2000 period showed the expected strong linear altitude dependence (−0.15 to −0.22‰ per 100 m) only during the summer season (May–September). Steeper gradients (~ −0.56 to −0.60‰ per 100 m) were observed for winter months over a low elevation belt, while hardly any altitudinal difference was seen for high elevation stations. This dichotomous pattern could be explained by the characteristically shallower vertical atmospheric mixing height during winter season and provides empirical evidence for recently simulated effects of stratified atmospheric flow on orographic precipitation isotopic ratios. This helps explain "anomalous" deflected altitudinal water isotope profiles reported from many other high relief regions. Grids and isotope distribution maps of the monthly δ18Op have been calculated over the study region for 1995–1996. The adopted interpolation method took into account both the variable mixing heights and the seasonal difference in the isotopic lapse rate and combined them with residual kriging. The presented data set allows a point estimation of δ18Op with monthly resolution. According to the test calculations executed on subsets, this biannual data set can be extended back to 1992 with maintained fidelity and, with a reduced station subset, even back to 1983 at the expense of faded reliability of the derived δ18Op estimates, mainly in the eastern part of Switzerland. Before 1983, reliable results can only be expected for the Swiss Plateau since important stations representing eastern and south-western Switzerland were not yet in operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many observed time series of the global radiosonde or PILOT networks exist as fragments distributed over different archives. Identifying and merging these fragments can enhance their value for studies on the three-dimensional spatial structure of climate change. The Comprehensive Historical Upper-Air Network (CHUAN version 1.7), which was substantially extended in 2013, and the Integrated Global Radiosonde Archive (IGRA) are the most important collections of upper-air measurements taken before 1958. CHUAN (tracked) balloon data start in 1900, with higher numbers from the late 1920s onward, whereas IGRA data start in 1937. However, a substantial fraction of those measurements have not been taken at synoptic times (preferably 00:00 or 12:00 GMT) and on altitude levels instead of standard pressure levels. To make them comparable with more recent data, the records have been brought to synoptic times and standard pressure levels using state-of-the-art interpolation techniques, employing geopotential information from the National Oceanic and Atmospheric Administration (NOAA) 20th Century Reanalysis (NOAA 20CR). From 1958 onward the European Re-Analysis archives (ERA-40 and ERA-Interim) available at the European Centre for Medium-Range Weather Forecasts (ECMWF) are the main data sources. These are easier to use, but pilot data still have to be interpolated to standard pressure levels. Fractions of the same records distributed over different archives have been merged, if necessary, taking care that the data remain traceable back to their original sources. If possible, station IDs assigned by the World Meteorological Organization (WMO) have been allocated to the station records. For some records which have never been identified by a WMO ID, a local ID above 100 000 has been assigned. The merged data set contains 37 wind records longer than 70 years and 139 temperature records longer than 60 years. It can be seen as a useful basis for further data processing steps, most notably homogenization and gridding, after which it should be a valuable resource for climatological studies. Homogeneity adjustments for wind using the NOAA-20CR as a reference are described in Ramella Pralungo and Haimberger (2014). Reliable homogeneity adjustments for temperature beyond 1958 using a surface-data-only reanalysis such as NOAA-20CR as a reference have yet to be created. All the archives and metadata files are available in ASCII and netCDF format in the PANGAEA archive

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Strict next-to-leading order (NLO) results for the dilepton production rate from a QCD plasma at temperatures above a few hundred MeV suffer from a breakdown of the loop expansion in the regime of soft invariant masses M 2 ≪ (πT)2. In this regime an LPM resummation is needed for obtaining the correct leading-order result. We show how to construct an interpolation between the hard NLO and the leading-order LPM expression, which is theoretically consistent in both regimes and free from double counting. The final numerical results are presented in a tabulated form, suitable for insertion into hydrodynamical codes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main method of proving the Craig Interpolation Property (CIP) constructively uses cut-free sequent proof systems. Until now, however, no such method has been known for proving the CIP using more general sequent-like proof formalisms, such as hypersequents, nested sequents, and labelled sequents. In this paper, we start closing this gap by presenting an algorithm for proving the CIP for modal logics by induction on a nested-sequent derivation. This algorithm is applied to all the logics of the so-called modal cube.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present chapter gives a comprehensive introduction into the display and quantitative characterization of scalp field data. After introducing the construction of scalp field maps, different interpolation methods, the effect of the recording reference and the computation of spatial derivatives are discussed. The arguments raised in this first part have important implications for resolving a potential ambiguity in the interpretation of differences of scalp field data. In the second part of the chapter different approaches for comparing scalp field data are described. All of these comparisons can be interpreted in terms of differences of intracerebral sources either in strength, or in location and orientation in a nonambiguous way. In the present chapter we only refer to scalp field potentials, but mapping also can be used to display other features, such as power or statistical values. However, the rules for comparing and interpreting scalp field potentials might not apply to such data. Generic form of scalp field data Electroencephalogram (EEG) and event-related potential (ERP) recordings consist of one value for each sample in time and for each electrode. The recorded EEG and ERP data thus represent a two-dimensional array, with one dimension corresponding to the variable “time” and the other dimension corresponding to the variable “space” or electrode. Table 2.1 shows ERP measurements over a brief time period. The ERP data (averaged over a group of healthy subjects) were recorded with 19 electrodes during a visual paradigm. The parietal midline Pz electrode has been used as the reference electrode.