973 resultados para discrete cosine transform


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The class of all Exponential-Polynomial-Trigonometric (EPT) functions is classical and equal to the Euler-d’Alembert class of solutions of linear differential equations with constant coefficients. The class of non-negative EPT functions defined on [0;1) was discussed in Hanzon and Holland (2010) of which EPT probability density functions are an important subclass. EPT functions can be represented as ceAxb, where A is a square matrix, b a column vector and c a row vector where the triple (A; b; c) is the minimal realization of the EPT function. The minimal triple is only unique up to a basis transformation. Here the class of 2-EPT probability density functions on R is defined and shown to be closed under a variety of operations. The class is also generalised to include mixtures with the pointmass at zero. This class coincides with the class of probability density functions with rational characteristic functions. It is illustrated that the Variance Gamma density is a 2-EPT density under a parameter restriction. A discrete 2-EPT process is a process which has stochastically independent 2-EPT random variables as increments. It is shown that the distribution of the minimum and maximum of such a process is an EPT density mixed with a pointmass at zero. The Laplace Transform of these distributions correspond to the discrete time Wiener-Hopf factors of the discrete time 2-EPT process. A distribution of daily log-returns, observed over the period 1931-2011 from a prominent US index, is approximated with a 2-EPT density function. Without the non-negativity condition, it is illustrated how this problem is transformed into a discrete time rational approximation problem. The rational approximation software RARL2 is used to carry out this approximation. The non-negativity constraint is then imposed via a convex optimisation procedure after the unconstrained approximation. Sufficient and necessary conditions are derived to characterise infinitely divisible EPT and 2-EPT functions. Infinitely divisible 2-EPT density functions generate 2-EPT Lévy processes. An assets log returns can be modelled as a 2-EPT Lévy process. Closed form pricing formulae are then derived for European Options with specific times to maturity. Formulae for discretely monitored Lookback Options and 2-Period Bermudan Options are also provided. Certain Greeks, including Delta and Gamma, of these options are also computed analytically. MATLAB scripts are provided for calculations involving 2-EPT functions. Numerical option pricing examples illustrate the effectiveness of the 2-EPT approach to financial modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel hybrid data-driven approach is developed for forecasting power system parameters with the goal of increasing the efficiency of short-term forecasting studies for non-stationary time-series. The proposed approach is based on mode decomposition and a feature analysis of initial retrospective data using the Hilbert-Huang transform and machine learning algorithms. The random forests and gradient boosting trees learning techniques were examined. The decision tree techniques were used to rank the importance of variables employed in the forecasting models. The Mean Decrease Gini index is employed as an impurity function. The resulting hybrid forecasting models employ the radial basis function neural network and support vector regression. A part from introduction and references the paper is organized as follows. The second section presents the background and the review of several approaches for short-term forecasting of power system parameters. In the third section a hybrid machine learningbased algorithm using Hilbert-Huang transform is developed for short-term forecasting of power system parameters. Fourth section describes the decision tree learning algorithms used for the issue of variables importance. Finally in section six the experimental results in the following electric power problems are presented: active power flow forecasting, electricity price forecasting and for the wind speed and direction forecasting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate a carbon mitigation cost curve for the U.S. commercial sector based on econometric estimation of the responsiveness of fuel demand and equipment choices to energy price changes. The model econometrically estimates fuel demand conditional on fuel choice, which is characterized by a multinomial logit model. Separate estimation of end uses (e.g., heating, cooking) using the U.S. Commercial Buildings Energy Consumption Survey allows for exceptionally detailed estimation of price responsiveness disaggregated by end use and fuel type. We then construct aggregate long-run elasticities, by fuel type, through a series of simulations; own-price elasticities range from -0.9 for district heat services to -2.9 for fuel oil. The simulations form the basis of a marginal cost curve for carbon mitigation, which suggests that a price of $20 per ton of carbon would result in an 8% reduction in commercial carbon emissions, and a price of $100 per ton would result in a 28% reduction. © 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2005-2012 IEEE.Within industrial automation systems, three-dimensional (3-D) vision provides very useful feedback information in autonomous operation of various manufacturing equipment (e.g., industrial robots, material handling devices, assembly systems, and machine tools). The hardware performance in contemporary 3-D scanning devices is suitable for online utilization. However, the bottleneck is the lack of real-time algorithms for recognition of geometric primitives (e.g., planes and natural quadrics) from a scanned point cloud. One of the most important and the most frequent geometric primitive in various engineering tasks is plane. In this paper, we propose a new fast one-pass algorithm for recognition (segmentation and fitting) of planar segments from a point cloud. To effectively segment planar regions, we exploit the orthonormality of certain wavelets to polynomial function, as well as their sensitivity to abrupt changes. After segmentation of planar regions, we estimate the parameters of corresponding planes using standard fitting procedures. For point cloud structuring, a z-buffer algorithm with mesh triangles representation in barycentric coordinates is employed. The proposed recognition method is tested and experimentally validated in several real-world case studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tomography problem is investigated when the available projections are restricted to a limited angular domain. It is shown that a previous algorithm proposed for extrapolating the data to the missing cone in Fourier space is unstable in the presence of noise because of the ill-posedness of the problem. A regularized algorithm is proposed, which converges to stable solutions. The efficiency of both algorithms is tested by means of numerical simulations. © 1983 Taylor and Francis Group, LLC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For pt.I. see ibid. vol.1, p.301 (1985). In the first part of this work a general definition of an inverse problem with discrete data has been given and an analysis in terms of singular systems has been performed. The problem of the numerical stability of the solution, which in that paper was only briefly discussed, is the main topic of this second part. When the condition number of the problem is too large, a small error on the data can produce an extremely large error on the generalised solution, which therefore has no physical meaning. The authors review most of the methods which have been developed for overcoming this difficulty, including numerical filtering, Tikhonov regularisation, iterative methods, the Backus-Gilbert method and so on. Regularisation methods for the stable approximation of generalised solutions obtained through minimisation of suitable seminorms (C-generalised solutions), such as the method of Phillips (1962), are also considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

info:eu-repo/semantics/published

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work uses the discrete element method (DEM) to describe assemblies of particulate bulk materials. Working numerical descriptions of entire processes using this scheme are infeasible because of the very large number of elements (1012 or more in a moderately sized industrial silo). However it is possible to capture much of the essential bulk mechanics through selective DEM on important regions of an assembly, thereafter using the information in continuum numerical descriptions of particulate processes. The continuum numerical model uses population balances of the various components in bulk solid mixtures. It depends on constitutive relationships for the internal transfer, creation and/or destruction of components within the mixture. In this paper we show the means of generating such relationships for two important flow phenomena – segregation whereby particles differing in some important property (often size) separate into discrete phases, and degradation, whereby particles break into sub-elements, through impact on each other or shearing. We perform DEM simulations under a range of representative conditions, extracting the important parameters for the relevant transfer, creation and/or destruction of particles in certain classes within the assembly over time. Continuum predictions of segregation and degradation using this scheme are currently being successfully validated against bulk experimental data and are beginning to be used in schemes to improve the design and operation of bulk solids process plant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The solution process for diffusion problems usually involves the time development separately from the space solution. A finite difference algorithm in time requires a sequential time development in which all previous values must be determined prior to the current value. The Stehfest Laplace transform algorithm, however, allows time solutions without the knowledge of prior values. It is of interest to be able to develop a time-domain decomposition suitable for implementation in a parallel environment. One such possibility is to use the Laplace transform to develop coarse-grained solutions which act as the initial values for a set of fine-grained solutions. The independence of the Laplace transform solutions means that we do indeed have a time-domain decomposition process. Any suitable time solver can be used for the fine-grained solution. To illustrate the technique we shall use an Euler solver in time together with the dual reciprocity boundary element method for the space solution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The uptake and diffusion of solvents across polymer membranes is important in controlled drug delivery, effects on drug uptake into, for example, infusion bags and containers, as well as transport across protective clothing. Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) spectroscopy has been used to monitor the effects of different solvents on the diffusion of a model compound, 4-cyanophenol (CNP) across silicone membrane and on the equilibrium concentration of CNP obtained in the membrane following diffusion. ATR-FTIR spectroscopic imaging of membrane diffusion was used to gain an understanding of when the boundary conditions applied to Fick's second law, used to model the diffusion of permeants across the silicone membrane do not hold. The imaging experiments indicated that when the solvent was not taken up appreciably into the membrane, the presence of discrete solvent pools between the ATR crystal and the silicone membrane can affect the diffusion profile of the permeant. This effect is more significant if the permeant has a high solubility in the solvent. In contrast, solvents that are taken up into the membrane to a greater extent, or those where the solubility of the permeant in the vehicle is relatively low, were found to show a good fit to the diffusion model. As such these systems allow the ATR-FTIR spectroscopic approach to give mechanistic insight into how the particular solvents enhance permeation. The solubility of CNP in the solvent and the uptake of the solvent into the membrane were found to be important influences on the equilibrium concentration of the permeant obtained in the membrane following diffusion. In general, solvents which were taken up to a significant extent into the membrane and which caused the membrane to swell increased the diffusion coefficient of the permeant in the membrane though other factors such as solvent viscosity may also be important.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Species size distributions for metazoan benthic invertebrates conform to the highly conservative bimodal pattern, regardless of the sieve mesh sizes or numbers of sieves used in their extraction. This pattern is not an artefact of sampling a size continuum as suggested by computer simulations using just 2 fixed mesh sizes in Bett (2013; Mar Ecol Prog Ser 487:1-6). Meiobenthos and macrobenthos are coherent entities, each with a distinct suite of functional attributes, and should not be regarded as a single unit for ecological modelling purposes.