72 resultados para Minimum Entropy Deconvolution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Internal risk management models of the kind popularized by J. P. Morgan are now used widely by the world’s most sophisticated financial institutions as a means of measuring risk. Using the returns on three of the most popular futures contracts on the London International Financial Futures Exchange, in this paper we investigate the possibility of using multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models for the calculation of minimum capital risk requirements (MCRRs). We propose a method for the estimation of the value at risk of a portfolio based on a multivariate GARCH model. We find that the consideration of the correlation between the contracts can lead to more accurate, and therefore more appropriate, MCRRs compared with the values obtained from a univariate approach to the problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the frequency of extreme events for three LIFFE futures contracts for the calculation of minimum capital risk requirements (MCRRs). We propose a semiparametric approach where the tails are modelled by the Generalized Pareto Distribution and smaller risks are captured by the empirical distribution function. We compare the capital requirements form this approach with those calculated from the unconditional density and from a conditional density - a GARCH(1,1) model. Our primary finding is that both in-sample and for a hold-out sample, our extreme value approach yields superior results than either of the other two models which do not explicitly model the tails of the return distribution. Since the use of these internal models will be permitted under the EC-CAD II, they could be widely adopted in the near future for determining capital adequacies. Hence, close scrutiny of competing models is required to avoid a potentially costly misallocation capital resources while at the same time ensuring the safety of the financial system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent low and prolonged minimum of the solar cycle, along with the slow growth in activity of the new cycle, has led to suggestions that the Sun is entering a Grand Solar Minimum (GSMi), potentially as deep as the Maunder Minimum (MM). This raises questions about the persistence and predictability of solar activity. We study the autocorrelation functions and predictability R^2_L(t) of solar indices, particularly group sunspot number R_G and heliospheric modulation potential phi for which we have data during the descent into the MM. For R_G and phi, R^2_L (t) > 0.5 for times into the future of t = 4 and 3 solar cycles, respectively: sufficient to allow prediction of a GSMi onset. The lower predictability of sunspot number R_Z is discussed. The current declines in peak and mean R_G are the largest since the onset of the MM and exceed those around 1800 which failed to initiate a GSMi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to reconsider the Maximum Entropy Production conjecture (MEP) in the context of a very simple two-dimensional zonal-vertical climate model able to represent the total material entropy production due at the same time to both horizontal and vertical heat fluxes. MEP is applied first to a simple four-box model of climate which accounts for both horizontal and vertical material heat fluxes. It is shown that, under condition of fixed insolation, a MEP solution is found with reasonably realistic temperature and heat fluxes, thus generalising results from independent two-box horizontal or vertical models. It is also shown that the meridional and the vertical entropy production terms are independently involved in the maximisation and thus MEP can be applied to each subsystem with fixed boundary conditions. We then extend the four-box model by increasing its resolution, and compare it with GCM output. A MEP solution is found which is fairly realistic as far as the horizontal large scale organisation of the climate is concerned whereas the vertical structure looks to be unrealistic and presents seriously unstable features. This study suggest that the thermal meridional structure of the atmosphere is predicted fairly well by MEP once the insolation is given but the vertical structure of the atmosphere cannot be predicted satisfactorily by MEP unless constraints are imposed to represent the determination of longwave absorption by water vapour and clouds as a function of the state of the climate. Furthermore an order-of-magnitude estimate of contributions to the material entropy production due to horizontal and vertical processes within the climate system is provided by using two different methods. In both cases we found that approximately 40 mW m−2 K−1 of material entropy production is due to vertical heat transport and 5–7 mW m−2 K−1 to horizontal heat transport

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an outlook on the climate system thermodynamics. First, we construct an equivalent Carnot engine with efficiency and frame the Lorenz energy cycle in a macroscale thermodynamic context. Then, by exploiting the second law, we prove that the lower bound to the entropy production is times the integrated absolute value of the internal entropy fluctuations. An exergetic interpretation is also proposed. Finally, the controversial maximum entropy production principle is reinterpreted as requiring the joint optimization of heat transport and mechanical work production. These results provide tools for climate change analysis and for climate models’ validation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the regularization problem for linear, constant coefficient descriptor systems Ex' = Ax+Bu, y1 = Cx, y2 = Γx' by proportional and derivative mixed output feedback. Necessary and sufficient conditions are given, which guarantee that there exist output feedbacks such that the closed-loop system is regular, has index at most one and E+BGΓ has a desired rank, i.e., there is a desired number of differential and algebraic equations. To resolve the freedom in the choice of the feedback matrices we then discuss how to obtain the desired regularizing feedback of minimum norm and show that this approach leads to useful results in the sense of robustness only if the rank of E is decreased. Numerical procedures are derived to construct the desired feedback gains. These numerical procedures are based on orthogonal matrix transformations which can be implemented in a numerically stable way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Open solar flux (OSF) variations can be described by the imbalance between source and loss terms. We use spacecraft and geomagnetic observations of OSF from 1868 to present and assume the OSF source, S, varies with the observed sunspot number, R. Computing the required fractional OSF loss, χ, reveals a clear solar cycle variation, in approximate phase with R. While peak R varies significantly from cycle to cycle, χ is surprisingly constant in both amplitude and waveform. Comparisons of χ with measures of heliospheric current sheet (HCS) orientation reveal a strong correlation. The cyclic nature of χ is exploited to reconstruct OSF back to the start of sunspot records in 1610. This agrees well with the available spacecraft, geomagnetic, and cosmogenic isotope observations. Assuming S is proportional to R yields near-zero OSF throughout the Maunder Minimum. However, χ becomes negative during periods of low R, particularly the most recent solar minimum, meaning OSF production is underestimated. This is related to continued coronal mass ejection (CME) activity, and therefore OSF production, throughout solar minimum, despite R falling to zero. Correcting S for this produces a better match to the recent solar minimum OSF observations. It also results in a cycling, nonzero OSF during the Maunder Minimum, in agreement with cosmogenic isotope observations. These results suggest that during the Maunder Minimum, HCS tilt cycled as over recent solar cycles, and the CME rate was roughly constant at the levels measured during the most recent two solar minima.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the empirical performance of the classical minimum-variance hedging strategy, comparing several econometric models for estimating hedge ratios of crude oil, gasoline and heating oil crack spreads. Given the great variability and large jumps in both spot and futures prices, considerable care is required when processing the relevant data and accounting for the costs of maintaining and re-balancing the hedge position. We find that the variance reduction produced by all models is statistically and economically indistinguishable from the one-for-one “naïve” hedge. However, minimum-variance hedging models, especially those based on GARCH, generate much greater margin and transaction costs than the naïve hedge. Therefore we encourage hedgers to use a naïve hedging strategy on the crack spread bundles now offered by the exchange; this strategy is the cheapest and easiest to implement. Our conclusion contradicts the majority of the existing literature, which favours the implementation of GARCH-based hedging strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to retrieve information from different layers within a stratified sample using terahertz pulsed reflection imaging and spectroscopy has traditionally been resolution limited by the pulse width available. In this paper, a deconvolution algorithm is presented which circumvents this resolution limit, enabling deep sub-wavelength and sub-pulse width depth resolution. The algorithm is explained through theoretical investigation, and demonstrated by reconstructing signals reflected from boundaries in stratified materials that cannot be resolved directly from the unprocessed time-domain reflection signal. Furthermore, the deconvolution technique has been used to recreate sub-surface images from a stratified sample: imaging the reverse side of a piece of paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A custom-built deconvolution technique for pulsed terahertz imaging is presented in this paper. It is examined as a tool for the measurement of thin transparent films. The power of the technique is illustrated by using the impulse response function, calculated in the deconvolution process, to recreate terahertz images of both sides of a piece of paper derived from one single terahertz reflection measurement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a new sparse kernel density estimator using a forward constrained regression framework, within which the nonnegative and summing-to-unity constraints of the mixing weights can easily be satisfied. Our main contribution is to derive a recursive algorithm to select significant kernels one at time based on the minimum integrated square error (MISE) criterion for both the selection of kernels and the estimation of mixing weights. The proposed approach is simple to implement and the associated computational cost is very low. Specifically, the complexity of our algorithm is in the order of the number of training data N, which is much lower than the order of N2 offered by the best existing sparse kernel density estimators. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to those of the classical Parzen window estimate and other existing sparse kernel density estimators.