39 resultados para Permutation-Symmetric Covariance


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work is to invert the ionospheric electron density profile from Riometer (Relative Ionospheric opacity meter) measurement. The newly Riometer instrument KAIRA (Kilpisjärvi Atmospheric Imaging Receiver Array) is used to measure the cosmic HF radio noise absorption that taking place in the D-region ionosphere between 50 to 90 km. In order to invert the electron density profile synthetic data is used to feed the unknown parameter Neq using spline height method, which works by taking electron density profile at different altitude. Moreover, smoothing prior method also used to sample from the posterior distribution by truncating the prior covariance matrix. The smoothing profile approach makes the problem easier to find the posterior using MCMC (Markov Chain Monte Carlo) method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines the suitability of VaR in foreign exchange rate risk management from the perspective of a European investor. The suitability of four different VaR models is evaluated in respect to have insight if VaR is a valuable tool in managing foreign exchange rate risk. The models evaluated are historical method, historical bootstrap method, variance-covariance method and Monte Carlo simulation. The data evaluated are divided into emerging and developed market currencies to have more intriguing analysis. The foreign exchange rate data in this thesis is from 31st January 2000 to 30th April 2014. The results show that the previously mentioned VaR models performance in foreign exchange risk management is not to be considered as a single tool in foreign exchange rate risk management. The variance-covariance method and Monte Carlo simulation performs poorest in both currency portfolios. Both historical methods performed better but should also be considered as an additional tool along with other more sophisticated analysis tools. A comparative study of VaR estimates and forward prices is also included in the thesis. The study reveals that regardless of the expensive hedging cost of emerging market currencies the risk captured by VaR is more expensive and thus FX forward hedging is recommended

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis work models the squeezing of the tube and computes the fluid motion of a peristaltic pump. The simulations have been conducted by using COMSOL Multiphysics FSI module. The model is setup in axis symmetric with several simulation cases to have a clear understanding of the results. The model captures total displacement of the tube, velocity magnitude, and average pressure fluctuation of the fluid motion. A clear understanding and review of many mathematical and physical concepts are also discussed with their applications in real field. In order to solve the problems and work around the resource constraints, a thorough understanding of mass balance and momentum equations, finite element concepts, arbitrary Lagrangian-Eulerian method, one-way coupling method, two-way coupling method, and COMSOL Multiphysics simulation setup are understood and briefly narrated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rough turning is an important form of manufacturing cylinder-symmetric parts. Thus far, increasing the level of automation in rough turning has included process monitoring methods or adaptive turning control methods that aim to keep the process conditions constant. However, in order to improve process safety, quality and efficiency, an adaptive turning control should be transformed into an intelligent machining system optimizing cutting values to match process conditions or to actively seek to improve process conditions. In this study, primary and secondary chatter and chip formation are studied to understand how to measure the effect of these phenomena to the process conditions and how to avoid undesired cutting conditions. The concept of cutting state is used to address the combination of these phenomena and the current use of the power capacity of the lathe. The measures to the phenomena are not developed based on physical measures, but instead, the severity of the measures is modelled against expert opinion. Based on the concept of cutting state, an expert system style fuzzy control system capable of optimizing the cutting process was created. Important aspects of the system include the capability to adapt to several cutting phenomena appearing at once, even if the said phenomena would potentially require conflicting control action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current thesis manuscript studies the suitability of a recent data assimilation method, the Variational Ensemble Kalman Filter (VEnKF), to real-life fluid dynamic problems in hydrology. VEnKF combines a variational formulation of the data assimilation problem based on minimizing an energy functional with an Ensemble Kalman filter approximation to the Hessian matrix that also serves as an approximation to the inverse of the error covariance matrix. One of the significant features of VEnKF is the very frequent re-sampling of the ensemble: resampling is done at every observation step. This unusual feature is further exacerbated by observation interpolation that is seen beneficial for numerical stability. In this case the ensemble is resampled every time step of the numerical model. VEnKF is implemented in several configurations to data from a real laboratory-scale dam break problem modelled with the shallow water equations. It is also tried in a two-layer Quasi- Geostrophic atmospheric flow problem. In both cases VEnKF proves to be an efficient and accurate data assimilation method that renders the analysis more realistic than the numerical model alone. It also proves to be robust against filter instability by its adaptive nature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimization of quantum measurement processes has a pivotal role in carrying out better, more accurate or less disrupting, measurements and experiments on a quantum system. Especially, convex optimization, i.e., identifying the extreme points of the convex sets and subsets of quantum measuring devices plays an important part in quantum optimization since the typical figures of merit for measuring processes are affine functionals. In this thesis, we discuss results determining the extreme quantum devices and their relevance, e.g., in quantum-compatibility-related questions. Especially, we see that a compatible device pair where one device is extreme can be joined into a single apparatus essentially in a unique way. Moreover, we show that the question whether a pair of quantum observables can be measured jointly can often be formulated in a weaker form when some of the observables involved are extreme. Another major line of research treated in this thesis deals with convex analysis of special restricted quantum device sets, covariance structures or, in particular, generalized imprimitivity systems. Some results on the structure ofcovariant observables and instruments are listed as well as results identifying the extreme points of covariance structures in quantum theory. As a special case study, not published anywhere before, we study the structure of Euclidean-covariant localization observables for spin-0-particles. We also discuss the general form of Weyl-covariant phase-space instruments. Finally, certain optimality measures originating from convex geometry are introduced for quantum devices, namely, boundariness measuring how ‘close’ to the algebraic boundary of the device set a quantum apparatus is and the robustness of incompatibility quantifying the level of incompatibility for a quantum device pair by measuring the highest amount of noise the pair tolerates without becoming compatible. Boundariness is further associated to minimum-error discrimination of quantum devices, and robustness of incompatibility is shown to behave monotonically under certain compatibility-non-decreasing operations. Moreover, the value of robustness of incompatibility is given for a few special device pairs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis concerns the analysis of epidemic models. We adopt the Bayesian paradigm and develop suitable Markov Chain Monte Carlo (MCMC) algorithms. This is done by considering an Ebola outbreak in the Democratic Republic of Congo, former Zaïre, 1995 as a case of SEIR epidemic models. We model the Ebola epidemic deterministically using ODEs and stochastically through SDEs to take into account a possible bias in each compartment. Since the model has unknown parameters, we use different methods to estimate them such as least squares, maximum likelihood and MCMC. The motivation behind choosing MCMC over other existing methods in this thesis is that it has the ability to tackle complicated nonlinear problems with large number of parameters. First, in a deterministic Ebola model, we compute the likelihood function by sum of square of residuals method and estimate parameters using the LSQ and MCMC methods. We sample parameters and then use them to calculate the basic reproduction number and to study the disease-free equilibrium. From the sampled chain from the posterior, we test the convergence diagnostic and confirm the viability of the model. The results show that the Ebola model fits the observed onset data with high precision, and all the unknown model parameters are well identified. Second, we convert the ODE model into a SDE Ebola model. We compute the likelihood function using extended Kalman filter (EKF) and estimate parameters again. The motivation of using the SDE formulation here is to consider the impact of modelling errors. Moreover, the EKF approach allows us to formulate a filtered likelihood for the parameters of such a stochastic model. We use the MCMC procedure to attain the posterior distributions of the parameters of the SDE Ebola model drift and diffusion parts. In this thesis, we analyse two cases: (1) the model error covariance matrix of the dynamic noise is close to zero , i.e. only small stochasticity added into the model. The results are then similar to the ones got from deterministic Ebola model, even if methods of computing the likelihood function are different (2) the model error covariance matrix is different from zero, i.e. a considerable stochasticity is introduced into the Ebola model. This accounts for the situation where we would know that the model is not exact. As a results, we obtain parameter posteriors with larger variances. Consequently, the model predictions then show larger uncertainties, in accordance with the assumption of an incomplete model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For my Licentiate thesis, I conducted research on risk measures. Continuing with this research, I now focus on capital allocation. In the proportional capital allocation principle, the choice of risk measure plays a very important part. In the chapters Introduction and Basic concepts, we introduce three definitions of economic capital, discuss the purpose of capital allocation, give different viewpoints of capital allocation and present an overview of relevant literature. Risk measures are defined and the concept of coherent risk measure is introduced. Examples of important risk measures are given, e. g., Value at Risk (VaR), Tail Value at Risk (TVaR). We also discuss the implications of dependence and review some important distributions. In the following chapter on Capital allocation we introduce different principles for allocating capital. We prefer to work with the proportional allocation method. In the following chapter, Capital allocation based on tails, we focus on insurance business lines with heavy-tailed loss distribution. To emphasize capital allocation based on tails, we define the following risk measures: Conditional Expectation, Upper Tail Covariance and Tail Covariance Premium Adjusted (TCPA). In the final chapter, called Illustrative case study, we simulate two sets of data with five insurance business lines using Normal copulas and Cauchy copulas. The proportional capital allocation is calculated using TCPA as risk measure. It is compared with the result when VaR is used as risk measure and with covariance capital allocation. In this thesis, it is emphasized that no single allocation principle is perfect for all purposes. When focusing on the tail of losses, the allocation based on TCPA is a good one, since TCPA in a sense includes features of TVaR and Tail covariance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Various researches in the field of econophysics has shown that fluid flow have analogous phenomena in financial market behavior, the typical parallelism being delivered between energy in fluids and information on markets. However, the geometry of the manifold on which market dynamics act out their dynamics (corporate space) is not yet known. In this thesis, utilizing a Seven year time series of prices of stocks used to compute S&P500 index on the New York Stock Exchange, we have created local chart to the corporate space with the goal of finding standing waves and other soliton like patterns in the behavior of stock price deviations from the S&P500 index. By first calculating the correlation matrix of normalized stock price deviations from the S&P500 index, we have performed a local singular value decomposition over a set of four different time windows as guides to the nature of patterns that may emerge. I turns out that in almost all cases, each singular vector is essentially determined by relatively small set of companies with big positive or negative weights on that singular vector. Over particular time windows, sometimes these weights are strongly correlated with at least one industrial sector and certain sectors are more prone to fast dynamics whereas others have longer standing waves.