989 resultados para expected shortfall portfolio optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sampling scheme is essential in the investigation of the spatial variability of soil properties in Soil Science studies. The high costs of sampling schemes optimized with additional sampling points for each physical and chemical soil property, prevent their use in precision agriculture. The purpose of this study was to obtain an optimal sampling scheme for physical and chemical property sets and investigate its effect on the quality of soil sampling. Soil was sampled on a 42-ha area, with 206 geo-referenced points arranged in a regular grid spaced 50 m from each other, in a depth range of 0.00-0.20 m. In order to obtain an optimal sampling scheme for every physical and chemical property, a sample grid, a medium-scale variogram and the extended Spatial Simulated Annealing (SSA) method were used to minimize kriging variance. The optimization procedure was validated by constructing maps of relative improvement comparing the sample configuration before and after the process. A greater concentration of recommended points in specific areas (NW-SE direction) was observed, which also reflects a greater estimate variance at these locations. The addition of optimal samples, for specific regions, increased the accuracy up to 2 % for chemical and 1 % for physical properties. The use of a sample grid and medium-scale variogram, as previous information for the conception of additional sampling schemes, was very promising to determine the locations of these additional points for all physical and chemical soil properties, enhancing the accuracy of kriging estimates of the physical-chemical properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a procedure for the optical characterization of thin-film stacks from spectrophotometric data. The procedure overcomes the intrinsic limitations arising in the numerical determination of manyparameters from reflectance or transmittance spectra measurements. The key point is to use all theinformation available from the manufacturing process in a single global optimization process. The method is illustrated by a case study of solgel applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computer-aided method to improve the thickness uniformity attainable when coating multiple substrates inside a thermal evaporation physical vapor deposition unit is presented. The study is developed for the classical spherical (dome-shaped) calotte and also for a plane sector reversible holder setup. This second arrangement is very useful for coating both sides of the substrate, such as antireflection multilayers on lenses. The design of static correcting shutters for both kinds of configurations is also discussed. Some results of using the method are presented as an illustration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term preservation of bioreporter bacteria is essential for the functioning of cell-based detection devices, particularly when field application, e.g., in developing countries, is intended. We varied the culture conditions (i.e., the NaCl content of the medium), storage protection media, and preservation methods (vacuum drying vs. encapsulation gels remaining hydrated) in order to achieve optimal preservation of the activity of As (III) bioreporter bacteria during up to 12 weeks of storage at 4 degrees C. The presence of 2% sodium chloride during the cultivation improved the response intensity of some bioreporters upon reconstitution, particularly of those that had been dried and stored in the presence of sucrose or trehalose and 10% gelatin. The most satisfying, stable response to arsenite after 12 weeks storage was obtained with cells that had been dried in the presence of 34% trehalose and 1.5% polyvinylpyrrolidone. Amendments of peptone, meat extract, sodium ascorbate, and sodium glutamate preserved the bioreporter activity only for the first 2 weeks, but not during long-term storage. Only short-term stability was also achieved when bioreporter bacteria were encapsulated in gels remaining hydrated during storage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coalescing compact binary systems are important sources of gravitational waves. Here we investigate the detectability of this gravitational radiation by the recently proposed laser interferometers. The spectral density of noise for various practicable configurations of the detector is also reviewed. This includes laser interferometers with delay lines and Fabry-Prot cavities in the arms, both in standard and dual recycling arrangements. The sensitivity of the detector in all those configurations is presented graphically and the signal-to-noise ratio is calculated numerically. For all configurations we find values of the detector's parameters which maximize the detectability of coalescing binaries, the discussion comprising Newtonian- as well as post-Newtonian-order effects. Contour plots of the signal-to-noise ratio are also presented in certain parameter domains which illustrate the interferometer's response to coalescing binary signals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The protein shells, or capsids, of nearly all spherelike viruses adopt icosahedral symmetry. In the present Letter, we propose a statistical thermodynamic model for viral self-assembly. We find that icosahedral symmetry is not expected for viral capsids constructed from structurally identical protein subunits and that this symmetry requires (at least) two internal switching configurations of the protein. Our results indicate that icosahedral symmetry is not a generic consequence of free energy minimization but requires optimization of internal structural parameters of the capsid proteins

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geographic information systems (GIS) and artificial intelligence (AI) techniques were used to develop an intelligent snow removal asset management system (SRAMS). The system has been evaluated through a case study examining snow removal from the roads in Black Hawk County, Iowa, for which the Iowa Department of Transportation (Iowa DOT) is responsible. The SRAMS is comprised of an expert system that contains the logical rules and expertise of the Iowa DOT’s snow removal experts in Black Hawk County, and a geographic information system to access and manage road data. The system is implemented on a mid-range PC by integrating MapObjects 2.1 (a GIS package), Visual Rule Studio 2.2 (an AI shell), and Visual Basic 6.0 (a programming tool). The system could efficiently be used to generate prioritized snowplowing routes in visual format, to optimize the allocation of assets for plowing, and to track materials (e.g., salt and sand). A test of the system reveals an improvement in snowplowing time by 1.9 percent for moderate snowfall and 9.7 percent for snowstorm conditions over the current manual system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The individual life model has always been considered as the one closest to the real situation of the total claims of a life insurance portfolio. It only makes the ¿nearly inevitable assumption¿ of independence of the lifelenghts of insured persons in the portfolio. Many clinical studies, however, have demonstrated positive dependence of paired lives such as husband and wife. In our opinion, it won¿t be unrealistic expecting a considerable number of married couples in any life insurance portfolio (e.g. life insurance contracts formalized at the time of signing a mortatge) and these dependences materially increase the values for the stop-loss premiums associated to the aggregate claims of the portfolio. Since the stop-loss order is the order followed by any risk averse decison maker, the simplifying hypothesis of independence constitute a real financial danger for the company, in the sense that most of their decisions are based on the aggregated claims distribution. In this paper, we will determine approximations for the distribution of the aggregate claims of a life insurance portfolio with some married couples and we will describe how to make safe decisions when we don¿t know exactly the dependence structure between the risks in each couple. Results in this paper are partly based on results in Dhaene and Goovaerts (1997)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Beta coefficients are not stable if we modify the observation periods of the returns. The market portfolio composition also varies, whereas changes in the betas are the same, whether they are calculated as regression coefficients or as a ratio of the risk premiums. The instantaneous beta, obtained when the capitalization frequency approaches infinity, may be a useful tool in portfolio selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[cat] En aquest treball s'analitza un model estocàstic en temps continu en el que l'agent decisor descompta les utilitats instantànies i la funció final amb taxes de preferència temporal constants però diferents. En aquest context es poden modelitzar problemes en els quals, quan el temps s'acosta al moment final, la valoració de la funció final incrementa en comparació amb les utilitats instantànies. Aquest tipus d'asimetria no es pot descriure ni amb un descompte estàndard ni amb un variable. Per tal d'obtenir solucions consistents temporalment es deriva l'equació de programació dinàmica estocàstica, les solucions de la qual són equilibris Markovians. Per a aquest tipus de preferències temporals, s'estudia el model clàssic de consum i inversió (Merton, 1971) per a les funcions d'utilitat del tipus CRRA i CARA, comparant els equilibris Markovians amb les solucions inconsistents temporalment. Finalment es discuteix la introducció del temps final aleatori.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The process of free reserves in a non-life insurance portfolio as defined in the classical model of risk theory is modified by the introduction of dividend policies that set maximum levels for the accumulation of reserves. The first part of the work formulates the quantification of the dividend payments via the expectation of their current value under diferent hypotheses. The second part presents a solution based on a system of linear equations for discrete dividend payments in the case of a constant dividend barrier, illustrated by solving a specific case.