994 resultados para Evolving modeling
Resumo:
This paper is concerned with the experimental and modeling studies on the smoldering rates of incense sticks as a function of ambient oxygen fraction in air, the flow velocity and size. The experimental results are obtained both for forward and reverse smolder conditions. The results are explained on the basis of surface combustion due to diffusion of oxygen to the surface by both free and forced convection supporting the heat transfer into the solid by conduction, into the stream by convection and the radiant heat transfer from the surface. The heat release at the surface is controlled by the convective transport of the oxidizer to the surface. To obtain the diffusion rates particularly for the reverse smolder, CFD calculations of fluid flow with along with a passive scalar are needed; these calculations have been made both for forward and reverse smolder. The interesting aspect of the CFD calculations is that while the Nusselt umber for forward smolder shows a clear root( Re-u) dependence ( Re-u = Flow Reynolds Number), the result for reverse smolder shows a peak in the variation with Reynolds number with the values lower than for forward smolder and unsteadiness in the flow beyond a certain flow rate. The results of flow behavior and Nusselt number are used in a simple model for the heat transfer at the smoldering surface to obtain the dependence of the smoldering rate on the diameter of the incense stick, the flow rate of air and the oxygen fraction. The results are presented in terms of a correlation for the non-dimensional smoldering rate with radiant flux from the surface and heat generation rate at the surface. The correlations appear reasonable for both forward and reverse smolder cases.
Resumo:
11β-hydroksisteroididehydrogenaasientsyymit (11β-HSD) 1 ja 2 säätelevät kortisonin ja kortisolin määrää kudoksissa. 11β-HSD1 -entsyymin ylimäärä erityisesti viskeraalisessa rasvakudoksessa aiheuttaa metaboliseen oireyhtymän klassisia oireita, mikä tarjoaa mahdollisuuden metabolisen oireyhtymän hoitoon 11β-HSD1 -entsyymin selektiivisellä estämisellä. 11β-HSD2 -entsyymin inhibitio aiheuttaa kortisonivälitteisen mineralokortikoidireseptorien aktivoitumisen, mikä puolestaan johtaa hypertensiivisiin haittavaikutuksiin. Haittavaikutuksista huolimatta 11β-HSD2 -entsyymin estäminen saattaa olla hyödyllistä tilanteissa, joissa halutaan nostaa kortisolin määrä elimistössä. Lukuisia selektiivisiä 11β-HSD1 inhibiittoreita on kehitetty, mutta 11β-HSD2-inhibiittoreita on raportoitu vähemmän. Ero näiden kahden isotsyymin aktiivisen kohdan välillä on myös tuntematon, mikä vaikeuttaa selektiivisten inhibiittoreiden kehittämistä kummallekin entsyymille. Tällä työllä oli kaksi tarkoitusta: (1) löytää ero 11β-HSD entsyymien välillä ja (2) kehittää farmakoforimalli, jota voitaisiin käyttää selektiivisten 11β-HSD2 -inhibiittoreiden virtuaaliseulontaan. Ongelmaa lähestyttiin tietokoneavusteisesti: homologimallinnuksella, pienmolekyylien telakoinnilla proteiiniin, ligandipohjaisella farmakoforimallinnuksella ja virtuaaliseulonnalla. Homologimallinnukseen käytettiin SwissModeler -ohjelmaa, ja luotu malli oli hyvin päällekäinaseteltavissa niin templaattinsa (17β-HSD1) kuin 11β-HSD1 -entsyymin kanssa. Eroa entsyymien välillä ei löytynyt tarkastelemalla päällekäinaseteltuja entsyymejä. Seitsemän yhdistettä, joista kuusi on 11β-HSD2 -selektiivisiä, telakoitiin molempiin entsyymeihin käyttäen ohjelmaa GOLD. 11β-HSD1 -entsyymiin yhdisteet kiinnittyivät kuten suurin osa 11β-HSD1 -selektiivisistä tai epäselektiivisistä inhibiittoreista, kun taas 11β-HSD2 -entsyymiin kaikki yhdisteet olivat telakoituneet käänteisesti. Tällainen sitoutumistapa mahdollistaa vetysidokset Ser310:een ja Asn171:een, aminohappoihin, jotka olivat nähtävissä vain 11β-HSD2 -entsyymissä. Farmakoforimallinnukseen käytettiin ohjelmaa LigandScout3.0, jolla ajettiin myös virtuaaliseulonnat. Luodut kaksi farmakoforimallia, jotka perustuivat aiemmin telakointiinkin käytettyihin kuuteen 11β-HSD2 -selektiiviseen yhdisteeseen, koostuivat kuudesta ominaisuudesta (vetysidosakseptori, vetysidosdonori ja hydrofobinen), ja kieltoalueista. 11β-HSD2 -selektiivisyyden kannalta tärkeimmät ominaisuudet ovat vetysidosakseptori, joka voi muodostaa sidoksen Ser310 kanssa ja vetysidosdonori sen vieressä. Tälle vetysidosdonorille ei löytynyt vuorovaikutusparia 11β-HSD2-mallista. Sopivasti proteiiniin orientoitunut vesimolekyyli voisi kuitenkin olla sopiva ratkaisu puuttuvalle vuorovaikutusparille. Koska molemmat farmakoforimallit löysivät 11β-HSD2 -selektiivisiä yhdisteitä ja jättivät epäselektiivisiä pois testiseulonnassa, käytettiin molempia malleja Innsbruckin yliopistossa säilytettävistä yhdisteistä (2700 kappaletta) koostetun tietokannan seulontaan. Molemmista seulonnoista löytyneistä hiteistä valittiin yhteensä kymmenen kappaletta, jotka lähetettiin biologisiin testeihin. Biologisien testien tulokset vahvistavat lopullisesti sen kuinka hyvin luodut mallit edustavat todellisuudessa 11β-HSD2 -selektiivisyyttä.
Resumo:
We develop an alternate characterization of the statistical distribution of the inter-cell interference power observed in the uplink of CDMA systems. We show that the lognormal distribution better matches the cumulative distribution and complementary cumulative distribution functions of the uplink interference than the conventionally assumed Gaussian distribution and variants based on it. This is in spite of the fact that many users together contribute to uplink interference, with the number of users and their locations both being random. Our observations hold even in the presence of power control and cell selection, which have hitherto been used to justify the Gaussian distribution approximation. The parameters of the lognormal are obtained by matching moments, for which detailed analytical expressions that incorporate wireless propagation, cellular layout, power control, and cell selection parameters are developed. The moment-matched lognormal model, while not perfect, is an order of magnitude better in modeling the interference power distribution.
Resumo:
A detailed mechanics based model is developed to analyze the problem of structural instability in slender aerospace vehicles. Coupling among the rigid-body modes, the longitudinal vibrational modes and the transverse vibrational modes due to asymmetric lifting-body cross-section are considered. The model also incorporates the effects of aerodynamic pressure and the propulsive thrust of the vehicle. The model is one-dimensional, and it can be employed to idealized slender vehicles with complex shapes. Condition under which a flexible body with internal stress waves behaves like a perfect rigid body is derived. Two methods are developed for finite element discretization of the system: (1) A time-frequency Fourier spectral finite element method and (2) h-p finite element method. Numerical results using the above methods are presented in Part II of this paper. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.
Resumo:
Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.
Resumo:
In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.
Resumo:
Financial time series tend to behave in a manner that is not directly drawn from a normal distribution. Asymmetries and nonlinearities are usually seen and these characteristics need to be taken into account. To make forecasts and predictions of future return and risk is rather complicated. The existing models for predicting risk are of help to a certain degree, but the complexity in financial time series data makes it difficult. The introduction of nonlinearities and asymmetries for the purpose of better models and forecasts regarding both mean and variance is supported by the essays in this dissertation. Linear and nonlinear models are consequently introduced in this dissertation. The advantages of nonlinear models are that they can take into account asymmetries. Asymmetric patterns usually mean that large negative returns appear more often than positive returns of the same magnitude. This goes hand in hand with the fact that negative returns are associated with higher risk than in the case where positive returns of the same magnitude are observed. The reason why these models are of high importance lies in the ability to make the best possible estimations and predictions of future returns and for predicting risk.
Resumo:
The human resource (HR) function is under pressure both to change roles and to play a large variety of roles. Questions of change and development in the HR function become particularly interesting in the context of mergers and acquisitions when two corporations are integrated. The purpose of the thesis is to examine the roles played by the HR function in the context of large-scale mergers and thus to understand what happens to the HR function in such change environments, and to shed light on the underlying factors that influence changes in the HR function. To achieve this goal, the study seeks first to identify the roles played by the HR function before and after the merger, and second, to identify the factors that affect the roles played by the HR function. It adopts a qualitative case study approach including ten focal case organisations (mergers) and four matching cases (non-mergers). The sample consists of large corporations originating from either Finland or Sweden. HR directors and members of the top management teams within the case organisations were interviewed. The study suggests that changes occur within the HR function, and that the trend is for the HR function to become increasingly strategic. However, the HR function was found to play strategic roles only when the HR administration ran smoothly. The study also suggests that the HR function has become more versatile. An HR function that was perceived to be mainly administrative before the merger is likely after the merger to perform some strategically important activities in addition to the administrative ones. Significant changes in the roles played by the HR function were observed in some of the case corporations. This finding suggests that the merger integration process is a window of opportunity for the HR function. HR functions that take a proactive and leading role during the integration process might expand the number of roles played and move from being an administrator before the merger to also being a business partner after integration. The majority of the HR functions studied remained mainly reactive during the organisational change process and although the evidence showed that they moved towards strategic tasks, the intra-functional changes remained comparatively small in these organisations. The study presents a new model that illustrates the impact of the relationship between the top management team and the HR function on the role of the HR function. The expectations held by the top management team for the HR function and the performance of the HR function were found to interact. On a dimension reaching from tactical to strategic, HR performance is likely to correspond to the expectations held by top management.
Resumo:
The objective of this paper is to investigate and model the characteristics of the prevailing volatility smiles and surfaces on the DAX- and ESX-index options markets. Continuing on the trend of Implied Volatility Functions, the Standardized Log-Moneyness model is introduced and fitted to historical data. The model replaces the constant volatility parameter of the Black & Scholes pricing model with a matrix of volatilities with respect to moneyness and maturity and is tested out-of-sample. Considering the dynamics, the results show support for the hypotheses put forward in this study, implying that the smile increases in magnitude when maturity and ATM volatility decreases and that there is a negative/positive correlation between a change in the underlying asset/time to maturity and implied ATM volatility. Further, the Standardized Log-Moneyness model indicates an improvement to pricing accuracy compared to previous Implied Volatility Function models, however indicating that the parameters of the models are to be re-estimated continuously for the models to fully capture the changing dynamics of the volatility smiles.
Resumo:
The notion of optimization is inherent in protein design. A long linear chain of twenty types of amino acid residues are known to fold to a 3-D conformation that minimizes the combined inter-residue energy interactions. There are two distinct protein design problems, viz. predicting the folded structure from a given sequence of amino acid monomers (folding problem) and determining a sequence for a given folded structure (inverse folding problem). These two problems have much similarity to engineering structural analysis and structural optimization problems respectively. In the folding problem, a protein chain with a given sequence folds to a conformation, called a native state, which has a unique global minimum energy value when compared to all other unfolded conformations. This involves a search in the conformation space. This is somewhat akin to the principle of minimum potential energy that determines the deformed static equilibrium configuration of an elastic structure of given topology, shape, and size that is subjected to certain boundary conditions. In the inverse-folding problem, one has to design a sequence with some objectives (having a specific feature of the folded structure, docking with another protein, etc.) and constraints (sequence being fixed in some portion, a particular composition of amino acid types, etc.) while obtaining a sequence that would fold to the desired conformation satisfying the criteria of folding. This requires a search in the sequence space. This is similar to structural optimization in the design-variable space wherein a certain feature of structural response is optimized subject to some constraints while satisfying the governing static or dynamic equilibrium equations. Based on this similarity, in this work we apply the topology optimization methods to protein design, discuss modeling issues and present some initial results.
Resumo:
The copper(II) complex [Cu(salgly) (bpy)] . 4H(2)O (1), where salgly is a tridentate glycinatosalicylaldimine Schiffbase Ligand, is prepared and structurally characterized. The complex is found to be catalytically active in the oxidation of ascorbic acid by dioxygen and the process is also effective in the presence of benzylamine giving benzaldehyde as a product, thus modeling the activity of the Cu-B site of dopamine beta-hydroxylase. (C) 2000 Elsevier Science S.A. All rights reserved.
Resumo:
A method is presented to model server unreliability in closed queuing networks. Breakdowns and repairs of servers, assumed to be time-dependent, are modeled using virtual customers and virtual servers in the system. The problem is thus converted into a closed queue with all reliable servers and preemptive resume priority centers. Several recent preemptive priority approximations and an approximation of the one proposed are used in the analysis. This method has approximately the same computational requirements as that of mean-value analysis for a network of identical dimensions and is therefore very efficient