995 resultados para Exponential models
Resumo:
The important application of semistatic hedging in financial markets naturally leads to the notion of quasi--self-dual processes. The focus of our study is to give new characterizations of quasi--self-duality. We analyze quasi--self-dual Lévy driven markets which do not admit arbitrage opportunities and derive a set of equivalent conditions for the stochastic logarithm of quasi--self-dual martingale models. Since for nonvanishing order parameter two martingale properties have to be satisfied simultaneously, there is a nontrivial relation between the order and shift parameter representing carrying costs in financial applications. This leads to an equation containing an integral term which has to be inverted in applications. We first discuss several important properties of this equation and, for some well-known Lévy-driven models, we derive a family of closed-form inversion formulae.
Resumo:
Most models of tumorigenesis assume that the tumor grows by increased cell division. In these models, it is generally supposed that daughter cells behave as do their parents, and cell numbers have clear potential for exponential growth. We have constructed simple mathematical models of tumorigenesis through failure of programmed cell death (PCD) or differentiation. These models do not assume that descendant cells behave as their parents do. The models predict that exponential growth in cell numbers does sometimes occur, usually when stem cells fail to die or differentiate. At other times, exponential growth does not occur: instead, the number of cells in the population reaches a new, higher equilibrium. This behavior is predicted when fully differentiated cells fail to undergo PCD. When cells of intermediate differentiation fail to die or to differentiate further, the values of growth parameters determine whether growth is exponential or leads to a new equilibrium. The predictions of the model are sensitive to small differences in growth parameters. Failure of PCD and differentiation, leading to a new equilibrium number of cells, may explain many aspects of tumor behavior--for example, early premalignant lesions such as cervical intraepithelial neoplasia, the fact that some tumors very rarely become malignant, the observation of plateaux in the growth of some solid tumors, and, finally, long lag phases of growth until mutations arise that eventually result in exponential growth.
Resumo:
The exponential growth of the subjective information in the framework of the Web 2.0 has led to the need to create Natural Language Processing tools able to analyse and process such data for multiple practical applications. They require training on specifically annotated corpora, whose level of detail must be fine enough to capture the phenomena involved. This paper presents EmotiBlog – a fine-grained annotation scheme for subjectivity. We show the manner in which it is built and demonstrate the benefits it brings to the systems using it for training, through the experiments we carried out on opinion mining and emotion detection. We employ corpora of different textual genres –a set of annotated reported speech extracted from news articles, the set of news titles annotated with polarity and emotion from the SemEval 2007 (Task 14) and ISEAR, a corpus of real-life self-expressed emotion. We also show how the model built from the EmotiBlog annotations can be enhanced with external resources. The results demonstrate that EmotiBlog, through its structure and annotation paradigm, offers high quality training data for systems dealing both with opinion mining, as well as emotion detection.
Resumo:
NPT and NVT Monte Carlo simulations are applied to models for methane and water to predict the PVT behaviour of these fluids over a wide range of temperatures and pressures. The potential models examined in this paper have previously been presented in the literature with their specific parameters optimised to fit phase coexistence data. The exponential-6 potential for methane gives generally good prediction of PVT behaviour over the full range of temperature and pressures studied with the only significant deviation from experimental data seen at high temperatures and pressures. The NSPCE water model shows very poor prediction of PVT behaviour, particularly at dense conditions. To improve this. the charge separation in the NSPCE model is varied with density. Improvements for vapour and liquid phase PVT predictions are achieved with this variation. No improvement was found in the prediction of the oxygen-oxygen radial distribution by varying charge separation under dense phase conditions. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
We develop a model for exponential decay of broadband pulses, and examine its implications for experiments on optical precursors. One of the signature features of Brillouin precursors is attenuation with a less rapid decay than that predicted by Beer's Law. Depending on the pulse parameters and the model that is adopted for the dielectric properties of the medium, the limiting z-dependence of the loss has been described as z(-1/2), z(-1/3), exponential, or, in more detailed descriptions, some combination of the above. Experimental results in the search for precursors are examined in light of the different models, and a stringent test for sub-exponential decay is applied to data on propagation of 500 femtosecond pulses through 1-5 meters of water. (C) 2005 Optical Society of America.
Resumo:
The recurrence interval statistics for regional seismicity follows a universal distribution function, independent of the tectonic setting or average rate of activity (Corral, 2004). The universal function is a modified gamma distribution with power-law scaling of recurrence intervals shorter than the average rate of activity and exponential decay for larger intervals. We employ the method of Corral (2004) to examine the recurrence statistics of a range of cellular automaton earthquake models. The majority of models has an exponential distribution of recurrence intervals, the same as that of a Poisson process. One model, the Olami-Feder-Christensen automaton, has recurrence statistics consistent with regional seismicity for a certain range of the conservation parameter of that model. For conservation parameters in this range, the event size statistics are also consistent with regional seismicity. Models whose dynamics are dominated by characteristic earthquakes do not appear to display universality of recurrence statistics.
Resumo:
The particle size of the bed sediments in or on many natural streams, alluvial fans, laboratory flumes, irrigation canals and mine waste deltas varies exponentially with distance along the stream. A plot of the available worldwide exponential bed particle size diminution coefficient data against stream length is presented which shows that all the data lie within a single narrow band extending over virtually the whole range of stream lengths and bed sediment particle sizes found on Earth. This correlation applies to both natural and artificial flows with both sand and gravel beds, irrespective of either the solids concentration or whether normal or reverse sorting occurs. This strongly suggests that there are common mechanisms underlying the exponential diminution of bed particles in subaerial aqueous flows of all kinds. Thus existing models of sorting and abrasion applicable to some such flows may be applicable to others. A comparison of exponential laboratory abrasion and field diminution coefficients suggests that abrasion is unlikely to be significant in gravel and sand bed streams shorter than about 10 km to 100 km, and about 500 km, respectively. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
This report outlines the derivation and application of a non-zero mean, polynomial-exponential covariance function based Gaussian process which forms the prior wind field model used in 'autonomous' disambiguation. It is principally used since the non-zero mean permits the computation of realistic local wind vector prior probabilities which are required when applying the scaled-likelihood trick, as the marginals of the full wind field prior. As the full prior is multi-variate normal, these marginals are very simple to compute.
Resumo:
An interactive hierarchical Generative Topographic Mapping (HGTM) ¸iteHGTM has been developed to visualise complex data sets. In this paper, we build a more general visualisation system by extending the HGTM visualisation system in 3 directions: bf (1) We generalize HGTM to noise models from the exponential family of distributions. The basic building block is the Latent Trait Model (LTM) developed in ¸iteKabanpami. bf (2) We give the user a choice of initializing the child plots of the current plot in either em interactive, or em automatic mode. In the interactive mode the user interactively selects ``regions of interest'' as in ¸iteHGTM, whereas in the automatic mode an unsupervised minimum message length (MML)-driven construction of a mixture of LTMs is employed. bf (3) We derive general formulas for magnification factors in latent trait models. Magnification factors are a useful tool to improve our understanding of the visualisation plots, since they can highlight the boundaries between data clusters. The unsupervised construction is particularly useful when high-level plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. We illustrate our approach on a toy example and apply our system to three more complex real data sets.
Resumo:
Recently, we have developed the hierarchical Generative Topographic Mapping (HGTM), an interactive method for visualization of large high-dimensional real-valued data sets. In this paper, we propose a more general visualization system by extending HGTM in three ways, which allows the user to visualize a wider range of data sets and better support the model development process. 1) We integrate HGTM with noise models from the exponential family of distributions. The basic building block is the Latent Trait Model (LTM). This enables us to visualize data of inherently discrete nature, e.g., collections of documents, in a hierarchical manner. 2) We give the user a choice of initializing the child plots of the current plot in either interactive, or automatic mode. In the interactive mode, the user selects "regions of interest," whereas in the automatic mode, an unsupervised minimum message length (MML)-inspired construction of a mixture of LTMs is employed. The unsupervised construction is particularly useful when high-level plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. 3) We derive general formulas for magnification factors in latent trait models. Magnification factors are a useful tool to improve our understanding of the visualization plots, since they can highlight the boundaries between data clusters. We illustrate our approach on a toy example and evaluate it on three more complex real data sets. © 2005 IEEE.
Resumo:
We study a class of models used with success in the modelling of climatological sequences. These models are based on the notion of renewal. At first, we examine the probabilistic aspects of these models to afterwards study the estimation of their parameters and their asymptotical properties, in particular the consistence and the normality. We will discuss for applications, two particular classes of alternating renewal processes at discrete time. The first class is defined by laws of sojourn time that are translated negative binomial laws and the second class, suggested by Green is deduced from alternating renewal process in continuous time with sojourn time laws which are exponential laws with parameters α^0 and α^1 respectively.
Resumo:
In the area of stress-strength models there has been a large amount of work as regards estimation of the reliability R = Pr(X2 < X1 ) when X1 and X2 are independent random variables belonging to the same univariate family of distributions. The algebraic form for R = Pr(X2 < X1 ) has been worked out for the majority of the well-known distributions including Normal, uniform, exponential, gamma, weibull and pareto. However, there are still many other distributions for which the form of R is not known. We have identified at least some 30 distributions with no known form for R. In this paper we consider some of these distributions and derive the corresponding forms for the reliability R. The calculations involve the use of various special functions.
Resumo:
Mathematics Subject Classification: 74D05, 26A33
Resumo:
The northern Antarctic Peninsula is one of the fastest changing regions on Earth. The disintegration of the Larsen-A Ice Shelf in 1995 caused tributary glaciers to adjust by speeding up, surface lowering, and overall increased ice-mass discharge. In this study, we investigate the temporal variation of these changes at the Dinsmoor-Bombardier-Edgeworth glacier system by analyzing dense time series from various spaceborne and airborne Earth observation missions. Precollapse ice shelf conditions and subsequent adjustments through 2014 were covered. Our results show a response of the glacier system some months after the breakup, reaching maximum surface velocities at the glacier front of up to 8.8 m/d in 1999 and a subsequent decrease to ~1.5 m/d in 2014. Using a dense time series of interferometrically derived TanDEM-X digital elevation models and photogrammetric data, an exponential function was fitted for the decrease in surface elevation. Elevation changes in areas below 1000 m a.s.l. amounted to at least 130±15 m130±15 m between 1995 and 2014, with change rates of ~3.15 m/a between 2003 and 2008. Current change rates (2010-2014) are in the range of 1.7 m/a. Mass imbalances were computed with different scenarios of boundary conditions. The most plausible results amount to -40.7±3.9 Gt-40.7±3.9 Gt. The contribution to sea level rise was estimated to be 18.8±1.8 Gt18.8±1.8 Gt, corresponding to a 0.052±0.005 mm0.052±0.005 mm sea level equivalent, for the period 1995-2014. Our analysis and scenario considerations revealed that major uncertainties still exist due to insufficiently accurate ice-thickness information. The second largest uncertainty in the computations was the glacier surface mass balance, which is still poorly known. Our time series analysis facilitates an improved comparison with GRACE data and as input to modeling of glacio-isostatic uplift in this region. The study contributed to a better understanding of how glacier systems adjust to ice shelf disintegration.
Resumo:
The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.