987 resultados para variance function
Resumo:
In this work we revisit the problem of the hedging of contingent claim using mean-square criterion. We prove that in incomplete market, some probability measure can be identified so that becomes -martingale under .This is in fact a new proposition on the martingale representation theorem. The new results also identify a weight function that serves to be an approximation to the Radon-Nikodým derivative of the unique neutral martingale measure.
Resumo:
There is an increasing demand for environmental assessments of the marine environment to include ecosystem function. However, existing schemes are predominantly based on taxonomic (i.e. structural) measures of biodiversity. Biodiversity and Ecosystem Function (BEF) relationships are suggested to provide a mechanism for converting taxonomic information into surrogates of ecosystem function. This review assesses the evidence for marine BEF relationships and their potential to be used in practical monitoring applications (i.e. operationalized). Five key requirements were identified for the practical application of BEF relationships: (1) a complete understanding of strength, direction and prevalence of marine BEF relationships, (2) an understanding of which biological components are influential within specific BEF relationships, (3) the biodiversity of the selected biological components can be measured easily, (4) the ecological mechanisms that are the most important for generating marine BEF relationships, i.e. identity effects or complementarity, are known and (5) the proportion of the overall functional variance is explained by biodiversity, and hence BEF relationships, has been established. Numerous positive and some negative BEF relationships were found within the literature, although many reproduced poorly the natural species richness, trophic structures or multiple functions of real ecosystems (requirement 1). Null relationships were also reported. The consistency of the positive and negative relationships was often low that compromised the ability to generalize BEF relationships and confident application of BEF within marine monitoring. Equally, some biological components and functions have received little or no investigation. Expert judgement was used to attribute biological components using spatial extent, presence and functional rate criteria (requirement 2). This approach highlighted the main biological components contributing the most to specific ecosystem functions, and that many of the particularly influential components were found to have received the least amount of research attention. The need for biodiversity to be measureable (requirement 3) is possible for most biological components although difficult within the functionally important microbes. Identity effects underpinned most marine BEF relationships (requirement 4). As such, processes that translated structural biodiversity measures into functional diversity were found to generate better BEF relationships. The analysis of the contribution made by biodiversity, over abiotic influences, to the total expression of a particular ecosystem function was rarely measured or considered (requirement 5). Hence it is not possible to determine the overall importance of BEF relationships within the total ecosystem functioning observed. In the few studies where abiotic factors had been considered, it was clear that these modified BEF relationships and have their own direct influence on functional rate. Based on the five requirements, the information required for immediate ‘operationalization’ of BEF relationships within marine functional monitoring is lacking. However, the concept of BEF inclusion within practical monitoring applications, supported by ecological modelling, shows promise for providing surrogate indicators of functioning.
Resumo:
Arcellacea (testate lobose amoebae) communities were assessed from 73 sediment-water interface samples collected from 33 lakes in urban and rural settings within the Greater Toronto Area (GTA), Ontario, Canada, as well as from forested control areas in the Lake Simcoe area, Algonquin Park and eastern Ontario. The results were used to: (1) develop a statistically rigorous arcellacean-based training set for sedimentary phosphorus (Olsen P (OP)) loading; and (2) derive a transfer function to reconstruct OP levels during the post-European settlement era (AD1870s onward) using a chronologically well-constrained core from Haynes Lake on the environmentally sensitive Oak Ridges Moraine, within the GTA. Ordination analysis indicated that OP most influenced arcellacean assemblages, explaining 6.5% (p < 0.005) of total variance. An improved training set where the influence of other important environmental variables (e.g. total organic carbon, total nitrogen, Mg) was reduced, comprised 40 samples from 31 lakes, and was used to construct a transfer function for lacustrine arcellaceans for sedimentary phosphorus (Olsen P) using tolerance downweighted weighted averaging (WA-Tol) with inverse deshrinking (RMSEPjack-77pp; r2jack = 0.68). The inferred reconstruction indicates that OP levels remained near pre-settlement background levels from settlement in the late AD 1970s through to the early AD 1970s. Since OP runoff from both forests and pasture is minimal, early agricultural land use within the lake catchment was as most likely pasture and/or was used to grow perennial crops such as Timothy-grass for hay. A significant increase in inferred OP concentration beginning ~ AD 1972 may have been related to a change in crops (e.g. corn production) in the catchment resulting in more runoff, and the introduction of chemical fertilizers. A dramatic decline in OP after ~ AD 1985 probably corresponds to a reduction in chemical fertilizer use related to advances in agronomy, which permitted a more precise control over required fertilizer application. Another significant increase in OP levels after ~ AD 1995 may have been related to the construction of a large golf course upslope and immediately to the north of Haynes Lake in AD 1993, where significant fertilizer use is required to maintain the fairways. These results demonstrate that arcellaceans have great potential for reconstructing lake water geochemistry and will complement other proxies (e.g. diatoms) in paleolimnological research.
Resumo:
Objective: To explore, using functional magnetic resonance imaging (MRI), the functional organisation of phonological processing in young adults born very preterm.
Subjects: Six right handed male subjects with radiological evidence of thinning of the corpus callosum were selected from a cohort of very preterm subjects. Six normal right handed male volunteers acted as controls.
Method: Blood oxygenation level dependent contrast echoplanar images were acquired over five minutes at 1.5 T while subjects performed the tasks. During the ON condition, subjects were visually presented with pairs of non-words and asked to press a key when a pair of words rhymed (phonological processing). This task alternated with the OFF condition, which required subjects to make letter case judgments of visually presented pairs of consonant letter strings (orthographic processing). Generic brain activation maps were constructed from individual images by sinusoidal regression and non-parametric testing. Between group differences in the mean power of experimental response were identified on a voxel wise basis by analysis of variance.
Results: Compared with controls, the subjects with thinning of the corpus callosum showed significantly reduced power of response in the left hemisphere, including the peristriate cortex and the cerebellum, as well as in the right parietal association area. Significantly increased power of response was observed in the right precentral gyrus and the right supplementary motor area.
Conclusions: The data show evidence of increased frontal and decreased occipital activation in male subjects with neurodevelopmental thinning of the corpus callosum, which may be due to the operation of developmental compensatory mechanisms.
Resumo:
We developed the concept of split-'t to deal with the large molecules (in terms of the number of electrons and nuclear charge Z). This naturally leads to partitioning the local energy into components due to each electron shell. The minimization of the variation of the valence shell local energy is used to optimize a simple two parameter CuH wave function. Molecular properties (spectroscopic constants and the dipole moment) are calculated for the optimized and nearly optimized wave functions using the Variational Quantum Monte Carlo method. Our best results are comparable to those from the single and double configuration interaction (SDCI) method.
Resumo:
This thesis tested whether cognitive performance during passive heat stress may be affected by changes in cerebrovascular variables as opposed to strictly thermally-induced changes. A pharmacological reduction in cerebral blood flow (CBF) using indomethacin along with a hypocapnia-induced CBF reduction during passive heat stress (Tre ~1.5°C above baseline) were used to investigate any cerebrovascular-mediated changes in cognitive performance. Repeated measures analysis of variance indicated that One-Touch Stockings of Cambridge (OTS) performance was not affected by a significant reduction in CBF during passive heat stress. More specifically, OTS accuracy measures did not change as a result of either a reduction in CBF or increasing passive heat stress. However, it was found that OTS response time indices improved with increasing passive heat stress independent of CBF changes. In conclusion, a significant reduction in CBF does not cause additional changes in performance of an executive functioning task during severe passive heat stress.
Resumo:
The attached file is created with Scientific Workplace Latex
Resumo:
The present study on the characterization of probability distributions using the residual entropy function. The concept of entropy is extensively used in literature as a quantitative measure of uncertainty associated with a random phenomenon. The commonly used life time models in reliability Theory are exponential distribution, Pareto distribution, Beta distribution, Weibull distribution and gamma distribution. Several characterization theorems are obtained for the above models using reliability concepts such as failure rate, mean residual life function, vitality function, variance residual life function etc. Most of the works on characterization of distributions in the reliability context centers around the failure rate or the residual life function. The important aspect of interest in the study of entropy is that of locating distributions for which the shannon’s entropy is maximum subject to certain restrictions on the underlying random variable. The geometric vitality function and examine its properties. It is established that the geometric vitality function determines the distribution uniquely. The problem of averaging the residual entropy function is examined, and also the truncated form version of entropies of higher order are defined. In this study it is established that the residual entropy function determines the distribution uniquely and that the constancy of the same is characteristics to the geometric distribution
Resumo:
The electron localization function (ELF) has been proven so far a valuable tool to determine the location of electron pairs. Because of that, the ELF has been widely used to understand the nature of the chemical bonding and to discuss the mechanism of chemical reactions. Up to now, most applications of the ELF have been performed with monodeterminantal methods and only few attempts to calculate this function for correlated wave functions have been carried out. Here, a formulation of ELF valid for mono- and multiconfigurational wave functions is given and compared with previous recently reported approaches. The method described does not require the use of the homogeneous electron gas to define the ELF, at variance with the ELF definition given by Becke. The effect of the electron correlation in the ELF, introduced by means of configuration interaction with singles and doubles calculations, is discussed in the light of the results derived from a set of atomic and molecular systems
Resumo:
New construction algorithms for radial basis function (RBF) network modelling are introduced based on the A-optimality and D-optimality experimental design criteria respectively. We utilize new cost functions, based on experimental design criteria, for model selection that simultaneously optimizes model approximation, parameter variance (A-optimality) or model robustness (D-optimality). The proposed approaches are based on the forward orthogonal least-squares (OLS) algorithm, such that the new A-optimality- and D-optimality-based cost functions are constructed on the basis of an orthogonalization process that gains computational advantages and hence maintains the inherent computational efficiency associated with the conventional forward OLS approach. The proposed approach enhances the very popular forward OLS-algorithm-based RBF model construction method since the resultant RBF models are constructed in a manner that the system dynamics approximation capability, model adequacy and robustness are optimized simultaneously. The numerical examples provided show significant improvement based on the D-optimality design criterion, demonstrating that there is significant room for improvement in modelling via the popular RBF neural network.
Resumo:
An alternative blind deconvolution algorithm for white-noise driven minimum phase systems is presented and verified by computer simulation. This algorithm uses a cost function based on a novel idea: variance approximation and series decoupling (VASD), and suggests that not all autocorrelation function values are necessary to implement blind deconvolution.
Resumo:
The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.
Resumo:
There is an increasing body of research investigating whether abnormal glucose tolerance is associated with cognitive impairments, the evidence from which is equivocal. A systematic search of the literature identified twenty-three studies which assessed either clinically defined impaired glucose tolerance (IGT) or variance in glucose tolerance within the clinically defined normal range (NGT). The findings suggest that poor glucose tolerance is associated with cognitive impairments, with decrements in verbal memory being most prevalent. However, the evidence for decrements in other domains was weak. The NGT studies report a stronger glucose tolerance-cognition association than the IGT studies, which is likely to be due to the greater number of glucose tolerance parameters and the more sensitive cognitive tests in the NGT studies compared to the IGT studies. It is also speculated that the negative cognitive impact of abnormalities in glucose tolerance increases with age, and that glucose consumption is most beneficial to individuals with poor glucose tolerance compared to individuals with normal glucose tolerance. The role of potential mechanisms are discussed.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this article, we consider the synthetic control chart with two-stage sampling (SyTS chart) to control the process mean and variance. During the first stage, one item of the sample is inspected; if its value X, is close to the target value of the process mean, then the sampling is interrupted. Otherwise, the sampling goes on to the second stage, where the remaining items are inspected and the statistic T = Sigma [x(i) - mu(0) + xi sigma(0)](2) is computed taking into account all items of the sample. The design parameter is function of X-1. When the statistic T is larger than a specified value, the sample is classified as nonconforming. According to the synthetic procedure, the signal is based on Conforming Run Length (CRL). The CRL is the number of samples taken from the process since the previous nonconforming sample until the occurrence of the next nonconforming sample. If the CRL is sufficiently small, then a signal is generated. A comparative study shows that the SyTS chart and the joint X and S charts with double sampling are very similar in performance. However, from the practical viewpoint, the SyTS chart is more convenient to administer than the joint charts.