987 resultados para harmonic random process


Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is very well known that the first succesful valuation of a stock option was done by solving a deterministic partial differential equation (PDE) of the parabolic type with some complementary conditions specific for the option. In this approach, the randomness in the option value process is eliminated through a no-arbitrage argument. An alternative approach is to construct a replicating portfolio for the option. From this viewpoint the payoff function for the option is a random process which, under a new probabilistic measure, turns out to be of a special type, a martingale. Accordingly, the value of the replicating portfolio (equivalently, of the option) is calculated as an expectation, with respect to this new measure, of the discounted value of the payoff function. Since the expectation is, by definition, an integral, its calculation can be made simpler by resorting to powerful methods already available in the theory of analytic functions. In this paper we use precisely two of those techniques to find the well-known value of a European call

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is very well known that the first succesful valuation of a stock option was done by solving a deterministic partial differential equation (PDE) of the parabolic type with some complementary conditions specific for the option. In this approach, the randomness in the option value process is eliminated through a no-arbitrage argument. An alternative approach is to construct a replicating portfolio for the option. From this viewpoint the payoff function for the option is a random process which, under a new probabilistic measure, turns out to be of a special type, a martingale. Accordingly, the value of the replicating portfolio (equivalently, of the option) is calculated as an expectation, with respect to this new measure, of the discounted value of the payoff function. Since the expectation is, by definition, an integral, its calculation can be made simpler by resorting to powerful methods already available in the theory of analytic functions. In this paper we use precisely two of those techniques to find the well-known value of a European call

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In zebrafish, germ cells are responsible for transmitting the genetic information from one generation to the next. During the first cleavages of zebrafish embryonic development, a specialized part of the cytoplasm known as germ plasm, is responsible of committing four blastomeres to become the progenitors of all germ cells in the forming embryo. Much is known about how the germ plasm is spatially distributed in early stages of primordial germ cell development, a process described to be dependant on microtubules and actin. However, little is known about how the material is inherited after it reorganizes into a perinuclear location, or how is the symmetrical distribution regulated in order to ensure proper inheritance of the material by both daughter cells. It is also not clear whether there is a controlled mechanism that regulates the number of granules inherited by the daughter cells, or whether it is a random process. We describe the distribution of germ plasm material from 4hpf to 24hpf in zebrafish primordial germ cells using Vasa protein as marker. Vasa positive material appears to be conglomerate into 3 to 4 big spherical structures at 4hpf. While development progresses, these big structures become smaller perinuclear granules that reach a total number of approximately 30 at 24hpf. We investigated how this transformation occurs and how the minus-end microtubule dependent motor protein Dynein plays a role in this process. Additionally, we describe specific colocalization of microtubules and perinuclear granules during interphase and more interestingly, during all different stages of cell division. We show that distribution of granules follow what seems to be a regulated distribution: during cells division, daughter cells inherit an equal number of granules. We propose that due to the permanent colocalization of microtubular structures with germinal granules during interphase and cell division, a coordinated mechanism between these structures may ensure proper distribution of the material among daughter cells. Furthermore, we show that exposure to the microtubule-depolymerizing drug nocodazole leads to disassembly of the germ cell nuclear lamin matrix, chromatin condensation, and fusion of granules to a big conglomerate, revealing dependence of granular distribution on microtubules and proper nuclear structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The time-of-detection method for aural avian point counts is a new method of estimating abundance, allowing for uncertain probability of detection. The method has been specifically designed to allow for variation in singing rates of birds. It involves dividing the time interval of the point count into several subintervals and recording the detection history of the subintervals when each bird sings. The method can be viewed as generating data equivalent to closed capture–recapture information. The method is different from the distance and multiple-observer methods in that it is not required that all the birds sing during the point count. As this method is new and there is some concern as to how well individual birds can be followed, we carried out a field test of the method using simulated known populations of singing birds, using a laptop computer to send signals to audio stations distributed around a point. The system mimics actual aural avian point counts, but also allows us to know the size and spatial distribution of the populations we are sampling. Fifty 8-min point counts (broken into four 2-min intervals) using eight species of birds were simulated. Singing rate of an individual bird of a species was simulated following a Markovian process (singing bouts followed by periods of silence), which we felt was more realistic than a truly random process. The main emphasis of our paper is to compare results from species singing at (high and low) homogenous rates per interval with those singing at (high and low) heterogeneous rates. Population size was estimated accurately for the species simulated, with a high homogeneous probability of singing. Populations of simulated species with lower but homogeneous singing probabilities were somewhat underestimated. Populations of species simulated with heterogeneous singing probabilities were substantially underestimated. Underestimation was caused by both the very low detection probabilities of all distant individuals and by individuals with low singing rates also having very low detection probabilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The experimental variogram computed in the usual way by the method of moments and the Haar wavelet transform are similar in that they filter data and yield informative summaries that may be interpreted. The variogram filters out constant values; wavelets can filter variation at several spatial scales and thereby provide a richer repertoire for analysis and demand no assumptions other than that of finite variance. This paper compares the two functions, identifying that part of the Haar wavelet transform that gives it its advantages. It goes on to show that the generalized variogram of order k=1, 2, and 3 filters linear, quadratic, and cubic polynomials from the data, respectively, which correspond with more complex wavelets in Daubechies's family. The additional filter coefficients of the latter can reveal features of the data that are not evident in its usual form. Three examples in which data recorded at regular intervals on transects are analyzed illustrate the extended form of the variogram. The apparent periodicity of gilgais in Australia seems to be accentuated as filter coefficients are added, but otherwise the analysis provides no new insight. Analysis of hyerpsectral data with a strong linear trend showed that the wavelet-based variograms filtered it out. Adding filter coefficients in the analysis of the topsoil across the Jurassic scarplands of England changed the upper bound of the variogram; it then resembled the within-class variogram computed by the method of moments. To elucidate these results, we simulated several series of data to represent a random process with values fluctuating about a mean, data with long-range linear trend, data with local trend, and data with stepped transitions. The results suggest that the wavelet variogram can filter out the effects of long-range trend, but not local trend, and of transitions from one class to another, as across boundaries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A discrete-time random process is described, which can generate bursty sequences of events. A Bernoulli process, where the probability of an event occurring at time t is given by a fixed probability x, is modified to include a memory effect where the event probability is increased proportionally to the number of events that occurred within a given amount of time preceding t. For small values of x the interevent time distribution follows a power law with exponent −2−x. We consider a dynamic network where each node forms, and breaks connections according to this process. The value of x for each node depends on the fitness distribution, \rho(x), from which it is drawn; we find exact solutions for the expectation of the degree distribution for a variety of possible fitness distributions, and for both cases where the memory effect either is, or is not present. This work can potentially lead to methods to uncover hidden fitness distributions from fast changing, temporal network data, such as online social communications and fMRI scans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report time evolution studies of low coverage CO adsorption (surface hydrogen site blocking < 40%) and oxidative stripping on stepped Pt(776) and Pt(554) surfaces. It was observed that there is no preferential site occupancy for CO adsorption on step or terrace. It is proposed that CO adsorption onto these surfaces is a random process, and after CO adsorption there is no appreciable shift from CO-(111) to CO-(110) sites. This implies that after adsorption, CO molecules either have a very long residence time, or that the diffusion coefficient is much lower than previously thought. After CO electrooxidation the sites released included both terrace (111) and step (110) orientations. For surface hydrogen site blocking > 40%, the lateral interactions might play a role in the preferential CO site occupancy. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work a new method is proposed of separated estimation for the ARMA spectral model based on the modified Yule-Walker equations and on the least squares method. The proposal of the new method consists of performing an AR filtering in the random process generated obtaining a new random estimate, which will reestimate the ARMA model parameters, given a better spectrum estimate. Some numerical examples will be presented in order to ilustrate the performance of the method proposed, which is evaluated by the relative error and the average variation coefficient.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this work is to present a frequency domain model to demonstrate the operation of an electromagnetic arrangement for controlling the injection of zero-sequence currents in the electrical system. Considering the diversity of sequential distribution of harmonic components of a current, the device proposed can be used in the process of mitigation of zero-sequence components. This device, here called electromagnetic suppressor, consists of a blocker and filter both electromagnetic, whose joint operation can provide paths of high and low impedances that can be conveniently adjusted in order to search for a desired performance. This study presents physical considerations, mathematical modeling and computer simulations that clearly demonstrate the viability of this application as a more viable alternative in the conception of filtering systems. The performance analysis is based on the frequency response of harmonic transmittances. The efficacy of this technique in direct actions to maximize the harmonic mitigation process is demonstrated. ©2010 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper addresses the investment decisions considering the presence of financial constraints of 373 large Brazilian firms from 1997 to 2004, using panel data. A Bayesian econometric model was used considering ridge regression for multicollinearity problems among the variables in the model. Prior distributions are assumed for the parameters, classifying the model into random or fixed effects. We used a Bayesian approach to estimate the parameters, considering normal and Student t distributions for the error and assumed that the initial values for the lagged dependent variable are not fixed, but generated by a random process. The recursive predictive density criterion was used for model comparisons. Twenty models were tested and the results indicated that multicollinearity does influence the value of the estimated parameters. Controlling for capital intensity, financial constraints are found to be more important for capital-intensive firms, probably due to their lower profitability indexes, higher fixed costs and higher degree of property diversification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The brain's structural and functional systems, protein-protein interaction, and gene networks are examples of biological systems that share some features of complex networks, such as highly connected nodes, modularity, and small-world topology. Recent studies indicate that some pathologies present topological network alterations relative to norms seen in the general population. Therefore, methods to discriminate the processes that generate the different classes of networks (e. g., normal and disease) might be crucial for the diagnosis, prognosis, and treatment of the disease. It is known that several topological properties of a network (graph) can be described by the distribution of the spectrum of its adjacency matrix. Moreover, large networks generated by the same random process have the same spectrum distribution, allowing us to use it as a "fingerprint". Based on this relationship, we introduce and propose the entropy of a graph spectrum to measure the "uncertainty" of a random graph and the Kullback-Leibler and Jensen-Shannon divergences between graph spectra to compare networks. We also introduce general methods for model selection and network model parameter estimation, as well as a statistical procedure to test the nullity of divergence between two classes of complex networks. Finally, we demonstrate the usefulness of the proposed methods by applying them to (1) protein-protein interaction networks of different species and (2) on networks derived from children diagnosed with Attention Deficit Hyperactivity Disorder (ADHD) and typically developing children. We conclude that scale-free networks best describe all the protein-protein interactions. Also, we show that our proposed measures succeeded in the identification of topological changes in the network while other commonly used measures (number of edges, clustering coefficient, average path length) failed.