966 resultados para Random process
Resumo:
Standard algorithms in tracking and other state-space models assume identical and synchronous sampling rates for the state and measurement processes. However, real trajectories of objects are typically characterized by prolonged smooth sections, with sharp, but infrequent, changes. Thus, a more parsimonious representation of a target trajectory may be obtained by direct modeling of maneuver times in the state process, independently from the observation times. This is achieved by assuming the state arrival times to follow a random process, typically specified as Markovian, so that state points may be allocated along the trajectory according to the degree of variation observed. The resulting variable dimension state inference problem is solved by developing an efficient variable rate particle filtering algorithm to recursively update the posterior distribution of the state sequence as new data becomes available. The methodology is quite general and can be applied across many models where dynamic model uncertainty occurs on-line. Specific models are proposed for the dynamics of a moving object under internal forcing, expressed in terms of the intrinsic dynamics of the object. The performance of the algorithms with these dynamical models is demonstrated on several challenging maneuvering target tracking problems in clutter. © 2006 IEEE.
Resumo:
The propagation of waves in an extended, irregular medium is studied under the "quasi-optics" and the "Markov random process" approximations. Under these assumptions, a Fokker-Planck equation satisfied by the characteristic functional of the random wave field is derived. A complete set of the moment equations with different transverse coordinates and different wavenumbers is then obtained from the characteristic functional. The derivation does not require Gaussian statistics of the random medium and the result can be applied to the time-dependent problem. We then solve the moment equations for the phase correlation function, angular broadening, temporal pulse smearing, intensity correlation function, and the probability distribution of the random waves. The necessary and sufficient conditions for strong scintillation are also given.
We also consider the problem of diffraction of waves by a random, phase-changing screen. The intensity correlation function is solved in the whole Fresnel diffraction region and the temporal pulse broadening function is derived rigorously from the wave equation.
The method of smooth perturbations is applied to interplanetary scintillations. We formulate and calculate the effects of the solar-wind velocity fluctuations on the observed intensity power spectrum and on the ratio of the observed "pattern" velocity and the true velocity of the solar wind in the three-dimensional spherical model. The r.m.s. solar-wind velocity fluctuations are found to be ~200 km/sec in the region about 20 solar radii from the Sun.
We then interpret the observed interstellar scintillation data using the theories derived under the Markov approximation, which are also valid for the strong scintillation. We find that the Kolmogorov power-law spectrum with an outer scale of 10 to 100 pc fits the scintillation data and that the ambient averaged electron density in the interstellar medium is about 0.025 cm-3. It is also found that there exists a region of strong electron density fluctuation with thickness ~10 pc and mean electron density ~7 cm-3 between the PSR 0833-45 pulsar and the earth.
Resumo:
This paper examines the determinants of unemployment duration in a competing risks framework with two destination states: inactivity and employment. The innovation is the recognition of defective risks. A polynomial hazard function is used to differentiate between two possible sources of infinite durations. The first is produced by a random process of unlucky draws, the second by workers rejecting a destination state. The evidence favors the mover-stayer model over the search model. Refinement of the former approach, using a more flexible baseline hazard function, produces a robust and more convincing explanation for positive and zero transition rates out of unemployment.
Resumo:
The ability to exchange keys between users is vital in any wireless based security system. A key generation technique exploits the randomness of the wireless channel is a promising alternative to existing key distribution techniques, e.g., public key cryptography. In this paper a secure key generation scheme based on the subcarriers’ channel responses in orthogonal frequencydivision multiplexing (OFDM) systems is proposed. We first implement a time-variant multipath channel with its channel impulse response modelled as a wide sense stationary (WSS) uncorrelated scattering random process and demonstrate that each subcarrier’s channel response is also a WSS random process. We then define the X% coherence time as the time required to produce an X% correlation coefficient in the autocorrelation function (ACF) of each channel tap, and find that when all the channel taps have the same Doppler power spectrum, all subcarriers’ channel responses has the same ACF as the channel taps. The subcarrier’s channel response is then sampled every X% coherence time and quantized into key bits. All the key sequences’ randomness is tested using National Institute of Standards and Technology (NIST) statistical test suite and the results indicate that the commonly used sampling interval as 50% coherence time cannot guarantee the randomness of the key sequence.
Resumo:
The ability to exchange keys between users is vital in any wireless based security system. A key generation technique which exploits the randomness of the wireless channel is a promising alternative to existing key distribution techniques, e.g., public key cryptography. In this paper, a secure key generation scheme based on the subcarriers' channel responses in orthogonal frequency-division multiplexing (OFDM) systems is proposed. We first implement a time-variant multipath channel with its channel impulse response modelled as a wide sense stationary (WSS) uncorrelated scattering random process and demonstrate that each subcarrier's channel response is also a WSS random process. We then define the X% coherence time as the time required to produce an X% correlation coefficient in the autocorrelation function (ACF) of each channel tap, and find that when all the channel taps have the same Doppler power spectrum, all subcarriers' channel responses has the same ACF as the channel taps. The subcarrier's channel response is then sampled every X% coherence time and quantized into key bits. All the key sequences' randomness is tested using National Institute of Standards and Technology (NIST) statistical test suite and the results indicate that the commonly used sampling interval as 50% coherence time cannot guarantee the randomness of the key sequence.
Resumo:
This paper presents a key generation system derived from the channel response of individual subcarrier in orthogonal frequency-division multiplexing (OFDM) systems. Practical aspects of the security were investigated by implementing our key generation scheme on a wireless open-access research platform (WARP), which enables us to obtain channel estimation of individual OFDM subcarriers, a feature not currently available in most commercial wireless interface cards. Channel response of individual OFDM subcarrier is usually a wide sense stationary random process, which allows us to find the optimal probing period and maximize the key generation rate. The implementation requires cross layer design as it involves interaction between physical and MAC layer. We have experimentally verified the feasibility and principles of key generation, and also evaluated the performance of our system in terms of randomness, key generation rate and key disagreement rate, which proves that OFDM subcarrier's channel responses are valid for key generation.
Resumo:
O objectivo principal da presente tese consiste no desenvolvimento de estimadores robustos do variograma com boas propriedades de eficiência. O variograma é um instrumento fundamental em Geoestatística, pois modela a estrutura de dependência do processo em estudo e influencia decisivamente a predição de novas observações. Os métodos tradicionais de estimação do variograma não são robustos, ou seja, são sensíveis a pequenos desvios das hipóteses do modelo. Essa questão é importante, pois as propriedades que motivam a aplicação de tais métodos, podem não ser válidas nas vizinhanças do modelo assumido. O presente trabalho começa por conter uma revisão dos principais conceitos em Geoestatística e da estimação tradicional do variograma. De seguida, resumem-se algumas noções fundamentais sobre robustez estatística. No seguimento, apresenta-se um novo método de estimação do variograma que se designou por estimador de múltiplos variogramas. O método consiste em quatro etapas, nas quais prevalecem, alternadamente, os critérios de robustez ou de eficiência. A partir da amostra inicial, são calculadas, de forma robusta, algumas estimativas pontuais do variograma; com base nessas estimativas pontuais, são estimados os parâmetros do modelo pelo método dos mínimos quadrados; as duas fases anteriores são repetidas, criando um conjunto de múltiplas estimativas da função variograma; por fim, a estimativa final do variograma é definida pela mediana das estimativas obtidas anteriormente. Assim, é possível obter um estimador que tem boas propriedades de robustez e boa eficiência em processos Gaussianos. A investigação desenvolvida revelou que, quando se usam estimativas discretas na primeira fase da estimação do variograma, existem situações onde a identificabilidade dos parâmetros não está assegurada. Para os modelos de variograma mais comuns, foi possível estabelecer condições, pouco restritivas, que garantem a unicidade de solução na estimação do variograma. A estimação do variograma supõe sempre a estacionaridade da média do processo. Como é importante que existam procedimentos objectivos para avaliar tal condição, neste trabalho sugere-se um teste para validar essa hipótese. A estatística do teste é um estimador-MM, cuja distribuição é desconhecida nas condições de dependência assumidas. Tendo em vista a sua aproximação, apresenta-se uma versão do método bootstrap adequada ao estudo de observações dependentes de processos espaciais. Finalmente, o estimador de múltiplos variogramas é avaliado em termos da sua aplicação prática. O trabalho contém um estudo de simulação que confirma as propriedades estabelecidas. Em todos os casos analisados, o estimador de múltiplos variogramas produziu melhores resultados do que as alternativas usuais, tanto para a distribuição assumida, como para distribuições contaminadas.
Resumo:
BACKGROUND: The synthesis of published research in systematic reviews is essential when providing evidence to inform clinical and health policy decision-making. However, the validity of systematic reviews is threatened if journal publications represent a biased selection of all studies that have been conducted (dissemination bias). To investigate the extent of dissemination bias we conducted a systematic review that determined the proportion of studies published as peer-reviewed journal articles and investigated factors associated with full publication in cohorts of studies (i) approved by research ethics committees (RECs) or (ii) included in trial registries. METHODS AND FINDINGS: Four bibliographic databases were searched for methodological research projects (MRPs) without limitations for publication year, language or study location. The searches were supplemented by handsearching the references of included MRPs. We estimated the proportion of studies published using prediction intervals (PI) and a random effects meta-analysis. Pooled odds ratios (OR) were used to express associations between study characteristics and journal publication. Seventeen MRPs (23 publications) evaluated cohorts of studies approved by RECs; the proportion of published studies had a PI between 22% and 72% and the weighted pooled proportion when combining estimates would be 46.2% (95% CI 40.2%-52.4%, I2 = 94.4%). Twenty-two MRPs (22 publications) evaluated cohorts of studies included in trial registries; the PI of the proportion published ranged from 13% to 90% and the weighted pooled proportion would be 54.2% (95% CI 42.0%-65.9%, I2 = 98.9%). REC-approved studies with statistically significant results (compared with those without statistically significant results) were more likely to be published (pooled OR 2.8; 95% CI 2.2-3.5). Phase-III trials were also more likely to be published than phase II trials (pooled OR 2.0; 95% CI 1.6-2.5). The probability of publication within two years after study completion ranged from 7% to 30%. CONCLUSIONS: A substantial part of the studies approved by RECs or included in trial registries remains unpublished. Due to the large heterogeneity a prediction of the publication probability for a future study is very uncertain. Non-publication of research is not a random process, e.g., it is associated with the direction of study findings. Our findings suggest that the dissemination of research findings is biased.
Resumo:
In zebrafish, germ cells are responsible for transmitting the genetic information from one generation to the next. During the first cleavages of zebrafish embryonic development, a specialized part of the cytoplasm known as germ plasm, is responsible of committing four blastomeres to become the progenitors of all germ cells in the forming embryo. Much is known about how the germ plasm is spatially distributed in early stages of primordial germ cell development, a process described to be dependant on microtubules and actin. However, little is known about how the material is inherited after it reorganizes into a perinuclear location, or how is the symmetrical distribution regulated in order to ensure proper inheritance of the material by both daughter cells. It is also not clear whether there is a controlled mechanism that regulates the number of granules inherited by the daughter cells, or whether it is a random process. We describe the distribution of germ plasm material from 4hpf to 24hpf in zebrafish primordial germ cells using Vasa protein as marker. Vasa positive material appears to be conglomerate into 3 to 4 big spherical structures at 4hpf. While development progresses, these big structures become smaller perinuclear granules that reach a total number of approximately 30 at 24hpf. We investigated how this transformation occurs and how the minus-end microtubule dependent motor protein Dynein plays a role in this process. Additionally, we describe specific colocalization of microtubules and perinuclear granules during interphase and more interestingly, during all different stages of cell division. We show that distribution of granules follow what seems to be a regulated distribution: during cells division, daughter cells inherit an equal number of granules. We propose that due to the permanent colocalization of microtubular structures with germinal granules during interphase and cell division, a coordinated mechanism between these structures may ensure proper distribution of the material among daughter cells. Furthermore, we show that exposure to the microtubule-depolymerizing drug nocodazole leads to disassembly of the germ cell nuclear lamin matrix, chromatin condensation, and fusion of granules to a big conglomerate, revealing dependence of granular distribution on microtubules and proper nuclear structure.
Resumo:
The time-of-detection method for aural avian point counts is a new method of estimating abundance, allowing for uncertain probability of detection. The method has been specifically designed to allow for variation in singing rates of birds. It involves dividing the time interval of the point count into several subintervals and recording the detection history of the subintervals when each bird sings. The method can be viewed as generating data equivalent to closed capture–recapture information. The method is different from the distance and multiple-observer methods in that it is not required that all the birds sing during the point count. As this method is new and there is some concern as to how well individual birds can be followed, we carried out a field test of the method using simulated known populations of singing birds, using a laptop computer to send signals to audio stations distributed around a point. The system mimics actual aural avian point counts, but also allows us to know the size and spatial distribution of the populations we are sampling. Fifty 8-min point counts (broken into four 2-min intervals) using eight species of birds were simulated. Singing rate of an individual bird of a species was simulated following a Markovian process (singing bouts followed by periods of silence), which we felt was more realistic than a truly random process. The main emphasis of our paper is to compare results from species singing at (high and low) homogenous rates per interval with those singing at (high and low) heterogeneous rates. Population size was estimated accurately for the species simulated, with a high homogeneous probability of singing. Populations of simulated species with lower but homogeneous singing probabilities were somewhat underestimated. Populations of species simulated with heterogeneous singing probabilities were substantially underestimated. Underestimation was caused by both the very low detection probabilities of all distant individuals and by individuals with low singing rates also having very low detection probabilities.
Resumo:
The experimental variogram computed in the usual way by the method of moments and the Haar wavelet transform are similar in that they filter data and yield informative summaries that may be interpreted. The variogram filters out constant values; wavelets can filter variation at several spatial scales and thereby provide a richer repertoire for analysis and demand no assumptions other than that of finite variance. This paper compares the two functions, identifying that part of the Haar wavelet transform that gives it its advantages. It goes on to show that the generalized variogram of order k=1, 2, and 3 filters linear, quadratic, and cubic polynomials from the data, respectively, which correspond with more complex wavelets in Daubechies's family. The additional filter coefficients of the latter can reveal features of the data that are not evident in its usual form. Three examples in which data recorded at regular intervals on transects are analyzed illustrate the extended form of the variogram. The apparent periodicity of gilgais in Australia seems to be accentuated as filter coefficients are added, but otherwise the analysis provides no new insight. Analysis of hyerpsectral data with a strong linear trend showed that the wavelet-based variograms filtered it out. Adding filter coefficients in the analysis of the topsoil across the Jurassic scarplands of England changed the upper bound of the variogram; it then resembled the within-class variogram computed by the method of moments. To elucidate these results, we simulated several series of data to represent a random process with values fluctuating about a mean, data with long-range linear trend, data with local trend, and data with stepped transitions. The results suggest that the wavelet variogram can filter out the effects of long-range trend, but not local trend, and of transitions from one class to another, as across boundaries.
Resumo:
With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.
Resumo:
A discrete-time random process is described, which can generate bursty sequences of events. A Bernoulli process, where the probability of an event occurring at time t is given by a fixed probability x, is modified to include a memory effect where the event probability is increased proportionally to the number of events that occurred within a given amount of time preceding t. For small values of x the interevent time distribution follows a power law with exponent −2−x. We consider a dynamic network where each node forms, and breaks connections according to this process. The value of x for each node depends on the fitness distribution, \rho(x), from which it is drawn; we find exact solutions for the expectation of the degree distribution for a variety of possible fitness distributions, and for both cases where the memory effect either is, or is not present. This work can potentially lead to methods to uncover hidden fitness distributions from fast changing, temporal network data, such as online social communications and fMRI scans.
Resumo:
We report time evolution studies of low coverage CO adsorption (surface hydrogen site blocking < 40%) and oxidative stripping on stepped Pt(776) and Pt(554) surfaces. It was observed that there is no preferential site occupancy for CO adsorption on step or terrace. It is proposed that CO adsorption onto these surfaces is a random process, and after CO adsorption there is no appreciable shift from CO-(111) to CO-(110) sites. This implies that after adsorption, CO molecules either have a very long residence time, or that the diffusion coefficient is much lower than previously thought. After CO electrooxidation the sites released included both terrace (111) and step (110) orientations. For surface hydrogen site blocking > 40%, the lateral interactions might play a role in the preferential CO site occupancy. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
In this work a new method is proposed of separated estimation for the ARMA spectral model based on the modified Yule-Walker equations and on the least squares method. The proposal of the new method consists of performing an AR filtering in the random process generated obtaining a new random estimate, which will reestimate the ARMA model parameters, given a better spectrum estimate. Some numerical examples will be presented in order to ilustrate the performance of the method proposed, which is evaluated by the relative error and the average variation coefficient.