992 resultados para Sequential Gaussian Simulation
Resumo:
In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.
Resumo:
We present simple procedures for the prediction of a real valued sequence. The algorithms are based on a combinationof several simple predictors. We show that if the sequence is a realization of a bounded stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes.
Resumo:
Spanning avalanches in the 3D Gaussian Random Field Ising Model (3D-GRFIM) with metastable dynamics at T=0 have been studied. Statistical analysis of the field values for which avalanches occur has enabled a Finite-Size Scaling (FSS) study of the avalanche density to be performed. Furthermore, a direct measurement of the geometrical properties of the avalanches has confirmed an earlier hypothesis that several types of spanning avalanches with two different fractal dimensions coexist at the critical point. We finally compare the phase diagram of the 3D-GRFIM with metastable dynamics with the same model in equilibrium at T=0.
Resumo:
Simulation has traditionally been used for analyzing the behavior of complex real world problems. Even though only some features of the problems are considered, simulation time tends to become quite high even for common simulation problems. Parallel and distributed simulation is a viable technique for accelerating the simulations. The success of parallel simulation depends heavily on the combination of the simulation application, algorithm and message population in the simulation is sufficient, no additional delay is caused by this environment. In this thesis a conservative, parallel simulation algorithm is applied to the simulation of a cellular network application in a distributed workstation environment. This thesis presents a distributed simulation environment, Diworse, which is based on the use of networked workstations. The distributed environment is considered especially hard for conservative simulation algorithms due to the high cost of communication. In this thesis, however, the distributed environment is shown to be a viable alternative if the amount of communication is kept reasonable. Novel ideas of multiple message simulation and channel reduction enable efficient use of this environment for the simulation of a cellular network application. The distribution of the simulation is based on a modification of the well known Chandy-Misra deadlock avoidance algorithm with null messages. The basic Chandy Misra algorithm is modified by using the null message cancellation and multiple message simulation techniques. The modifications reduce the amount of null messages and the time required for their execution, thus reducing the simulation time required. The null message cancellation technique reduces the processing time of null messages as the arriving null message cancels other non processed null messages. The multiple message simulation forms groups of messages as it simulates several messages before it releases the new created messages. If the message population in the simulation is suffiecient, no additional delay is caused by this operation A new technique for considering the simulation application is also presented. The performance is improved by establishing a neighborhood for the simulation elements. The neighborhood concept is based on a channel reduction technique, where the properties of the application exclusively determine which connections are necessary when a certain accuracy for simulation results is required. Distributed simulation is also analyzed in order to find out the effect of the different elements in the implemented simulation environment. This analysis is performed by using critical path analysis. Critical path analysis allows determination of a lower bound for the simulation time. In this thesis critical times are computed for sequential and parallel traces. The analysis based on sequential traces reveals the parallel properties of the application whereas the analysis based on parallel traces reveals the properties of the environment and the distribution.
Resumo:
A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.
Resumo:
The first two articles build procedures to simulate vector of univariate states and estimate parameters in nonlinear and non Gaussian state space models. We propose state space speci fications that offer more flexibility in modeling dynamic relationship with latent variables. Our procedures are extension of the HESSIAN method of McCausland[2012]. Thus, they use approximation of the posterior density of the vector of states that allow to : simulate directly from the state vector posterior distribution, to simulate the states vector in one bloc and jointly with the vector of parameters, and to not allow data augmentation. These properties allow to build posterior simulators with very high relative numerical efficiency. Generic, they open a new path in nonlinear and non Gaussian state space analysis with limited contribution of the modeler. The third article is an essay in commodity market analysis. Private firms coexist with farmers' cooperatives in commodity markets in subsaharan african countries. The private firms have the biggest market share while some theoretical models predict they disappearance once confronted to farmers cooperatives. Elsewhere, some empirical studies and observations link cooperative incidence in a region with interpersonal trust, and thus to farmers trust toward cooperatives. We propose a model that sustain these empirical facts. A model where the cooperative reputation is a leading factor determining the market equilibrium of a price competition between a cooperative and a private firm
Resumo:
Spanning avalanches in the 3D Gaussian Random Field Ising Model (3D-GRFIM) with metastable dynamics at T=0 have been studied. Statistical analysis of the field values for which avalanches occur has enabled a Finite-Size Scaling (FSS) study of the avalanche density to be performed. Furthermore, a direct measurement of the geometrical properties of the avalanches has confirmed an earlier hypothesis that several types of spanning avalanches with two different fractal dimensions coexist at the critical point. We finally compare the phase diagram of the 3D-GRFIM with metastable dynamics with the same model in equilibrium at T=0.
Resumo:
In dieser Arbeit werden mithilfe der Likelihood-Tiefen, eingeführt von Mizera und Müller (2004), (ausreißer-)robuste Schätzfunktionen und Tests für den unbekannten Parameter einer stetigen Dichtefunktion entwickelt. Die entwickelten Verfahren werden dann auf drei verschiedene Verteilungen angewandt. Für eindimensionale Parameter wird die Likelihood-Tiefe eines Parameters im Datensatz als das Minimum aus dem Anteil der Daten, für die die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, und dem Anteil der Daten, für die diese Ableitung nicht positiv ist, berechnet. Damit hat der Parameter die größte Tiefe, für den beide Anzahlen gleich groß sind. Dieser wird zunächst als Schätzer gewählt, da die Likelihood-Tiefe ein Maß dafür sein soll, wie gut ein Parameter zum Datensatz passt. Asymptotisch hat der Parameter die größte Tiefe, für den die Wahrscheinlichkeit, dass für eine Beobachtung die Ableitung der Loglikelihood-Funktion nach dem Parameter nicht negativ ist, gleich einhalb ist. Wenn dies für den zu Grunde liegenden Parameter nicht der Fall ist, ist der Schätzer basierend auf der Likelihood-Tiefe verfälscht. In dieser Arbeit wird gezeigt, wie diese Verfälschung korrigiert werden kann sodass die korrigierten Schätzer konsistente Schätzungen bilden. Zur Entwicklung von Tests für den Parameter, wird die von Müller (2005) entwickelte Simplex Likelihood-Tiefe, die eine U-Statistik ist, benutzt. Es zeigt sich, dass für dieselben Verteilungen, für die die Likelihood-Tiefe verfälschte Schätzer liefert, die Simplex Likelihood-Tiefe eine unverfälschte U-Statistik ist. Damit ist insbesondere die asymptotische Verteilung bekannt und es lassen sich Tests für verschiedene Hypothesen formulieren. Die Verschiebung in der Tiefe führt aber für einige Hypothesen zu einer schlechten Güte des zugehörigen Tests. Es werden daher korrigierte Tests eingeführt und Voraussetzungen angegeben, unter denen diese dann konsistent sind. Die Arbeit besteht aus zwei Teilen. Im ersten Teil der Arbeit wird die allgemeine Theorie über die Schätzfunktionen und Tests dargestellt und zudem deren jeweiligen Konsistenz gezeigt. Im zweiten Teil wird die Theorie auf drei verschiedene Verteilungen angewandt: Die Weibull-Verteilung, die Gauß- und die Gumbel-Copula. Damit wird gezeigt, wie die Verfahren des ersten Teils genutzt werden können, um (robuste) konsistente Schätzfunktionen und Tests für den unbekannten Parameter der Verteilung herzuleiten. Insgesamt zeigt sich, dass für die drei Verteilungen mithilfe der Likelihood-Tiefen robuste Schätzfunktionen und Tests gefunden werden können. In unverfälschten Daten sind vorhandene Standardmethoden zum Teil überlegen, jedoch zeigt sich der Vorteil der neuen Methoden in kontaminierten Daten und Daten mit Ausreißern.
Resumo:
Satellite-based rainfall monitoring is widely used for climatological studies because of its full global coverage but it is also of great importance for operational purposes especially in areas such as Africa where there is a lack of ground-based rainfall data. Satellite rainfall estimates have enormous potential benefits as input to hydrological and agricultural models because of their real time availability, low cost and full spatial coverage. One issue that needs to be addressed is the uncertainty on these estimates. This is particularly important in assessing the likely errors on the output from non-linear models (rainfall-runoff or crop yield) which make use of the rainfall estimates, aggregated over an area, as input. Correct assessment of the uncertainty on the rainfall is non-trivial as it must take account of • the difference in spatial support of the satellite information and independent data used for calibration • uncertainties on the independent calibration data • the non-Gaussian distribution of rainfall amount • the spatial intermittency of rainfall • the spatial correlation of the rainfall field This paper describes a method for estimating the uncertainty on satellite-based rainfall values taking account of these factors. The method involves firstly a stochastic calibration which completely describes the probability of rainfall occurrence and the pdf of rainfall amount for a given satellite value, and secondly the generation of ensemble of rainfall fields based on the stochastic calibration but with the correct spatial correlation structure within each ensemble member. This is achieved by the use of geostatistical sequential simulation. The ensemble generated in this way may be used to estimate uncertainty at larger spatial scales. A case study of daily rainfall monitoring in the Gambia, west Africa for the purpose of crop yield forecasting is presented to illustrate the method.
Resumo:
In a sequential clinical trial, accrual of data on patients often continues after the stopping criterion for the study has been met. This is termed “overrunning.” Overrunning occurs mainly when the primary response from each patient is measured after some extended observation period. The objective of this article is to compare two methods of allowing for overrunning. In particular, simulation studies are reported that assess the two procedures in terms of how well they maintain the intended type I error rate. The effect on power resulting from the incorporation of “overrunning data” using the two procedures is evaluated.
Resumo:
In this paper the implementation of dynamic data reconciliation techniques for sequential modular models is described. The paper is organised as follows. First, an introduction to dynamic data reconciliation is given. Then, the online use of rigorous process models is introduced. The sequential modular approach to dynamic simulation is briefly discussed followed by a short review of the extended Kalman filter. The second section describes how the modules are implemented. A simulation case study and its results are also presented.
Resumo:
The electronic properties of liquid hydrogen fluoride (HF) were investigated by carrying out sequential quantum mechanics/Born-Oppenheimer molecular dynamics. The structure of the liquid is in good agreement with recent experimental information. Emphasis was placed on the analysis of polarisation effects, dynamic polarisability and electronic excitations in liquid HF. Our results indicate an increase in liquid phase of the dipole moment (similar to 0.5 D) and isotropic polarisability (5%) relative to their gas-phase values. Our best estimate for the first vertical excitation energy in liquid HF indicates a blue-shift of 0.4 +/- 0.2 eV relative to that of the gas-phase monomer (10.4 eV). (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
We present experimental evidence of the existence of cell variability in terms of threshold light dose for Hep G2 (liver cancer cells) cultured. Using a theoretical model to describe the effects caused by successive photodynamic therapy (PDT) sessions, and based on the consequences of a partial response we introduce the threshold dose distribution concept within a tumor. The experimental model consists in a stack of flasks, and simulates subsequent layers of a tissue exposed to PDT application. The result indicates that cells from the same culture could respond in different ways to similar PDT induced-damages. Moreover, the consequence is a partial killing of the cells submitted to PDT, and the death fraction decreased at each in vitro PDT session. To demonstrate the occurrence of cell population modification as a response to PDT, we constructed a simple theoretical model and assumed that the threshold dose distribution for a cell population of a tumor is represented by a modified Gaussian distribution.
Resumo:
This work presents the use of sequential injection analysis (SIA) and the response surface methodology as a tool for optimization of Fenton-based processes. Alizarin red S dye (C.I. 58005) was used as a model compound for the anthraquinones family. whose pigments have a large use in coatings industry. The following factors were considered: [H(2)O(2)]:[Alizarin] and [H(2)O(2)]:[FeSO(4)] ratios and pH. The SIA system was designed to add reagents to the reactor and to perform on-line sampling of the reaction medium, sending the samples to a flow-through spectrophotometer for monitoring the color reduction of the dye. The proposed system fed the statistical program with degradation data for fast construction of response surface plots. After optimization, 99.7% of the dye was degraded and the TOC content was reduced to 35% of the original value. Low reagents consumption and high sampling throughput were the remarkable features of the SIA system. (C) 2008 Published by Elsevier B.V.
Resumo:
In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.