886 resultados para Discrete Gaussian Sampling


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neste trabalho é dado ênfase à inclusão das incertezas na avaliação do comportamento estrutural, objetivando uma melhor representação das características do sistema e uma quantificação do significado destas incertezas no projeto. São feitas comparações entre as técnicas clássicas existentes de análise de confiabilidade, tais como FORM, Simulação Direta Monte Carlo (MC) e Simulação Monte Carlo com Amostragem por Importância Adaptativa (MCIS), e os métodos aproximados da Superfície de Resposta( RS) e de Redes Neurais Artificiais(ANN). Quando possível, as comparações são feitas salientando- se as vantagens e inconvenientes do uso de uma ou de outra técnica em problemas com complexidades crescentes. São analisadas desde formulações com funções de estado limite explícitas até formulações implícitas com variabilidade espacial de carregamento e propriedades dos materiais, incluindo campos estocásticos. É tratado, em especial, o problema da análise da confiabilidade de estruturas de concreto armado incluindo o efeito da variabilidade espacial de suas propriedades. Para tanto é proposto um modelo de elementos finitos para a representação do concreto armado que incorpora as principais características observadas neste material. Também foi desenvolvido um modelo para a geração de campos estocásticos multidimensionais não Gaussianos para as propriedades do material e que é independente da malha de elementos finitos, assim como implementadas técnicas para aceleração das avaliações estruturais presentes em qualquer das técnicas empregadas. Para o tratamento da confiabilidade através da técnica da Superfície de Resposta, o algoritmo desenvolvido por Rajashekhar et al(1993) foi implementado. Já para o tratamento através de Redes Neurais Artificias, foram desenvolvidos alguns códigos para a simulação de redes percéptron multicamada e redes com função de base radial e então implementados no algoritmo de avaliação de confiabilidade desenvolvido por Shao et al(1997). Em geral, observou-se que as técnicas de simulação tem desempenho bastante baixo em problemas mais complexos, sobressaindo-se a técnica de primeira ordem FORM e as técnicas aproximadas da Superfície de Resposta e de Redes Neurais Artificiais, embora com precisão prejudicada devido às aproximações presentes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dois experimentos e um levantamento por amostragem foram analisados no contexto de dados espaciais. Os experimentos foram delineados em blocos completos casualizados sendo que no experimento um (EXP 1) foram avaliados oito cultivares de trevo branco, sendo estudadas as variáveis Matéria Seca Total (MST) e Matéria Seca de Gramíneas (MSGRAM) e no experimento dois (EXP 2) 20 cultivares de espécies forrageiras, onde foi estudada a variável Percentagem de Implantação (%IMPL). As variáveis foram analisadas no contexto de modelos mistos, sendo modelada a variabilidade espacial através de semivariogramas exponencias, esféricos e gaussianos. Verificou-se uma diminuição em média de 19% e 14% do Coeficiente de Variação (CV) das medias dos cultivares, e uma diminuição em média de 24,6% e 33,3% nos erros padrões dos contrastes ortogonais propostos em MST e MSGRAM. No levantamento por amostragem, estudou-se a associação espacial em Aristida laevis (Nees) Kunth , Paspalum notatum Fl e Demodium incanum DC, amostrados em uma transecção fixa de quadros contiguos, a quatro tamanhos de unidades amostrais (0,1x0,1m; 0,1x0,3m; 0,1x0,5m; e 0,1x1,0m). Nas espécies Aristida laevis (Nees) Kunth e Paspalum notatum Fl, existiu um bom ajuste dos semivariogramas a tamanhos menores das unidades amostrais, diminuíndo quando a unidade amostral foi maior. Desmodium incanum DC apresentou comportamento contrario, ajustando melhor os semivariogramas a tamanhos maiores das unidades amostrais.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Convex combinations of long memory estimates using the same data observed at different sampling rates can decrease the standard deviation of the estimates, at the cost of inducing a slight bias. The convex combination of such estimates requires a preliminary correction for the bias observed at lower sampling rates, reported by Souza and Smith (2002). Through Monte Carlo simulations, we investigate the bias and the standard deviation of the combined estimates, as well as the root mean squared error (RMSE), which takes both into account. While comparing the results of standard methods and their combined versions, the latter achieve lower RMSE, for the two semi-parametric estimators under study (by about 30% on average for ARFIMA(0,d,0) series).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops a framework to test whether discrete-valued irregularly-spaced financial transactions data follow a subordinated Markov process. For that purpose, we consider a specific optional sampling in which a continuous-time Markov process is observed only when it crosses some discrete level. This framework is convenient for it accommodates not only the irregular spacing of transactions data, but also price discreteness. Further, it turns out that, under such an observation rule, the current price duration is independent of previous price durations given the current price realization. A simple nonparametric test then follows by examining whether this conditional independence property holds. Finally, we investigate whether or not bid-ask spreads follow Markov processes using transactions data from the New York Stock Exchange. The motivation lies on the fact that asymmetric information models of market microstructures predict that the Markov property does not hold for the bid-ask spread. The results are mixed in the sense that the Markov assumption is rejected for three out of the five stocks we have analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O algoritmo de simulação seqüencial estocástica mais amplamente utilizado é o de simulação seqüencial Gaussiana (ssG). Teoricamente, os métodos estocásticos reproduzem tão bem o espaço de incerteza da VA Z(u) quanto maior for o número L de realizações executadas. Entretanto, às vezes, L precisa ser tão alto que o uso dessa técnica pode se tornar proibitivo. Essa Tese apresenta uma estratégia mais eficiente a ser adotada. O algoritmo de simulação seqüencial Gaussiana foi alterado para se obter um aumento em sua eficiência. A substituição do método de Monte Carlo pela técnica de Latin Hypercube Sampling (LHS), fez com que a caracterização do espaço de incerteza da VA Z(u), para uma dada precisão, fosse alcançado mais rapidamente. A técnica proposta também garante que todo o modelo de incerteza teórico seja amostrado, sobretudo em seus trechos extremos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze simultaneous discrete public good games wi.th incomplete information and continuous contributions. To use the terminology of Admati and Perry (1991). we consider comribution and subscription games. In the former. comrioutions are :1ot rcfunded if the project is not completed. while in thp. iatter they are. For the special case whp.re provision by a single player is possible we show the existence of an equilibrium in Doth cootribution and subscription games where a player decides to provide the good by himself. For the case where is not feasible for a single player to provide the good by himself, we show that any equilibriwn of both games is inefficient. WE also provide a sufficient condition for "contributing zero" to be the unique equilibrium of the contribution garoe with n players and characterize e

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze simultaneous discrete public good games with incomplete information and continuous contributions. To use the tenninology of Admati and Perry (1991), we consider contribution and subscription games. In the former, contributions are not refunded ifthe project is not completed, while in the latter they are. For the special case where provision by a single player is possible we show the existence of an equihbrium in both contnbution and subscription games where a player decides to provide the good by himself. For the case where is not feasible for a single player to provide the good by himself: we show that there exist equilibria of the subscription game where each participant pays the same amount. Moreover, using the technical apparatus from Myerson (1981) we show that neither the subscription nor the contribution games admit ex-post eÁ cient equibbria. hl addition. we provide a suÁ cient condition for êontributing zero 'to be the unique equihbrium of the contnbution game with n players.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Economists and policymakers have long been concerned with increasing the supply of health professionals in rural and remote areas. This work seeks to understand which factors influence physicians’ choice of practice location right after completing residency. Differently from previous papers, we analyse the Brazilian missalocation and assess the particularities of developing countries. We use a discrete choice model approach with a multinomial logit specification. Two rich databases are employed containing the location and wage of formally employed physicians as well as details from their post-graduation. Our main findings are that amenities matter, physicians have a strong tendency to remain in the region they completed residency and salaries are significant in the choice of urban, but not rural, communities. We conjecture this is due to attachments built during training and infrastructure concerns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When estimating policy parameters, also known as treatment effects, the assignment to treatment mechanism almost always causes endogeneity and thus bias many of these policy parameters estimates. Additionally, heterogeneity in program impacts is more likely to be the norm than the exception for most social programs. In situations where these issues are present, the Marginal Treatment Effect (MTE) parameter estimation makes use of an instrument to avoid assignment bias and simultaneously to account for heterogeneous effects throughout individuals. Although this parameter is point identified in the literature, the assumptions required for identification may be strong. Given that, we use weaker assumptions in order to partially identify the MTE, i.e. to stablish a methodology for MTE bounds estimation, implementing it computationally and showing results from Monte Carlo simulations. The partial identification we perfom requires the MTE to be a monotone function over the propensity score, which is a reasonable assumption on several economics' examples, and the simulation results shows it is possible to get informative even in restricted cases where point identification is lost. Additionally, in situations where estimated bounds are not informative and the traditional point identification is lost, we suggest a more generic method to point estimate MTE using the Moore-Penrose Pseudo-Invese Matrix, achieving better results than traditional methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents general methods in non-Gaussian analysis in infinite dimensional spaces. As main applications we study Poisson and compound Poisson spaces. Given a probability measure μ on a co-nuclear space, we develop an abstract theory based on the generalized Appell systems which are bi-orthogonal. We study its properties as well as the generated Gelfand triples. As an example we consider the important case of Poisson measures. The product and Wick calculus are developed on this context. We provide formulas for the change of the generalized Appell system under a transformation of the measure. The L² structure for the Poisson measure, compound Poisson and Gamma measures are elaborated. We exhibit the chaos decomposition using the Fock isomorphism. We obtain the representation of the creation, annihilation operators. We construct two types of differential geometry on the configuration space over a differentiable manifold. These two geometries are related through the Dirichlet forms for Poisson measures as well as for its perturbations. Finally, we construct the internal geometry on the compound configurations space. In particular, the intrinsic gradient, the divergence and the Laplace-Beltrami operator. As a result, we may define the Dirichlet forms which are associated to a diffusion process. Consequently, we obtain the representation of the Lie algebra of vector fields with compact support. All these results extends directly for the marked Poisson spaces.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The bubble crab Dotilla fenestrata forms very dense populations on the sand flats of the eastern coast of Inhaca Island, Mozambique, making it an interesting biological model to examine spatial distribution patterns and test the relative efficiency of common sampling methods. Due to its apparent ecological importance within the sandy intertidal community, understanding the factors ruling the dynamics of Dotilla populations is also a key issue. In this study, different techniques of estimating crab density are described, and the trends of spatial distribution of the different population categories are shown. The studied populations are arranged in discrete patches located at the well-drained crests of nearly parallel mega sand ripples. For a given sample size, there was an obvious gain in precision by using a stratified random sampling technique, considering discrete patches as strata, compared to the simple random design. Density average and variance differed considerably among patches since juveniles and ovigerous females were found clumped, with higher densities at the lower and upper shore levels, respectively. Burrow counting was found to be an adequate method for large-scale sampling, although consistently underestimating actual crab density by nearly half. Regression analyses suggested that crabs smaller than 2.9 mm carapace width tend to be undetected in visual burrow counts. A visual survey of sampling plots over several patches of a large Dotilla population showed that crab density varied in an interesting oscillating pattern, apparently following the topography of the sand flat. Patches extending to the lower shore contained higher densities than those mostly covering the higher shore. Within-patch density variability also pointed to the same trend, but the density increment towards the lowest shore level varied greatly among the patches compared.