949 resultados para Geo-statistical model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Submarine groundwater discharge (SGD) is an integral part of the hydrological cycle and represents an important aspect of land-ocean interactions. We used a numerical model to simulate flow and salt transport in a nearshore groundwater aquifer under varying wave conditions based on yearlong random wave data sets, including storm surge events. The results showed significant flow asymmetry with rapid response of influxes and retarded response of effluxes across the seabed to the irregular wave conditions. While a storm surge immediately intensified seawater influx to the aquifer, the subsequent return of intruded seawater to the sea, as part of an increased SGD, was gradual. Using functional data analysis, we revealed and quantified retarded, cumulative effects of past wave conditions on SGD including the fresh groundwater and recirculating seawater discharge components. The retardation was characterized well by a gamma distribution function regardless of wave conditions. The relationships between discharge rates and wave parameters were quantifiable by a regression model in a functional form independent of the actual irregular wave conditions. This statistical model provides a useful method for analyzing and predicting SGD from nearshore unconfined aquifers affected by random waves

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Microarrays are high throughput biological assays that allow the screening of thousands of genes for their expression. The main idea behind microarrays is to compute for each gene a unique signal that is directly proportional to the quantity of mRNA that was hybridized on the chip. A large number of steps and errors associated with each step make the generated expression signal noisy. As a result, microarray data need to be carefully pre-processed before their analysis can be assumed to lead to reliable and biologically relevant conclusions. This thesis focuses on developing methods for improving gene signal and further utilizing this improved signal for higher level analysis. To achieve this, first, approaches for designing microarray experiments using various optimality criteria, considering both biological and technical replicates, are described. A carefully designed experiment leads to signal with low noise, as the effect of unwanted variations is minimized and the precision of the estimates of the parameters of interest are maximized. Second, a system for improving the gene signal by using three scans at varying scanner sensitivities is developed. A novel Bayesian latent intensity model is then applied on these three sets of expression values, corresponding to the three scans, to estimate the suitably calibrated true signal of genes. Third, a novel image segmentation approach that segregates the fluorescent signal from the undesired noise is developed using an additional dye, SYBR green RNA II. This technique helped in identifying signal only with respect to the hybridized DNA, and signal corresponding to dust, scratch, spilling of dye, and other noises, are avoided. Fourth, an integrated statistical model is developed, where signal correction, systematic array effects, dye effects, and differential expression, are modelled jointly as opposed to a sequential application of several methods of analysis. The methods described in here have been tested only for cDNA microarrays, but can also, with some modifications, be applied to other high-throughput technologies. Keywords: High-throughput technology, microarray, cDNA, multiple scans, Bayesian hierarchical models, image analysis, experimental design, MCMC, WinBUGS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Baltic Sea is a geologically young, large brackish water basin, and few of the species living there have fully adapted to its special conditions. Many of the species live on the edge of their distribution range in terms of one or more environmental variables such as salinity or temperature. Environmental fluctuations are know to cause fluctuations in populations abundance, and this effect is especially strong near the edges of the distribution range, where even small changes in an environmental variable can be critical to the success of a species. This thesis examines which environmental factors are the most important in relation to the success of various commercially exploited fish species in the northern Baltic Sea. It also examines the uncertainties related to fish stocks current and potential status as well as to their relationship with their environment. The aim is to quantify the uncertainties related to fisheries and environmental management, to find potential management strategies that can be used to reduce uncertainty in management results and to develop methodology related to uncertainty estimation in natural resources management. Bayesian statistical methods are utilized due to their ability to treat uncertainty explicitly in all parts of the statistical model. The results show that uncertainty about important parameters of even the most intensively studied fish species such as salmon (Salmo salar L.) and Baltic herring (Clupea harengus membras L.) is large. On the other hand, management approaches that reduce uncertainty can be found. These include utilising information about ecological similarity of fish stocks and species, and using management variables that are directly related to stock parameters that can be measured easily and without extrapolations or assumptions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accelerator mass spectrometry (AMS) is an ultrasensitive technique for measuring the concentration of a single isotope. The electric and magnetic fields of an electrostatic accelerator system are used to filter out other isotopes from the ion beam. The high velocity means that molecules can be destroyed and removed from the measurement background. As a result, concentrations down to one atom in 10^16 atoms are measurable. This thesis describes the construction of the new AMS system in the Accelerator Laboratory of the University of Helsinki. The system is described in detail along with the relevant ion optics. System performance and some of the 14C measurements done with the system are described. In a second part of the thesis, a novel statistical model for the analysis of AMS data is presented. Bayesian methods are used in order to make the best use of the available information. In the new model, instrumental drift is modelled with a continuous first-order autoregressive process. This enables rigorous normalization to standards measured at different times. The Poisson statistical nature of a 14C measurement is also taken into account properly, so that uncertainty estimates are much more stable. It is shown that, overall, the new model improves both the accuracy and the precision of AMS measurements. In particular, the results can be improved for samples with very low 14C concentrations or measured only a few times.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Helsinki's evangelical lutheran congregations, the share of the people being members of that church compared with all the people living in their specific geographical areas varies from 62,4 per cent in Paavali to 80,7 per cent in Munkkiniemi. The boundaries of the congregations are about to be redrawn to level the differences in the congregations. In this thesis, the reasons of the differences in Helsinki s districts were studied closer. The data consisted of statistical information gathered from the Population Information System of Finland. It included information by age groups about the population register keeper, marital status, native tongue, level of education and gender in the end of 2005. Additional data was gathered from Helsinki Region Statistics web service. It included information about the dwelling, level of income and main activities of the inhabitants in the districts. The main method was stepwise linear regression. Minor methods were crosstabulation and correlation matrixes. The result of the study was a statistical model that explains 72,2 per cent of the variation of the shares in the congregations. The dependent variable was the share of the people being members of evangelical lutheran church in the dirstricts. The independent variables were the share of the people having other than Finnish or Swedish as their native tongue, the share of rented apartments, the shares of apartments including four rooms and a kitchen, the share of detached houses in the districts and the shares of women and people with no income in the districts. The independent variables present in the model depict the amount of foreigners, dwellings, gender and the level of income of the population. The high share of foreigners, people with no income and rented apartments explain the low share of the people being members of evangelical lutheran church. On the contrary, the high share of the people being members of evangelical lutheran church in the district is explained by the large apartments, detached houses and amount of women living there.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In contemporary wideband orthogonal frequency division multiplexing (OFDM) systems, such as Long Term Evolution (LTE) and WiMAX, different subcarriers over which a codeword is transmitted may experience different signal-to-noise-ratios (SNRs). Thus, adaptive modulation and coding (AMC) in these systems is driven by a vector of subcarrier SNRs experienced by the codeword, and is more involved. Exponential effective SNR mapping (EESM) simplifies the problem by mapping this vector into a single equivalent fiat-fading SNR. Analysis of AMC using EESM is challenging owing to its non-linear nature and its dependence on the modulation and coding scheme. We first propose a novel statistical model for the EESM, which is based on the Beta distribution. It is motivated by the central limit approximation for random variables with a finite support. It is simpler and as accurate as the more involved ad hoc models proposed earlier. Using it, we develop novel expressions for the throughput of a point-to-point OFDM link with multi-antenna diversity that uses EESM for AMC. We then analyze a general, multi-cell OFDM deployment with co-channel interference for various frequency-domain schedulers. Extensive results based on LTE and WiMAX are presented to verify the model and analysis, and gain new insights.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Significant changes are reported in extreme rainfall characteristics over India in recent studies though there are disagreements on the spatial uniformity and causes of trends. Based on recent theoretical advancements in the Extreme Value Theory (EVT), we analyze changes in extreme rainfall characteristics over India using a high-resolution daily gridded (1 degrees latitude x 1 degrees longitude) dataset. Intensity, duration and frequency of excess rain over a high threshold in the summer monsoon season are modeled by non-stationary distributions whose parameters vary with physical covariates like the El-Nino Southern Oscillation index (ENSO-index) which is an indicator of large-scale natural variability, global average temperature which is an indicator of human-induced global warming and local mean temperatures which possibly indicate more localized changes. Each non-stationary model considers one physical covariate and the best chosen statistical model at each rainfall grid gives the most significant physical driver for each extreme rainfall characteristic at that grid. Intensity, duration and frequency of extreme rainfall exhibit non-stationarity due to different drivers and no spatially uniform pattern is observed in the changes in them across the country. At most of the locations, duration of extreme rainfall spells is found to be stationary, while non-stationary associations between intensity and frequency and local changes in temperature are detected at a large number of locations. This study presents the first application of nonstationary statistical modeling of intensity, duration and frequency of extreme rainfall over India. The developed models are further used for rainfall frequency analysis to show changes in the 100-year extreme rainfall event. Our findings indicate the varying nature of each extreme rainfall characteristic and their drivers and emphasize the necessity of a comprehensive framework to assess resulting risks of precipitation induced flooding. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Statistical model-based methods are presented for the reconstruction of autocorrelated signals in impulsive plus continuous noise environments. Signals are modelled as autoregressive and noise sources as discrete and continuous mixtures of Gaussians, allowing for robustness in highly impulsive and non-Gaussian environments. Markov Chain Monte Carlo methods are used for reconstruction of the corrupted waveforms within a Bayesian probabilistic framework and results are presented for contaminated voice and audio signals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a statistical model-based approach to signal enhancement in the case of additive broadband noise. Because broadband noise is localised in neither time nor frequency, its removal is one of the most pervasive and difficult signal enhancement tasks. In order to improve perceived signal quality, we take advantage of human perception and define a best estimate of the original signal in terms of a cost function incorporating perceptual optimality criteria. We derive the resultant signal estimator and implement it in a short-time spectral attenuation framework. Audio examples, references, and further information may be found at http://www-sigproc.eng.cam.ac.uk/~pjw47.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers a class of dynamic Spatial Point Processes (PP) that evolves over time in a Markovian fashion. This Markov in time PP is hidden and observed indirectly through another PP via thinning, displacement and noise. This statistical model is important for Multi object Tracking applications and we present an approximate likelihood based method for estimating the model parameters. The work is supported by an extensive numerical study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In view of recent interest in the Cl37 (ʋ solar’e-)Ar37 reaction cross section, information on some aspects of mass 37 nuclei has been obtained using the K39 (d, ∝) Ar37 and Cl35 (He3, p) Ar37 reactions. Ar37 levels have been found at 0, 1.41, 1.62, 2.22, 2.50, 2.80, 3.17, 3.27, 3.53, 3.61, 3.71, (3.75), (3.90), 3.94, 4.02, (4.21), 4.28, 4.32, 4.40, 4.45, 4.58, 4.63, 4.74, 4.89, 4.98, 5.05, 5.10, 5.13, 5.21, 5.35, 5.41, 5.44, 5.54, 5.58, 5.67, 5.77, and 5.85 MeV (the underlined values correspond to previously tabulated levels). The nuclear temperature calculated from the Ar37 level density is 1.4 MeV. Angular distributions of the lowest six levels with the K39 (d, ∝) Ar37 reaction at Ed = 10 MeV indicate a dominant direct interaction mechanism and the inapplicability of the 2I + 1 rule of the statistical model. Comparison of the spectra obtained with the K39 (d, ∝) Ar37 and Cl35 (He3, p) Ar37 reactions leads to the suggestion that the 5.13-MeV level is the T = 3/2 Cl37 ground state analog. The ground state Q-value of the Ca40 (p, ∝) K37 reaction has been measured: -5179 ± 9 keV. This value implies a K37 mass excess of -24804 ± 10 keV. Description of a NMR magnetometer and a sixteen-detector array used in conjunction with a 61-cm double-focusing magnetic spectrometer are included in appendices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a method to generate new melodies, based on conserving the semiotic structure of a template piece. A pattern discovery algorithm is applied to a template piece to extract significant segments: those that are repeated and those that are transposed in the piece. Two strategies are combined to describe the semiotic coherence structure of the template piece: inter-segment coherence and intra-segment coherence. Once the structure is described it is used as a template for new musical content that is generated using a statistical model created from a corpus of bertso melodies and iteratively improved using a stochastic optimization method. Results show that the method presented here effectively describes a coherence structure of a piece by discovering repetition and transposition relations between segments, and also by representing the relations among notes within the segments. For bertso generation the method correctly conserves all intra and inter-segment coherence of the template, and the optimization method produces coherent generated melodies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta é uma tese centrada nas estratégias empregadas pelos eleitores para o processamento das informações sobre a política, no contexto da campanha presidencial brasileira de 2006. Propusemos, neste trabalho, um modelo estatístico para o processamento da informação sobre a política, construído a partir da contribuição de estudos realizados nos campos de conhecimento das ciências sociais, da economia, da psicologia cognitiva e da comunicação, e, sobretudo, a partir das evidências extraídas de nosso desenho de pesquisa. Este combinou métodos qualitativo, quantitativo e a análise das estratégias retóricas empregadas por candidatos e partidos políticos no Horário Gratuito de Propaganda Eleitoral (HGPE), elemento dinâmico de nosso estudo, por sintetizar os fluxos de informação no ambiente das campanhas políticas. Esse conjunto de abordagens metodológicas, foi empregado para o estudo de caso do eleitor belo-horizontino, inserido no complexo ambiente informacional das campanhas presidenciais. Com informações incompletas, o eleitor precisou escolher em quem acreditar, lidando com a incerteza dos resultados do pleito e com a incerteza em relação ao comportamento futuro dos atores, cioso de que as retóricas da campanha estavam orientadas para a persuasão. O nosso trabalho procurou mapear as estratégias empregadas pelos eleitores na seleção de temas do debate para a atenção e para o processamento das novas informações sobre a política, adquiridas em interações múltiplas ao longo da campanha. Essa complexa tarefa foi destinada à escolha de por quem ser persuadido. Procuramos responder, neste trabalho, a partir das evidências empíricas, várias preocupações deste campo de conhecimento, entre elas: 1) Em meio a tantos temas abordados na disputa entre partidos e candidatos, quais deles e por que o indivíduo escolhe para prestar atenção e acreditar? 2) Que variáveis intermedeiam e qual o seu peso nesse processo de interação com as novas informações para explicar a tomada de decisão? 3) As prioridades da agenda política do eleitor se alteram ao longo da campanha? 4) Os eleitores ampliam o repertório mais geral de informação sobre a política? 5) As percepções sobre avaliação de governo e em relação aos temas prioritários da agenda do eleitor se alteram ao longo da campanha?

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tese se insere no conjunto de pesquisas que procura entender como funcionam as eleições no Brasil. Especificamente, o objetivo é investigar a propaganda negativa durante as eleições presidenciais. Para tal foram desenvolvidos cinco capítulos. O primeiro situa o leitor no debate normativo sobre o papel da propaganda negativa para a democracia eleitoral. Nele, é debatida a importância dos ataques em uma série de circunstâncias, como mobilização política, ambiente informacional e decisão do voto. O segundo capítulo constitui ampla análise do conteúdo da propaganda negativa exibida no âmbito do Horário Gratuito de Propaganda Eleitoral durante as eleições presidenciais de 1989, 1994, 1998, 2002, 2006 e 2010, primeiro e segundo turnos. A metodologia seguiu as orientações formuladas por Figueiredo et all. (1998), mas adaptadas para as especificidades da propaganda negativa. Neste objetivo, tendências interessantes foram descobertas, a mais interessante, sem dúvida, é o baixo índice de ataques ocorrido entre os candidatos. O terceiro busca investigar o uso estratégico das inserções durante as campanhas presidenciais. Debato o caráter regulamentado do modelo brasileiro de propaganda. Ainda assim, aponto estratégias divergentes no uso estratégico das inserções negativas, sendo o horário noturno o lócus predominante dos ataques. O quarto capítulo procura criar um modelo de campanha negativa com base na teoria dos jogos. No modelo, procuro responder às seguintes questões: quem ataca quem, quando e por quê? Argumento que a propaganda negativa é o último recurso utilizado pelos candidatos na conquista por votos. Ela tem como propósito central alterar a tendência do adversário. Por essa razão, é utilizada principalmente por candidatos em situações de desvantagem nos índices de intenção de voto. O quinto e último capítulo desenvolve modelo estatístico para medir o impacto da propaganda negativa nos índices de intenção de voto.