10 resultados para ESTIMATORS
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
The characterization of thermocouple sensors for temperature measurement in varying-flow environments is a challenging problem. Recently, the authors introduced novel difference-equation-based algorithms that allow in situ characterization of temperature measurement probes consisting of two-thermocouple sensors with differing time constants. In particular, a linear least squares (LS) lambda formulation of the characterization problem, which yields unbiased estimates when identified using generalized total LS, was introduced. These algorithms assume that time constants do not change during operation and are, therefore, appropriate for temperature measurement in homogenous constant-velocity liquid or gas flows. This paper develops an alternative ß-formulation of the characterization problem that has the major advantage of allowing exploitation of a priori knowledge of the ratio of the sensor time constants, thereby facilitating the implementation of computationally efficient algorithms that are less sensitive to measurement noise. A number of variants of the ß-formulation are developed, and appropriate unbiased estimators are identified. Monte Carlo simulation results are used to support the analysis.
Resumo:
The purpose of this study is to compare the inferability of various synthetic as well as real biological regulatory networks. In order to assess differences we apply local network-based measures. That means, instead of applying global measures, we investigate and assess an inference algorithm locally, on the level of individual edges and subnetworks. We demonstrate the behaviour of our local network-based measures with respect to different regulatory networks by conducting large-scale simulations. As inference algorithm we use exemplarily ARACNE. The results from our exploratory analysis allow us not only to gain new insights into the strength and weakness of an inference algorithm with respect to characteristics of different regulatory networks, but also to obtain information that could be used to design novel problem-specific statistical estimators.
Resumo:
In this article, we extend the earlier work of Freeland and McCabe [Journal of time Series Analysis (2004) Vol. 25, pp. 701–722] and develop a general framework for maximum likelihood (ML) analysis of higher-order integer-valued autoregressive processes. Our exposition includes the case where the innovation sequence has a Poisson distribution and the thinning is binomial. A recursive representation of the transition probability of the model is proposed. Based on this transition probability, we derive expressions for the score function and the Fisher information matrix, which form the basis for ML estimation and inference. Similar to the results in Freeland and McCabe (2004), we show that the score function and the Fisher information matrix can be neatly represented as conditional expectations. Using the INAR(2) speci?cation with binomial thinning and Poisson innovations, we examine both the asymptotic e?ciency and ?nite sample properties of the ML estimator in relation to the widely used conditional least
squares (CLS) and Yule–Walker (YW) estimators. We conclude that, if the Poisson assumption can be justi?ed, there are substantial gains to be had from using ML especially when the thinning parameters are large.
Resumo:
In astrophysical systems, radiation-matter interactions are important in transferring energy and momentum between the radiation field and the surrounding material. This coupling often makes it necessary to consider the role of radiation when modelling the dynamics of astrophysical fluids. During the last few years, there have been rapid developments in the use of Monte Carlo methods for numerical radiative transfer simulations. Here, we present an approach to radiation hydrodynamics that is based on coupling Monte Carlo radiative transfer techniques with finite-volume hydrodynamical methods in an operator-split manner. In particular, we adopt an indivisible packet formalism to discretize the radiation field into an ensemble of Monte Carlo packets and employ volume-based estimators to reconstruct the radiation field characteristics. In this paper the numerical tools of this method are presented and their accuracy is verified in a series of test calculations. Finally, as a practical example, we use our approach to study the influence of the radiation-matter coupling on the homologous expansion phase and the bolometric light curve of Type Ia supernova explosions. © 2012 The Authors Monthly Notices of the Royal Astronomical Society © 2012 RAS.
Resumo:
A three-dimensional Monte Carlo code for modelling radiation transport in Type Ia supernovae is described. In addition to tracking Monte Carlo quanta to follow the emission, scattering and deposition of radiative energy, a scheme involving volume-based Monte Carlo estimators is used to allow properties of the emergent radiation field to be extracted for specific viewing angles in a multidimensional structure. This eliminates the need to compute spectra or light curves by angular binning of emergent quanta. The code is applied to two test problems to illustrate consequences of multidimensional structure on the modelling of light curves. First, elliptical models are used to quantify how large-scale asphericity can introduce angular dependence to light curves. Secondly, a model which incorporates complex structural inhomogeneity, as predicted by modern explosion models, is used to investigate how such structure may affect light-curve properties. © 2006 RAS.
Resumo:
This article investigates to what extent the worldwide increase in body mass index (BMI) has been affected by economic globalization and inequality. We used time-series and longitudinal cross-national analysis of 127 countries from 1980 to 2008. Data on mean adult BMI were obtained from the Global Burden of Metabolic Risk Factors of Chronic Diseases Collaborating Group. Globalization was measured using the Swiss Economic Institute (KOF) index of economic globalization. Economic inequality between countries was measured with the mean difference in gross domestic product per capita purchasing power parity in international dollars. Economic inequality within countries was measured using the Gini index from the Standardized World Income Inequality Database. Other covariates including poverty, population size, urban population, openness to trade and foreign direct investment were taken from the World Development Indicators (WDI) database. Time-series regression analyses showed that the global increase in BMI is positively associated with both the index of economic globalization and inequality between countries, after adjustment for covariates. Longitudinal panel data analyses showed that the association between economic globalization and BMI is robust after controlling for all covariates and using different estimators. The association between economic inequality within countries and BMI, however, was significant only among high-income nations. More research is needed to study the pathways between economic globalization and BMI. These findings, however, contribute to explaining how contemporary globalization can be reformed to promote better health and control the global obesity epidemic. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
Summary: We present a new R package, diveRsity, for the calculation of various diversity statistics, including common diversity partitioning statistics (?, G) and population differentiation statistics (D, GST ', ? test for population heterogeneity), among others. The package calculates these estimators along with their respective bootstrapped confidence intervals for loci, sample population pairwise and global levels. Various plotting tools are also provided for a visual evaluation of estimated values, allowing users to critically assess the validity and significance of statistical tests from a biological perspective. diveRsity has a set of unique features, which facilitate the use of an informed framework for assessing the validity of the use of traditional F-statistics for the inference of demography, with reference to specific marker types, particularly focusing on highly polymorphic microsatellite loci. However, the package can be readily used for other co-dominant marker types (e.g. allozymes, SNPs). Detailed examples of usage and descriptions of package capabilities are provided. The examples demonstrate useful strategies for the exploration of data and interpretation of results generated by diveRsity. Additional online resources for the package are also described, including a GUI web app version intended for those with more limited experience using R for statistical analysis. © 2013 British Ecological Society.
Resumo:
This paper considers inference from multinomial data and addresses the problem of choosing the strength of the Dirichlet prior under a mean-squared error criterion. We compare the Maxi-mum Likelihood Estimator (MLE) and the most commonly used Bayesian estimators obtained by assuming a prior Dirichlet distribution with non-informative prior parameters, that is, the parameters of the Dirichlet are equal and altogether sum up to the so called strength of the prior. Under this criterion, MLE becomes more preferable than the Bayesian estimators at the increase of the number of categories k of the multinomial, because non-informative Bayesian estimators induce a region where they are dominant that quickly shrinks with the increase of k. This can be avoided if the strength of the prior is not kept constant but decreased with the number of categories. We argue that the strength should decrease at least k times faster than usual estimators do.
Resumo:
In this brief, a hybrid filter algorithm is developed to deal with the state estimation (SE) problem for power systems by taking into account the impact from the phasor measurement units (PMUs). Our aim is to include PMU measurements when designing the dynamic state estimators for power systems with traditional measurements. Also, as data dropouts inevitably occur in the transmission channels of traditional measurements from the meters to the control center, the missing measurement phenomenon is also tackled in the state estimator design. In the framework of extended Kalman filter (EKF) algorithm, the PMU measurements are treated as inequality constraints on the states with the aid of the statistical criterion, and then the addressed SE problem becomes a constrained optimization one based on the probability-maximization method. The resulting constrained optimization problem is then solved using the particle swarm optimization algorithm together with the penalty function approach. The proposed algorithm is applied to estimate the states of the power systems with both traditional and PMU measurements in the presence of probabilistic data missing phenomenon. Extensive simulations are carried out on the IEEE 14-bus test system and it is shown that the proposed algorithm gives much improved estimation performances over the traditional EKF method.