799 resultados para Measurement,
Measurement of the energy spectrum of cosmic rays above 10(18) eV using the Pierre Auger Observatory
Resumo:
We report a measurement of the flux of cosmic rays with unprecedented precision and Statistics using the Pierre Auger Observatory Based on fluorescence observations in coincidence with at least one Surface detector we derive a spectrum for energies above 10(18) eV We also update the previously published energy spectrum obtained with the surface detector array The two spectra are combined addressing the systematic uncertainties and, in particular. the influence of the energy resolution on the spectral shape The spectrum can be described by a broken power law E(-gamma) with index gamma = 3 3 below the ankle which is measured at log(10)(E(ankle)/eV) = 18 6 Above the ankle the spectrum is described by a power law with index 2 6 followed by a flux suppression, above about log(10)(E/eV) = 19 5, detected with high statistical significance (C) 2010 Elsevier B V All rights reserved
Resumo:
In this article, we discuss inferential aspects of the measurement error regression models with null intercepts when the unknown quantity x (latent variable) follows a skew normal distribution. We examine first the maximum-likelihood approach to estimation via the EM algorithm by exploring statistical properties of the model considered. Then, the marginal likelihood, the score function and the observed information matrix of the observed quantities are presented allowing direct inference implementation. In order to discuss some diagnostics techniques in this type of models, we derive the appropriate matrices to assessing the local influence on the parameter estimates under different perturbation schemes. The results and methods developed in this paper are illustrated considering part of a real data set used by Hadgu and Koch [1999, Application of generalized estimating equations to a dental randomized clinical trial. Journal of Biopharmaceutical Statistics, 9, 161-178].
Resumo:
The main object of this paper is to discuss the Bayes estimation of the regression coefficients in the elliptically distributed simple regression model with measurement errors. The posterior distribution for the line parameters is obtained in a closed form, considering the following: the ratio of the error variances is known, informative prior distribution for the error variance, and non-informative prior distributions for the regression coefficients and for the incidental parameters. We proved that the posterior distribution of the regression coefficients has at most two real modes. Situations with a single mode are more likely than those with two modes, especially in large samples. The precision of the modal estimators is studied by deriving the Hessian matrix, which although complicated can be computed numerically. The posterior mean is estimated by using the Gibbs sampling algorithm and approximations by normal distributions. The results are applied to a real data set and connections with results in the literature are reported. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This work presents a Bayesian semiparametric approach for dealing with regression models where the covariate is measured with error. Given that (1) the error normality assumption is very restrictive, and (2) assuming a specific elliptical distribution for errors (Student-t for example), may be somewhat presumptuous; there is need for more flexible methods, in terms of assuming only symmetry of errors (admitting unknown kurtosis). In this sense, the main advantage of this extended Bayesian approach is the possibility of considering generalizations of the elliptical family of models by using Dirichlet process priors in dependent and independent situations. Conditional posterior distributions are implemented, allowing the use of Markov Chain Monte Carlo (MCMC), to generate the posterior distributions. An interesting result shown is that the Dirichlet process prior is not updated in the case of the dependent elliptical model. Furthermore, an analysis of a real data set is reported to illustrate the usefulness of our approach, in dealing with outliers. Finally, semiparametric proposed models and parametric normal model are compared, graphically with the posterior distribution density of the coefficients. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Scale mixtures of the skew-normal (SMSN) distribution is a class of asymmetric thick-tailed distributions that includes the skew-normal (SN) distribution as a special case. The main advantage of these classes of distributions is that they are easy to simulate and have a nice hierarchical representation facilitating easy implementation of the expectation-maximization algorithm for the maximum-likelihood estimation. In this paper, we assume an SMSN distribution for the unobserved value of the covariates and a symmetric scale mixtures of the normal distribution for the error term of the model. This provides a robust alternative to parameter estimation in multivariate measurement error models. Specific distributions examined include univariate and multivariate versions of the SN, skew-t, skew-slash and skew-contaminated normal distributions. The results and methods are applied to a real data set.
Resumo:
In general, the normal distribution is assumed for the surrogate of the true covariates in the classical error model. This paper considers a class of distributions, which includes the normal one, for the variables subject to error. An estimation approach yielding consistent estimators is developed and simulation studies reported.
Resumo:
Influence diagnostics methods are extended in this article to the Grubbs model when the unknown quantity x (latent variable) follows a skew-normal distribution. Diagnostic measures are derived from the case-deletion approach and the local influence approach under several perturbation schemes. The observed information matrix to the postulated model and Delta matrices to the corresponding perturbed models are derived. Results obtained for one real data set are reported, illustrating the usefulness of the proposed methodology.
Resumo:
In many epidemiological studies it is common to resort to regression models relating incidence of a disease and its risk factors. The main goal of this paper is to consider inference on such models with error-prone observations and variances of the measurement errors changing across observations. We suppose that the observations follow a bivariate normal distribution and the measurement errors are normally distributed. Aggregate data allow the estimation of the error variances. Maximum likelihood estimates are computed numerically via the EM algorithm. Consistent estimation of the asymptotic variance of the maximum likelihood estimators is also discussed. Test statistics are proposed for testing hypotheses of interest. Further, we implement a simple graphical device that enables an assessment of the model`s goodness of fit. Results of simulations concerning the properties of the test statistics are reported. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease. Copyright (C) 2008 John Wiley & Sons, Ltd.
A robust Bayesian approach to null intercept measurement error model with application to dental data
Resumo:
Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the Skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Objectives: Studies of the viscoelastic properties of the vocal folds are normally performed with rheometers that use parallel assigned a fixed value. In tissues subject to variation of thickness plates whose interplate space is usually at between samples, fixed gaps could result in different compressions, compromising the comparison among them. We performed,in experimental study to determine whether different compressions call lead to different results in measurements of dynamic viscosity (DV) of vocal fold samples. Methods: We Measured the DV of vocal fold samples of 10 larynges of cadavers under 3 different compression levels, corresponding to 0.2, 0.5, and 10 N on an 8-mm-diameter parallel-plate rheometer. Results: The DV directly varied with compression. We observed statistically significant differences between the results of 0.2 and 10 N (p = 0.0396) and 0.5 and 10 N (p = 0.0442). Conclusions: The study demonstrated that the level of compression influences the DV measure and Suggests that a defined compression level should be used in rheometric studies of biological tissues.
Resumo:
The electroanalytical techniques are very promissing to perform the quality control of crude vegetable. Solid State Differential Pulse Voltammetry in the supporting electrolyte is able to detect the oxidation signals of the active material, which can be used as a parameter to identify the type of crude vegetable and its antioxidant activity. The working electrode consisted in a carbon paste electrode modified with the powder of vegetable raw material (EMF). The electrochemical measurements were performed in a cell containing the working (EMF), reference (Ag/AgCl, KClsat) and auxiliary (Pt) electrodes.
Resumo:
This Thesis project is a part of the all-round automation of production of concentrating solar PV/T systems Absolicon X10. ABSOLICON Solar Concentrator AB has been invented and started production of the prospective solar concentrated system Absolicon X10. The aims of this Thesis project are designing, assembling, calibrating and putting in operation the automatic measurement system intended to evaluate the shape of concentrating parabolic reflectors.On the basis of the requirements of the company administration and needs of real production process the operation conditions for the Laser testing rig were formulated. The basic concept to use laser radiation was defined.At the first step, the complex design of the whole system was made and division on the parts was defined. After the preliminary conducted simulations the function and operation conditions of the all parts were formulated.At the next steps, the detailed design of all the parts was conducted. Most components were ordered from respective companies. Some of the mechanical components were made in the workshop of the company. All parts of the Laser-testing rig were assembled and tested. Software part, which controls the Laser-testing rig work, was created on the LabVIEW basis. To tune and test software part the special simulator was designed and assembled.When all parts were assembled in the complete system, the Laser-testing rig was tested, calibrated and tuned.In the workshop of Absolicon AB, the trial measurements were conducted and Laser-testing rig was installed in the production line at the plant in Soleftea.
Resumo:
This Thesis project is a part of the research conducted in Solar industry. ABSOLICON Solar Concentrator AB has invented and started production of the prospective solar concentrated system Absolicon X10. The aims of this Thesis project are designing, assembling, calibrating and putting in operation the automatic measurement system intended to evaluate distribution of density of solar radiation in the focal line of the concentrated parabolic reflectors and to measure radiation from the artificial source of light being a calibration-testing tool.On the basis of the requirements of the company’s administration and needs of designing the concentrated reflectors the operation conditions for the Sun-Walker were formulated. As the first step, the complex design of the whole system was made and division on the parts was specified. After the preliminary conducted simulation of the functions and operation conditions of the all parts were formulated.As the next steps, the detailed design of all the parts was made. Most components were ordered from respective companies. Some of the mechanical components were made in the workshop of the company. All parts of the Sun-Walker were assembled and tested. The software part, which controls the Sun-Walker work and conducts measurements of solar irradiation, was created on the LabVIEW basis. To tune and test the software part, the special simulator was designed and assembled.When all parts were assembled in the complete system, the Sun-Walker was tested, calibrated and tuned.
Resumo:
We generalize the standard linear-response (Kubo) theory to obtain the conductivity of a system that is subject to a quantum measurement of the current. Our approach can be used to specifically elucidate how back-action inherent to quantum measurements affects electronic transport. To illustrate the utility of our general formalism, we calculate the frequency-dependent conductivity of graphene and discuss the effect of measurement-induced decoherence on its value in the dc limit. We are able to resolve an ambiguity related to the parametric dependence of the minimal conductivity.