923 resultados para data publishing
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work develops a new methodology in order to discriminate models for interval-censored data based on bootstrap residual simulation by observing the deviance difference from one model in relation to another, according to Hinde (1992). Generally, this sort of data can generate a large number of tied observations and, in this case, survival time can be regarded as discrete. Therefore, the Cox proportional hazards model for grouped data (Prentice & Gloeckler, 1978) and the logistic model (Lawless, 1982) can befitted by means of generalized linear models. Whitehead (1989) considered censoring to be an indicative variable with a binomial distribution and fitted the Cox proportional hazards model using complementary log-log as a link function. In addition, a logistic model can be fitted using logit as a link function. The proposed methodology arises as an alternative to the score tests developed by Colosimo et al. (2000), where such models can be obtained for discrete binary data as particular cases from the Aranda-Ordaz distribution asymmetric family. These tests are thus developed with a basis on link functions to generate such a fit. The example that motivates this study was the dataset from an experiment carried out on a flax cultivar planted on four substrata susceptible to the pathogen Fusarium oxysoprum. The response variable, which is the time until blighting, was observed in intervals during 52 days. The results were compared with the model fit and the AIC values.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The usual particle emission scenario used in hydrodynamics presupposes that particles instantaneously stop interacting (freeze-out) once they reach some three-dimensional surface. Another formalism has recently been developed where particle emission occurs continuously during the whole expansion of thermalized matter. Here we compare both mechanisms in a simplified hydrodynamical framework and show that they lead to a drastically different interpretation of data.
Resumo:
We compare the results obtained by using the continuous emission model with data from Ph-Ph collisions. We determine the initial conditions necessary to reproduce the strange particle ratios (experiment WA97) and with the obtained results, we study the dependence on particle mass of the inverse slope parameter T. Some particle spectra are also shown.
Resumo:
Linear mixed effects models are frequently used to analyse longitudinal data, due to their flexibility in modelling the covariance structure between and within observations. Further, it is easy to deal with unbalanced data, either with respect to the number of observations per subject or per time period, and with varying time intervals between observations. In most applications of mixed models to biological sciences, a normal distribution is assumed both for the random effects and for the residuals. This, however, makes inferences vulnerable to the presence of outliers. Here, linear mixed models employing thick-tailed distributions for robust inferences in longitudinal data analysis are described. Specific distributions discussed include the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted, and the Gibbs sampler and the Metropolis-Hastings algorithms are used to carry out the posterior analyses. An example with data on orthodontic distance growth in children is discussed to illustrate the methodology. Analyses based on either the Student-t distribution or on the usual Gaussian assumption are contrasted. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process for modelling distributions of the random effects and of residuals in linear mixed models, and the MCMC implementation allows the computations to be performed in a flexible manner.
Resumo:
The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
The CMS Hadron Calorimeter in the barrel, endcap and forward regions is fully commissioned. Cosmic ray data were taken with and without magnetic field at the surface hall and after installation in the experimental hall, hundred meters underground. Various measurements were also performed during the few days of beam in the LHC in September 2008. Calibration parameters were extracted, and the energy response of the HCAL determined from test beam data has been checked. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
This paper discusses the design and performance of the time measurement technique and of the synchronization systems of the CMS hadron calorimeter. Time measurement performance results are presented from test beam data taken in the years 2004 and 2006. For hadronic showers of energy greater than 100 GeV, the timing resolution is measured to be about 1.2 ns. Time synchronization and out-of-time background rejection results are presented from the Cosmic Run At Four Tesla and LHC beam runs taken in the Autumn of 2008. The inter-channel synchronization is measured to be within 2 ns. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
Through this workshop, database experts from the various ministries and the Central Statistical Office (CSO) were introduced to the CREATE and PROCESS modules of the REDATAM software, which could be used for database creation and analysis of data. This workshop was the second in a series of workshops aimed at promoting human-resource and capacity-building at the national and regional levels in the use of the REDATAM software. It also served as a qualifier for a follow-up workshop on the use of the web-publishing application of the software to be held in 2010.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The strategic management of information plays a fundamental role in the organizational management process since the decision-making process depend on the need for survival in a highly competitive market. Companies are constantly concerned about information transparency and good practices of corporate governance (CG) which, in turn, directs relations between the controlling power of the company and investors. In this context, this article presents the relationship between the disclosing of information of joint-stock companies by means of using XBRL, the open data model adopted by the Brazilian government, a model that boosted the publication of Information Access Law (Lei de Acesso à Informação), nº 12,527 of 18 November 2011. Information access should be permeated by a mediation policy in order to subsidize the knowledge construction and decision-making of investors. The XBRL is the main model for the publishing of financial information. The use of XBRL by means of new semantic standard created for Linked Data, strengthens the information dissemination, as well as creates analysis mechanisms and cross-referencing of data with different open databases available on the Internet, providing added value to the data/information accessed by civil society.
Resumo:
We present an implementation of the F-statistic to carry out the first search in data from the Virgo laser interferometric gravitational wave detector for periodic gravitational waves from a priori unknown, isolated rotating neutron stars. We searched a frequency f(0) range from 100 Hz to 1 kHz and the frequency dependent spindown f(1) range from -1.6(f(0)/100 Hz) x 10(-9) Hz s(-1) to zero. A large part of this frequency-spindown space was unexplored by any of the all-sky searches published so far. Our method consisted of a coherent search over two-day periods using the F-statistic, followed by a search for coincidences among the candidates from the two-day segments. We have introduced a number of novel techniques and algorithms that allow the use of the fast Fourier transform (FFT) algorithm in the coherent part of the search resulting in a fifty-fold speed-up in computation of the F-statistic with respect to the algorithm used in the other pipelines. No significant gravitational wave signal was found. The sensitivity of the search was estimated by injecting signals into the data. In the most sensitive parts of the detector band more than 90% of signals would have been detected with dimensionless gravitational-wave amplitude greater than 5 x 10(-24).
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)