846 resultados para Semantic Publishing, Linked Data, Bibliometrics, Informetrics, Data Retrieval, Citations


Relevância:

50.00% 50.00%

Publicador:

Resumo:

This work develops a new methodology in order to discriminate models for interval-censored data based on bootstrap residual simulation by observing the deviance difference from one model in relation to another, according to Hinde (1992). Generally, this sort of data can generate a large number of tied observations and, in this case, survival time can be regarded as discrete. Therefore, the Cox proportional hazards model for grouped data (Prentice & Gloeckler, 1978) and the logistic model (Lawless, 1982) can befitted by means of generalized linear models. Whitehead (1989) considered censoring to be an indicative variable with a binomial distribution and fitted the Cox proportional hazards model using complementary log-log as a link function. In addition, a logistic model can be fitted using logit as a link function. The proposed methodology arises as an alternative to the score tests developed by Colosimo et al. (2000), where such models can be obtained for discrete binary data as particular cases from the Aranda-Ordaz distribution asymmetric family. These tests are thus developed with a basis on link functions to generate such a fit. The example that motivates this study was the dataset from an experiment carried out on a flax cultivar planted on four substrata susceptible to the pathogen Fusarium oxysoprum. The response variable, which is the time until blighting, was observed in intervals during 52 days. The results were compared with the model fit and the AIC values.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The usual particle emission scenario used in hydrodynamics presupposes that particles instantaneously stop interacting (freeze-out) once they reach some three-dimensional surface. Another formalism has recently been developed where particle emission occurs continuously during the whole expansion of thermalized matter. Here we compare both mechanisms in a simplified hydrodynamical framework and show that they lead to a drastically different interpretation of data.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

We compare the results obtained by using the continuous emission model with data from Ph-Ph collisions. We determine the initial conditions necessary to reproduce the strange particle ratios (experiment WA97) and with the obtained results, we study the dependence on particle mass of the inverse slope parameter T. Some particle spectra are also shown.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Pode-se afirmar que a evolução tecnológica (desenvolvimento de novos instrumentos de medição como, softwares, satélites e computadores, bem como, o barateamento das mídias de armazenamento) permite às Organizações produzirem e adquirirem grande quantidade de dados em curto espaço de tempo. Devido ao volume de dados, Organizações de pesquisa se tornam potencialmente vulneráveis aos impactos da explosão de informações. Uma solução adotada por algumas Organizações é a utilização de ferramentas de sistemas de informação para auxiliar na documentação, recuperação e análise dos dados. No âmbito científico, essas ferramentas são desenvolvidas para armazenar diferentes padrões de metadados (dados sobre dados). Durante o processo de desenvolvimento destas ferramentas, destaca-se a adoção de padrões como a Linguagem Unificada de Modelagem (UML, do Inglês Unified Modeling Language), cujos diagramas auxiliam na modelagem de diferentes aspectos do software. O objetivo deste estudo é apresentar uma ferramenta de sistemas de informação para auxiliar na documentação dos dados das Organizações por meio de metadados e destacar o processo de modelagem de software, por meio da UML. Será abordado o Padrão de Metadados Digitais Geoespaciais, amplamente utilizado na catalogação de dados por Organizações científicas de todo mundo, e os diagramas dinâmicos e estáticos da UML como casos de uso, sequências e classes. O desenvolvimento das ferramentas de sistemas de informação pode ser uma forma de promover a organização e a divulgação de dados científicos. No entanto, o processo de modelagem requer especial atenção para o desenvolvimento de interfaces que estimularão o uso das ferramentas de sistemas de informação.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Linear mixed effects models are frequently used to analyse longitudinal data, due to their flexibility in modelling the covariance structure between and within observations. Further, it is easy to deal with unbalanced data, either with respect to the number of observations per subject or per time period, and with varying time intervals between observations. In most applications of mixed models to biological sciences, a normal distribution is assumed both for the random effects and for the residuals. This, however, makes inferences vulnerable to the presence of outliers. Here, linear mixed models employing thick-tailed distributions for robust inferences in longitudinal data analysis are described. Specific distributions discussed include the Student-t, the slash and the contaminated normal. A Bayesian framework is adopted, and the Gibbs sampler and the Metropolis-Hastings algorithms are used to carry out the posterior analyses. An example with data on orthodontic distance growth in children is discussed to illustrate the methodology. Analyses based on either the Student-t distribution or on the usual Gaussian assumption are contrasted. The thick-tailed distributions provide an appealing robust alternative to the Gaussian process for modelling distributions of the random effects and of residuals in linear mixed models, and the MCMC implementation allows the computations to be performed in a flexible manner.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In this work, initial crystallographic studies of human haemoglobin (Hb) crystallized in isoionic and oxygen-free PEG solution are presented. Under these conditions, functional measurements of the O-2-linked binding of water molecules and release of protons have evidenced that Hb assumes an unforeseen new allosteric conformation. The determination of the high-resolution structure of the crystal of human deoxy-Hb fully stripped of anions may provide a structural explanation for the role of anions in the allosteric properties of Hb and, particularly, for the influence of chloride on the Bohr effect, the mechanism by which Hb oxygen affinity is regulated by pH. X-ray diffraction data were collected to 1.87 Angstrom resolution using a synchrotron-radiation source. Crystals belong to the space group P2(1)2(1)2 and preliminary analysis revealed the presence of one tetramer in the asymmetric unit. The structure is currently being refined using maximum-likelihood protocols.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Despite the abundant availability,of protocols and application for peer-to-peer file sharing, several drawbacks are still present in the field. Among most notable drawbacks is the lack of a simple and interoperable way to share information among independent peer-to-peer networks. Another drawback is the requirement that the shared content can be accessed only by a limited number of compatible applications, making impossible their access to others applications and system. In this work we present a new approach for peer-to-peer data indexing, focused on organization and retrieval of metadata which describes the shared content. This approach results in a common and interoperable infrastructure, which provides a transparent access to data shared on multiple data sharing networks via a simple API. The proposed approach is evaluated using a case study, implemented as a cross-platform extension to Mozilla Fir fox browser; and demonstrates the advantages of such interoperability over conventional distributed data access strategies.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Within the next decade, the improved version 2 of Global Ozone Monitoring Experiment (GOME-2), a ultraviolet-visible spectrometer dedicated to the observation of key atmospheric trace species from space, will be launched successively on board three EUMETSAT Polar System (EPS) MetOp satellites. Starting with the launch of MetOp-1 scheduled for summer 2006, the GOME-2 series will extend till 2020 the global monitoring of atmospheric composition pioneered with ERS-2 GOME-1 since 1995 and enhanced with Envisat SCIAMACHY since 2002 and EOS-Aura OMI since 2004. For more than a decade, an international pool of scientific teams active in ground-and space-based ultraviolet-visible remote sensing have contributed to the successful post-launch validation of trace gas data products and the associated maturation of retrieval algorithms for the latter satellites, ensuring that geophysical data products are/become reliable and accurate enough for intended research and applications. Building on this experience, this consortium plans now to develop and carry out appropriate validation of a list of GOME-2 trace gas column data of both tropospheric and stratospheric relevance: nitrogen dioxide (NO 2), ozone (O 3), bromine monoxide (BrO), chlorine dioxide (OClO), formaldehyde (HCHO), and sulphur dioxide (SO 2). The proposed investigation will combine four complementary approaches resulting in an end-to-end validation of expected column data products.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Despite the abundant availability of protocols and application for peer-to-peer file sharing, several drawbacks are still present in the field. Among most notable drawbacks is the lack of a simple and interoperable way to share information among independent peer-to-peer networks. Another drawback is the requirement that the shared content can be accessed only by a limited number of compatible applications, making impossible their access to others applications and system. In this work we present a new approach for peer-to-peer data indexing, focused on organization and retrieval of metadata which describes the shared content. This approach results in a common and interoperable infrastructure, which provides a transparent access to data shared on multiple data sharing networks via a simple API. The proposed approach is evaluated using a case study, implemented as a cross-platform extension to Mozilla Firefox browser, and demonstrates the advantages of such interoperability over conventional distributed data access strategies. © 2009 IEEE.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.