987 resultados para Experiment data
Resumo:
We show that a hadron gas model with continuous particle emission instead of freeze-out may solve some of the problems (high values of the freeze-out density and specific net charge) that one encounters in the latter case when studying strange particle ratios such as those from the experiment WA85. This underlines the necessity to understand better particle emission in hydrodynamics to be able to analyze data. It also reopens the possibility of a quark-hadron transition occurring with phase equilibrium instead of explosively.
Resumo:
We show that the accumulated CERN LEP-II data taken at √s = 130-206 GeV can establish more restrictive bounds on doubly charged bilepton couplings and masses than any other experiment so far. We also analyze the discovery potential of a prospective linear collider operating in both e+e- and e γ modes.
Resumo:
Within the next decade, the improved version 2 of Global Ozone Monitoring Experiment (GOME-2), a ultraviolet-visible spectrometer dedicated to the observation of key atmospheric trace species from space, will be launched successively on board three EUMETSAT Polar System (EPS) MetOp satellites. Starting with the launch of MetOp-1 scheduled for summer 2006, the GOME-2 series will extend till 2020 the global monitoring of atmospheric composition pioneered with ERS-2 GOME-1 since 1995 and enhanced with Envisat SCIAMACHY since 2002 and EOS-Aura OMI since 2004. For more than a decade, an international pool of scientific teams active in ground-and space-based ultraviolet-visible remote sensing have contributed to the successful post-launch validation of trace gas data products and the associated maturation of retrieval algorithms for the latter satellites, ensuring that geophysical data products are/become reliable and accurate enough for intended research and applications. Building on this experience, this consortium plans now to develop and carry out appropriate validation of a list of GOME-2 trace gas column data of both tropospheric and stratospheric relevance: nitrogen dioxide (NO 2), ozone (O 3), bromine monoxide (BrO), chlorine dioxide (OClO), formaldehyde (HCHO), and sulphur dioxide (SO 2). The proposed investigation will combine four complementary approaches resulting in an end-to-end validation of expected column data products.
Resumo:
The CMS Collaboration conducted a month-long data-taking exercise known as the Cosmic Run At Four Tesla in late 2008 in order to complete the commissioning of the experiment for extended operation. The operational lessons resulting from this exercise were addressed in the subsequent shutdown to better prepare CMS for LHC beams in 2009. The cosmic data collected have been invaluable to study the performance of the detectors, to commission the alignment and calibration techniques, and to make several cosmic ray measurements. The experimental setup, conditions, and principal achievements from this data-taking exercise are described along with a review of the preceding integration activities. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
Results are presented from a search for the rare decays Bs0→μ+μ- and B0→μ+μ - in pp collisions at √s=7 and 8 TeV, with data samples corresponding to integrated luminosities of 5 and 20 fb-1, respectively, collected by the CMS experiment at the LHC. An unbinned maximum-likelihood fit to the dimuon invariant mass distribution gives a branching fraction B(Bs0→μ+μ-)=(3.0-0.9+1.0) ×10-9, where the uncertainty includes both statistical and systematic contributions. An excess of Bs0→μ+μ- events with respect to background is observed with a significance of 4.3 standard deviations. For the decay B0→μ+μ- an upper limit of B(B0→μ+μ-)<1.1×10 -9 at the 95% confidence level is determined. Both results are in agreement with the expectations from the standard model. © 2013 CERN. Published by the American Physical Society under the terms of the.
Resumo:
Caribbean census microdata are not easily accessible to researchers. Although there are well-established and commonly used procedures technical, administrative and legal which are used to disseminate anonymized census microdata to researchers, they have not been widely used in the Caribbean. The small size of Caribbean countries makes anonymization relatively more difficult and standard methods are not always directly applicable. This study reviews commonly used methods of disseminating census microdata and considers their applicability to the Caribbean. It demonstrates the application of statistical disclosure control methods using the census datasets of Grenada and Trinidad and Tobago and considers various possible designs of microdata release file in terms of disclosure risk and utility to researchers. It then considers how various forms of microdata dissemination: public use files, licensed use files, remote data access and secure data laboratories could be used to disseminate census microdata. It concludes that there is scope for a substantial expansion of access to Caribbean census microdata and that through collaboration with international organisations and data archives, this can be achieved with relatively little burden on statistical offices.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Gravitational waves from a variety of sources are predicted to superpose to create a stochastic background. This background is expected to contain unique information from throughout the history of the Universe that is unavailable through standard electromagnetic observations, making its study of fundamental importance to understanding the evolution of the Universe. We carry out a search for the stochastic background with the latest data from the LIGO and Virgo detectors. Consistent with predictions from most stochastic gravitational-wave background models, the data display no evidence of a stochastic gravitational-wave signal. Assuming a gravitational-wave spectrum of Omega(GW)(f) = Omega(alpha)(f/f(ref))(alpha), we place 95% confidence level upper limits on the energy density of the background in each of four frequency bands spanning 41.5-1726 Hz. In the frequency band of 41.5-169.25 Hz for a spectral index of alpha = 0, we constrain the energy density of the stochastic background to be Omega(GW)(f) < 5.6 x 10(-6). For the 600-1000 Hz band, Omega(GW)(f) < 0.14(f/900 Hz)(3), a factor of 2.5 lower than the best previously reported upper limits. We find Omega(GW)(f) < 1.8 x 10(-4) using a spectral index of zero for 170-600 Hz and Omega(GW)(f) < 1.0(f/1300 Hz)(3) for 1000-1726 Hz, bands in which no previous direct limits have been placed. The limits in these four bands are the lowest direct measurements to date on the stochastic background. We discuss the implications of these results in light of the recent claim by the BICEP2 experiment of the possible evidence for inflationary gravitational waves.
Resumo:
This report differs from previous reports in two respects: it covers experimental work up to January 1, 1935, and it includes brief abstracts of publications since the last report. Previously most of the report dealt with work done before the end of the fiscal year; that is, work done between June 30 and January 1 was not reported until over a year later, for the most part. The present report corrects that defect, and in addition the abstracts of publications will make the report useful as a reference guide to published matter. The projects are discussed under subject headings and in addition to the abstracts, brief reports of progress in projects under way are included. Complete data for these projects are not included; rather an attempt has been made to show how far the work has gone and to indicate some of the directions or trends of the work. The drouth of the past summer reduced yields severely. As a result the collection of significant data on yields was almost impossible. A few of the Experiment Station workers have ben loaned to federal projects. Despite these handicaps many projects have been advanced and many have been completed.
Resumo:
Each plasma physics laboratory has a proprietary scheme to control and data acquisition system. Usually, it is different from one laboratory to another. It means that each laboratory has its own way to control the experiment and retrieving data from the database. Fusion research relies to a great extent on international collaboration and this private system makes it difficult to follow the work remotely. The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The choice of MDSplus (Model Driven System plus) is proved by the fact that it is widely utilized, and the scientists from different institutions may use the same system in different experiments in different tokamaks without the need to know how each system treats its acquisition system and data analysis. Another important point is the fact that the MDSplus has a library system that allows communication between different types of language (JAVA, Fortran, C, C++, Python) and programs such as MATLAB, IDL, OCTAVE. In the case of tokamak TCABR interfaces (object of this paper) between the system already in use and MDSplus were developed, instead of using the MDSplus at all stages, from the control, and data acquisition to the data analysis. This was done in the way to preserve a complex system already in operation and otherwise it would take a long time to migrate. This implementation also allows add new components using the MDSplus fully at all stages. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
We present measurements of Underlying Event observables in pp collisions at root s = 0 : 9 and 7 TeV. The analysis is performed as a function of the highest charged-particle transverse momentum p(T),L-T in the event. Different regions are defined with respect to the azimuthal direction of the leading (highest transverse momentum) track: Toward, Transverse and Away. The Toward and Away regions collect the fragmentation products of the hardest partonic interaction. The Transverse region is expected to be most sensitive to the Underlying Event activity. The study is performed with charged particles above three different p(T) thresholds: 0.15, 0.5 and 1.0 GeV/c. In the Transverse region we observe an increase in the multiplicity of a factor 2-3 between the lower and higher collision energies, depending on the track p(T) threshold considered. Data are compared to PYTHIA 6.4, PYTHIA 8.1 and PHOJET. On average, all models considered underestimate the multiplicity and summed p(T) in the Transverse region by about 10-30%.
Resumo:
Period adding cascades have been observed experimentally/numerically in the dynamics of neurons and pancreatic cells, lasers, electric circuits, chemical reactions, oceanic internal waves, and also in air bubbling. We show that the period adding cascades appearing in bubbling from a nozzle submerged in a viscous liquid can be reproduced by a simple model, based on some hydrodynamical principles, dealing with the time evolution of two variables, bubble position and pressure of the air chamber, through a system of differential equations with a rule of detachment based on force balance. The model further reduces to an iterating one-dimensional map giving the pressures at the detachments, where time between bubbles come out as an observable of the dynamics. The model has not only good agreement with experimental data, but is also able to predict the influence of the main parameters involved, like the length of the hose connecting the air supplier with the needle, the needle radius and the needle length. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.3695345]
Resumo:
In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.