975 resultados para weighed estimate
Resumo:
The STAR Collaboration at the Relativistic Heavy Ion Collider presents measurements of J/psi e(+) e(-) at midrapidity and high transverse momentum (pT > 5 GeV/c) in p + p and central Cu + Cu collisions at root s(NN) = 200 GeV. The inclusive J/psi production cross section for Cu + Cu collisions is found to be consistent at high p(T) with the binary collision-scaled cross section for p + p collisions. At a confidence level of 97%, this is in contrast to a suppression of J/psi production observed at lower p(T). Azimuthal correlations of J/psi with charged hadrons in p + p collisions provide an estimate of the contribution of B-hadron decays to J/psi production of 13% +/- 5%.
Resumo:
Aims. Given that in most cases just thermal pressure is taken into account in the hydrostatic equilibrium equation to estimate galaxy cluster mass, the main purpose of this paper is to consider the contribution of all three non-thermal components to total mass measurements. The non-thermal pressure is composed by cosmic rays, turbulence and magnetic pressures. Methods. To estimate the thermal pressure we used public XMM-Newton archival data of five Abell clusters to derive temperature and density profiles. To describe the magnetic pressure, we assume a radial distribution for the magnetic field, B(r) proportional to rho(alpha)(g). To seek generality we assume alpha within the range of 0.5 to 0.9, as indicated by observations and numerical simulations. Turbulent motions and bulk velocities add a turbulent pressure, which is considered using an estimate from numerical simulations. For this component, we assume an isotropic pressure, P(turb) = 1/3 rho(g)(sigma(2)(r) + sigma(2)(t)). We also consider the contribution of cosmic ray pressure, P(cr) proportional to r(-0.5). Thus, besides the gas (thermal) pressure, we include these three non-thermal components in the magnetohydrostatic equilibrium equation and compare the total mass estimates with the values obtained without them. Results. A consistent description for the non-thermal component could yield a variation in mass estimates that extends from 10% to similar to 30%. We verified that in the inner parts of cool core clusters the cosmic ray component is comparable to the magnetic pressure, while in non-cool core clusters the cosmic ray component is dominant. For cool core clusters the magnetic pressure is the dominant component, contributing more than 50% of the total mass variation due to non-thermal pressure components. However, for non-cool core clusters, the major influence comes from the cosmic ray pressure that accounts for more than 80% of the total mass variation due to non-thermal pressure effects. For our sample, the maximum influence of the turbulent component to the total mass variation can be almost 20%. Although all of the assumptions agree with previous works, it is important to notice that our results rely on the specific parametrization adopted in this work. We show that this analysis can be regarded as a starting point for a more detailed and refined exploration of the influence of non-thermal pressure in the intra-cluster medium (ICM).
Resumo:
In this paper, we estimate the losses during teleportation processes requiring either two high-Q cavities or a single bimodal cavity. The estimates were carried out using the phenomenological operator approach introduced by de Almeida et al. [Phys. Rev. A 62, 033815 (2000)].
Resumo:
Bounds on the exchange-correlation energy of many-electron systems are derived and tested. By using universal scaling properties of the electron-electron interaction, we obtain the exponent of the bounds in three, two, one, and quasione dimensions. From the properties of the electron gas in the dilute regime, the tightest estimate to date is given for the numerical prefactor of the bound, which is crucial in practical applications. Numerical tests on various low-dimensional systems are in line with the bounds obtained and give evidence of an interesting dimensional crossover between two and one dimensions.
Resumo:
In this article, we evaluate the use of simple Lee-Goldburg cross-polarization (LG-CP) NMR experiments for obtaining quantitative information of molecular motion in the intermediate regime. In particular, we introduce the measurement of Hartmann-Hahn matching profiles for the assessment of heteronuclear dipolar couplings as well as dynamics as a reliable and robust alternative to the more common analysis of build-up curves. We have carried out dynamic spin dynamics simulations in order to test the method's sensitivity to intermediate motion and address its limitations concerning possible experimental imperfections. We further demonstrate the successful use of simple theoretical concepts, most prominently Anderson-Weiss (AW) theory, to analyze the data. We further propose an alternative way to estimate activation energies of molecular motions, based upon the acquisition of only two LG-CP spectra per temperature at different temperatures. As experimental tests, molecular jumps in imidazole methyl sulfonate, trimethylsulfoxonium iodide, and bisphenol A polycarbonate were investigated with the new method.
Resumo:
Thanks to recent advances in molecular biology, allied to an ever increasing amount of experimental data, the functional state of thousands of genes can now be extracted simultaneously by using methods such as cDNA microarrays and RNA-Seq. Particularly important related investigations are the modeling and identification of gene regulatory networks from expression data sets. Such a knowledge is fundamental for many applications, such as disease treatment, therapeutic intervention strategies and drugs design, as well as for planning high-throughput new experiments. Methods have been developed for gene networks modeling and identification from expression profiles. However, an important open problem regards how to validate such approaches and its results. This work presents an objective approach for validation of gene network modeling and identification which comprises the following three main aspects: (1) Artificial Gene Networks (AGNs) model generation through theoretical models of complex networks, which is used to simulate temporal expression data; (2) a computational method for gene network identification from the simulated data, which is founded on a feature selection approach where a target gene is fixed and the expression profile is observed for all other genes in order to identify a relevant subset of predictors; and (3) validation of the identified AGN-based network through comparison with the original network. The proposed framework allows several types of AGNs to be generated and used in order to simulate temporal expression data. The results of the network identification method can then be compared to the original network in order to estimate its properties and accuracy. Some of the most important theoretical models of complex networks have been assessed: the uniformly-random Erdos-Renyi (ER), the small-world Watts-Strogatz (WS), the scale-free Barabasi-Albert (BA), and geographical networks (GG). The experimental results indicate that the inference method was sensitive to average degree k variation, decreasing its network recovery rate with the increase of k. The signal size was important for the inference method to get better accuracy in the network identification rate, presenting very good results with small expression profiles. However, the adopted inference method was not sensible to recognize distinct structures of interaction among genes, presenting a similar behavior when applied to different network topologies. In summary, the proposed framework, though simple, was adequate for the validation of the inferred networks by identifying some properties of the evaluated method, which can be extended to other inference methods.
Resumo:
Background: The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results: In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions: A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 <= q <= 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/.
Resumo:
Background: Feature selection is a pattern recognition approach to choose important variables according to some criteria in order to distinguish or explain certain phenomena (i.e., for dimensionality reduction). There are many genomic and proteomic applications that rely on feature selection to answer questions such as selecting signature genes which are informative about some biological state, e. g., normal tissues and several types of cancer; or inferring a prediction network among elements such as genes, proteins and external stimuli. In these applications, a recurrent problem is the lack of samples to perform an adequate estimate of the joint probabilities between element states. A myriad of feature selection algorithms and criterion functions have been proposed, although it is difficult to point the best solution for each application. Results: The intent of this work is to provide an open-source multiplataform graphical environment for bioinformatics problems, which supports many feature selection algorithms, criterion functions and graphic visualization tools such as scatterplots, parallel coordinates and graphs. A feature selection approach for growing genetic networks from seed genes ( targets or predictors) is also implemented in the system. Conclusion: The proposed feature selection environment allows data analysis using several algorithms, criterion functions and graphic visualization tools. Our experiments have shown the software effectiveness in two distinct types of biological problems. Besides, the environment can be used in different pattern recognition applications, although the main concern regards bioinformatics tasks.
Resumo:
Background: The aim of this study was to estimate the prevalence of fibromyalgia, as well as to assess the major symptoms of this syndrome in an adult, low socioeconomic status population assisted by the primary health care system in a city in Brazil. Methods: We cross-sectionally sampled individuals assisted by the public primary health care system (n = 768, 35-60 years old). Participants were interviewed by phone and screened about pain. They were then invited to be clinically assessed (304 accepted). Pain was estimated using a Visual Analogue Scale (VAS). Fibromyalgia was assessed using the Fibromyalgia Impact Questionnaire (FIQ), as well as screening for tender points using dolorimetry. Statistical analyses included Bayesian Statistics and the Kruskal-Wallis Anova test (significance level = 5%). Results: From the phone-interview screening, we divided participants (n = 768) in three groups: No Pain (NP) (n = 185); Regional Pain (RP) (n = 388) and Widespread Pain (WP) (n = 106). Among those participating in the clinical assessments, (304 subjects), the prevalence of fibromyalgia was 4.4% (95% confidence interval [2.6%; 6.3%]). Symptoms of pain (VAS and FIQ), feeling well, job ability, fatigue, morning tiredness, stiffness, anxiety and depression were statically different among the groups. In multivariate analyses we found that individuals with FM and WP had significantly higher impairment than those with RP and NP. FM and WP were similarly disabling. Similarly, RP was no significantly different than NP. Conclusion: Fibromyalgia is prevalent in the low socioeconomic status population assisted by the public primary health care system. Prevalence was similar to other studies (4.4%) in a more diverse socioeconomic population. Individuals with FM and WP have significant impact in their well being.
Resumo:
Background: Worldwide, a high proportion of HIV-infected individuals enter into HIV care late. Here, our objective was to estimate the impact that late entry into HIV care has had on AIDS mortality rates in Brazil. Methodology/Principal Findings: We analyzed data from information systems regarding HIV-infected adults who sought treatment at public health care facilities in Brazil from 2003 to 2006. We initially estimated the prevalence of late entry into HIV care, as well as the probability of death in the first 12 months, the percentage of the risk of death attributable to late entry, and the number of avoidable deaths. We subsequently adjusted the annual AIDS mortality rate by excluding such deaths. Of the 115,369 patients evaluated, 50,358 (43.6%) had entered HIV care late, and 18,002 died in the first 12 months, representing a 16.5% probability of death in the first 12 months (95% CI: 16.3-16.7). By comparing patients who entered HIV care late with those who gained timely access, we found that the risk ratio for death was 49.5 (95% CI: 45.1-54.2). The percentage of the risk of death attributable to late entry was 95.5%, translating to 17,189 potentially avoidable deaths. Averting those deaths would have lowered the 2003-2006 AIDS mortality rate by 39.5%. Including asymptomatic patients with CD4(+) T cell counts >200 and <= 350 cells/mm(3) in the group who entered HIV care late increased this proportion by 1.8%. Conclusions/Significance: In Brazil, antiretroviral drugs reduced AIDS mortality by 43%. Timely entry would reduce that rate by a similar proportion, as well as resulting in a 45.2% increase in the effectiveness of the program for HIV care. The World Health Organization recommendation that asymptomatic patients with CD4(+) T cell counts <= 350 cells/mm(3) be treated would not have a significant impact on this scenario.
Resumo:
We simplify the known formula for the asymptotic estimate of the number of deterministic and accessible automata with n states over a k-letter alphabet. The proof relies on the theory of Lagrange inversion applied in the context of generalized binomial series.
Resumo:
In this work a simple and reliable method for the simultaneous determination of Cr, Fe, Ni and V in crude oil, using emulsion sampling graphite furnace atomic absorption spectrometry is proposed. Under the best conditions, sample masses around 50 mg were weighed in polypropylene tubes and emulsified in a mixture of 0.5% (v v(-1)) hexane + 6% (m v(-1)) Triton X-100 (R). Considering the compromised conditions, the pyrolysis an atomization temperatures for the simultaneous determination of Cr, Fe, Ni and V were 1400 degrees C and 2500 degrees C, respectively. Aliquots of 20 mu L of reference solution and sample emulsion were co-injected into the graphite tube with 10 mu L of 1.0 g L(-1) Mg(NO(3))(2) as chemical modifier. The detection limits (n = 10, 3 sigma) and characteristic masses were, respectively: 0.07 mu g g(-1) and 19 pg for Cr; 2.15 mu g g(-1) and 31 pg for Fe; 1.25 mu g g(-1) and 44 pg for Ni; and 1.15 mu g g(-1) and 149 pg for V. The reliability of the proposed method was checked by fuel oil Standard Reference Material (SRMTriton X-100 (R) 1634c - NIST) analysis. The concentrations found presented no statistical differences compared to the certified values at 95% confidence level.
Resumo:
Quantifying the rate of propagule release is of most importance to estimate reproductive output of natural populations, but simple methods to obtain such data are seldom reported. We designed and tested an inexpensive apparatus capable of reliably measure the release of gametes, eggs or larvae of sessile marine invertebrates in vertical walls. A population of the acom barnacle Chthamalus bisinuatus was sampled with this trap over 68d to obtain a time series of naupliar release. An apparent semilunar trend is shown, indicating the effectiveness of this sampling method.
Resumo:
Tropical ecosystems play a large and complex role in the global carbon cycle. Clearing of natural ecosystems for agriculture leads to large pulses of CO(2) to the atmosphere from terrestrial biomass. Concurrently, the remaining intact ecosystems, especially tropical forests, may be sequestering a large amount of carbon from the atmosphere in response to global environmental changes including climate changes and an increase in atmospheric CO(2). Here we use an approach that integrates census-based historical land use reconstructions, remote-sensing-based contemporary land use change analyses, and simulation modeling of terrestrial biogeochemistry to estimate the net carbon balance over the period 1901-2006 for the state of Mato Grosso, Brazil, which is one of the most rapidly changing agricultural frontiers in the world. By the end of this period, we estimate that of the state`s 925 225 km(2), 221 092 km(2) have been converted to pastures and 89 533 km(2) have been converted to croplands, with forest-to-pasture conversions being the dominant land use trajectory but with recent transitions to croplands increasing rapidly in the last decade. These conversions have led to a cumulative release of 4.8 Pg C to the atmosphere, with similar to 80% from forest clearing and 20% from the clearing of cerrado. Over the same period, we estimate that the residual undisturbed ecosystems accumulated 0.3 Pg C in response to CO2 fertilization. Therefore, the net emissions of carbon from Mato Grosso over this period were 4.5 Pg C. Net carbon emissions from Mato Grosso since 2000 averaged 146 Tg C/yr, on the order of Brazil`s fossil fuel emissions during this period. These emissions were associated with the expansion of croplands to grow soybeans. While alternative management regimes in croplands, including tillage, fertilization, and cropping patterns promote carbon storage in ecosystems, they remain a small portion of the net carbon balance for the region. This detailed accounting of a region`s carbon balance is the type of foundation analysis needed by the new United Nations Collaborative Programmme for Reducing Emissions from Deforestation and Forest Degradation (REDD).
Resumo:
The Brazilian Amazon is one of the most rapidly developing agricultural areas in the world and represents a potentially large future source of greenhouse gases from land clearing and subsequent agricultural management. In an integrated approach, we estimate the greenhouse gas dynamics of natural ecosystems and agricultural ecosystems after clearing in the context of a future climate. We examine scenarios of deforestation and postclearing land use to estimate the future (2006-2050) impacts on carbon dioxide (CO(2)), methane (CH(4)), and nitrous oxide (N(2)O) emissions from the agricultural frontier state of Mato Grosso, using a process-based biogeochemistry model, the Terrestrial Ecosystems Model (TEM). We estimate a net emission of greenhouse gases from Mato Grosso, ranging from 2.8 to 15.9 Pg CO(2)-equivalents (CO(2)-e) from 2006 to 2050. Deforestation is the largest source of greenhouse gas emissions over this period, but land uses following clearing account for a substantial portion (24-49%) of the net greenhouse gas budget. Due to land-cover and land-use change, there is a small foregone carbon sequestration of 0.2-0.4 Pg CO(2)-e by natural forests and cerrado between 2006 and 2050. Both deforestation and future land-use management play important roles in the net greenhouse gas emissions of this frontier, suggesting that both should be considered in emissions policies. We find that avoided deforestation remains the best strategy for minimizing future greenhouse gas emissions from Mato Grosso.