905 resultados para Cosmology,cosmic voids,mass function,astrophysics,large scale structure,theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EU-funded research project ALARM will develop and test methods and protocols for the assessment of large-scale environmental risks in order to minimise negative human impacts. Research focuses on the assessment and forecast of changes in biodiversity and in the structure, function, and dynamics of ecosystems. This includes the relationships between society, the economy and biodiversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many numerical models for weather prediction and climate studies are run at resolutions that are too coarse to resolve convection explicitly, but too fine to justify the local equilibrium assumed by conventional convective parameterizations. The Plant-Craig (PC) stochastic convective parameterization scheme, developed in this paper, solves this problem by removing the assumption that a given grid-scale situation must always produce the same sub-grid-scale convective response. Instead, for each timestep and gridpoint, one of the many possible convective responses consistent with the large-scale situation is randomly selected. The scheme requires as input the large-scale state as opposed to the instantaneous grid-scale state, but must nonetheless be able to account for genuine variations in the largescale situation. Here we investigate the behaviour of the PC scheme in three-dimensional simulations of radiative-convective equilibrium, demonstrating in particular that the necessary space-time averaging required to produce a good representation of the input large-scale state is not in conflict with the requirement to capture large-scale variations. The resulting equilibrium profiles agree well with those obtained from established deterministic schemes, and with corresponding cloud-resolving model simulations. Unlike the conventional schemes the statistics for mass flux and rainfall variability from the PC scheme also agree well with relevant theory and vary appropriately with spatial scale. The scheme is further shown to adapt automatically to changes in grid length and in forcing strength.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in our understanding of the large-scale electric and magnetic fields in the coupled magnetosphere-ionosphere system are reviewed. The literature appearing in the period January 1991–June 1993 is sorted into 8 general areas of study. The phenomenon of substorms receives the most attention in this literature, with the location of onset being the single most discussed issue. However, if the magnetic topology in substorm phases was widely debated, less attention was paid to the relationship of convection to the substorm cycle. A significantly new consensus view of substorm expansion and recovery phases emerged, which was termed the ‘Kiruna Conjecture’ after the conference at which it gained widespread acceptance. The second largest area of interest was dayside transient events, both near the magnetopause and the ionosphere. It became apparent that these phenomena include at least two classes of events, probably due to transient reconnection bursts and sudden solar wind dynamic pressure changes. The contribution of both types of event to convection is controversial. The realisation that induction effects decouple electric fields in the magnetosphere and ionosphere, on time scales shorter than several substorm cycles, calls for broadening of the range of measurement techniques in both the ionosphere and at the magnetopause. Several new techniques were introduced including ionospheric observations which yield reconnection rate as a function of time. The magnetospheric and ionospheric behaviour due to various quasi-steady interplanetary conditions was studied using magnetic cloud events. For northward IMF conditions, reverse convection in the polar cap was found to be predominantly a summer hemisphere phenomenon and even for extremely rare prolonged southward IMF conditions, the magnetosphere was observed to oscillate through various substorm cycles rather than forming a steady-state convection bay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data from 58 strong-lensing events surveyed by the Sloan Lens ACS Survey are used to estimate the projected galaxy mass inside their Einstein radii by two independent methods: stellar dynamics and strong gravitational lensing. We perform a joint analysis of these two estimates inside models with up to three degrees of freedom with respect to the lens density profile, stellar velocity anisotropy, and line-of-sight (LOS) external convergence, which incorporates the effect of the large-scale structure on strong lensing. A Bayesian analysis is employed to estimate the model parameters, evaluate their significance, and compare models. We find that the data favor Jaffe`s light profile over Hernquist`s, but that any particular choice between these two does not change the qualitative conclusions with respect to the features of the system that we investigate. The density profile is compatible with an isothermal, being sightly steeper and having an uncertainty in the logarithmic slope of the order of 5% in models that take into account a prior ignorance on anisotropy and external convergence. We identify a considerable degeneracy between the density profile slope and the anisotropy parameter, which largely increases the uncertainties in the estimates of these parameters, but we encounter no evidence in favor of an anisotropic velocity distribution on average for the whole sample. An LOS external convergence following a prior probability distribution given by cosmology has a small effect on the estimation of the lens density profile, but can increase the dispersion of its value by nearly 40%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the impact of the existence of a primordial magnetic field on the filter mass, characterizing the minimum baryonic mass that can form in dark matter (DM) haloes. For masses below the filter mass, the baryon content of DM haloes are severely depressed. The filter mass is the mass when the baryon to DM mass ratio in a halo is equal to half the baryon to DM ratio of the Universe. The filter mass has previously been used in semi-analytic calculations of galaxy formation, without taking into account the possible existence of a primordial magnetic field. We examine here its effect on the filter mass. For homogeneous comoving primordial magnetic fields of B(0) similar to 1 or 2 nG and a re-ionization epoch that starts at a redshift z(s) = 11 and is completed at z(r) = 8, the filter mass is increased at redshift 8, for example, by factors of 4.1 and 19.8, respectively. The dependence of the filter mass on the parameters describing the re-ionization epoch is investigated. Our results are particularly important for the formation of low-mass galaxies in the presence of a homogeneous primordial magnetic field. For example, for B(0) similar to 1 nG and a re-ionization epoch of z(s) similar to 11 and z(r) similar to 7, our results indicate that galaxies of total mass M similar to 5 x 108 M(circle dot) need to form at redshifts z(F) greater than or similar to 2.0, and galaxies of total mass M similar to 108 M(circle dot) at redshifts z(F) greater than or similar to 7.7.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new accelerating cosmology driven only by baryons plus cold dark matter (CDM) is proposed in the framework of general relativity. In this scenario the present accelerating stage of the Universe is powered by the negative pressure describing the gravitationally-induced particle production of cold dark matter particles. This kind of scenario has only one free parameter and the differential equation governing the evolution of the scale factor is exactly the same of the Lambda CDM model. For a spatially flat Universe, as predicted by inflation (Omega(dm) + Omega(baryon) = 1), it is found that the effectively observed matter density parameter is Omega(meff) = 1 - alpha, where alpha is the constant parameter specifying the CDM particle creation rate. The supernovae test based on the Union data (2008) requires alpha similar to 0.71 so that Omega(meff) similar to 0.29 as independently derived from weak gravitational lensing, the large scale structure and other complementary observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between the structure and function of biological networks constitutes a fundamental issue in systems biology. Particularly, the structure of protein-protein interaction networks is related to important biological functions. In this work, we investigated how such a resilience is determined by the large scale features of the respective networks. Four species are taken into account, namely yeast Saccharomyces cerevisiae, worm Caenorhabditis elegans, fly Drosophila melanogaster and Homo sapiens. We adopted two entropy-related measurements (degree entropy and dynamic entropy) in order to quantify the overall degree of robustness of these networks. We verified that while they exhibit similar structural variations under random node removal, they differ significantly when subjected to intentional attacks (hub removal). As a matter of fact, more complex species tended to exhibit more robust networks. More specifically, we quantified how six important measurements of the networks topology (namely clustering coefficient, average degree of neighbors, average shortest path length, diameter, assortativity coefficient, and slope of the power law degree distribution) correlated with the two entropy measurements. Our results revealed that the fraction of hubs and the average neighbor degree contribute significantly for the resilience of networks. In addition, the topological analysis of the removed hubs indicated that the presence of alternative paths between the proteins connected to hubs tend to reinforce resilience. The performed analysis helps to understand how resilience is underlain in networks and can be applied to the development of protein network models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analyses of circulating metabolites in large prospective epidemiological studies could lead to improved prediction and better biological understanding of coronary heart disease (CHD). We performed a mass spectrometry-based non-targeted metabolomics study for association with incident CHD events in 1,028 individuals (131 events; 10 y. median follow-up) with validation in 1,670 individuals (282 events; 3.9 y. median follow-up). Four metabolites were replicated and independent of main cardiovascular risk factors [lysophosphatidylcholine 18∶1 (hazard ratio [HR] per standard deviation [SD] increment = 0.77, P-value<0.001), lysophosphatidylcholine 18∶2 (HR = 0.81, P-value<0.001), monoglyceride 18∶2 (MG 18∶2; HR = 1.18, P-value = 0.011) and sphingomyelin 28∶1 (HR = 0.85, P-value = 0.015)]. Together they contributed to moderate improvements in discrimination and re-classification in addition to traditional risk factors (C-statistic: 0.76 vs. 0.75; NRI: 9.2%). MG 18∶2 was associated with CHD independently of triglycerides. Lysophosphatidylcholines were negatively associated with body mass index, C-reactive protein and with less evidence of subclinical cardiovascular disease in additional 970 participants; a reverse pattern was observed for MG 18∶2. MG 18∶2 showed an enrichment (P-value = 0.002) of significant associations with CHD-associated SNPs (P-value = 1.2×10-7 for association with rs964184 in the ZNF259/APOA5 region) and a weak, but positive causal effect (odds ratio = 1.05 per SD increment in MG 18∶2, P-value = 0.05) on CHD, as suggested by Mendelian randomization analysis. In conclusion, we identified four lipid-related metabolites with evidence for clinical utility, as well as a causal role in CHD development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new method to quantify substructures in clusters of galaxies, based on the analysis of the intensity of structures. This analysis is done in a residual image that is the result of the subtraction of a surface brightness model, obtained by fitting a two-dimensional analytical model (beta-model or Sersic profile) with elliptical symmetry, from the X-ray image. Our method is applied to 34 clusters observed by the Chandra Space Telescope that are in the redshift range z is an element of [0.02, 0.2] and have a signal-to-noise ratio (S/N) greater than 100. We present the calibration of the method and the relations between the substructure level with physical quantities, such as the mass, X-ray luminosity, temperature, and cluster redshift. We use our method to separate the clusters in two sub-samples of high-and low-substructure levels. We conclude, using Monte Carlo simulations, that the method recuperates very well the true amount of substructure for small angular core radii clusters (with respect to the whole image size) and good S/N observations. We find no evidence of correlation between the substructure level and physical properties of the clusters such as gas temperature, X-ray luminosity, and redshift; however, analysis suggest a trend between the substructure level and cluster mass. The scaling relations for the two sub-samples (high-and low-substructure level clusters) are different (they present an offset, i. e., given a fixed mass or temperature, low-substructure clusters tend to be more X-ray luminous), which is an important result for cosmological tests using the mass-luminosity relation to obtain the cluster mass function, since they rely on the assumption that clusters do not present different scaling relations according to their dynamical state.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The origin of cosmic rays at all energies is still uncertain. In this paper, we present and explore an astrophysical scenario to produce cosmic rays with energy ranging from below 10(15) to 3 x 10(20) eV. We show here that just our Galaxy and the radio galaxy Cen A, each with their own galactic cosmic-ray particles but with those from the radio galaxy pushed up in energy by a relativistic shock in the jet emanating from the active black hole, are sufficient to describe the most recent data in the PeV to near ZeV energy range. Data are available over this entire energy range from the KASCADE, KASCADE-Grande, and Pierre Auger Observatory experiments. The energy spectrum calculated here correctly reproduces the measured spectrum beyond the knee and, contrary to widely held expectations, no other extragalactic source population is required to explain the data even at energies far below the general cutoff expected at 6 x 10(19) eV, the Greisen-Zatsepin-Kuz'min turnoff due to interaction with the cosmological microwave background. We present several predictions for the source population, the cosmic-ray composition, and the propagation to Earth which can be tested in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Starting from the Fisher matrix for counts in cells, we derive the full Fisher matrix for surveys of multiple tracers of large-scale structure. The key step is the classical approximation, which allows us to write the inverse of the covariance of the galaxy counts in terms of the naive matrix inverse of the covariance in a mixed position-space and Fourier-space basis. We then compute the Fisher matrix for the power spectrum in bins of the 3D wavenumber , the Fisher matrix for functions of position (or redshift z) such as the linear bias of the tracers and/or the growth function and the cross-terms of the Fisher matrix that expresses the correlations between estimations of the power spectrum and estimations of the bias. When the bias and growth function are fully specified, and the Fourier-space bins are large enough that the covariance between them can be neglected, the Fisher matrix for the power spectrum reduces to the widely used result that was first derived by Feldman, Kaiser & Peacock. Assuming isotropy, a fully analytical calculation of the Fisher matrix in the classical approximation can be performed in the case of a constant-density, volume-limited survey.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is an observational study of the large-scale moisture transport over South America, with some analyses on its relation to subtropical rainfall. The concept of aerial rivers is proposed as a framework: it is an analogy between the main pathways of moisture flow in the atmosphere and surface rivers. Opposite to surface rivers, aerial rivers gain (lose) water through evaporation (precipitation). The magnitude of the vertically integrated moisture transport is discharge, and precipitable water is like the mass of the liquid column-multiplied by an equivalent speed it gives discharge. Trade wind flow into Amazonia, and the north/northwesterly flow to the subtropics, east of the Andes, are aerial rivers. Aerial lakes are the sections of a moisture pathway where the flow slows down and broadens, because of diffluence, and becomes deeper, with higher precipitable water. This is the case over Amazonia, downstream of the trade wind confluence. In the dry season, moisture from the aerial lake is transported northeastward, but weaker flow over southern Amazonia heads southward toward the subtropics. Southern Amazonia appears as a source of moisture to this flow. Aerial river discharge to the subtropics is comparable to that of the Amazon River. The variations of the amount of moisture coming from Amazonia have an important effect over the variability of discharge. Correlations between the flow from Amazonia and subtropical rainfall are not strong. However, some months within the set of dry seasons observed showed a strong increase (decrease) occurring together with an important increase (decrease) in subtropical rainfall.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Planck scale physics may influence the evolution of cosmological fluctuations in the early stages of cosmological evolution. Because of the quasiexponential redshifting, which occurs during an inflationary period, the physical wavelengths of comoving scales that correspond to the present large-scale structure of the Universe were smaller than the Planck length in the early stages of the inflationary period. This trans-Planckian effect was studied before using toy models. The Horava-Lifshitz (HL) theory offers the chance to study this problem in a candidate UV complete theory of gravity. In this paper we study the evolution of cosmological perturbations according to HL gravity assuming that matter gives rise to an inflationary background. As is usually done in inflationary cosmology, we assume that the fluctuations originate in their minimum energy state. In the trans-Planckian region the fluctuations obey a nonlinear dispersion relation of Corley-Jacobson type. In the "healthy extension" of HL gravity there is an extra degree of freedom which plays an important role in the UV region but decouples in the IR, and which influences the cosmological perturbations. We find that in spite of these important changes compared to the usual description, the overall scale invariance of the power spectrum of cosmological perturbations is recovered. However, we obtain oscillations in the spectrum as a function of wave number with a relative amplitude of order unity and with an effective frequency which scales nonlinearly with wave number. Taking the usual inflationary parameters we find that the frequency of the oscillations is so large as to render the effect difficult to observe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bioinformatics, in the last few decades, has played a fundamental role to give sense to the huge amount of data produced. Obtained the complete sequence of a genome, the major problem of knowing as much as possible of its coding regions, is crucial. Protein sequence annotation is challenging and, due to the size of the problem, only computational approaches can provide a feasible solution. As it has been recently pointed out by the Critical Assessment of Function Annotations (CAFA), most accurate methods are those based on the transfer-by-homology approach and the most incisive contribution is given by cross-genome comparisons. In the present thesis it is described a non-hierarchical sequence clustering method for protein automatic large-scale annotation, called “The Bologna Annotation Resource Plus” (BAR+). The method is based on an all-against-all alignment of more than 13 millions protein sequences characterized by a very stringent metric. BAR+ can safely transfer functional features (Gene Ontology and Pfam terms) inside clusters by means of a statistical validation, even in the case of multi-domain proteins. Within BAR+ clusters it is also possible to transfer the three dimensional structure (when a template is available). This is possible by the way of cluster-specific HMM profiles that can be used to calculate reliable template-to-target alignments even in the case of distantly related proteins (sequence identity < 30%). Other BAR+ based applications have been developed during my doctorate including the prediction of Magnesium binding sites in human proteins, the ABC transporters superfamily classification and the functional prediction (GO terms) of the CAFA targets. Remarkably, in the CAFA assessment, BAR+ placed among the ten most accurate methods. At present, as a web server for the functional and structural protein sequence annotation, BAR+ is freely available at http://bar.biocomp.unibo.it/bar2.0.