74 resultados para LARGE-SCALE STRUCTURE OF UNIVERSE

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We studied superclusters of galaxies in a volume-limited sample extracted from the Sloan Digital Sky Survey Data Release 7 and from mock catalogues based on a semi-analytical model of galaxy evolution in the Millennium Simulation. A density field method was applied to a sample of galaxies brighter than M(r) = -21+5 log h(100) to identify superclusters, taking into account selection and boundary effects. In order to evaluate the influence of the threshold density, we have chosen two thresholds: the first maximizes the number of objects (D1) and the second constrains the maximum supercluster size to similar to 120 h(-1) Mpc (D2). We have performed a morphological analysis, using Minkowski Functionals, based on a parameter, which increases monotonically from filaments to pancakes. An anticorrelation was found between supercluster richness (and total luminosity or size) and the morphological parameter, indicating that filamentary structures tend to be richer, larger and more luminous than pancakes in both observed and mock catalogues. We have also used the mock samples to compare supercluster morphologies identified in position and velocity spaces, concluding that our morphological classification is not biased by the peculiar velocities. Monte Carlo simulations designed to investigate the reliability of our results with respect to random fluctuations show that these results are robust. Our analysis indicates that filaments and pancakes present different luminosity and size distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We recently predicted the existence of random primordial magnetic fields (RPMFs) in the form of randomly oriented cells with dipole-like structure with a cell size L(0) and an average magnetic field B(0). Here, we investigate models for primordial magnetic field with a similar web-like structure, and other geometries, differing perhaps in L(0) and B(0). The effect of RPMF on the formation of the first galaxies is investigated. The filtering mass, M(F), is the halo mass below which baryon accretion is severely depressed. We show that these RPMF could influence the formation of galaxies by altering the filtering mass and the baryon gas fraction of a halo, f(g). The effect is particularly strong in small galaxies. We find, for example, for a comoving B(0) = 0.1 mu G, and a reionization epoch that starts at z(s) = 11 and ends at z(e) = 8, for L(0) = 100 pc at z = 12, the f(g) becomes severely depressed for M < 10(7) M(circle dot), whereas for B(0) = 0 the f(g) becomes severely depressed only for much smaller masses, M < 10(5) M(circle dot). We suggest that the observation of M(F) and f(g) at high redshifts can give information on the intensity and structure of primordial magnetic fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the impact of the existence of a primordial magnetic field on the filter mass, characterizing the minimum baryonic mass that can form in dark matter (DM) haloes. For masses below the filter mass, the baryon content of DM haloes are severely depressed. The filter mass is the mass when the baryon to DM mass ratio in a halo is equal to half the baryon to DM ratio of the Universe. The filter mass has previously been used in semi-analytic calculations of galaxy formation, without taking into account the possible existence of a primordial magnetic field. We examine here its effect on the filter mass. For homogeneous comoving primordial magnetic fields of B(0) similar to 1 or 2 nG and a re-ionization epoch that starts at a redshift z(s) = 11 and is completed at z(r) = 8, the filter mass is increased at redshift 8, for example, by factors of 4.1 and 19.8, respectively. The dependence of the filter mass on the parameters describing the re-ionization epoch is investigated. Our results are particularly important for the formation of low-mass galaxies in the presence of a homogeneous primordial magnetic field. For example, for B(0) similar to 1 nG and a re-ionization epoch of z(s) similar to 11 and z(r) similar to 7, our results indicate that galaxies of total mass M similar to 5 x 108 M(circle dot) need to form at redshifts z(F) greater than or similar to 2.0, and galaxies of total mass M similar to 108 M(circle dot) at redshifts z(F) greater than or similar to 7.7.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cosmic shear requires high precision measurement of galaxy shapes in the presence of the observational point spread function (PSF) that smears out the image. The PSF must therefore be known for each galaxy to a high accuracy. However, for several reasons, the PSF is usually wavelength dependent; therefore, the differences between the spectral energy distribution of the observed objects introduce further complexity. In this paper, we investigate the effect of the wavelength dependence of the PSF, focusing on instruments in which the PSF size is dominated by the diffraction limit of the telescope and which use broad-band filters for shape measurement. We first calculate biases on cosmological parameter estimation from cosmic shear when the stellar PSF is used uncorrected. Using realistic galaxy and star spectral energy distributions and populations and a simple three-component circular PSF, we find that the colour dependence must be taken into account for the next generation of telescopes. We then consider two different methods for removing the effect: (i) the use of stars of the same colour as the galaxies and (ii) estimation of the galaxy spectral energy distribution using multiple colours and using a telescope model for the PSF. We find that both of these methods correct the effect to levels below the tolerances required for per cent level measurements of dark energy parameters. Comparison of the two methods favours the template-fitting method because its efficiency is less dependent on galaxy redshift than the broad-band colour method and takes full advantage of deeper photometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enantiomerically pure (R)- and (S)-gamma-hydroxy-organochalcogenides are prepared using poly-[R]-3-hydroxybutanoate (PHB) as the starting material. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims. An analytical solution for the discrepancy between observed core-like profiles and predicted cusp profiles in dark matter halos is studied. Methods. We calculate the distribution function for Navarro-Frenk-White halos and extract energy from the distribution, taking into account the effects of baryonic physics processes. Results. We show with a simple argument that we can reproduce the evolution of a cusp to a flat density profile by a decrease of the initial potential energy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between the structure and function of biological networks constitutes a fundamental issue in systems biology. Particularly, the structure of protein-protein interaction networks is related to important biological functions. In this work, we investigated how such a resilience is determined by the large scale features of the respective networks. Four species are taken into account, namely yeast Saccharomyces cerevisiae, worm Caenorhabditis elegans, fly Drosophila melanogaster and Homo sapiens. We adopted two entropy-related measurements (degree entropy and dynamic entropy) in order to quantify the overall degree of robustness of these networks. We verified that while they exhibit similar structural variations under random node removal, they differ significantly when subjected to intentional attacks (hub removal). As a matter of fact, more complex species tended to exhibit more robust networks. More specifically, we quantified how six important measurements of the networks topology (namely clustering coefficient, average degree of neighbors, average shortest path length, diameter, assortativity coefficient, and slope of the power law degree distribution) correlated with the two entropy measurements. Our results revealed that the fraction of hubs and the average neighbor degree contribute significantly for the resilience of networks. In addition, the topological analysis of the removed hubs indicated that the presence of alternative paths between the proteins connected to hubs tend to reinforce resilience. The performed analysis helps to understand how resilience is underlain in networks and can be applied to the development of protein network models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microgauss magnetic fields are observed in all galaxies at low and high redshifts. The origin of these intense magnetic fields is a challenging question in astrophysics. We show here that the natural plasma fluctuations in the primordial Universe (assumed to be random), predicted by the fluctuation - dissipation theorem, predicts similar to 0.034 mu G fields over similar to 0.3 kpc regions in galaxies. If the dipole magnetic fields predicted by the fluctuation- dissipation theorem are not completely random, microgauss fields over regions greater than or similar to 0.34 kpc are easily obtained. The model is thus a strong candidate for resolving the problem of the origin of magnetic fields in less than or similar to 10(9) years in high redshift galaxies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The landfall of Cyclone Catarina on the Brazilian coast in March 2004 became known as the first documented hurricane in the South Atlantic Ocean, promoting a new view oil how large-scale features can contribute to tropical transition. The aim of this paper is to put the large-scale circulation associated with Catarina`s transition in climate perspective. This is discussed in the light of a robust pattern of spatial correlations between thermodynamic and dynamic variables of importance for hurricane formation. A discussion on how transition mechanisms respond to the present-day circulation is presented. These associations help in understanding why Catarina was formed in a region previously thought to be hurricane-free. Catarina developed over a large-scale area of thermodynamically favourable air/sea temperature contrast. This aspect explains the paradox that such a rare system developed when the sea surface temperature was slightly below average. But, although thermodynamics played an important role, it is apparent that Catarina would not have formed without the key dynamic interplay triggered by a high latitude blocking. The blocking was associated with an extreme positive phase of the Southern Annular Mode (SAM) both hemispherically and locally, and the nearby area where Catarina developed is found to be more cyclonic during the positive phase of the SAM. A conceptual model is developed and a `South Atlantic index` is introduced as a useful diagnostic of potential conditions leading to tropical transition in the area, where large-scale indices indicate trends towards more favourable atmospheric conditions for tropical cyclone formation. Copyright (c) 2008 Royal Meteorological Society

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complex interactions among endangered ecosystems, landowners` interests, and different models of land tenure and use, constitute an important series of challenges for those seeking to maintain and restore biodiversity and augment the flow of ecosystem services. Over the past 10 years, we have developed a data-based approach to address these challenges and to achieve medium and large-scale ecological restoration of riparian areas on private lands in the state of Sao Paulo, southeastern Brazil. Given varying motivations for ecological restoration, the location of riparian areas within landholdings, environmental zoning of different riparian areas, and best-practice restoration methods were developed for each situation. A total of 32 ongoing projects, covering 527,982 ha, were evaluated in large sugarcane farms and small mixed farms, and six different restoration techniques have been developed to help upscale the effort. Small mixed farms had higher portions of land requiring protection as riparian areas (13.3%), and lower forest cover of riparian areas (18.3%), than large sugarcane farms (10.0% and 36.9%, respectively for riparian areas and forest cover values). In both types of farms, forest fragments required some degree of restoration. Historical anthropogenic degradation has compromised forest ecosystem structure and functioning, despite their high-diversity of native tree and shrub species. Notably, land use patterns in riparian areas differed markedly. Large sugarcane farms had higher portions of riparian areas occupied by highly mechanized agriculture, abandoned fields, and anthropogenic wet fields created by siltation in water courses. In contrast, in small mixed crop farms, low or non-mechanized agriculture and pasturelands were predominant. Despite these differences, plantations of native tree species covering the entire area was by far the main restoration method needed both by large sugarcane farms (76.0%) and small mixed farms (92.4%), in view of the low resilience of target sites, reduced forest cover, and high fragmentation, all of which limit the potential for autogenic restoration. We propose that plantations should be carried out with a high-diversity of native species in order to create biologically viable restored forests, and to assist long-term biodiversity persistence at the landscape scale. Finally, we propose strategies to integrate the political, socio-economic and methodological aspects needed to upscale restoration efforts in tropical forest regions throughout Latin America and elsewhere. (C) 2010 Elsevier BA/. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nucleus (46)Ti has been studied with the reaction (42)Ca((7)Li,p2n)(46)Ti at a bombarding energy of 31 MeV. Thin target foils backed with a thick Au layer were used. Five new levels of negative parity were observed. Several lifetimes have been determined with the Doppler shift attenuation method. Low-lying experimental negative-parity levels are assigned to three bands with K(pi) = 3, 0, and 4, which are interpreted in terms of the large-scale shell model, considering particle-hole excitations from d(3/2) and s(1/2) orbitals. Shell model calculations were performed using a few effective interactions. However, good agreement was not achieved in the description of either negative- or positive-parity low-lying levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale cortical networks exhibit characteristic topological properties that shape communication between brain regions and global cortical dynamics. Analysis of complex networks allows the description of connectedness, distance, clustering, and centrality that reveal different aspects of how the network's nodes communicate. Here, we focus on a novel analysis of complex walks in a series of mammalian cortical networks that model potential dynamics of information flow between individual brain regions. We introduce two new measures called absorption and driftness. Absorption is the average length of random walks between any two nodes, and takes into account all paths that may diffuse activity throughout the network. Driftness is the ratio between absorption and the corresponding shortest path length. For a given node of the network, we also define four related measurements, namely in-and out-absorption as well as in-and out-driftness, as the averages of the corresponding measures from all nodes to that node, and from that node to all nodes, respectively. We find that the cat thalamo-cortical system incorporates features of two classic network topologies, Erdos-Renyi graphs with respect to in-absorption and in-driftness, and configuration models with respect to out-absorption and out-driftness. Moreover, taken together these four measures separate the network nodes based on broad functional roles (visual, auditory, somatomotor, and frontolimbic).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large scale enzymatic resolution of racemic sulcatol 2 has been useful for stereoselective biocatalysis. This reaction was fast and selective, using vinyl acetate as donor of acyl group and lipase from Candida antarctica (CALB) as catalyst. The large scale reaction (5.0 g, 39 mmol) afforded high optical purities for S-(+)-sulcatol 2 and R-(+)-sulcatyl acetate 3, i.e., ee > 99 per cent and good yields (45 per cent) within a short time (40 min). Thermodynamic parameters for the chemoesterification of sulcatol 2 by vinyl acetate were evaluated. The enthalpy and Gibbs free energy values of this reaction were negative, indicating that this process is exothermic and spontaneous which is in agreement with the reaction obtained enzymatically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.