43 resultados para future energy scenario
Resumo:
We discuss the role of dissipation in the explosive spinodal decomposition scenario of hadron production during the chiral transition after a high-energy heavy ion collision. We use a Langevin description inspired by microscopic nonequilibrium field theory results to perform real-time lattice simulations of the behavior of the chiral fields. We show that the effect of dissipation can be dramatic. Analytic results for the short-time dynamics are also presented. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
There are a plethora of dark energy parametrizations that can fit current supernovae Ia data. However, these data are only sensitive to redshifts up to order one. In fact, many of these parametrizations break down at higher redshifts. In this paper we study the effect of dark energy models on the formation of dark halos. We select a couple of dark energy parametrizations which are sensible at high redshifts and compute their effect on the evolution of density perturbations in the linear and non-linear regimes. Using the Press-Schechter formalism we show that they produce distinguishable signatures in the number counts of dark halos. Therefore, future observations of galaxy clusters can provide complementary constraints on the behaviour of dark energy.
Resumo:
The DO experiment at Fermilab's Tevatron will record several petabytes of data over the next five years in pursuing the goals of understanding nature and searching for the origin of mass. Computing resources required to analyze these data far exceed capabilities of any one institution. Moreover, the widely scattered geographical distribution of DO collaborators poses further serious difficulties for optimal use of human and computing resources. These difficulties will exacerbate in future high energy physics experiments, like the LHC. The computing grid has long been recognized as a solution to these problems. This technology is being made a more immediate reality to end users in DO by developing a grid in the DO Southern Analysis Region (DOSAR), DOSAR-Grid, using a available resources within it and a home-grown local task manager, McFarm. We will present the architecture in which the DOSAR-Grid is implemented, the use of technology and the functionality of the grid, and the experience from operating the grid in simulation, reprocessing and data analyses for a currently running HEP experiment.
Resumo:
Common sense tells us that the future is an essential element in any strategy. In addition, there is a good deal of literature on scenario planning, which is an important tool in considering the future in terms of strategy. However, in many organizations there is serious resistance to the development of scenarios, and they are not broadly implemented by companies. But even organizations that do not rely heavily on the development of scenarios do, in fact, construct visions to guide their strategies. But it might be asked, what happens when this vision is not consistent with the future? To address this problem, the present article proposes a method for checking the content and consistency of an organization's vision of the future, no matter how it was conceived. The proposed method is grounded on theoretical concepts from the field of future studies, which are described in this article. This study was motivated by the search for developing new ways of improving and using scenario techniques as a method for making strategic decisions. The method was then tested on a company in the field of information technology in order to check its operational feasibility. The test showed that the proposed method is, in fact, operationally feasible and was capable of analyzing the vision of the company being studied, indicating both its shortcomings and points of inconsistency. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
As it follows from the classical analysis, the typical final state of a dark energy universe where a dominant energy condition is violated is a finite-time, sudden future singularity (a big rip). For a number of dark energy universes (including scalar phantom and effective phantom theories as well as specific quintessence models) we demonstrate that quantum effects play the dominant role near a big rip, driving the universe out of a future singularity (or, at least, moderating it). As a consequence, the entropy bounds with quantum corrections become well defined near a big rip. Similarly, black hole mass loss due to phantom accretion is not so dramatic as was expected: masses do not vanish to zero due to the transient character of the phantom evolution stage. Some examples of cosmological evolution for a negative, time-dependent equation of state are also considered with the same conclusions. The application of negative entropy (or negative temperature) occurrence in the phantom thermodynamics is briefly discussed.
Resumo:
In the spatial electric load forecasting, the future land use determination is one of the most important tasks, and one of the most difficult, because of the stochastic nature of the city growth. This paper proposes a fast and efficient algorithm to find out the future land use for the vacant land in the utility service area, using ideas from knowledge extraction and evolutionary algorithms. The methodology was implemented into a full simulation software for spatial electric load forecasting, showing a high rate of success when the results are compared to information gathered from specialists. The importance of this methodology lies in the reduced set of data needed to perform the task and the simplicity for implementation, which is a great plus for most of the electric utilities without specialized tools for this planning activity. © 2008 IEEE.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Transactional memory (TM) is a new synchronization mechanism devised to simplify parallel programming, thereby helping programmers to unleash the power of current multicore processors. Although software implementations of TM (STM) have been extensively analyzed in terms of runtime performance, little attention has been paid to an equally important constraint faced by nearly all computer systems: energy consumption. In this work we conduct a comprehensive study of energy and runtime tradeoff sin software transactional memory systems. We characterize the behavior of three state-of-the-art lock-based STM algorithms, along with three different conflict resolution schemes. As a result of this characterization, we propose a DVFS-based technique that can be integrated into the resolution policies so as to improve the energy-delay product (EDP). Experimental results show that our DVFS-enhanced policies are indeed beneficial for applications with high contention levels. Improvements of up to 59% in EDP can be observed in this scenario, with an average EDP reduction of 16% across the STAMP workloads. © 2012 IEEE.