36 resultados para CLUMPY UNIVERSE
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The tunneling approach to the wave function of the Universe has been recently criticized by Bousso and Hawking who claim that it predicts a catastrophic instability of de Sitter space with respect to pair production of black holes. We show that this claim is unfounded. First, we argue that different horizon size regions in de Sitter space cannot be treated as independently created, as they contend. And second, the WKB tunneling wave function is not simply the inverse of the Hartle-Hawking one, except in very special cases. Applied to the related problem of pair production of massive particles, we argue that the tunneling wave function leads to a small constant production rate, and not to a catastrophe as the argument of Bousso and Hawking would suggest.
Resumo:
If the effective cosmological constant is nonzero, our observable universe may enter a stage of exponential expansion. In such a case, regions of it may tunnel back to the false vacuum of an inflaton scalar field, and inflation with a high expansion rate may resume in those regions. An ideal eternal observer would then witness an infinite succession of cycles from false vacuum to true, and back. Within each cycle, the entire history of a hot universe would be replayed. If there were several minima of the inflaton potential, our ideal observer would visit each one of these minima with a frequency which depends on the shape of the potential. We generalize the formalism of stochastic inflation to analyze the global structure of the universe when this recycling process is taken into account.
Resumo:
The atmospheric Cherenkov gamma-ray telescope MAGIC, designed for a low-energy threshold, has detected very-high-energy gamma rays from a giant flare of the distant Quasi-Stellar Radio Source (in short: radio quasar) 3C 279, at a distance of more than 5 billion light-years (a redshift of 0.536). No quasar has been observed previously in very-high-energy gamma radiation, and this is also the most distant object detected emitting gamma rays above 50 gigaelectron volts. Because high-energy gamma rays may be stopped by interacting with the diffuse background light in the universe, the observations by MAGIC imply a low amount for such light, consistent with that known from galaxy counts.
Resumo:
We report preliminary findings from analysis of a database under construction. The paper explores the legislative process in search for some of the alleged consequences of cabinet coalitions in a presidential system. Coalition effects should be less evident in the success of executive initiatives: strategic behavior hampers this intuitive measure of performance. Better measures, because less subject to strategic considerations, are the odds of passage of legislators' bills and the time proposals take to be approved. Thus measured, coalition effects are discernible. Analysis of the universe of proposals processed in the fragmented Uruguayan Parliament between 1985 and 2000 reveals that coalition, observed about half the period, swells success rates of coalition members by 60% on average (and by as much as 150% for those close to the president). Event history analysis shows that coalitions cut the wait for an executive bill by 3 months, 1/6th the average wait. The reverse effect is felt on the duration of legislators' bills.
Resumo:
Estudi realitzat a partir d’una estada al Physics Department de la New York University, United States, Estats Units, entre 2006 i 2008. Una de les observacions de més impacte en la cosmologia moderna ha estat la determinació empírica que l’Univers es troba actualment en una fase d’Expansió Accelerada (EA). Aquest fenòmen implica que o bé l’Univers està dominat per un nou sector de matèria/energia, o bé la Relativitat General deixa de tenir validesa a escales cosmològiques. La primera possibilitat comprèn els models d’Energia Fosca (EF), i el seu principal problema és que l’EF ha de tenir propietats tan especials que es fan difícils de justificar teòricament. La segona possibilitat requereix la construcció de teories consistents de Gravetat Modificada a Grans Distàncies (GMGD), que són una generalització dels models de gravetat massiva. L’interès fenomenològic per aquestes teories també va resorgir amb l’aparició dels primers exemples de models de GMGD, com ara el model de Dvali, Gabadadze i Porrati (DGP), que consisteix en un tipus de brana en una dimensió extra. Malauradament, però, aquest model no permet explicar de forma consistent l’EA de l’Univers. Un dels objectius d’aquest projecte ha estat establir la viabilitat interna i fenomenològica dels models de GMGD. Des del punt de vista fenomenològic, ens hem centrat en la questió més important a la pràctica: trobar signatures observacionals que permetin distingir els models de GMGD dels d’EF. A nivell més teòric, també hem investigat el significat de les inestabilitats del model DGP.L’altre gran objectiu que ens vam proposar va ser la construcció de noves teories de GMGD. En la segona part d’aquest projecte, hem elaborat i mostrat la consistència del model “DGP en Cascada”, que generalitza el model DGP a més dimensions extra, i representa el segon model consistent i invariant-Lorentz a l’espai pla conegut. L’existència d’altres models de GMGD més enllà de DGP és de gran interès atès que podria permetre obtenir l’EA de l’Univers de forma purament geomètrica.
Resumo:
For the execution of the scientific applications, different methods have been proposed to dynamically provide execution environments for such applications that hide the complexity of underlying distributed and heterogeneous infrastructures. Recently virtualization has emerged as a promising technology to provide such environments. Virtualization is a technology that abstracts away the details of physical hardware and provides virtualized resources for high-level scientific applications. Virtualization offers a cost-effective and flexible way to use and manage computing resources. Such an abstraction is appealing in Grid computing and Cloud computing for better matching jobs (applications) to computational resources. This work applies the virtualization concept to the Condor dynamic resource management system by using Condor Virtual Universe to harvest the existing virtual computing resources to their maximum utility. It allows existing computing resources to be dynamically provisioned at run-time by users based on application requirements instead of statically at design-time thereby lay the basis for efficient use of the available resources, thus providing way for the efficient use of the available resources.
Resumo:
Report for the scientific sojourn carried out at the Darmouth College, from august 2007 until february 2008. It has been very successful, from different viewpoints: scientific, philosophical, human. We have definitely advanced, during the past six months, towards the comprehension of the behaviour of the fluctuations of the quantum vacuum in the presence of boundaries, moving and non-moving, and also in situations where the topology of space-time changes: the dynamical Casimir effect, regularization problems, particle creation statistics, according to different BC, etc. We have solved some longstanding problems and got in this subject quite remarkable results (as we will explain in more detail below). We also pursued a general approach towards a viable modified f(R) gravity in both the Jordan and the Einstein frames (which are known to be mathematically equivalent, but physically not so). A class of exponential, realistic modified gravities has been introduced by us and investigated with care. Special focus was made on step-class models, most promising from the phenomenological viewpoint and which provide a natural way to classify all viable modified gravities. One- and two-steps models were considered, but the analysis is extensible to N-step models. Both inflation in the early universe and the onset of recent accelerated expansion arise in these models in a natural, unified way, what makes them very promising. Moreover, it is monstrated in our work that models in this category easily pass all local tests, including stability of spherical body solution, non-violation of Newton's law, and generation of a very heavy positive mass for the additional scalar degree of freedom.
Resumo:
Grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational resources. Grid enables access to the resources but it does not guarantee any quality of service. Moreover, Grid does not provide performance isolation; job of one user can influence the performance of other user’s job. The other problem with Grid is that the users of Grid belong to scientific community and the jobs require specific and customized software environment. Providing the perfect environment to the user is very difficult in Grid for its dispersed and heterogeneous nature. Though, Cloud computing provide full customization and control, but there is no simple procedure available to submit user jobs as in Grid. The Grid computing can provide customized resources and performance to the user using virtualization. A virtual machine can join the Grid as an execution node. The virtual machine can also be submitted as a job with user jobs inside. Where the first method gives quality of service and performance isolation, the second method also provides customization and administration in addition. In this thesis, a solution is proposed to enable virtual machine reuse which will provide performance isolation with customization and administration. The same virtual machine can be used for several jobs. In the proposed solution customized virtual machines join the Grid pool on user request. Proposed solution describes two scenarios to achieve this goal. In first scenario, user submits their customized virtual machine as a job. The virtual machine joins the Grid pool when it is powered on. In the second scenario, user customized virtual machines are preconfigured in the execution system. These virtual machines join the Grid pool on user request. Condor and VMware server is used to deploy and test the scenarios. Condor supports virtual machine jobs. The scenario 1 is deployed using Condor VM universe. The second scenario uses VMware-VIX API for scripting powering on and powering off of the remote virtual machines. The experimental results shows that as scenario 2 does not need to transfer the virtual machine image, the virtual machine image becomes live on pool more faster. In scenario 1, the virtual machine runs as a condor job, so it easy to administrate the virtual machine. The only pitfall in scenario 1 is the network traffic.
Resumo:
Treball de recerca que pretén respondre les següents preguntes: La visió de Goya sobre l'univers femení, i que plasma en les seves pintures, correspon amb la realitat social de les dones de la seva època? Es diu que el segle XVIII va ésser el segle de les dones, però és això realment cert? Va ser el segle de les dones o bé el segle d'algunes dones? A través de dos itineraris paral·lels: la literatura d'una banda i les arts plàstiques de l'altra, s'explorarà la naturalesa d'aquest segle. La comparació de l'anàlisi documental i de dades iconogràfiques potser ens podran donar una resposta.
Resumo:
One of the unresolved questions of modern physics is the nature of Dark Matter. Strong experimental evidences suggest that the presence of this elusive component in the energy budget of the Universe is quite significant, without, however, being able to provide conclusive information about its nature. The most plausible scenario is that of weakly interacting massive particles (WIMPs), that includes a large class of non-baryonic Dark Matter candidates with a mass typically between few tens of GeV and few TeVs, and a cross section of the order of weak interactions. Search for Dark Matter particles using very high energy gamma-ray Cherenkov telescopes is based on the model that WIMPs can self-annihilate, leading to production of detectable species, like photons. These photons are very energetic, and since unreflected by the Universe's magnetic fields, they can be traced straight to the source of their creation. The downside of the approach is a great amount of background radiation, coming from the conventional astrophysical objects, that usually hides clear signals of the Dark Matter particle interactions. That is why good choice of the observational candidates is the crucial factor in search for Dark Matter. With MAGIC (Major Atmospheric Gamma-ray Imaging Cherenkov Telescopes), a two-telescope ground-based system located in La Palma, Canary Islands, we choose objects like dwarf spheroidal satellite galaxies of the Milky Way and galaxy clusters for our search. Our idea is to increase chances for WIMPs detection by pointing to objects that are relatively close, with great amount of Dark Matter and with as-little-as-possible pollution from the stars. At the moment, several observation projects are ongoing and analyses are being performed.
Resumo:
We study how to promote compliance with rules in everyday situations. Having access to unique data on the universe of users of all public libraries inBarcelona, we test the effect of sending email messages with dierent contents.We find that users return their items earlier if asked to do so in a simple email.Emails reminding users of the penalties associated with late returns are more effective than emails with only a generic reminder. We find differential treatmenteffects by user types. The characteristics we analyze are previous compliance,gender, age, and nationality.
Resumo:
Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.
Resumo:
We consider the classical stochastic fluctuations of spacetime geometry induced by quantum fluctuations of massless nonconformal matter fields in the early Universe. To this end, we supplement the stress-energy tensor of these fields with a stochastic part, which is computed along the lines of the Feynman-Vernon and Schwinger-Keldysh techniques; the Einstein equation is therefore upgraded to a so-called Einstein-Langevin equation. We consider in some detail the conformal fluctuations of flat spacetime and the fluctuations of the scale factor in a simple cosmological model introduced by Hartle, which consists of a spatially flat isotropic cosmology driven by radiation and dust.
Resumo:
A semiclassical cosmological model is considered which consists of a closed Friedmann-Robertson-Walker spacetime in the presence of a cosmological constant, which mimics the effect of an inflaton field, and a massless, non-conformally coupled quantum scalar field. We show that the back-reaction of the quantum field, which consists basically of a nonlocal term due to gravitational particle creation and a noise term induced by the quantum fluctuations of the field, are able to drive the cosmological scale factor over the barrier of the classical potential so that if the universe starts near a zero scale factor (initial singularity), it can make the transition to an exponentially expanding de Sitter phase. We compute the probability of this transition and it turns out to be comparable with the probability that the universe tunnels from ``nothing'' into an inflationary stage in quantum cosmology. This suggests that in the presence of matter fields the back-reaction on the spacetime should not be neglected in quantum cosmology.
Resumo:
An inflating brane world can be created from ``nothing'' together with its anti-de Sitter (AdS) bulk. The resulting space-time has compact spatial sections bounded by the brane. During inflation, the continuum of KK modes is separated from the massless zero mode by the gap m=(3/2)H, where H is the Hubble rate. We consider the analogue of the Nariai solution and argue that it describes the pair production of ``black cigars'' attached to the inflating brane. In the case when the size of the instantons is much larger than the AdS radius, the 5-dimensional action agrees with the 4-dimensional one. Hence, the 5D and 4D gravitational entropies are the same in this limit. We also consider thermal instantons with an AdS black hole in the bulk. These may be interpreted as describing the creation of a hot universe from nothing or the production of AdS black holes in the vicinity of a pre-existing inflating brane world. The Lorentzian evolution of the brane world after creation is briefly discussed. An additional ``integration constant'' in the Friedmann equation-accompanying a term which dilutes like radiation-describes the tidal force in the fifth direction and arises from the mass of a spherical object inside the bulk. In general, this could be a 5-dimensional black hole or a ``parallel'' brane world of negative tension concentrical with our brane-world. In the case of thermal solutions, and in the spirit of the AdS/CFT correspondence, one may attribute the additional term to thermal radiation in the boundary theory. Then, for temperatures well below the AdS scale, the entropy of this radiation agrees with the entropy of the black hole in the AdS bulk.