987 resultados para dark energy experiments
Resumo:
A stately fraction of the Universe volume is dominated by almost empty space. Alongside the luminous filamentary structures that make it up, there are vast and smooth regions that have remained outside the Cosmology spotlight during the past decades: cosmic voids. Although essentially devoid of matter, voids enclose fundamental information about the cosmological framework and have gradually become an effective and competitive cosmological probe. In this Thesis work we present fundamental results about the cosmological exploitation of voids. We focused on the number density of voids as a function of their radius, known as void size function, developing an effective pipeline for its cosmological usage. We proposed a new parametrisation of the most used theoretical void size function to model voids identified in the distribution of biased tracers (i.e. dark matter haloes, galaxies and galaxy clusters), a step of fundamental importance to extend the analysis to real data surveys. We then applied our built methodology to study voids in alternative cosmological scenarios. Firstly we exploited voids with the aim of breaking the degeneracies between cosmological scenarios characterised by modified gravity and the inclusion of massive neutrinos. Secondly we analysed voids in the perspective of the Euclid survey, focusing on the void abundance constraining power on dynamical dark energy models with massive neutrinos. Moreover we explored other void statistics like void profiles and clustering (i.e. the void-galaxy and the void-void correlation), providing cosmological forecasts for the Euclid mission. We finally focused on the probe combination, highlighting the incredible potential of the joint analysis of multiple void statistics and of the combination of the void size function with different cosmological probes. Our results show the fundamental role of the void analysis in constraining the fundamental parameters of the cosmological model and pave the way for future studies on this topic.
Resumo:
Cosmic voids are vast and underdense regions emerging between the elements of the cosmic web and dominating the large-scale structure of the Universe. Void number counts and density profiles have been demonstrated to provide powerful cosmological probes. Indeed, thanks to their low-density nature and they very large sizes, voids represent natural laboratories to test alternative dark energy scenarios, modifications of gravity and the presence of massive neutrinos. Despite the increasing use of cosmic voids in Cosmology, a commonly accepted definition for these objects has not yet been reached. For this reason, different void finding algorithms have been proposed during the years. Voids finder algorithms based on density or geometrical criteria are affected by intrinsic uncertainties. In recent years, new solutions have been explored to face these issues. The most interesting is based on the idea of identify void positions through the dynamics of the mass tracers, without performing any direct reconstruction of the density field. The goal of this Thesis is to provide a performing void finder algorithm based on dynamical criteria. The Back-in-time void finder (BitVF) we present use tracers as test particles and their orbits are reconstructed from their actual clustered configuration to an homogeneous and isotropic distribution, expected for the Universe early epoch. Once the displacement field is reconstructed, the density field is computed as its divergence. Consequently, void centres are identified as local minima of the field. In this Thesis work we applied the developed void finding algorithm to simulations. From the resulting void samples we computed different void statistics, comparing the results to those obtained with VIDE, the most popular void finder. BitVF proved to be able to produce a more reliable void samples than the VIDE ones. The BitVF algorithm will be a fundamental tool for precision cosmology, especially with upcoming galaxy-survey.
Resumo:
Il modello ΛCDM è il modello cosmologico più semplice, ma finora più efficace, per descrivere l'evoluzione dell'universo. Esso si basa sulla teoria della Relatività Generale di Einstein e fornisce una spiegazione dell'espansione accelerata dell'universo introducendo la costante cosmologica Λ, che rappresenta il contributo della cosiddetta energia oscura, un'entità di cui ben poco si sa con certezza. Sono stati tuttavia proposti modelli teorici alternativi che descrivono gli effetti di questa quantità misteriosa, introducendo ad esempio gradi di libertà aggiuntivi, come nella teoria di Horndeski. L'obiettivo principale di questa testi è quello di studiare questi modelli tramite il tensor computer algebra xAct. In particolare, il nostro scopo sarà quello di implementare una procedura universale che permette di derivare, a partire dall'azione, le equazioni del moto e l'evoluzione temporale di qualunque modello generico.
Resumo:
The high sensitivity and excellent timing accuracy of Geiger mode avalanche photodiodes makes them ideal sensors as pixel detectors for particle tracking in high energy physics experiments to be performed in future linear colliders. Nevertheless, it is well known that these sensors suffer from dark counts and afterpulsing noise, which induce false hits (indistinguishable from event detection) as well as an increase of the necessary area of the readout system. In this work, we present a comparison between APDs fabricated in a high voltage 0.35 µm and a high integration 0.13 µm commercially available CMOS technologies that has been performed to determine which of them best fits the particle collider requirements. In addition, a readout circuit that allows low noise operation is introduced. Experimental characterization of the proposed pixel is also presented in this work.
Resumo:
Different treatments that could be implemented in the home environ-ment are evaluated with the objective of reaching a more rational and efficient use of energy. We consider that a detailed knowledge of energy-consuming behaviour is paramount for the development and implementation of new technologies, services and even policies that could result in more rational energy use. The proposed evaluation methodology is based on the development of economic experiments implemented in an experimental economics laboratory, where the behaviour of individuals when making decisions related to energy use in the domestic environment can be tested.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Search for dark matter and large extra dimensions in monojet events in pp collisions at root s=7 TeV
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We show the results and discussions of the study of a possible suppression of the extragalactic neutrino flux during its propagation due to a nonstandard interaction with a candidate field to dark matter. In particular, we show the study of neutrino interaction with an ultra-light scalar field. It is shown that the extragalactic neutrino flux may be suppressed by such an interaction, leading to a mechanism to reduce the ultra-high energy neutrino flux. We calculate both the cases of non-self-conjugate as well as self-conjugate ultra-light dark matter. In the first case, the suppression is independent of the neutrino and dark matter masses. We conclude that care must be taken when explaining limits on the neutrino flux through source acceleration mechanisms only, since there could be other mechanisms, as absorption during propagation, for the reduction of the neutrino flux [1], © Published under licence by IOP Publishing Ltd.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Although the Standard Model of particle physics (SM) provides an extremely successful description of the ordinary matter, one knows from astronomical observations that it accounts only for around 5% of the total energy density of the Universe, whereas around 30% are contributed by the dark matter. Motivated by anomalies in cosmic ray observations and by attempts to solve questions of the SM like the (g-2)_mu discrepancy, proposed U(1) extensions of the SM gauge group have raised attention in recent years. In the considered U(1) extensions a new, light messenger particle, the hidden photon, couples to the hidden sector as well as to the electromagnetic current of the SM by kinetic mixing. This allows for a search for this particle in laboratory experiments exploring the electromagnetic interaction. Various experimental programs have been started to search for hidden photons, such as in electron-scattering experiments, which are a versatile tool to explore various physics phenomena. One approach is the dedicated search in fixed-target experiments at modest energies as performed at MAMI or at JLAB. In these experiments the scattering of an electron beam off a hadronic target e+(A,Z)->e+(A,Z)+l^+l^- is investigated and a search for a very narrow resonance in the invariant mass distribution of the lepton pair is performed. This requires an accurate understanding of the theoretical basis of the underlying processes. For this purpose it is demonstrated in the first part of this work, in which way the hidden photon can be motivated from existing puzzles encountered at the precision frontier of the SM. The main part of this thesis deals with the analysis of the theoretical framework for electron scattering fixed-target experiments searching for hidden photons. As a first step, the cross section for the bremsstrahlung emission of hidden photons in such experiments is studied. Based on these results, the applicability of the Weizsäcker-Williams approximation to calculate the signal cross section of the process, which is widely used to design such experimental setups, is investigated. In a next step, the reaction e+(A,Z)->e+(A,Z)+l^+l^- is analyzed as signal and background process in order to describe existing data obtained by the A1 experiment at MAMI with the aim to give accurate predictions of exclusion limits for the hidden photon parameter space. Finally, the derived methods are used to find predictions for future experiments, e.g., at MESA or at JLAB, allowing for a comprehensive study of the discovery potential of the complementary experiments. In the last part, a feasibility study for probing the hidden photon model by rare kaon decays is performed. For this purpose, invisible as well as visible decays of the hidden photon are considered within different classes of models. This allows one to find bounds for the parameter space from existing data and to estimate the reach of future experiments.
Resumo:
The experiments at the Large Hadron Collider at the European Centre for Particle Physics, CERN, rely on efficient and reliable trigger systems for singling out interesting events. This thesis documents two online timing monitoring tools for the central trigger of the ATLAS experiment as well as the adaption of the central trigger simulation as part of the upgrade for the second LHC run. Moreover, a search for candidates for so-called Dark Matter, for which there is ample cosmological evidence, is presented. This search for generic weakly interacting massive particles (WIMPs) is based on the roughly 20/fb of proton-proton collisions at a centre-of-mass-energy of sqrt{s}=8 TeV recorded with the ATLAS detector in 2012. The considered signature are events with a highly energetic jet and large missing transverse energy. No significant deviation from the theory prediction is observed. Exclusion limits are derived on parameters of different signal models and compared to the results of other experiments. Finally, the results of a simulation study on the potential of the analysis at sqrt{s}=14 TeV are presented.
Resumo:
We consider an effective field theory for a gauge singlet Dirac dark matter particle interacting with the standard model fields via effective operators suppressed by the scale Λ≳1 TeV. We perform a systematic analysis of the leading loop contributions to spin-independent Dirac dark matter–nucleon scattering using renormalization group evolution between Λ and the low-energy scale probed by direct detection experiments. We find that electroweak interactions induce operator mixings such that operators that are naively velocity suppressed and spin dependent can actually contribute to spin-independent scattering. This allows us to put novel constraints on Wilson coefficients that were so far poorly bounded by direct detection. Constraints from current searches are already significantly stronger than LHC bounds, and will improve in the near future. Interestingly, the loop contribution we find is isospin violating even if the underlying theory is isospin conserving.
Resumo:
Ocean acidification is expected to lower the net accretion of coral reefs yet little is known about its effect on coral photophysiology. This study investigated the effect of increasing CO2 on photosynthetic capacity and photoprotection in Acropora formosa. The photoprotective role of photorespiration within dinoflagellates (genus Symbiodinium) has largely been overlooked due to focus on the presence of a carbon-concentrating mechanism despite the evolutionary persistence of a Form II Rubisco. The photorespiratory fixation of oxygen produces phosphoglycolate that would otherwise inhibit carbon fixation though the Calvin cycle if it were not converted to glycolate by phosphoglycolate phosphatase (PGPase). Glycolate is then either excreted or dealt with by enzymes in the photorespiratory glycolate and/or glycerate pathways adding to the pool of carbon fixed in photosynthesis. We found that CO2 enrichment led to enhanced photoacclimation (increased chlorophyll a per cell) to the subsaturating light levels. Light-enhanced dark respiration per cell and xanthophyll de-epoxidation increased, with resultant decreases in photosynthetic capacity (Pnmax) per chlorophyll. The conservative CO2 emission scenario (A1B; 600-790 ppm) led to a 38% increase in the Pnmax per cell whereas the 'business-as-usual' scenario (A1F1; 1160-1500 ppm) led to a 45% reduction in PGPase expression and no change in Pnmax per cell. These findings support an important functional role for PGPase in dinoflagellates that is potentially compromised under CO2 enrichment.
Resumo:
Investigations on antimatter allow us to shed light on fundamental issues of contemporary physics. The only antiatom presently available, antihydrogen, is produced making use of the Antiproton Decelerator (AD) facility at CERN. International collaborations currently on the floor (ALPHA, ASACUSA and ATRAP) have succeeded in producing antihydrogen and are now involved in its confinement and manipulation. The AEGIS experiment is currently completing the commissioning of the apparatus which will generate and manipulate antiatoms. The present paper, after a report on the main results achieved with antihydrogen physics, gives an overview of the AEGIS experiment, describes its current status and discusses its first target.
Resumo:
Laser tweezers and atomic force microscopes are increasingly used to probe the interactions and mechanical properties of individual molecules. Unfortunately, using such time-dependent perturbations to force rare molecular events also drives the system away from equilibrium. Nevertheless, we show how equilibrium free energy profiles can be extracted rigorously from repeated nonequilibrium force measurements on the basis of an extension of Jarzynski's remarkable identity between free energies and the irreversible work.