28 resultados para particle physics - cosmology connection
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
The discovery of neutrino oscillations provides a solid evidence for nonzero neutrino masses and leptonic mixing. The fact that neutrino masses are so tiny constitutes a puzzling problem in particle physics. From the theoretical viewpoint, the smallness of neutrino masses can be elegantly explained through the seesaw mechanism. Another challenging issue for particle physics and cosmology is the explanation of the matter-antimatter asymmetry observed in Nature. Among the viable mechanisms, leptogenesis is a simple and well-motivated framework. In this paper we briefly review these aspects, making emphasis on the possibility of linking neutrino physics to the cosmological bary asymmetry originated from leptogenesis.
Resumo:
The main properties of strangelets, namely their energy per baryon, radius and electric charge, are studied in the unpaired magnetized strange quark matter (MSQM) and paired magnetized colour flavour locked (MCFL) phases. Temperature effects are taken into account in order to study their stability compared to the Fe-56 isotope and nonmagnetized strangelets within the framework of the MIT bag model. We conclude that the presence of a magnetic field tends to stabilize the strangelets more, even when temperature is considered. It is also shown that MCFL strangelets are more stable than ordinary MSQM strangelets for typical gap values of the order of O(100) MeV. A distinctive feature in the detection of strangelets either in cosmic rays or in heavy-ion collider experiments could be their electric charge. We find that the electric charge is modified in the presence of the magnetic field, leading to higher (lower) charge values for MSQM (MCFL) strangelets, when compared to the nonmagnetized case.
Resumo:
We consider a simple extension of the Standard Model by adding two Higgs triplets and a complex scalar singlet to its particle content. In this framework, the CP symmetry is spontaneously broken at high energies by the complex vacuum expectation value of the scalar singlet. Such a breaking leads to leptonic CP violation at low energies. The model also exhibits an A(4) X Z(4) flavor symmetry which, after being spontaneously broken at a high-energy scale, yields a tribimaximal pattern in the lepton sector. We consider small perturbations around the tribimaximal vacuum alignment condition in order to generate nonzero values of theta(13), as required by the latest neutrino oscillation data. It is shown that the value of theta(13) recently measured by the Daya Bay Reactor Neutrino Experiment can be accommodated in our framework together with large Dirac-type CP violation. We also address the viability of leptogenesis in our model through the out-of-equilibrium decays of the Higgs triplets. In particular, the CP asymmetries in the triplet decays into two leptons are computed and it is shown that the effective leptogenesis and low-energy CP-violating phases are directly linked.
Resumo:
We study a model consisting of particles with dissimilar bonding sites ("patches"), which exhibits self-assembly into chains connected by Y-junctions, and investigate its phase behaviour by both simulations and theory. We show that, as the energy cost epsilon(j) of forming Y-junctions increases, the extent of the liquid-vapour coexistence region at lower temperatures and densities is reduced. The phase diagram thus acquires a characteristic "pinched" shape in which the liquid branch density decreases as the temperature is lowered. To our knowledge, this is the first model in which the predicted topological phase transition between a fluid composed of short chains and a fluid rich in Y-junctions is actually observed. Above a certain threshold for epsilon(j), condensation ceases to exist because the entropy gain of forming Y-junctions can no longer offset their energy cost. We also show that the properties of these phase diagrams can be understood in terms of a temperature-dependent effective valence of the patchy particles. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3605703]
Resumo:
We investigate the phase behaviour of 2D mixtures of bi-functional and three-functional patchy particles and 3D mixtures of bi-functional and tetra-functional patchy particles by means of Monte Carlo simulations and Wertheim theory. We start by computing the critical points of the pure systems and then we investigate how the critical parameters change upon lowering the temperature. We extend the successive umbrella sampling method to mixtures to make it possible to extract information about the phase behaviour of the system at a fixed temperature for the whole range of densities and compositions of interest. (C) 2013 AIP Publishing LLC.
Resumo:
To determine self-consistently the time evolution of particle size and their number density in situ multi-angle polarization-sensitive laser light scattering was used. Cross-polarization intensities (incident and scattered light intensities with opposite polarization) measured at 135 degrees and ex situ transmission electronic microscopy analysis demonstrate the existence of nonspherical agglomerates during the early phase of agglomeration. Later in the particle time development both techniques reveal spherical particles again. The presence of strong cross-polarization intensities is accompanied by low-frequency instabilities detected on the scattered light intensities and plasma emission. It is found that the particle radius and particle number density during the agglomeration phase can be well described by the Brownian free molecule coagulation model. Application of this neutral particle coagulation model is justified by calculation of the particle charge whereby it is shown that particles of a few tens of nanometer can be considered as neutral under our experimental conditions. The measured particle dispersion can be well described by a Brownian free molecule coagulation model including a log-normal particle size distribution. (C) 1996 American Institute of Physics.
Resumo:
Motivated by the dark matter and the baryon asymmetry problems, we analyze a complex singlet extension of the Standard Model with a Z(2) symmetry (which provides a dark matter candidate). After a detailed two-loop calculation of the renormalization group equations for the new scalar sector, we study the radiative stability of the model up to a high energy scale (with the constraint that the 126 GeV Higgs boson found at the LHC is in the spectrum) and find it requires the existence of a new scalar state mixing with the Higgs with a mass larger than 140 GeV. This bound is not very sensitive to the cutoff scale as long as the latter is larger than 10(10) GeV. We then include all experimental and observational constraints/measurements from collider data, from dark matter direct detection experiments, and from the Planck satellite and in addition force stability at least up to the grand unified theory scale, to find that the lower bound is raised to about 170 GeV, while the dark matter particle must be heavier than about 50 GeV.
Resumo:
We investigate a mechanism that generates exact solutions of scalar field cosmologies in a unified way. The procedure investigated here permits to recover almost all known solutions, and allows one to derive new solutions as well. In particular, we derive and discuss one novel solution defined in terms of the Lambert function. The solutions are organised in a classification which depends on the choice of a generating function which we have denoted by x(phi) that reflects the underlying thermodynamics of the model. We also analyse and discuss the existence of form-invariance dualities between solutions. A general way of defining the latter in an appropriate fashion for scalar fields is put forward.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
This letter reports on the magnetic properties of Ti(1-x)Co(x)O(2) anatase phase nanopowders with different Co contents. It is shown that oxygen vacancies play an important role in promoting long-range ferromagnetic order in the material studied in addition to the transition-metal doping. Furthermore, the results allow ruling out the premise of a strict connection between Co clustering and the ferromagnetism observed in the Co:TiO(2) anatase system.
Resumo:
The idea of grand unification in a minimal supersymmetric SU(5) x SU(5) framework is revisited. It is shown that the unification of gauge couplings into a unique coupling constant can be achieved at a high-energy scale compatible with proton decay constraints. This requires the addition of minimal particle content at intermediate energy scales. In particular, the introduction of the SU(2)(L) triplets belonging to the (15, 1)+((15) over bar, 1) representations, as well as of the scalar triplet Sigma(3) and octet Sigma(8) in the (24, 1) representation, turns out to be crucial for unification. The masses of these intermediate particles can vary over a wide range, and even lie in the TeV region. In contrast, the exotic vector-like fermions must be heavy enough and have masses above 10(10) GeV. We also show that, if the SU(5) x SU(5) theory is embedded into a heterotic string scenario, it is not possible to achieve gauge coupling unification with gravity at the perturbative string scale.
Resumo:
We consider a simple model consisting of particles with four bonding sites ("patches"), two of type A and two of type B, on the square lattice, and investigate its global phase behavior by simulations and theory. We set the interaction between B patches to zero and calculate the phase diagram as the ratio between the AB and the AA interactions, epsilon(AB)*, varies. In line with previous work, on three-dimensional off-lattice models, we show that the liquid-vapor phase diagram exhibits a re-entrant or "pinched" shape for the same range of epsilon(AB)*, suggesting that the ratio of the energy scales - and the corresponding empty fluid regime - is independent of the dimensionality of the system and of the lattice structure. In addition, the model exhibits an order-disorder transition that is ferromagnetic in the re-entrant regime. The use of low-dimensional lattice models allows the simulation of sufficiently large systems to establish the nature of the liquid-vapor critical points and to describe the structure of the liquid phase in the empty fluid regime, where the size of the "voids" increases as the temperature decreases. We have found that the liquid-vapor critical point is in the 2D Ising universality class, with a scaling region that decreases rapidly as the temperature decreases. The results of simulations and theoretical analysis suggest that the line of order-disorder transitions intersects the condensation line at a multi-critical point at zero temperature and density, for patchy particle models with a re-entrant, empty fluid, regime. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3657406]
Resumo:
The Tevatron has measured a discrepancy relative to the standard model prediction in the forward-backward asymmetry in top quark pair production. This asymmetry grows with the rapidity difference of the two top quarks. It also increases with the invariant mass of the t (t) over bar pair, reaching, for high invariant masses, 3.4 standard deviations above the next-to-leading order prediction for the charge asymmetry of QCD. However, perfect agreement between experiment and the standard model was found in both total and differential cross section of top quark pair production. As this result could be a sign of new physics we have parametrized this new physics in terms of a complete set of dimension six operators involving the top quark. We have then used a Markov chain Monte Carlo approach in order to find the best set of parameters that fits the data, using all available data regarding top quark pair production at the Tevatron. We have found that just a very small number of operators are able to fit the data better than the standard model.
Resumo:
LHC has found hints for a Higgs particle of 125 GeV. We investigate the possibility that such a particle is a mixture of scalar and pseudoscalar states. For definiteness, we concentrate on a two-Higgs doublet model with explicit CP violation and soft Z(2) violation. Including all Higgs production mechanisms, we determine the current constraints obtained by comparing h -> yy with h -> VV*, and comment on the information which can be gained by measurements of h -> b (b) over bar. We find bounds vertical bar s(2)vertical bar less than or similar to 0.83 at one sigma, where vertical bar s(2)vertical bar = 0 (vertical bar s(2)vertical bar = 1) corresponds to a pure scalar (pure pseudoscalar) state.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.