885 resultados para PARTICLE CREATION
Resumo:
Consideramos um campo escalar não massivo num espaço-tempo bi-dimensional dentro de uma cavidade oscilante com condições de contorno mistas. Discutindo do fenômeno da criação de partículas, consideramos uma situação de ressonância paramétrica na qual a freqüência de oscilação da fronteira é duas vezes a freqüência do primeiro modo da cavidade estática. Por conveniência, supomos que a fronteira que está em repouso impõe ao campo a condição de Neumann, enquanto que a outra, em movimento não relativístico, impõe ao campo a condição de Dirichlet. Seguindo o procedimento desenvolvido por Dodonov e Klimov (Phys. Rev. A, 56, 2664 (1996)), calculamos o número de partículas criadas, a taxa de geração e a energia na cavidade. Comparamos nossos resultados aos encontrados na literatura para o caso Dirichlet-Dirichlet.
Resumo:
Our previous results on the nonperturbative calculations of the mean current and of the energy-momentum tensor in QED with the T-constant electric field are generalized to arbitrary dimensions. The renormalized mean values are found, and the vacuum polarization contributions and particle creation contributions to these mean values are isolated in the large T limit; we also relate the vacuum polarization contributions to the one-loop effective Euler-Heisenberg Lagrangian. Peculiarities in odd dimensions are considered in detail. We adapt general results obtained in 2 + 1 dimensions to the conditions which are realized in the Dirac model for graphene. We study the quantum electronic and energy transport in the graphene at low carrier density and low temperatures when quantum interference effects are important. Our description of the quantum transport in the graphene is based on the so-called generalized Furry picture in QED where the strong external field is taken into account nonperturbatively; this approach is not restricted to a semiclassical approximation for carriers and does not use any statistical assumptions inherent in the Boltzmann transport theory. In addition, we consider the evolution of the mean electromagnetic field in the graphene, taking into account the backreaction of the matter field to the applied external field. We find solutions of the corresponding Dirac-Maxwell set of equations and with their help we calculate the effective mean electromagnetic field and effective mean values of the current and the energy-momentum tensor. The nonlinear and linear I-V characteristics experimentally observed in both low-and high-mobility graphene samples are quite well explained in the framework of the proposed approach, their peculiarities being essentially due to the carrier creation from the vacuum by the applied electric field. DOI: 10.1103/PhysRevD.86.125022
Resumo:
We propose an alternative, nonsingular, cosmic scenario based on gravitationally induced particle production. The model is an attempt to evade the coincidence and cosmological constant problems of the standard model (Lambda CDM) and also to connect the early and late time accelerating stages of the Universe. Our space-time emerges from a pure initial de Sitter stage thereby providing a natural solution to the horizon problem. Subsequently, due to an instability provoked by the production of massless particles, the Universe evolves smoothly to the standard radiation dominated era thereby ending the production of radiation as required by the conformal invariance. Next, the radiation becomes subdominant with the Universe entering in the cold dark matter dominated era. Finally, the negative pressure associated with the creation of cold dark matter (CCDM model) particles accelerates the expansion and drives the Universe to a final de Sitter stage. The late time cosmic expansion history of the CCDM model is exactly like in the standard Lambda CDM model; however, there is no dark energy. The model evolves between two limiting (early and late time) de Sitter regimes. All the stages are also discussed in terms of a scalar field description. This complete scenario is fully determined by two extreme energy densities, or equivalently, the associated de Sitter Hubble scales connected by rho(I)/rho(f) = (H-I/H-f)(2) similar to 10(122), a result that has no correlation with the cosmological constant problem. We also study the linear growth of matter perturbations at the final accelerating stage. It is found that the CCDM growth index can be written as a function of the Lambda growth index, gamma(Lambda) similar or equal to 6/11. In this framework, we also compare the observed growth rate of clustering with that predicted by the current CCDM model. Performing a chi(2) statistical test we show that the CCDM model provides growth rates that match sufficiently well with the observed growth rate of structure.
Resumo:
We discuss a new interacting model for the cosmological dark sector in which the attenuated dilution of cold dark matter scales as a(-3)f(a), where f(a) is an arbitrary function of the cosmic scale factor a. From thermodynamic arguments, we show that f(a) is proportional to the entropy source of the particle creation process. In order to investigate the cosmological consequences of this kind of interacting models, we expand f(a) in a power series, and viable cosmological solutions are obtained. Finally, we use current observational data to place constraints on the interacting function f(a).
Resumo:
The nonequilibrium phase transition of the one-dimensional triplet-creation model is investigated using the n-site approximation scheme. We find that the phase diagram in the space of parameters (gamma, D), where gamma is the particle decay probability and D is the diffusion probability, exhibits a tricritical point for n >= 4. However, the fitting of the tricritical coordinates (gamma(t), D(t)) using data for 4 <= n <= 13 predicts that gamma(t) becomes negative for n >= 26, indicating thus that the phase transition is always continuous in the limit n -> infinity. However, the large discrepancies between the critical parameters obtained in this limit and those obtained by Monte Carlo simulations, as well as a puzzling non-monotonic dependence of these parameters on the order of the approximation n, argue for the inadequacy of the n-site approximation to study the triplet-creation model for computationally feasible values of n.
Resumo:
During the last decade advances in the field of sensor design and improved base materials have pushed the radiation hardness of the current silicon detector technology to impressive performance. It should allow operation of the tracking systems of the Large Hadron Collider (LHC) experiments at nominal luminosity (1034 cm-2s-1) for about 10 years. The current silicon detectors are unable to cope with such an environment. Silicon carbide (SiC), which has recently been recognized as potentially radiation hard, is now studied. In this work it was analyzed the effect of high energy neutron irradiation on 4H-SiC particle detectors. Schottky and junction particle detectors were irradiated with 1 MeV neutrons up to fluence of 1016 cm-2. It is well known that the degradation of the detectors with irradiation, independently of the structure used for their realization, is caused by lattice defects, like creation of point-like defect, dopant deactivation and dead layer formation and that a crucial aspect for the understanding of the defect kinetics at a microscopic level is the correct identification of the crystal defects in terms of their electrical activity. In order to clarify the defect kinetic it were carried out a thermal transient spectroscopy (DLTS and PICTS) analysis of different samples irradiated at increasing fluences. The defect evolution was correlated with the transport properties of the irradiated detector, always comparing with the un-irradiated one. The charge collection efficiency degradation of Schottky detectors induced by neutron irradiation was related to the increasing concentration of defects as function of the neutron fluence.
Resumo:
The AEgIS experiment is an interdisciplinary collaboration between atomic, plasma and particle physicists, with the scientific goal of performing the first precision measurement of the Earth's gravitational acceleration on antimatter. The principle of the experiment is as follows: cold antihydrogen atoms are synthesized in a Penning-Malmberg trap and are Stark accelerated towards a moiré deflectometer, the classical counterpart of an atom interferometer, and annihilate on a position sensitive detector. Crucial to the success of the experiment is an antihydrogen detector that will be used to demonstrate the production of antihydrogen and also to measure the temperature of the anti-atoms and the creation of a beam. The operating requirements for the detector are very challenging: it must operate at close to 4 K inside a 1 T solenoid magnetic field and identify the annihilation of the antihydrogen atoms that are produced during the 1 μs period of antihydrogen production. Our solution—called the FACT detector—is based on a novel multi-layer scintillating fiber tracker with SiPM readout and off the shelf FPGA based readout system. This talk will present the design of the FACT detector and detail the operation of the detector in the context of the AEgIS experiment.
Resumo:
For two two-level atoms coupled to a single Bosonic mode that is driven and heavily damped, the steady state can be entangled by resonantly driving the system [S. Schneider and G. J. Milburn, Phys. Rev. A 65, 042107 (2002)]. We present a scheme to significantly increase the steady-state entanglement by using homodyne-mediated feedback, in which the Bosonic mode is that of an electromagnetic cavity, the output of which is measured and the resulting homodyne photocurrent is used to modulate the field driving the qubits. Such feedback can increase the nonlinear response to both the decoherence process of the two-qubit system and the coherent evolution of individual qubits. We present the properties of the entangled states using the SO(3) Q function.
Resumo:
This article analyzes the historical, social and cognitive dimensions of the sociology of medicine in the construction of its identity, from Wolf Lepenies' perspective. It is understood that the construction of an identity does not end with the first historical manifestations, but is consolidated when it is institutionalized and structured as a field of knowledge by creating its own forms of cognitive expression. The text is divided into three parts: in the first the precursors are presented, highlighting the role played by some travelers, naturalists and folklore scholars, followed by social physicians-scientists and the first social scientists (1940-1969). In the second part, aspects of the consolidation of the social sciences in health are presented at two significant moments, namely the 1970s and 1980s. In the third part, the issues raised by the field are addressed in general terms. It is considered that once the main structural stages are in place there is still a need for the formation of new generations of social scientists in health. It is also essential to disseminate scientific production and to ensure that the relations are studied in depth and institutionalized with the sociological matrices on the one hand and with the field of health on the other.
Resumo:
Current data indicate that the size of high-density lipoprotein (HDL) may be considered an important marker for cardiovascular disease risk. We established reference values of mean HDL size and volume in an asymptomatic representative Brazilian population sample (n=590) and their associations with metabolic parameters by gender. Size and volume were determined in HDL isolated from plasma by polyethyleneglycol precipitation of apoB-containing lipoproteins and measured using the dynamic light scattering (DLS) technique. Although the gender and age distributions agreed with other studies, the mean HDL size reference value was slightly lower than in some other populations. Both HDL size and volume were influenced by gender and varied according to age. HDL size was associated with age and HDL-C (total population); non- white ethnicity and CETP inversely (females); HDL-C and PLTP mass (males). On the other hand, HDL volume was determined only by HDL-C (total population and in both genders) and by PLTP mass (males). The reference values for mean HDL size and volume using the DLS technique were established in an asymptomatic and representative Brazilian population sample, as well as their related metabolic factors. HDL-C was a major determinant of HDL size and volume, which were differently modulated in females and in males.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.
Resumo:
Two case studies are presented to describe the process of public school teachers authoring and creating chemistry simulations. They are part of the Virtual Didactic Laboratory for Chemistry, a project developed by the School of the Future of the University of Sao Paulo. the documental analysis of the material produced by two groups of teachers reflects different selection process for both themes and problem-situations when creating simulations. The study demonstrates the potential for chemistry learning with an approach that takes students' everyday lives into account and is based on collaborative work among teachers and researches. Also, from the teachers' perspectives, the possibilities of interaction that a simulation offers for classroom activities are considered.
Resumo:
The aim of this paper is to analyze the process of knowledge creation when developing high technology products in projects having various innovation degrees. The main contribution to the literature is the systematization of an approach to analyze knowledge creation during the product innovation process. Three innovation projects developed by a company specialized in industrial automation systems were investigated using case studies. The knowledge creation processes, which took place in these three projects, were analyzed comparatively. As a distinctive result of this paper, the main features of the knowledge creation processes influenced by a degree of technological innovation are identified.
Resumo:
Creation of cold dark matter (CCDM) can macroscopically be described by a negative pressure, and, therefore, the mechanism is capable to accelerate the Universe, without the need of an additional dark energy component. In this framework, we discuss the evolution of perturbations by considering a Neo-Newtonian approach where, unlike in the standard Newtonian cosmology, the fluid pressure is taken into account even in the homogeneous and isotropic background equations (Lima, Zanchin, and Brandenberger, MNRAS 291, L1, 1997). The evolution of the density contrast is calculated in the linear approximation and compared to the one predicted by the Lambda CDM model. The difference between the CCDM and Lambda CDM predictions at the perturbative level is quantified by using three different statistical methods, namely: a simple chi(2)-analysis in the relevant space parameter, a Bayesian statistical inference, and, finally, a Kolmogorov-Smirnov test. We find that under certain circumstances, the CCDM scenario analyzed here predicts an overall dynamics (including Hubble flow and matter fluctuation field) which fully recovers that of the traditional cosmic concordance model. Our basic conclusion is that such a reduction of the dark sector provides a viable alternative description to the accelerating Lambda CDM cosmology.