925 resultados para TIME-LIKE GEODESICS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the Einstein s theory of General Relativity the field equations relate the geometry of space-time with the content of matter and energy, sources of the gravitational field. This content is described by a second order tensor, known as energy-momentum tensor. On the other hand, the energy-momentum tensors that have physical meaning are not specified by this theory. In the 700s, Hawking and Ellis set a couple of conditions, considered feasible from a physical point of view, in order to limit the arbitrariness of these tensors. These conditions, which became known as Hawking-Ellis energy conditions, play important roles in the gravitation scenario. They are widely used as powerful tools for analysis; from the demonstration of important theorems concerning to the behavior of gravitational fields and geometries associated, the gravity quantum behavior, to the analysis of cosmological models. In this dissertation we present a rigorous deduction of the several energy conditions currently in vogue in the scientific literature, such as: the Null Energy Condition (NEC), Weak Energy Condition (WEC), the Strong Energy Condition (SEC), the Dominant Energy Condition (DEC) and Null Dominant Energy Condition (NDEC). Bearing in mind the most trivial applications in Cosmology and Gravitation, the deductions were initially made for an energy-momentum tensor of a generalized perfect fluid and then extended to scalar fields with minimal and non-minimal coupling to the gravitational field. We also present a study about the possible violations of some of these energy conditions. Aiming the study of the single nature of some exact solutions of Einstein s General Relativity, in 1955 the Indian physicist Raychaudhuri derived an equation that is today considered fundamental to the study of the gravitational attraction of matter, which became known as the Raychaudhuri equation. This famous equation is fundamental for to understanding of gravitational attraction in Astrophysics and Cosmology and for the comprehension of the singularity theorems, such as, the Hawking and Penrose theorem about the singularity of the gravitational collapse. In this dissertation we derive the Raychaudhuri equation, the Frobenius theorem and the Focusing theorem for congruences time-like and null congruences of a pseudo-riemannian manifold. We discuss the geometric and physical meaning of this equation, its connections with the energy conditions, and some of its several aplications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this dissertation, after a brief review on the Einstein s General Relativity Theory and its application to the Friedmann-Lemaitre-Robertson-Walker (FLRW) cosmological models, we present and discuss the alternative theories of gravity dubbed f(R) gravity. These theories come about when one substitute in the Einstein-Hilbert action the Ricci curvature R by some well behaved nonlinear function f(R). They provide an alternative way to explain the current cosmic acceleration with no need of invoking neither a dark energy component, nor the existence of extra spatial dimensions. In dealing with f(R) gravity, two different variational approaches may be followed, namely the metric and the Palatini formalisms, which lead to very different equations of motion. We briefly describe the metric formalism and then concentrate on the Palatini variational approach to the gravity action. We make a systematic and detailed derivation of the field equations for Palatini f(R) gravity, which generalize the Einsteins equations of General Relativity, and obtain also the generalized Friedmann equations, which can be used for cosmological tests. As an example, using recent compilations of type Ia Supernovae observations, we show how the f(R) = R − fi/Rn class of gravity theories explain the recent observed acceleration of the universe by placing reasonable constraints on the free parameters fi and n. We also examine the question as to whether Palatini f(R) gravity theories permit space-times in which causality, a fundamental issue in any physical theory [22], is violated. As is well known, in General Relativity there are solutions to the viii field equations that have causal anomalies in the form of closed time-like curves, the renowned Gödel model being the best known example of such a solution. Here we show that every perfect-fluid Gödel-type solution of Palatini f(R) gravity with density and pressure p that satisfy the weak energy condition + p 0 is necessarily isometric to the Gödel geometry, demonstrating, therefore, that these theories present causal anomalies in the form of closed time-like curves. This result extends a theorem on Gödel-type models to the framework of Palatini f(R) gravity theory. We derive an expression for a critical radius rc (beyond which causality is violated) for an arbitrary Palatini f(R) theory. The expression makes apparent that the violation of causality depends on the form of f(R) and on the matter content components. We concretely examine the Gödel-type perfect-fluid solutions in the f(R) = R−fi/Rn class of Palatini gravity theories, and show that for positive matter density and for fi and n in the range permitted by the observations, these theories do not admit the Gödel geometry as a perfect-fluid solution of its field equations. In this sense, f(R) gravity theory remedies the causal pathology in the form of closed timelike curves which is allowed in General Relativity. We also examine the violation of causality of Gödel-type by considering a single scalar field as the matter content. For this source, we show that Palatini f(R) gravity gives rise to a unique Gödeltype solution with no violation of causality. Finally, we show that by combining a perfect fluid plus a scalar field as sources of Gödel-type geometries, we obtain both solutions in the form of closed time-like curves, as well as solutions with no violation of causality

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A precise fomulation of the strong Equivalence Principle is essential to the understanding of the relationship between gravitation and quantum mechanics. The relevant aspects are reviewed in a context including General Relativity but allowing for the presence of torsion. For the sake of brevity, a concise statement is proposed for the Principle: An ideal observer immersed in a gravitational field can choose a reference frame in which gravitation goes unnoticed. This statement is given a clear mathematical meaning through an accurate discussion of its terms. It holds for ideal observers (time-like smooth non-intersecting curves), but not for real, spatially extended observers. Analogous results hold for gauge fields. The difference between gravitation and the other fundamental interactions comes from their distinct roles in the equation of force.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nerve regeneration in a sensory nerve was obtained by the application of different techniques: inside-out vein graft (IOVG group) and standard vein graft (SVG group). These techniques provide a good microenvironment for axon regeneration in motor nerves, but their efficiency for regeneration of sensory nerves is controversial. The saphenous nerve was sectioned and repaired by the inside-out and standard vein graft techniques in rats. After 4, 12, and 20 weeks the graft and the distal stump were observed under electron microscopy. In each studied period, the pattern, diameters, and thickness of the myelin sheaths of the regenerated axons were measured in the graft and distal stump. A comparative study about the regenerated nerve fibers by these two different techniques was performed. Regenerated nerve fibers were prominent in both vein grafts 4 weeks after the surgical procedures. On the other hand, in the distal stump, regenerated nerve fibers were observed only from 12 weeks. In both inside-out vein graft and standard vein graft statistical difference was not observed about the diameters and thickness of the myelinated fibers after 20 weeks. On the other hand, the inside-out group had greater regenerated axon number when compared to the standard group. There is a capillary invasion in both graft and distal stump, especially in the IOVG group. The regenerated axons follow these capillaries all the time like satellite microfascicles. After 20 weeks, the diameters of regenerated fibers repaired by the standard vein graft technique were closer to the normal fibers compared to the inside-out vein graft. On the other hand, the pattern of these regenerated axons was better in the IOVG group.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A fluidização de partículas é amplamente utilizada na indústria, principalmente devido às altas taxas de transferência de calor e massa entre as fases. O acoplamento entre a Dinâmica dos Fluidos Computacional (CFD – Computational Fluid Dynamics) e o Método dos Elementos Discretos (DEM – Discrete Element Method) tem se tornado atrativo para a simulação de fluidização, já que nesse caso o movimento das partículas é analisado de forma mais direta do que em outros tipos de abordagens. O grande problema do acoplamento CFD-DEM é a alta exigência computacional para rastrear todas as partículas do sistema, o que leva ao uso de estratégias de redução do tempo de simulação que em caso de utilização incorreta podem comprometer os resultados. O presente trabalho trata da aplicação do acoplamento CFD-DEM na análise de fluidização de alumina, que é um problema importante para o setor mineral. Foram analisados diversos parâmetros capazes de influenciar os resultados e o tempo de simulação como os passos de tempo, os modelos de arrasto, a distribuição granulométrica das partículas, a constante de rigidez, a utilização de partículas representativas com tamanho maior que o das partículas reais, etc. O modelo de força de interação DEM utilizado foi o modelo de mola e amortecedor lineares (LSD – Linear Spring Dashpot). Todas as simulações foram realizadas com o software ANSYS FLUENT 14.5 e os resultados obtidos foram comparados com dados experimentais e da literatura. Tais resultados permitiram comprovar a capacidade do modelo linear LSD em predizer o comportamento global de leitos de alumina e reduzir o tempo de simulação, desde que os parâmetros do modelo sejam definidos de forma adequada.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The subject of this thesis is in the area of Applied Mathematics known as Inverse Problems. Inverse problems are those where a set of measured data is analysed in order to get as much information as possible on a model which is assumed to represent a system in the real world. We study two inverse problems in the fields of classical and quantum physics: QCD condensates from tau-decay data and the inverse conductivity problem. Despite a concentrated effort by physicists extending over many years, an understanding of QCD from first principles continues to be elusive. Fortunately, data continues to appear which provide a rather direct probe of the inner workings of the strong interactions. We use a functional method which allows us to extract within rather general assumptions phenomenological parameters of QCD (the condensates) from a comparison of the time-like experimental data with asymptotic space-like results from theory. The price to be paid for the generality of assumptions is relatively large errors in the values of the extracted parameters. Although we do not claim that our method is superior to other approaches, we hope that our results lend additional confidence to the numerical results obtained with the help of methods based on QCD sum rules. EIT is a technology developed to image the electrical conductivity distribution of a conductive medium. The technique works by performing simultaneous measurements of direct or alternating electric currents and voltages on the boundary of an object. These are the data used by an image reconstruction algorithm to determine the electrical conductivity distribution within the object. In this thesis, two approaches of EIT image reconstruction are proposed. The first is based on reformulating the inverse problem in terms of integral equations. This method uses only a single set of measurements for the reconstruction. The second approach is an algorithm based on linearisation which uses more then one set of measurements. A promising result is that one can qualitatively reconstruct the conductivity inside the cross-section of a human chest. Even though the human volunteer is neither two-dimensional nor circular, such reconstructions can be useful in medical applications: monitoring for lung problems such as accumulating fluid or a collapsed lung and noninvasive monitoring of heart function and blood flow.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Among all possible realizations of quark and antiquark assembly, the nucleon (the proton and the neutron) is the most stable of all hadrons and consequently has been the subject of intensive studies. Mass, shape, radius and more complex representations of its internal structure are measured since several decades using different probes. The proton (spin 1/2) is described by the electric GE and magnetic GM form factors which characterise its internal structure. The simplest way to measure the proton form factors consists in measuring the angular distribution of the electron-proton elastic scattering accessing the so-called Space-Like region where q2 < 0. Using the crossed channel antiproton proton <--> e+e-, one accesses another kinematical region, the so-called Time-Like region where q2 > 0. However, due to the antiproton proton <--> e+e- threshold q2th, only the kinematical domain q2 > q2th > 0 is available. To access the unphysical region, one may use the antiproton proton --> pi0 e+ e- reaction where the pi0 takes away a part of the system energy allowing q2 to be varied between q2th and almost 0. This thesis aims to show the feasibility of such measurements with the PANDA detector which will be installed on the new high intensity antiproton ring at the FAIR facility at Darmstadt. To describe the antiproton proton --> pi0 e+ e- reaction, a Lagrangian based approach is developed. The 5-fold differential cross section is determined and related to linear combinations of hadronic tensors. Under the assumption of one nucleon exchange, the hadronic tensors are expressed in terms of the 2 complex proton electromagnetic form factors. An extraction method which provides an access to the proton electromagnetic form factor ratio R = |GE|/|GM| and for the first time in an unpolarized experiment to the cosine of the phase difference is developed. Such measurements have never been performed in the unphysical region up to now. Extended simulations were performed to show how the ratio R and the cosine can be extracted from the positron angular distribution. Furthermore, a model is developed for the antiproton proton --> pi0 pi+ pi- background reaction considered as the most dangerous one. The background to signal cross section ratio was estimated under different cut combinations of the particle identification information from the different detectors and of the kinematic fits. The background contribution can be reduced to the percent level or even less. The corresponding signal efficiency ranges from a few % to 30%. The precision on the determination of the ratio R and of the cosine is determined using the expected counting rates via Monte Carlo method. A part of this thesis is also dedicated to more technical work with the study of the prototype of the electromagnetic calorimeter and the determination of its resolution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The availability of a high-intensity antiproton beam with momentum up to 15,GeV/c at the future FAIR will open a unique opportunity to investigate wide areas of nuclear physics with the $overline{P}$ANDA (anti{$overline{P}$}roton ANnihilations at DArmstadt) detector. Part of these investigations concern the Electromagnetic Form Factors of the proton in the time-like region and the study of the Transition Distribution Amplitudes, for which feasibility studies have been performed in this Thesis. rnMoreover, simulations to study the efficiency and the energy resolution of the backward endcap of the electromagnetic calorimeter of $overline{P}$ANDA are presented. This detector is crucial especially for the reconstruction of processes like $bar pprightarrow e^+ e^- pi^0$, investigated in this work. Different arrangements of dead material were studied. The results show that both, the efficiency and the energy resolution of the backward endcap of the electromagnetic calorimeter fullfill the requirements for the detection of backward particles, and that this detector is necessary for the reconstruction of the channels of interest. rnrnThe study of the annihilation channel $bar pprightarrow e^+ e^-$ will improve the knowledge of the Electromagnetic Form Factors in the time-like region, and will help to understand their connection with the Electromagnetic Form Factors in the space-like region. In this Thesis the feasibility of a measurement of the $bar pprightarrow e^+ e^-$ cross section with $overline{P}$ANDA is studied using Monte-Carlo simulations. The major background channel $bar pprightarrow pi^+ pi^-$ is taken into account. The results show a $10^9$ background suppression factor, which assure a sufficiently clean signal with less than 0.1% background contamination. The signal can be measured with an efficiency greater than 30% up to $s=14$,(GeV/c)$^2$. The Electromagnetic Form Factors are extracted from the reconstructed signal and corrected angular distribution. Above this $s$ limit, the low cross section will not allow the direct extraction of the Electromagnetic Form Factors. However, the total cross section can still be measured and an extraction of the Electromagnetic Form Factors is possible considering certain assumptions on the ratio between the electric and magnetic contributions.rnrnThe Transition Distribution Amplitudes are new non-perturbative objects describing the transition between a baryon and a meson. They are accessible in hard exclusive processes like $bar pprightarrow e^+ e^- pi^0$. The study of this process with $overline{P}$ANDA will test the Transition Distribution Amplitudes approach. This work includes a feasibility study for measuring this channel with $overline{P}$ANDA. The main background reaction is here $bar pprightarrow pi^+ pi^- pi^0$. A background suppression factor of $10^8$ has been achieved while keeping a signal efficiency above 20%.rnrnrnPart of this work has been published in the European Physics Journal A 44, 373-384 (2010).rn

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We analyze the pion transition form factor using dispersion theory. We calculate the singly-virtual form factor in the time-like region based on data for the e+e−→3π cross section, generalizing previous studies on ω,ϕ→3π decays and γπ→ππ scattering, and verify our result by comparing to e+e−→π0γ data. We perform the analytic continuation to the space-like region, predicting the poorly-constrained space-like transition form factor below 1GeV, and extract the slope of the form factor at vanishing momentum transfer aπ=(30.7±0.6)×10−3. We derive the dispersive formalism necessary for the extension of these results to the doubly-virtual case, as required for the pion-pole contribution to hadronic light-by-light scattering in the anomalous magnetic moment of the muon.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Space debris in geostationary orbits may be detected with optical telescopes when the objects are illuminated by the Sun. The advantage compared to Radar can be found in the illumination: radar illuminates the objects and thus the detection sensitivity depletest proportional to the fourth power of the d istance. The German Space Operation Center, GSOC, together with the Astronomical Institute of the University of Bern, AIUB, are setting up a telescope system called SMARTnet to demonstrate the capability of performing geostationary surveillance. Such a telescope system will consist of two telescopes on one mount: a smaller telescope with an aperture of 20cm will serve for fast survey while the larger one, a telescope with an aperture of 50cm, will be used for follow-up observations. The telescopes will be operated by GSOC from Oberpfaffenhofen by the internal monitoring and control system called SMARTnetMAC. The observation plan will be generated by MARTnetPlanning seven days in advance by applying an optimized planning scheduler, taking into account fault time like cloudy nights, priority of objects etc. From each picture taken, stars will be identified and everything not being a star is treated as a possible object. If the same object can be identified on multiple pictures within a short time span, the trace is called a tracklet. In the next step, several tracklets will be correlated to identify individual objects, ephemeris data for these objects are generated and catalogued . This will allow for services like collision avoidance to ensure safe operations for GSOC’s satellites. The complete data processing chain is handled by BACARDI, the backbone catalogue of relational debris information and is presented as a poster.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Muchas lecturas, textos, debates, programas de investigación en teoría de la comunicación, cultura y arte incluyen un tópico recurrente que pasa desapercibido ante las distintas interpretaciones. Éste comenzó a inquietarnos lo suficiente como para prestar mayor atención a los planteos que de él teníamos. Por esta razón decidimos ensayar en estas líneas algunas ideas recuperadas en ciertas lecturas sobre la experiencia. ¿Por qué ocuparnos de la experiencia? Desde hace un tiempo cierto discurso contemporáneo (a veces caracterizado como posmoderno) repite/repetimos que la experiencia, el sentir en el presente, cambió en una dirección totalmente nueva. No pocos debates se concentran en describir este nuevo escenario, intentando responder la pregunta disparada por la línea anterior: ¿qué direcciones tomó la experiencia contemporánea? Y por supuesto el interrogante es acompañado por otra duda con una mirada más escéptica: la experiencia contemporánea ¿es más rica o profunda que en el pasado?

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La biblioclastía fue definida como la compulsión humana por destruir libros. Desde tiempos inmemoriales el hombre ha querido imponer sus ideas a través de la destrucción de aquellas que se contraponían a las propias y los libros han sido objeto de esa "pulsión biblioclástica" durante toda la historia de la humanidad. La última dictadura militar que sufrió nuestro país dejó importantes secuelas en la sociedad. Mediante la censura e intervención en diferentes ámbitos (educación, cine, teatro, literatura, entre otros) se construyeron un discurso, un lenguaje y unas prácticas que hoy se reconocen como propias de ese tiempo histórico. Igual que pasó con la desaparición de las personas y sus cuerpos, la represión en el ámbito de la cultura fue parte de un plan sistemático, pensado, calculado y llevado a cabo por dependencias del Estado argentino destinadas a tal fin y por funcionarios (militares y civiles) que fueron parte de ese plan. Se considera de especial importancia que los bibliotecarios, como parte de aquéllos profesionales que contribuyen día a día a la preservación de la memoria, abordemos este tipo de problemáticas y reflexionemos en torno de la mismas. Este trabajo se enmarca en las investigaciones sobre el pasado reciente, teniendo como eje la Declaración Universal de los Derechos Humanos y su postura frente a la libertad de expresión. Intenta reconstruir los mecanismos censorios y cómo se plasmaron en las vivencias de diferentes actores relacionados con el ámbito del libro y del movimiento cultural y político platense, con el objetivo de contribuir a la memoria social de nuestra ciudad.