956 resultados para equivalence principle


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atmosphärische Neutrinos erlauben es Prinzipien der Relativitätstheorie, wie die Lorentz-Invarianz und das schwache Äquivalenzprinzip, zu überprüfen. Kleine Abweichungen von diesen Prinzipien können in einigen Theorien zu messbaren Neutrinooszillationen führen. In dieser Arbeit wird in den aufgezeichneten Neutrinoereignissen des AMANDA-Detektors nach solchen alternativen Oszillationseffekten gesucht. Das Neutrinoteleskop AMANDA befindet sich am geographischen Südpol und ist in einer Tiefe zwischen 1500 m und 2000 m im antarktischen Eispanzer eingebettet. AMANDA weist Myonneutrinos über das Tscherenkow-Licht neutrinoinduzierter Myonen nach, woraus die Richtung der Bahn des ursprünglichen Neutrinos rekonstruiert werden kann. Aus den AMANDA-Daten der Jahre 2000 bis 2003 wurden aus circa sieben Milliarden aufgezeichneten Ereignissen, die sich hauptsächlich aus dem Untergrund aus atmosphärischen Myonen zusammensetzen, 3401 Ereignisse neutrinoinduzierter Myonen selektiert. Dieser Datensatz wurde auf alternative Oszillationseffekte untersucht. Es wurden keine Hinweise auf solche Effekte gefunden. Für maximale Mischungswinkel konnte die untere Grenze für Oszillationsparameter, welche die Lorentz-Invarianz oder das Äquivalenzprinzip verletzen, auf DeltaBeta (2PhiDeltaGamma) < 5,15*10e-27 festgelegt werden.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main goal of the AEgIS experiment at CERN is to test the weak equivalence principle for antimatter. AEgIS will measure the free-fall of an antihydrogen beam traversing a moir'e deflectometer. The goal is to determine the gravitational acceleration with an initial relative accuracy of 1% by using an emulsion detector combined with a silicon μ-strip detector to measure the time of flight. Nuclear emulsions can measure the annihilation vertex of antihydrogen atoms with a precision of ~ 1–2 μm r.m.s. We present here results for emulsion detectors operated in vacuum using low energy antiprotons from the CERN antiproton decelerator. We compare with Monte Carlo simulations, and discuss the impact on the AEgIS project.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main goal of the AEgIS experiment at CERN is to test the weak equivalence principle for antimatter. We will measure the Earth ' s gravitational acceleration g with antihydrogen atoms being launched in a horizontal vacuum tube and traversing a moiré de fl ectometer. We intend to use a position sensitive device made of nuclear emulsions (combined with a time-of- fl ight detector such as silicon μ strips) to measure precisely their annihilation points at the end of the tube. The goal is to determine g with a 1% relative accuracy. In 2012 we tested emulsion fi lms in vacuum and at room temperature with low energy antiprotons from the CERN antiproton decelerator. First results on the expected performance for AEgIS are presented

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose to build and operate a detector based on the emulsion film technology for the measurement of the gravitational acceleration on antimatter, to be performed by the AEgIS experiment (AD6) at CERN. The goal of AEgIS is to test the weak equivalence principle with a precision of 1% on the gravitational acceleration g by measuring the vertical position of the annihilation vertex of antihydrogen atoms after their free fall while moving horizontally in a vacuum pipe. With the emulsion technology developed at the University of Bern we propose to improve the performance of AEgIS by exploiting the superior position resolution of emulsion films over other particle detectors. The idea is to use a new type of emulsion films, especially developed for applications in vacuum, to yield a spatial resolution of the order of one micron in the measurement of the sag of the antihydrogen atoms in the gravitational field. This is an order of magnitude better than what was planned in the original AEgIS proposal.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AEgIS experiment’s main goal is to measure the local gravitational acceleration of antihydrogen¯g and thus perform a direct test of the weak equivalence principle with antimatter. In the first phase of the experiment the aim is to measure ¯g with 1% relative precision. This paper presents the antihydrogen production method and a description of some components of the experiment, which are necessary for the gravity measurement. Current status of the AE¯gIS experimental apparatus is presented and recent commissioning results with antiprotons are outlined. In conclusion we discuss the short-term goals of the AE¯gIS collaboration that will pave the way for the first gravity measurement in the near future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The precise measurement of forces is one way to obtain deep insight into the fundamental interactions present in nature. In the context of neutral antimatter, the gravitational interaction is of high interest, potentially revealing new forces that violate the weak equivalence principle. Here we report on a successful extension of a tool from atom optics—the moiré deflectometer—for a measurement of the acceleration of slow antiprotons. The setup consists of two identical transmission gratings and a spatially resolving emulsion detector for antiproton annihilations. Absolute referencing of the observed antimatter pattern with a photon pattern experiencing no deflection allows the direct inference of forces present. The concept is also straightforwardly applicable to antihydrogen measurements as pursued by the AEgIS collaboration. The combination of these very different techniques from high energy and atomic physics opens a very promising route to the direct detection of the gravitational acceleration of neutral antimatter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The asymptotic structure of the far-wake behind a charged body in a rarefied plasma flow is investigated under the assumption of small ion-to-electron temperature ratio and of flow speed hypersonic with respect to the ions but not with respect to the electrons. It is found that waves are excited even if the flow is subacoustic (flow velocity less than the ion-acoustic speed). For both superacoustic and subacoustic velocities a steep wave front develops separating the weakly perturbed, quasineutral plasma ahead, from the region behind where ion waves appear. Near the axis a trailing front develops;the region between this and the axis is quasineutral for superacoustic speeds. The decay laws in all of these regions, the self-similar structure of the fronts and the general character of the waves are determined.The damping of the waves and special flow detail for bodies large and small compared with the Debye length are discussed. A nonlinear analysis of the leading wave front in superacoustic flow is carried out. A hyperacoustic equivalence principle is presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Control design for stochastic uncertain nonlinear systems is traditionally based on minimizing the expected value of a suitably chosen loss function. Moreover, most control methods usually assume the certainty equivalence principle to simplify the problem and make it computationally tractable. We offer an improved probabilistic framework which is not constrained by these previous assumptions, and provides a more natural framework for incorporating and dealing with uncertainty. The focus of this paper is on developing this framework to obtain an optimal control law strategy using a fully probabilistic approach for information extraction from process data, which does not require detailed knowledge of system dynamics. Moreover, the proposed control method framework allows handling the problem of input-dependent noise. A basic paradigm is proposed and the resulting algorithm is discussed. The proposed probabilistic control method is for the general nonlinear class of discrete-time systems. It is demonstrated theoretically on the affine class. A nonlinear simulation example is also provided to validate theoretical development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Robert J. Barro, a Harvard Egyetem professzora főként a gazdaságpolitika makroökonómiai modellezése területén elért eredményei alapján ismert a közgazdászok körében. Tevékenysége kiterjed mind az elméleti, mind pedig az empirikus kutatások területére. Jelen tanulmány Barro azon kutatásainak feltételezéseit és eredményeit összegzi, amelyek a ricardói ekvivalenciaelvből kiindulva a költségvetési politika elméletét magyarázó újszerű eredmények kibontakozását segítették elő. A 80-as években az Egyesült Államok magas költségvetési hiánya számos közgazdászt ösztönzött hasonló témájú elmélet kidolgozására. Mivel hazánkban szinte mindennapos vita forrása a költségvetési hiány túlzott mértéke, ami veszélyezteti a monetáris közösségben való részvételünket, különösen érdekes és időszerű annak áttekintése, hogy hogyan gondolkodik egy modern közgazdász a költségvetési hiány okairól és következményeiről. ________________ The question of budgetary discipline emerges in relation to the criteria of the Economic and Monetary Union in almost all European special journals today. There is much less attention paid to budgetary overspending, the adjustment of which caused a serious puzzle for the government and the economists of the United States in the 80's. The Lucasian world of new classical economics has questioned the effectiveness of government intervention, it confuted above all the efficiency of fiscal policy. The macroeconomic models of Barro (1979, 1986) introduced in the present study - building upon the theoretical approach of economic policy on similar foundations - examine the effect of budgetary spending principally from a long-run perspective. His empirical analysis, overarching almost seventy years (1916–1982), is based upon the time series of variables affecting the budgetary deficit of the United States, distinguishing the effect of the usual government expenses from the over average items within. On the basis of his investigation on the United States and the United Kingdom he, furthermore, did not reject the economic invigorating role of government spending, he opposed Lucas' conclusions and got a modest step closer to the Keynesian standpoint in this sense. Barro, however, irrefutably argues on classical grounds, he recalls and reevaluates the Ricardian equivalence principle, summarizes the critiques raised against it and unintentionally praises the Classical economists. According to Barro we cannot ignore the one-time theorem of Ricardo if we are endeavoring to model government spending - we have to count with it if not definitely as a positive, but at least as a normative economic relationship.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Erratum to: A high-flux BEC source for mobile atom interferometers in: New Journal of Physics 17 (2015) 065001

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Quantum sensors based on coherent matter-waves are precise measurement devices whose ultimate accuracy is achieved with Bose-Einstein condensates (BECs) in extended free fall. This is ideally realized in microgravity environments such as drop towers, ballistic rockets and space platforms. However, the transition from lab-based BEC machines to robust and mobile sources with comparable performance is a challenging endeavor. Here we report on the realization of a miniaturized setup, generating a flux of 4x10(5) quantum degenerate Rb-87 atoms every 1.6 s. Ensembles of 1 x 10(5) atoms can be produced at a 1 Hz rate. This is achieved by loading a cold atomic beam directly into a multi-layer atom chip that is designed for efficient transfer from laser-cooled to magnetically trapped clouds. The attained flux of degenerate atoms is on par with current lab-based BEC experiments while offering significantly higher repetition rates. Additionally, the flux is approaching those of current interferometers employing Raman-type velocity selection of laser-cooled atoms. The compact and robust design allows for mobile operation in a variety of demanding environments and paves the way for transportable high-precision quantum sensors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a very long baseline atom interferometer test of Einstein's equivalence principle (EEP) with ytterbium and rubidium extending over 10m of free fall. In view of existing parametrizations of EEP violations, this choice of test masses significantly broadens the scope of atom interferometric EEP tests with respect to other performed or proposed tests by comparing two elements with high atomic numbfers. In the first step, our experimental scheme will allow us to reach an accuracy in the Eotvos ratio of 7 . 10(-13). This achievement will constrain violation scenarios beyond our present knowledge and will represent an important milestone for exploring a variety of schemes for further improvements of the tests as outlined in the paper. We will discuss the technical realisation in the new infrastructure of the Hanover Institute of Technology (HITec) and give a short overview of the requirements needed to reach this accuracy. The experiment will demonstrate a variety of techniques, which will be employed in future tests of EEP, high-accuracy gravimetry and gravity gradiometry. It includes operation of a force-sensitive atom interferometer with an alkaline earth-like element in free fall, beam splitting over macroscopic distances and novel source concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex numbers appear in the Hilbert space formulation of quantum mechanics, but not in the formulation in phase space. Quantum symmetries are described by complex, unitary or antiunitary operators defining ray representations in Hilbert space, whereas in phase space they are described by real, true representations. Equivalence of the formulations requires that the former representations can be obtained from the latter and vice versa. Examples are given. Equivalence of the two formulations also requires that complex superpositions of state vectors can be described in the phase space formulation, and it is shown that this leads to a nonlinear superposition principle for orthogonal, pure-state Wigner functions. It is concluded that the use of complex numbers in quantum mechanics can be regarded as a computational device to simplify calculations, as in all other applications of mathematics to physical phenomena.