964 resultados para 230201 Probability Theory
Resumo:
We present a general formalism for deriving bounds on the shape parameters of the weak and electromagnetic form factors using as input correlators calculated from perturbative QCD, and exploiting analyticity and unitarily. The values resulting from the symmetries of QCD at low energies or from lattice calculations at special points inside the analyticity domain can be included in an exact way. We write down the general solution of the corresponding Meiman problem for an arbitrary number of interior constraints and the integral equations that allow one to include the phase of the form factor along a part of the unitarity cut. A formalism that includes the phase and some information on the modulus along a part of the cut is also given. For illustration we present constraints on the slope and curvature of the K-l3 scalar form factor and discuss our findings in some detail. The techniques are useful for checking the consistency of various inputs and for controlling the parameterizations of the form factors entering precision predictions in flavor physics.
Resumo:
The migrating electrons in biological systems normally are extraneous and taking this into account the electron delocalisation across the hydrogen bonds in proteins is re-examined. It is seen that an extraneous electron can travel rapidly via the low-lying virtual orbitals of the hydrogen-bonded π-electronic structure of peptide units in proteins. The frequency of electron transfer decreases slowly with an increase in the path length. However, the coupling of electron and protonic motions enhances this frequency. Transfer of electrons across the hydrogen bonds in accordance with the double-exchange mechanism does not appear to be possible. This theory offers a possibility for an extraneous electron to transfer within protein structures.
Resumo:
In the thesis I study various quantum coherence phenomena and create some of the foundations for a systematic coherence theory. So far, the approach to quantum coherence in science has been purely phenomenological. In my thesis I try to answer the question what quantum coherence is and how it should be approached within the framework of physics, the metatheory of physics and the terminology related to them. It is worth noticing that quantum coherence is a conserved quantity that can be exactly defined. I propose a way to define quantum coherence mathematically from the density matrix of the system. Degenerate quantum gases, i.e., Bose condensates and ultracold Fermi systems, form a good laboratory to study coherence, since their entropy is small and coherence is large, and thus they possess strong coherence phenomena. Concerning coherence phenomena in degenerate quantum gases, I concentrate in my thesis mainly on collective association from atoms to molecules, Rabi oscillations and decoherence. It appears that collective association and oscillations do not depend on the spin-statistics of particles. Moreover, I study the logical features of decoherence in closed systems via a simple spin-model. I argue that decoherence is a valid concept also in systems with a possibility to experience recoherence, i.e., Poincaré recurrences. Metatheoretically this is a remarkable result, since it justifies quantum cosmology: to study the whole universe (i.e., physical reality) purely quantum physically is meaningful and valid science, in which decoherence explains why the quantum physical universe appears to cosmologists and other scientists very classical-like. The study of the logical structure of closed systems also reveals that complex enough closed (physical) systems obey a principle that is similar to Gödel's incompleteness theorem of logic. According to the theorem it is impossible to describe completely a closed system within the system, and the inside and outside descriptions of the system can be remarkably different. Via understanding this feature it may be possible to comprehend coarse-graining better and to define uniquely the mutual entanglement of quantum systems.
Resumo:
In this thesis the current status and some open problems of noncommutative quantum field theory are reviewed. The introduction aims to put these theories in their proper context as a part of the larger program to model the properties of quantized space-time. Throughout the thesis, special focus is put on the role of noncommutative time and how its nonlocal nature presents us with problems. Applications in scalar field theories as well as in gauge field theories are presented. The infinite nonlocality of space-time introduced by the noncommutative coordinate operators leads to interesting structure and new physics. High energy and low energy scales are mixed, causality and unitarity are threatened and in gauge theory the tools for model building are drastically reduced. As a case study in noncommutative gauge theory, the Dirac quantization condition of magnetic monopoles is examined with the conclusion that, at least in perturbation theory, it cannot be fulfilled in noncommutative space.
Resumo:
In this paper, a novel genetic algorithm is developed by generating artificial chromosomes with probability control to solve the machine scheduling problems. Generating artificial chromosomes for Genetic Algorithm (ACGA) is closely related to Evolutionary Algorithms Based on Probabilistic Models (EAPM). The artificial chromosomes are generated by a probability model that extracts the gene information from current population. ACGA is considered as a hybrid algorithm because both the conventional genetic operators and a probability model are integrated. The ACGA proposed in this paper, further employs the ``evaporation concept'' applied in Ant Colony Optimization (ACO) to solve the permutation flowshop problem. The ``evaporation concept'' is used to reduce the effect of past experience and to explore new alternative solutions. In this paper, we propose three different methods for the probability of evaporation. This probability of evaporation is applied as soon as a job is assigned to a position in the permutation flowshop problem. Experimental results show that our ACGA with the evaporation concept gives better performance than some algorithms in the literature.
Resumo:
Modern elementary particle physics is based on quantum field theories. Currently, our understanding is that, on the one hand, the smallest structures of matter and, on the other hand, the composition of the universe are based on quantum field theories which present the observable phenomena by describing particles as vibrations of the fields. The Standard Model of particle physics is a quantum field theory describing the electromagnetic, weak, and strong interactions in terms of a gauge field theory. However, it is believed that the Standard Model describes physics properly only up to a certain energy scale. This scale cannot be much larger than the so-called electroweak scale, i.e., the masses of the gauge fields W^+- and Z^0. Beyond this scale, the Standard Model has to be modified. In this dissertation, supersymmetric theories are used to tackle the problems of the Standard Model. For example, the quadratic divergences, which plague the Higgs boson mass in the Standard model, cancel in supersymmetric theories. Experimental facts concerning the neutrino sector indicate that the lepton number is violated in Nature. On the other hand, the lepton number violating Majorana neutrino masses can induce sneutrino-antisneutrino oscillations in any supersymmetric model. In this dissertation, I present some viable signals for detecting the sneutrino-antisneutrino oscillation at colliders. At the e-gamma collider (at the International Linear Collider), the numbers of the electron-sneutrino-antisneutrino oscillation signal events are quite high, and the backgrounds are quite small. A similar study for the LHC shows that, even though there are several backrounds, the sneutrino-antisneutrino oscillations can be detected. A useful asymmetry observable is introduced and studied. Usually, the oscillation probability formula where the sneutrinos are produced at rest is used. However, here, we study a general oscillation probability. The Lorentz factor and the distance at which the measurement is made inside the detector can have effects, especially when the sneutrino decay width is very small. These effects are demonstrated for a certain scenario at the LHC.
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.
Resumo:
Based on a method proposed by Reddy and Daum, the equations governing the steady inviscid nonreacting gasdynamic laser (GDL) flow in a supersonic nozzle are reduced to a universal form so that the solutions depend on a single parameter which combines all the other parameters of the problem. Solutions are obtained for a sample case of available data and compared with existing results to validate the present approach. Also, similar solutions for a sample case are presented.
Resumo:
A microscopic study of the non‐Markovian (or memory) effects on the collective orientational relaxation in a dense dipolar liquid is carried out by using an extended hydrodynamic approach which provides a reliable description of the dynamical processes occuring at the molecular length scales. Detailed calculations of the wave‐vector dependent orientational correlation functions are presented. The memory effects are found to play an important role; the non‐Markovian results differ considerably from that of the Markovian theory. In particular, a slow long‐time decay of the longitudinal orientational correlation function is observed for dense liquids which becomes weaker in the presence of a sizeable translational contribution to the collective orientational relaxation. This slow decay can be attributed to the intermolecular correlations at the molecular length scales. The longitudinal component of the orientational correlation function becomes oscillatory in the underdamped limit of momenta relaxations and the frequency dependence of the friction reduce the frictional resistance on the collective excitations (commonly known as dipolarons) to make them long lived. The theory predicts that these dipolarons can, therefore, be important in chemical relaxation processes, in contradiction to the claims of some earlier theoretical studies.
Resumo:
A molecular theory of dielectric relaxation in a dense binary dipolar liquid is presented. The theory takes into account the effects of intra- and interspecies intermolecular interactions. It is shown that the relaxation is, in general, nonexponential. In certain limits, we recover the biexponential form traditionally used to analyze the experimental data of dielectric relaxation in a binary mixture. However, the relaxation times are widely different from the prediction of the noninteracting rotational diffusion model of Debye for a binary system. Detailed numerical evaluation of the frequency-dependent dielectric function epsilon-(omega) is carried out by using the known analytic solution of the mean spherical approximation (MSA) model for the two-particle direct correlation function for a polar mixture. A microscopic expression for both wave vector (k) and frequency (omega) dependent dielectric function, epsilon-(k,omega), of a binary mixture is also presented. The theoretical predictions on epsilon-(omega) (= epsilon-(k = 0, omega)) have been compared with the available experimental results. In particular, the present theory offers a molecular explanation of the phenomenon of fusing of the two relaxation channels of the neat liquids, observed by Schallamach many years ago.
Resumo:
Fujikawa's method of evaluating the supercurrent and the superconformal current anomalies, using the heat-kernel regularization scheme, is extended to theories with gauge invariance, in particular, to the off-shell N=1 supersymmetric Yang-Mills (SSYM) theory. The Jacobians of supersymmetry and superconformal transformations are finite. Although the gauge-fixing term is not supersymmetric and the regularization scheme is not manifestly supersymmetric, we find that the regularized Jacobians are gauge invariant and finite and they can be expressed in such a way that there is no one-loop supercurrent anomaly for the N=1 SSYM theory. The superconformal anomaly is nonzero and the anomaly agrees with a similar result obtained using other methods.
Resumo:
We have derived explicitly, the large scale distribution of quantum Ohmic resistance of a disordered one-dimensional conductor. We show that in the thermodynamic limit this distribution is characterized by two independent parameters for strong disorder, leading to a two-parameter scaling theory of localization. Only in the limit of weak disorder we recover single parameter scaling, consistent with existing theoretical treatments.
Resumo:
The Integrated Force Method (IFM) is a novel matrix formulation developed for analyzing the civil, mechanical and aerospace engineering structures. In this method all independent/internal forces are treated as unknown variables which are calculated by simultaneously imposing equations of equilibrium and compatibility conditions. This paper presents a new 12-node serendipity quadrilateral plate bending element MQP12 for the analysis of thin and thick plate problems using IFM. The Mindlin-Reissner plate theory has been employed in the formulation which accounts the effect of shear deformation. The performance of this new element with respect to accuracy and convergence is studied by analyzing many standard benchmark plate bending problems. The results of the new element MQP12 are compared with those of displacement-based 12-node plate bending elements available in the literature. The results are also compared with exact solutions. The new element MQP12 is free from shear locking and performs excellent for both thin and moderately thick plate bending situations.