999 resultados para Supersymmetric gauge theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent work of Jones et al. giving the long-range behaviour of the pair correlation function is used to confirm that the critical ratio Pc/nckBTc = 1/2 in the Born-Green theory. This deviates from experimental results on simple insulating liquids by more than the predictions of the van der Waals equation of state. A brief discussion of conditions for thermodynamic consistency, which the Born-Green theory violates, is then given. Finally, the approach of the Ornstein-Zernike correlation function to its critical point behaviour is discussed within the Born-Green theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nonlinear propagation characteristics of surface acoustic waves on an isotropic elastic solid have been studied in this paper. The solution of the harmonic boundary value problem for Rayleigh waves is obtained as a generalized Fourier series whose coefficients are proportional to the slowly varying amplitudes of the various harmonics. The infinite set of coupled equations for the amplitudes when solved exhibit an oscillatory slow variation signifying a continuous transfer of energy back and forth among the various harmonics. A conservation relation is derived among all the harmonic amplitudes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis was to develop measurement techniques and systems for measuring air quality and to provide information about air quality conditions and the amount of gaseous emissions from semi-insulated and uninsulated dairy buildings in Finland and Estonia. Specialization and intensification in livestock farming, such as in dairy production, is usually accompanied by an increase in concentrated environmental emissions. In addition to high moisture, the presence of dust and corrosive gases, and widely varying gas concentrations in dairy buildings, Finland and Estonia experience winter temperatures reaching below -40 ºC and summer temperatures above +30 ºC. The adaptation of new technologies for long-term air quality monitoring and measurement remains relatively uncommon in dairy buildings because the construction and maintenance of accurate monitoring systems for long-term use are too expensive for the average dairy farmer to afford. Though the documentation of accurate air quality measurement systems intended mainly for research purposes have been made in the past, standardised methods and the documentation of affordable systems and simple methods for performing air quality and emissions measurements in dairy buildings are unavailable. In this study, we built three measurement systems: 1) a Stationary system with integrated affordable sensors for on-site measurements, 2) a Wireless system with affordable sensors for off-site measurements, and 3) a Mobile system consisting of expensive and accurate sensors for measuring air quality. In addition to assessing existing methods, we developed simplified methods for measuring ventilation and emission rates in dairy buildings. The three measurement systems were successfully used to measure air quality in uninsulated, semi-insulated, and fully-insulated dairy buildings between the years 2005 and 2007. When carefully calibrated, the affordable sensors in the systems gave reasonably accurate readings. The spatial air quality survey showed high variation in microclimate conditions in the dairy buildings measured. The average indoor air concentration for carbon dioxide was 950 ppm, for ammonia 5 ppm, for methane 48 ppm, for relative humidity 70%, and for inside air velocity 0.2 m/s. The average winter and summer indoor temperatures during the measurement period were -7º C and +24 ºC for the uninsulated, +3 ºC and +20 ºC for the semi-insulated and +10 ºC and +25 ºC for the fully-insulated dairy buildings. The measurement results showed that the uninsulated dairy buildings had lower indoor gas concentrations and emissions compared to fully insulated buildings. Although occasionally exceeded, the ventilation rates and average indoor air quality in the dairy buildings were largely within recommended limits. We assessed the traditional heat balance, moisture balance, carbon dioxide balance and direct airflow methods for estimating ventilation rates. The direct velocity measurement for the estimation of ventilation rate proved to be impractical for naturally ventilated buildings. Two methods were developed for estimating ventilation rates. The first method is applicable in buildings in which the ventilation can be stopped or completely closed. The second method is useful in naturally ventilated buildings with large openings and high ventilation rates where spatial gas concentrations are heterogeneously distributed. The two traditional methods (carbon dioxide and methane balances), and two newly developed methods (theoretical modelling using Fick s law and boundary layer theory, and the recirculation flux-chamber technique) were used to estimate ammonia emissions from the dairy buildings. Using the traditional carbon dioxide balance method, ammonia emissions per cow from the dairy buildings ranged from 7 g day-1 to 35 g day-1, and methane emissions per cow ranged from 96 g day-1 to 348 g day-1. The developed methods proved to be as equally accurate as the traditional methods. Variation between the mean emissions estimated with the traditional and the developed methods was less than 20%. The developed modelling procedure provided sound framework for examining the impact of production systems on ammonia emissions in dairy buildings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the kinematics of a weak shock front governed by a hyperbolic system of conservation laws is studied. This is used to develop a method for solving problems, involving the propagation of nonlinear unimodal waves. It consists of first solving the nonlinear wave problem by moving along the bicharacteristics of the system and then fitting the shock into this solution field, so that it satisfies the necessary jump conditions. The kinematics of the shock leads in a natural way to the definition of ldquoshock-raysrdquo, which play the same role as the ldquoraysrdquo in a continuous flow. A special case of a circular cylinder introduced suddenly in a constant streaming flow is studied in detail. The shock fitted in the upstream region propagates with a velocity which is the mean of the velocities of the linear and the nonlinear wave fronts. In the downstream the solution is given by an expansion wave.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many grand unified theories (GUT's) predict non-Abelian monopoles which are sources of non-Abelian (and Abelian) magnetic flux. In the preceding paper, we discussed in detail the topological obstructions to the global implementation of the action of the "unbroken symmetry group" H on a classical test particle in the field of such a monopole. In this paper, the existence of similar topological obstructions to the definition of H action on the fields in such a monopole sector, as well as on the states of a quantum-mechanical test particle in the presence of such fields, are shown in detail. Some subgroups of H which can be globally realized as groups of automorphisms are identified. We also discuss the application of our analysis to the SU(5) GUT and show in particular that the non-Abelian monopoles of that theory break color and electroweak symmetries.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The third-kind linear integral equation Image where g(t) vanishes at a finite number of points in (a, b), is considered. In general, the Fredholm Alternative theory [[5.]] does not hold good for this type of integral equation. However, imposing certain conditions on g(t) and K(t, t′), the above integral equation was shown [[1.], 49–57] to obey a Fredholm-type theory, except for a certain class of kernels for which the question was left open. In this note a theory is presented for the equation under consideration with some additional assumptions on such kernels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I agree with Costanza and Finkelstein (2015) that it is futile to further invest in the study of generational differences in the work context due to a lack of appropriate theory and methods. The key problem with the generations concept is that splitting continuous variables such as age or time into a few discrete units involves arbitrary cutoffs and atheoretical groupings of individuals (e.g., stating that all people born between the early 1960s and early 1980s belong to Generation X). As noted by methodologists, this procedure leads to a loss of information about individuals and reduced statistical power (MacCallum, Zhang, Preacher, & Rucker, 2002). Due to these conceptual and methodological limitations, I regard it as very difficult if not impossible to develop a “comprehensive theory of generations” (Costanza & Finkelstein, p. 20) and to rigorously examine generational differences at work in empirical studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Excessive speed is a primary contributing factor to young novice road trauma, including intentional and unintentional speeds above posted limits or too fast for conditions. The objective of this research was to conduct a systematic review of recent investigations into novice drivers’ speed selection, with particular attention to applications and limitations of theory and methodology. Method Systematic searches of peer-reviewed and grey literature were conducted during September 2014. Abstract reviews identified 71 references potentially meeting selection criteria of investigations since the year 2000 into factors that influence (directly or indirectly) actual speed (i.e., behaviour or performance) of young (age <25 years) and/or novice (recently-licensed) drivers. Results Full paper reviews resulted in 30 final references: 15 focused on intentional speeding and 15 on broader speed selection investigations. Both sets identified a range of individual (e.g., beliefs, personality) and social (e.g., peer, adult) influences, were predominantly theory-driven and applied cross-sectional designs. Intentional speed investigations largely utilised self-reports while other investigations more often included actual driving (simulated or ‘real world’). The latter also identified cognitive workload and external environment influences, as well as targeted interventions. Discussion and implications Applications of theory have shifted the novice speed-related literature beyond a simplistic focus on intentional speeding as human error. The potential to develop a ‘grand theory’ of intentional speeding emerged and to fill gaps to understand broader speed selection influences. This includes need for future investigations of vehicle-related and physical environment-related influences and methodologies that move beyond cross-sectional designs and rely less on self-reports.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The adequacy and efficiency of existing legal and regulatory frameworks dealing with corporate phoenix activity have been repeatedly called into question over the past two decades through various reviews, inquiries, targeted regulatory operations and the implementation of piecemeal legislative reform. Despite these efforts, phoenix activity does not appear to have abated. While there is no law in Australia that declares ‘phoenix activity’ to be illegal, the behaviour that tends to manifest in phoenix activity can be capable of transgressing a vast array of law, including for example, corporate law, tax law, and employment law. This paper explores the notion that the persistence of phoenix activity despite the sheer extent of this law suggests that the law is not acting as powerfully as it might as a deterrent. Economic theories of entrepreneurship and innovation can to some extent explain why this is the case and also offer a sound basis for the evaluation and reconsideration of the existing law. The challenges facing key regulators are significant. Phoenix activity is not limited to particular corporate demographic: it occurs in SMEs, large companies and in corporate groups. The range of behaviour that can amount to phoenix activity is so broad, that not all phoenix activity is illegal. This paper will consider regulatory approaches to these challenges via analysis of approaches to detection and enforcement of the underlying law capturing illegal phoenix activity. Remedying the mischief of phoenix activity is of practical importance. The benefits include continued confidence in our economy, law that inspires best practice among directors, and law that is articulated in a manner such that penalties act as a sufficient deterrent and the regulatory system is able to detect offenders and bring them to account. Any further reforms must accommodate and tolerate legal phoenix activity, at least to some extent. Even then, phoenix activity pushes tolerance of repeated entrepreneurial failure to its absolute limit. The more limited liability is misused and abused, the stronger the argument to place some restrictions on access to limited liability. This paper proposes that such an approach is a legitimate next step for a robust and mature capitalist economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We demonstrate the phenomenon stated in the title, using for illustration a two-dimensional scalar-field model with a triple-well potential {fx837-1}. At the classical level, this system supports static topological solitons with finite energy. Upon quantisation, however, these solitons develop infinite energy, which cannot be renormalised away. Thus this quantised model has no soliton sector, even though classical solitons exist. Finally when the model is extended supersymmetrically by adding a Majorana field, finiteness of the soliton energy is recovered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A semi-empirical model is presented for describing the interionic interactions in molten salts using the experimentally available structure data. An extension of Bertaut's method of non-overlapping charges is used to estimate the electrostatic interaction energy in ionic melts. It is shown, in agreement with earlier computer simulation studies, that this energy increases when an ionic salt melts. The repulsion between ions is described using a compressible ion theory which uses structure-independent parameters. The van der Waals interactions and the thermal free energy are also included in the total energy, which is minimised with respect to isostructural volume variations to calculate the equilibrium density. Detailed results are presented for three molten systems, NaCl, CaCl2 and ZnCl2, and are shown to be in satisfactory agreement with experiments. With reliable structural data now being reported for several other molten salts, the present study gains relevance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This masters thesis explores some of the most recent developments in noncommutative quantum field theory. This old theme, first suggested by Heisenberg in the late 1940s, has had a renaissance during the last decade due to the firmly held belief that space-time becomes noncommutative at small distances and also due to the discovery that string theory in a background field gives rise to noncommutative field theory as an effective low energy limit. This has led to interesting attempts to create a noncommutative standard model, a noncommutative minimal supersymmetric standard model, noncommutative gravity theories etc. This thesis reviews themes and problems like those of UV/IR mixing, charge quantization, how to deal with the non-commutative symmetries, how to solve the Seiberg-Witten map, its connection to fluid mechanics and the problem of constructing general coordinate transformations to obtain a theory of noncommutative gravity. An emphasis has been put on presenting both the group theoretical results and the string theoretical ones, so that a comparison of the two can be made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our present-day understanding of fundamental constituents of matter and their interactions is based on the Standard Model of particle physics, which relies on quantum gauge field theories. On the other hand, the large scale dynamical behaviour of spacetime is understood via the general theory of relativity of Einstein. The merging of these two complementary aspects of nature, quantum and gravity, is one of the greatest goals of modern fundamental physics, the achievement of which would help us understand the short-distance structure of spacetime, thus shedding light on the events in the singular states of general relativity, such as black holes and the Big Bang, where our current models of nature break down. The formulation of quantum field theories in noncommutative spacetime is an attempt to realize the idea of nonlocality at short distances, which our present understanding of these different aspects of Nature suggests, and consequently to find testable hints of the underlying quantum behaviour of spacetime. The formulation of noncommutative theories encounters various unprecedented problems, which derive from their peculiar inherent nonlocality. Arguably the most serious of these is the so-called UV/IR mixing, which makes the derivation of observable predictions especially hard by causing new tedious divergencies, to which our previous well-developed renormalization methods for quantum field theories do not apply. In the thesis I review the basic mathematical concepts of noncommutative spacetime, different formulations of quantum field theories in the context, and the theoretical understanding of UV/IR mixing. In particular, I put forward new results to be published, which show that also the theory of quantum electrodynamics in noncommutative spacetime defined via Seiberg-Witten map suffers from UV/IR mixing. Finally, I review some of the most promising ways to overcome the problem. The final solution remains a challenge for the future.