996 resultados para CONVENTIONAL THEORY


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fujikawa's method of evaluating the supercurrent and the superconformal current anomalies, using the heat-kernel regularization scheme, is extended to theories with gauge invariance, in particular, to the off-shell N=1 supersymmetric Yang-Mills (SSYM) theory. The Jacobians of supersymmetry and superconformal transformations are finite. Although the gauge-fixing term is not supersymmetric and the regularization scheme is not manifestly supersymmetric, we find that the regularized Jacobians are gauge invariant and finite and they can be expressed in such a way that there is no one-loop supercurrent anomaly for the N=1 SSYM theory. The superconformal anomaly is nonzero and the anomaly agrees with a similar result obtained using other methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have derived explicitly, the large scale distribution of quantum Ohmic resistance of a disordered one-dimensional conductor. We show that in the thermodynamic limit this distribution is characterized by two independent parameters for strong disorder, leading to a two-parameter scaling theory of localization. Only in the limit of weak disorder we recover single parameter scaling, consistent with existing theoretical treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Integrated Force Method (IFM) is a novel matrix formulation developed for analyzing the civil, mechanical and aerospace engineering structures. In this method all independent/internal forces are treated as unknown variables which are calculated by simultaneously imposing equations of equilibrium and compatibility conditions. This paper presents a new 12-node serendipity quadrilateral plate bending element MQP12 for the analysis of thin and thick plate problems using IFM. The Mindlin-Reissner plate theory has been employed in the formulation which accounts the effect of shear deformation. The performance of this new element with respect to accuracy and convergence is studied by analyzing many standard benchmark plate bending problems. The results of the new element MQP12 are compared with those of displacement-based 12-node plate bending elements available in the literature. The results are also compared with exact solutions. The new element MQP12 is free from shear locking and performs excellent for both thin and moderately thick plate bending situations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The modern subject is what we can call a self-subjecting individual. This is someone in whose inner reality has been implanted a more permanent governability, a governability that works inside the agent. Michel Foucault s genealogy of the modern subject is the history of its constitution by power practices. By a flight of imagination, suppose that this history is not an evolving social structure or cultural phenomenon, but one of those insects (moth) whose life cycle consists of three stages or moments: crawling larva, encapsulated pupa, and flying adult. Foucault s history of power-practices presents the same kind of miracle of total metamorphosis. The main forces in the general field of power can be apprehended through a generalisation of three rationalities functioning side-by-side in the plurality of different practices of power: domination, normalisation and the law. Domination is a force functioning by the rationality of reason of state: the state s essence is power, power is firm domination over people, and people are the state s resource by which the state s strength is measured. Normalisation is a force that takes hold on people from the inside of society: it imposes society s own reality its empirical verity as a norm on people through silently working jurisdictional operations that exclude pathological individuals too far from the average of the population as a whole. The law is a counterforce to both domination and normalisation. Accounting for elements of legal practice as omnihistorical is not possible without a view of the general field of power. Without this view, and only in terms of the operations and tactical manoeuvres of the practice of law, nothing of the kind can be seen: the only thing that practice manifests is constant change itself. However, the backdrop of law s tacit dimension that is, the power-relations between law, domination and normalisation allows one to see more. In the general field of power, the function of law is exactly to maintain the constant possibility of change. Whereas domination and normalisation would stabilise society, the law makes it move. The European individual has a reality as a problem. What is a problem? A problem is something that allows entry into the field of thought, said Foucault. To be a problem, it is necessary for certain number of factors to have made it uncertain, to have made it lose familiarity, or to have provoked a certain number of difficulties around it . Entering the field of thought through problematisations of the European individual human forms, power and knowledge one is able to glimpse the historical backgrounds of our present being. These were produced, and then again buried, in intersections between practices of power and games of truth. In the problem of the European individual one has suitable circumstances that bring to light forces that have passed through the individual through centuries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hypertexts are digital texts characterized by interactive hyperlinking and a fragmented textual organization. Increasingly prominent since the early 1990s, hypertexts have become a common text type both on the Internet and in a variety of other digital contexts. Although studied widely in disciplines like hypertext theory and media studies, formal linguistic approaches to hypertext continue to be relatively rare. This study examines coherence negotiation in hypertext with particularly reference to hypertext fiction. Coherence, or the quality of making sense, is a fundamental property of textness. Proceeding from the premise that coherence is a subjectively evaluated property rather than an objective quality arising directly from textual cues, the study focuses on the processes through which readers interact with hyperlinks and negotiate continuity between hypertextual fragments. The study begins with a typological discussion of textuality and an overview of the historical and technological precedents of modern hypertexts. Then, making use of text linguistic, discourse analytical, pragmatic, and narratological approaches to textual coherence, the study takes established models developed for analyzing and describing conventional texts, and examines their applicability to hypertext. Primary data derived from a collection of hyperfictions is used throughout to illustrate the mechanisms in practice. Hypertextual coherence negotiation is shown to require the ability to cognitively operate between local and global coherence by means of processing lexical cohesion, discourse topical continuities, inferences and implications, and shifting cognitive frames. The main conclusion of the study is that the style of reading required by hypertextuality fosters a new paradigm of coherence. Defined as fuzzy coherence, this new approach to textual sensemaking is predicated on an acceptance of the coherence challenges readers experience when the act of reading comes to involve repeated encounters with referentially imprecise hyperlinks and discourse topical shifts. A practical application of fuzzy coherence is shown to be in effect in the way coherence is actively manipulated in hypertext narratives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clustering is a process of partitioning a given set of patterns into meaningful groups. The clustering process can be viewed as consisting of the following three phases: (i) feature selection phase, (ii) classification phase, and (iii) description generation phase. Conventional clustering algorithms implicitly use knowledge about the clustering environment to a large extent in the feature selection phase. This reduces the need for the environmental knowledge in the remaining two phases, permitting the usage of simple numerical measure of similarity in the classification phase. Conceptual clustering algorithms proposed by Michalski and Stepp [IEEE Trans. PAMI, PAMI-5, 396–410 (1983)] and Stepp and Michalski [Artif. Intell., pp. 43–69 (1986)] make use of the knowledge about the clustering environment in the form of a set of predefined concepts to compute the conceptual cohesiveness during the classification phase. Michalski and Stepp [IEEE Trans. PAMI, PAMI-5, 396–410 (1983)] have argued that the results obtained with the conceptual clustering algorithms are superior to conventional methods of numerical classification. However, this claim was not supported by the experimental results obtained by Dale [IEEE Trans. PAMI, PAMI-7, 241–244 (1985)]. In this paper a theoretical framework, based on an intuitively appealing set of axioms, is developed to characterize the equivalence between the conceptual clustering and conventional clustering. In other words, it is shown that any classification obtained using conceptual clustering can also be obtained using conventional clustering and vice versa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurements of the electrical resistivity of thin potassium wires at temperatures near 1 K have revealed a minimum in the resistivity as a function of temperature. By proposing that the electrons in these wires have undergone localization, albeit with large localization length, and that inelastic-scattering events destroy the coherence of that state, we can explain both the magnitude and shape of the temperature-dependent resistivity data. Localization of electrons in these wires is to be expected because, due to the high purity of the potassium, the elastic mean free path is comparable to the diameters of the thinnest samples, making the Thouless length lT (or inelastic diffusion length) much larger than the diameter, so that the wire is effectively one dimensional. The inelastic events effectively break the wire into a series of localized segments, whose resistances can be added to obtain the total resistance of the wire. The ensemble-averaged resistance for all possible segmented wires, weighted with a Poisson distribution of inelastic-scattering lengths along the wire, yields a length dependence for the resistance that is proportional to [L3/lin(T)], provided that lin(T)?L, where L is the sample length and lin(T) is some effective temperature-dependent one-dimensional inelastic-scattering length. A more sophisticated approach using a Poisson distribution in inelastic-scattering times, which takes into account the diffusive motion of the electrons along the wire through the Thouless length, yields a length- and temperature-dependent resistivity proportional to (L/lT)4 under appropriate conditions. Inelastic-scattering lifetimes are inferred from the temperature-dependent bulk resistivities (i.e., those of thicker, effectively three-dimensional samples), assuming that a minimum amount of energy must be exchanged for a collision to be effective in destroying the phase coherence of the localized state. If the dominant inelastic mechanism is electron-electron scattering, then our result, given the appropriate choice of the channel number parameter, is consistent with the data. If electron-phason scattering were of comparable importance, then our results would remain consistent. However, the inelastic-scattering lifetime inferred from bulk resistivity data is too short. This is because the electron-phason mechanism dominates in the inelastic-scattering rate, although the two mechanisms may be of comparable importance for the bulk resistivity. Possible reasons why the electron-phason mechanism might be less effective in thin wires than in bulk are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to construct a nonequilibrium statistical‐mechanics theory to study hysteresis in ferromagnetic systems. We study the hysteretic response of model spin systems to periodic magnetic fields H(t) as a function of the amplitude H0 and frequency Ω. At fixed H0, we find conventional, squarelike hysteresis loops at low Ω, and rounded, roughly elliptical loops at high Ω, in agreement with experiments. For the O(N→∞), d=3, (Φ2)2 model with Langevin dynamics, we find a novel scaling behavior for the area A of the hysteresis loop, of the form A∝H0.660Ω0.33. We carry out a Monte Carlo simulation of the hysteretic response of the two‐dimensional, nearest‐neighbor, ferromagnetic Ising model. These results agree qualitatively with the results obtained for the O(N) model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The general structure of a metric-torsion theory of gravitation allows a parity-violating contribution to the complete action which is linear in the curvature tensor and vanishes identically in the absence of torsion. The resulting action involves, apart from the constant ¯K E =8pgr/c4, a coupling (B) which governs the strength of the parity interaction mediated by torsion. In this model the Brans-Dicke scalar field generates the torsion field, even though it has zero spin. The interesting consequence of the theory is that its results for the solar-system differ very little from those obtained from Brans-Dicke (BD) theory. Therefore the theory is indistinguishable from BD theory in solar-system experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous reports from several parts of the world have confirmed that on calm clear nights a minimum in air temperature can occur just above ground, at heights of the order of $\frac{1}{2}$ m or less. This phenomenon, first observed by Ramdas & Atmanathan (1932), carries the associated paradox of an apparently unstable layer that sustains itself for several hours, and has not so far been satisfactorily explained. We formulate here a theory that considers energy balance between radiation, conduction and free or forced convection in humid air, with surface temperature, humidity and wind incorporated into an appropriate mathematical model as parameters. A complete numerical solution of the coupled air-soil problem is used to validate an approach that specifies the surface temperature boundary condition through a cooling rate parameter. Utilizing a flux-emissivity scheme for computing radiative transfer, the model is numerically solved for various values of turbulent friction velocity. It is shown that a lifted minimum is predicted by the model for values of ground emissivity not too close to unity, and for sufficiently low surface cooling rates and eddy transport. Agreement with observation for reasonable values of the parameters is demonstrated. A heuristic argument is offered to show that radiation substantially increases the critical Rayleigh number for convection, thus circumventing or weakening Rayleigh-Benard instability. The model highlights the key role played by two parameters generally ignored in explanations of the phenomenon, namely surface emissivity and soil thermal conductivity, and shows that it is unnecessary to invoke the presence of such particulate constituents as haze to produce a lifted minimum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A microscopic theory of the statics and the dynamics of solvation of an ion in a binary dipolar liquid is presented. The theory properly includes the different intermolecular correlations that are present in a binary mixture. As a result, the theory can explain several important aspects of both the statics and the dynamics of solvation that are observed in experiments. It provides a microscopic explanation of the preferential solvation of the more polar species by the solute ion. The dynamics of solvation is predicted to be highly non-exponential, in general. The average relaxation time is found to change nonlinearly with the composition of the mixture. These predictions are in qualitative agreement with the experimental results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently we presented a microscopic expression for dielectric friction on a rotating dipole. This expression has a rather curious structure, involving the contributions of the transverse polarization modes of the solvent and also of the molecular length scale processes. It is shown here that under proper limiting conditions, this expression reduces exactly to the classical continuum model expression of Nee and Zwanzig [J. Chem. Phys. 52, 6353 (1970)]. The derivation requires the use of the asymptotic form of the orientation‐dependent total pair correlation function, the neglect of the contributions of translational modes of the solvent, and also the use of the limit that the size of the solvent molecules goes to zero. Thus, the derivation can be important in understanding the validity of the continuum model and can also help in explaining the results of a recent computer simulation study of dielectric relaxation in a Brownian dipolar lattice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A recently developed microscopic theory of solvation dynamics in real dipolar liquids is used to calculate, for the first time, the solvation time correlation function in liquid acetonitrile, water and methanol. The calculated results are in excellent agreement with known experimental and computer simulation studies.