990 resultados para Zerocrossing sampling theory,
Resumo:
The migrating electrons in biological systems normally are extraneous and taking this into account the electron delocalisation across the hydrogen bonds in proteins is re-examined. It is seen that an extraneous electron can travel rapidly via the low-lying virtual orbitals of the hydrogen-bonded π-electronic structure of peptide units in proteins. The frequency of electron transfer decreases slowly with an increase in the path length. However, the coupling of electron and protonic motions enhances this frequency. Transfer of electrons across the hydrogen bonds in accordance with the double-exchange mechanism does not appear to be possible. This theory offers a possibility for an extraneous electron to transfer within protein structures.
Resumo:
In the thesis I study various quantum coherence phenomena and create some of the foundations for a systematic coherence theory. So far, the approach to quantum coherence in science has been purely phenomenological. In my thesis I try to answer the question what quantum coherence is and how it should be approached within the framework of physics, the metatheory of physics and the terminology related to them. It is worth noticing that quantum coherence is a conserved quantity that can be exactly defined. I propose a way to define quantum coherence mathematically from the density matrix of the system. Degenerate quantum gases, i.e., Bose condensates and ultracold Fermi systems, form a good laboratory to study coherence, since their entropy is small and coherence is large, and thus they possess strong coherence phenomena. Concerning coherence phenomena in degenerate quantum gases, I concentrate in my thesis mainly on collective association from atoms to molecules, Rabi oscillations and decoherence. It appears that collective association and oscillations do not depend on the spin-statistics of particles. Moreover, I study the logical features of decoherence in closed systems via a simple spin-model. I argue that decoherence is a valid concept also in systems with a possibility to experience recoherence, i.e., Poincaré recurrences. Metatheoretically this is a remarkable result, since it justifies quantum cosmology: to study the whole universe (i.e., physical reality) purely quantum physically is meaningful and valid science, in which decoherence explains why the quantum physical universe appears to cosmologists and other scientists very classical-like. The study of the logical structure of closed systems also reveals that complex enough closed (physical) systems obey a principle that is similar to Gödel's incompleteness theorem of logic. According to the theorem it is impossible to describe completely a closed system within the system, and the inside and outside descriptions of the system can be remarkably different. Via understanding this feature it may be possible to comprehend coarse-graining better and to define uniquely the mutual entanglement of quantum systems.
Resumo:
In this thesis the current status and some open problems of noncommutative quantum field theory are reviewed. The introduction aims to put these theories in their proper context as a part of the larger program to model the properties of quantized space-time. Throughout the thesis, special focus is put on the role of noncommutative time and how its nonlocal nature presents us with problems. Applications in scalar field theories as well as in gauge field theories are presented. The infinite nonlocality of space-time introduced by the noncommutative coordinate operators leads to interesting structure and new physics. High energy and low energy scales are mixed, causality and unitarity are threatened and in gauge theory the tools for model building are drastically reduced. As a case study in noncommutative gauge theory, the Dirac quantization condition of magnetic monopoles is examined with the conclusion that, at least in perturbation theory, it cannot be fulfilled in noncommutative space.
Resumo:
Stochastic behavior of an aero-engine failure/repair process has been analyzed from a Bayesian perspective. Number of failures/repairs in the component-sockets of this multi-component system are assumed to follow independent renewal processes with Weibull inter-arrival times. Based on the field failure/repair data of a large number of such engines and independent Gamma priors on the scale parameters and log-concave priors on the shape parameters, an exact method of sampling from the resulting posterior distributions of the parameters has been proposed. These generated parameter values are next utilised in obtaining the posteriors of the expected number of system repairs, system failure rate, and the conditional intensity function, which are computed using a recursive formula.
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.
Resumo:
Based on a method proposed by Reddy and Daum, the equations governing the steady inviscid nonreacting gasdynamic laser (GDL) flow in a supersonic nozzle are reduced to a universal form so that the solutions depend on a single parameter which combines all the other parameters of the problem. Solutions are obtained for a sample case of available data and compared with existing results to validate the present approach. Also, similar solutions for a sample case are presented.
Resumo:
A microscopic study of the non‐Markovian (or memory) effects on the collective orientational relaxation in a dense dipolar liquid is carried out by using an extended hydrodynamic approach which provides a reliable description of the dynamical processes occuring at the molecular length scales. Detailed calculations of the wave‐vector dependent orientational correlation functions are presented. The memory effects are found to play an important role; the non‐Markovian results differ considerably from that of the Markovian theory. In particular, a slow long‐time decay of the longitudinal orientational correlation function is observed for dense liquids which becomes weaker in the presence of a sizeable translational contribution to the collective orientational relaxation. This slow decay can be attributed to the intermolecular correlations at the molecular length scales. The longitudinal component of the orientational correlation function becomes oscillatory in the underdamped limit of momenta relaxations and the frequency dependence of the friction reduce the frictional resistance on the collective excitations (commonly known as dipolarons) to make them long lived. The theory predicts that these dipolarons can, therefore, be important in chemical relaxation processes, in contradiction to the claims of some earlier theoretical studies.
Resumo:
A molecular theory of dielectric relaxation in a dense binary dipolar liquid is presented. The theory takes into account the effects of intra- and interspecies intermolecular interactions. It is shown that the relaxation is, in general, nonexponential. In certain limits, we recover the biexponential form traditionally used to analyze the experimental data of dielectric relaxation in a binary mixture. However, the relaxation times are widely different from the prediction of the noninteracting rotational diffusion model of Debye for a binary system. Detailed numerical evaluation of the frequency-dependent dielectric function epsilon-(omega) is carried out by using the known analytic solution of the mean spherical approximation (MSA) model for the two-particle direct correlation function for a polar mixture. A microscopic expression for both wave vector (k) and frequency (omega) dependent dielectric function, epsilon-(k,omega), of a binary mixture is also presented. The theoretical predictions on epsilon-(omega) (= epsilon-(k = 0, omega)) have been compared with the available experimental results. In particular, the present theory offers a molecular explanation of the phenomenon of fusing of the two relaxation channels of the neat liquids, observed by Schallamach many years ago.
Resumo:
Fujikawa's method of evaluating the supercurrent and the superconformal current anomalies, using the heat-kernel regularization scheme, is extended to theories with gauge invariance, in particular, to the off-shell N=1 supersymmetric Yang-Mills (SSYM) theory. The Jacobians of supersymmetry and superconformal transformations are finite. Although the gauge-fixing term is not supersymmetric and the regularization scheme is not manifestly supersymmetric, we find that the regularized Jacobians are gauge invariant and finite and they can be expressed in such a way that there is no one-loop supercurrent anomaly for the N=1 SSYM theory. The superconformal anomaly is nonzero and the anomaly agrees with a similar result obtained using other methods.
Resumo:
We have derived explicitly, the large scale distribution of quantum Ohmic resistance of a disordered one-dimensional conductor. We show that in the thermodynamic limit this distribution is characterized by two independent parameters for strong disorder, leading to a two-parameter scaling theory of localization. Only in the limit of weak disorder we recover single parameter scaling, consistent with existing theoretical treatments.
Resumo:
In this dissertation I study language complexity from a typological perspective. Since the structuralist era, it has been assumed that local complexity differences in languages are balanced out in cross-linguistic comparisons and that complexity is not affected by the geopolitical or sociocultural aspects of the speech community. However, these assumptions have seldom been studied systematically from a typological point of view. My objective is to define complexity so that it is possible to compare it across languages and to approach its variation with the methods of quantitative typology. My main empirical research questions are: i) does language complexity vary in any systematic way in local domains, and ii) can language complexity be affected by the geographical or social environment? These questions are studied in three articles, whose findings are summarized in the introduction to the dissertation. In order to enable cross-language comparison, I measure complexity as the description length of the regularities in an entity; I separate it from difficulty, focus on local instead of global complexity, and break it up into different types. This approach helps avoid the problems that plagued earlier metrics of language complexity. My approach to grammar is functional-typological in nature, and the theoretical framework is basic linguistic theory. I delimit the empirical research functionally to the marking of core arguments (the basic participants in the sentence). I assess the distributions of complexity in this domain with multifactorial statistical methods and use different sampling strategies, implementing, for instance, the Greenbergian view of universals as diachronic laws of type preference. My data come from large and balanced samples (up to approximately 850 languages), drawn mainly from reference grammars. The results suggest that various significant trends occur in the marking of core arguments in regard to complexity and that complexity in this domain correlates with population size. These results provide evidence that linguistic patterns interact among themselves in terms of complexity, that language structure adapts to the social environment, and that there may be cognitive mechanisms that limit complexity locally. My approach to complexity and language universals can therefore be successfully applied to empirical data and may serve as a model for further research in these areas.
Resumo:
The Integrated Force Method (IFM) is a novel matrix formulation developed for analyzing the civil, mechanical and aerospace engineering structures. In this method all independent/internal forces are treated as unknown variables which are calculated by simultaneously imposing equations of equilibrium and compatibility conditions. This paper presents a new 12-node serendipity quadrilateral plate bending element MQP12 for the analysis of thin and thick plate problems using IFM. The Mindlin-Reissner plate theory has been employed in the formulation which accounts the effect of shear deformation. The performance of this new element with respect to accuracy and convergence is studied by analyzing many standard benchmark plate bending problems. The results of the new element MQP12 are compared with those of displacement-based 12-node plate bending elements available in the literature. The results are also compared with exact solutions. The new element MQP12 is free from shear locking and performs excellent for both thin and moderately thick plate bending situations.
Resumo:
The modern subject is what we can call a self-subjecting individual. This is someone in whose inner reality has been implanted a more permanent governability, a governability that works inside the agent. Michel Foucault s genealogy of the modern subject is the history of its constitution by power practices. By a flight of imagination, suppose that this history is not an evolving social structure or cultural phenomenon, but one of those insects (moth) whose life cycle consists of three stages or moments: crawling larva, encapsulated pupa, and flying adult. Foucault s history of power-practices presents the same kind of miracle of total metamorphosis. The main forces in the general field of power can be apprehended through a generalisation of three rationalities functioning side-by-side in the plurality of different practices of power: domination, normalisation and the law. Domination is a force functioning by the rationality of reason of state: the state s essence is power, power is firm domination over people, and people are the state s resource by which the state s strength is measured. Normalisation is a force that takes hold on people from the inside of society: it imposes society s own reality its empirical verity as a norm on people through silently working jurisdictional operations that exclude pathological individuals too far from the average of the population as a whole. The law is a counterforce to both domination and normalisation. Accounting for elements of legal practice as omnihistorical is not possible without a view of the general field of power. Without this view, and only in terms of the operations and tactical manoeuvres of the practice of law, nothing of the kind can be seen: the only thing that practice manifests is constant change itself. However, the backdrop of law s tacit dimension that is, the power-relations between law, domination and normalisation allows one to see more. In the general field of power, the function of law is exactly to maintain the constant possibility of change. Whereas domination and normalisation would stabilise society, the law makes it move. The European individual has a reality as a problem. What is a problem? A problem is something that allows entry into the field of thought, said Foucault. To be a problem, it is necessary for certain number of factors to have made it uncertain, to have made it lose familiarity, or to have provoked a certain number of difficulties around it . Entering the field of thought through problematisations of the European individual human forms, power and knowledge one is able to glimpse the historical backgrounds of our present being. These were produced, and then again buried, in intersections between practices of power and games of truth. In the problem of the European individual one has suitable circumstances that bring to light forces that have passed through the individual through centuries.