932 resultados para Franck-Condon principle.
Resumo:
There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.
Resumo:
In technicolor theories the scalar sector of the Standard Model is replaced by a strongly interacting sector. Although the Standard Model has been exceptionally successful, the scalar sector causes theoretical problems that make these theories seem an attractive alternative. I begin my thesis by considering QCD, which is the known example of strong interactions. The theory exhibits two phenomena: confinement and chiral symmetry breaking. I find the low-energy dynamics to be similar to that of the sigma models. Then I analyze the problems of the Standard Model Higgs sector, mainly the unnaturalness and triviality. Motivated by the example of QCD, I introduce the minimal technicolor model to resolve these problems. I demonstrate the minimal model to be free of anomalies and then deduce the main elements of its low-energy particle spectrum. I find the particle spectrum contains massless or very light technipions, and also technibaryons and techni-vector mesons with a high mass of over 1 TeV. Standard Model fermions remain strictly massless at this stage. Thus I introduce the technicolor companion theory of flavor, called extended technicolor. I show that the Standard Model fermions and technihadrons receive masses, but that they remain too light. I also discuss flavor-changing neutral currents and precision electroweak measurements. I then show that walking technicolor models partly solve these problems. In these models, contrary to QCD, the coupling evolves slowly over a large energy scale. This behavior adds to the masses so that even the light technihadrons are too heavy to be detected at current particle accelerators. Also all observed masses of the Standard Model particles can be generated, except for the bottom and top quarks. Thus it is shown in this thesis that, excluding the masses of third generation quarks, theories based on walking technicolor can in principle produce the observed particle spectrum.
Resumo:
The rail-sleeper system is idealized as an infinite, periodic beam-mass system. Use is made of the periodicity principle for the semi-infinite halves on either side of the forcing point for evaluation of the wave propagation constants and the corresponding modal vectors. It is shown that the spread of acceleration away from the forcing point depends primarily upon one of the wave propagation constants. However, all the four modal vectors (two for the left-hand side and two for the right-hand side) determine the driving point impedance of the rail-sleeper system, which in combination with the driving point impedance of the wheel (which is adopted from the preceding companion paper) determines the forces generated by combined surface roughness and the resultant accelerations. The compound one-third octave acceleration levels generated by typical roughness spectra are generally of the same order as the observed levels.
Resumo:
The shelf life of mangoes is limited by two main postharvest diseases when not consistently managed. These are anthracnose ( Colletotrichum gloeosporioides) and stem end rots (SER) ( Fusicoccum parvum). The management of these diseases has often relied mainly on the use of fungicides either as field spray treatments or as postharvest dips. These have done a fairly good job at serving the industry and allowing fruits to be transported, stored and sold at markets distant from the areas of production. There are however concerns on the continuous use of these fungicides as the main or only tool for the management of these diseases. This has necessitated a re-think of how these diseases could be sustainably managed into the future using a systems approach that focuses on integrated crop management. It is a holistic approach that considers all the crop protection management strategies including the genetics of the plant and its ability to naturally defend itself from infection with plant activators and growth regulators. It also considers other cultural or agronomic management tools such as the use of crop nutrition, timely application of irrigation water and the pruning of trees on a regular basis as a means of reducing inoculum levels in the orchards. The ultimate aim of this approach is to increase yields and obtain long term sustainable production. It is guided by the sustainable crop production principle which states that producers should apply as little inputs as possible but as much as needed.
Resumo:
The Standard Model of particle physics consists of the quantum electrodynamics (QED) and the weak and strong nuclear interactions. The QED is the basis for molecular properties, and thus it defines much of the world we see. The weak nuclear interaction is responsible for decays of nuclei, among other things, and in principle, it should also effects at the molecular scale. The strong nuclear interaction is hidden in interactions inside nuclei. From the high-energy and atomic experiments it is known that the weak interaction does not conserve parity. Consequently, the weak interaction and specifically the exchange of the Z^0 boson between a nucleon and an electron induces small energy shifts of different sign for mirror image molecules. This in turn will make the other enantiomer of a molecule energetically favorable than the other and also shifts the spectral lines of the mirror image pair of molecules into different directions creating a split. Parity violation (PV) in molecules, however, has not been observed. The topic of this thesis is how the weak interaction affects certain molecular magnetic properties, namely certain parameters of nuclear magnetic resonance (NMR) and electron spin resonance (ESR) spectroscopies. The thesis consists of numerical estimates of NMR and ESR spectral parameters and investigations of the effects of different aspects of quantum chemical computations to them. PV contributions to the NMR shielding and spin-spin coupling constants are investigated from the computational point of view. All the aspects of quantum chemical electronic structure computations are found to be very important, which makes accurate computations challenging. Effects of molecular geometry are also investigated using a model system of polysilyene chains. PV contribution to the NMR shielding constant is found to saturate after the chain reaches a certain length, but the effects of local geometry can be large. Rigorous vibrational averaging is also performed for a relatively small and rigid molecule. Vibrational corrections to the PV contribution are found to be only a couple of per cents. PV contributions to the ESR g-tensor are also evaluated using a series of molecules. Unfortunately, all the estimates are below the experimental limits, but PV in some of the heavier molecules comes close to the present day experimental resolution.
Resumo:
Since 2007, 96 wild Queensland groupers, Epinephelus lanceolatus, (Bloch), have been found dead in NE Australia. In some cases, Streptococcus agalactiae (Group B Streptococcus, GBS) was isolated. At present, a GBS isolate from a wild grouper case was employed in experimental challenge trials in hatchery-reared Queensland grouper by different routes of exposure. Injection resulted in rapid development of clinical signs including bilateral exophthalmia, hyperaemic skin or fins and abnormal swimming. Death occurred in, and GBS was re-isolated from, 98% fish injected and was detected by PCR in brain, head kidney and spleen from all fish, regardless of challenge dose. Challenge by immersion resulted in lower morbidity with a clear dose response. Whilst infection was established via oral challenge by admixture with feed, no mortality occurred. Histology showed pathology consistent with GBS infection in organs examined from all injected fish, from fish challenged with medium and high doses by immersion, and from high-dose oral challenge. These experimental challenges demonstrated that GBS isolated from wild Queensland grouper reproduced disease in experimentally challenged fish and resulted in pathology that was consistent with that seen in wild Queensland grouper infected with S. agalactiae.
Resumo:
This painting is in dedication " in honor of Meyer Ehrenberg by his students" as stated on a scroll held by the sitter. The scroll is dated October 7, 1820 and is followed by a list of names in two columns: (M.Ehrenberg, P.Ehrenberg, B. Ehrenberg, M.Imanuel M.Balke, B.Meier, L.Franck, M. Cohen, W.Fraenkel, M. Chohns, P.Goldschmidt, J.Lippoa, A. Nathan, M.Kramer, J.Fraenkel, M.Goldschmidt.) Ehrenberg was the founder of the first modern Jewish Day School in Germany, at Wolfenbuettel. Ehrenberg was the great-grandfather of the Jewish Theologian Franz Rosenzweig.
Resumo:
Atomic layer deposition (ALD) is a method for thin film deposition which has been extensively studied for binary oxide thin film growth. Studies on multicomponent oxide growth by ALD remain relatively few owing to the increased number of factors that come into play when more than one metal is employed. More metal precursors are required, and the surface may change significantly during successive stages of the growth. Multicomponent oxide thin films can be prepared in a well-controlled way as long as the same principle that makes binary oxide ALD work so well is followed for each constituent element: in short, the film growth has to be self-limiting. ALD of various multicomponent oxides was studied. SrTiO3, BaTiO3, Ba(1-x)SrxTiO3 (BST), SrTa2O6, Bi4Ti3O12, BiTaO4 and SrBi2Ta2O9 (SBT) thin films were prepared, many of them for the first time by ALD. Chemistries of the binary oxides are shown to influence the processing of their multicomponent counterparts. The compatibility of precursor volatilities, thermal stabilities and reactivities is essential for multicomponent oxide ALD, but it should be noted that the main reactive species, the growing film itself, must also be compatible with self-limiting growth chemistry. In the cases of BaO and Bi2O3 the growth of the binary oxide was very difficult, but the presence of Ti or Ta in the growing film made self-limiting growth possible. The application of the deposited films as dielectric and ferroelectric materials was studied. Post-deposition annealing treatments in different atmospheres were used to achieve the desired crystalline phase or, more generally, to improve electrical properties. Electrode materials strongly influenced the leakage current densities in the prepared metal insulator metal (MIM) capacitors. Film permittivities above 100 and leakage current densities below 110-7 A/cm2 were achieved with several of the materials.
Resumo:
Purpose – This paper aims to go beyond a bookkeeping approach to evolutionary analysis whereby surviving firms are better adapted and extinct firms were less adapted. From discussion of the preliminary findings of research into the Hobart pizza industry, evidence is presented of the need to adopt a more traditional approach to applying evolutionary theories with organizational research. Design/methodology/approach – After a brief review of the relevant literature, the preliminary findings of research into the Hobart pizza industry are presented. Then, several evolutionary concepts that are commonplace in ecological research are introduced to help explain the emergent findings. The paper concludes with consideration given to advancing a more consistent approach to employing evolutionary theories within organizational research. Findings – The paper finds that the process of selection cannot be assumed to occur evenly across time and/or space. Within geographically small markets different forms of selection operate in different ways and degrees requiring the use of more traditional evolutionary theories to highlight the causal process associated with population change. Research limitations/implications – The paper concludes by highlighting Geoffrey Hodgson’s Principle of Consistency. It is demonstrated that a failure to truly understand how and why theory is used in one domain will likely result in its misuse in another domain. That, at present, too few evolutionary concepts are employed in organisational research to ensure an appreciation of any underlying causal processes through which social change occurs. Originality/value – The concepts introduced throughout this paper, whilst not new, provide new entry points for organizational researchers intent on employing an evolutionary approach to understand the process of social change.
Resumo:
In the case of pipe trifurcation, previous observations report negative energy losses in the centre branch. This causes an anomaly, because there should not be any negative energy loss due to conservation of energy principle. Earlier investigators have suggested that this may be due to the non-inclusion of kinetic energy coefficient (a) in the computations of energy losses without any experimental evidence. In the present work, through experimentally determined velocity profiles, energy loss coefficients have been evaluated. It has been found that with the inclusion of a in the computations of energy loss, there is no negative energy loss in the centre branch.
Resumo:
This paper explores how whiteness scholarship can support deep engagement with both historical and contemporary forms of whiteness and racism in early childhood education. To this point, the uptake of whiteness scholarship in the field of early childhood has focused predominantly on autobiographical narratives. These narratives recount white educators’ stories of ‘becoming aware’ or ‘unmasking’ their whiteness. In colonising contexts including Australia, New Zealand and Canada, understanding how whiteness operates in different ways and what this means for educational research and practice, can support researchers and educators to identify and describe more fully the impacts of subtle forms of racism in their everyday practices. In this paper, whiteness is explored in a broader sense as: a form of property; an organising principle for institutional behaviours and practices; and as a fluid identity or subject position. These three intersecting elements of whiteness are drawn on to analyse data from a doctoral study about embedding Aboriginal and Torres Strait Islander perspectives in early childhood education curricula in two Australian urban childcare settings. Analysis is focused on how whiteness operated within the research site and research processes, along with the actions, inaction and talk of two educators engaged in embedding work. Findings show that both the researcher and educators reinforced, rather than reduced the impacts of whiteness and racism, despite the best of intentions.
Resumo:
In this Thesis, we develop theory and methods for computational data analysis. The problems in data analysis are approached from three perspectives: statistical learning theory, the Bayesian framework, and the information-theoretic minimum description length (MDL) principle. Contributions in statistical learning theory address the possibility of generalization to unseen cases, and regression analysis with partially observed data with an application to mobile device positioning. In the second part of the Thesis, we discuss so called Bayesian network classifiers, and show that they are closely related to logistic regression models. In the final part, we apply the MDL principle to tracing the history of old manuscripts, and to noise reduction in digital signals.
Resumo:
This paper presents a method of designing a minimax filter in the presence of large plant uncertainties and constraints on the mean squared values of the estimates. The minimax filtering problem is reformulated in the framework of a deterministic optimal control problem and the method of solution employed, invokes the matrix Minimum Principle. The constrained linear filter and its relation to singular control problems has been illustrated. For the class of problems considered here it is shown that the filter can he constrained separately after carrying out the mini maximization. Numorieal examples are presented to illustrate the results.
Resumo:
Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.
Resumo:
The Minimum Description Length (MDL) principle is a general, well-founded theoretical formalization of statistical modeling. The most important notion of MDL is the stochastic complexity, which can be interpreted as the shortest description length of a given sample of data relative to a model class. The exact definition of the stochastic complexity has gone through several evolutionary steps. The latest instantation is based on the so-called Normalized Maximum Likelihood (NML) distribution which has been shown to possess several important theoretical properties. However, the applications of this modern version of the MDL have been quite rare because of computational complexity problems, i.e., for discrete data, the definition of NML involves an exponential sum, and in the case of continuous data, a multi-dimensional integral usually infeasible to evaluate or even approximate accurately. In this doctoral dissertation, we present mathematical techniques for computing NML efficiently for some model families involving discrete data. We also show how these techniques can be used to apply MDL in two practical applications: histogram density estimation and clustering of multi-dimensional data.