958 resultados para Semiclassical violation of the equivalence principle
Resumo:
Bibliography: p. [221]-222.
Resumo:
One of the problems in AI tasks solving by neurocomputing methods is a considerable training time. This problem especially appears when it is needed to reach high quality in forecast reliability or pattern recognition. Some formalised ways for increasing of networks’ training speed without loosing of precision are proposed here. The offered approaches are based on the Sufficiency Principle, which is formal representation of the aim of a concrete task and conditions (limitations) of their solving [1]. This is development of the concept that includes the formal aims’ description to the context of such AI tasks as classification, pattern recognition, estimation etc.
Resumo:
Mathematics Subject Classification: 35CXX, 26A33, 35S10
Resumo:
Same-sex parenting is by no means a new phenomenon but the legal recognition and acceptance of gay and lesbian couples as parents is a relatively recent development in most countries. Traditionally, such recognition has been opposed on the basis of the claim that the best interests of children could not be met by gay and lesbian parents. This thesis examines the validity of this argument and it explores the true implications of the best interests principle in this context. The objective is to move away from subjective or moral conceptions of the best interests principle to an understanding which is informed by relevant sociological and psychological data and which is guided by reference to the rights contained in the UN Convention on the Rights of the Child. Using this perspective, the thesis addresses the overarching issue of whether the law should offer legal recognition and protection to gay and lesbian families and the more discrete matter of how legal protection should be provided. It is argued that the best interests principle can be used to demand that same-sex parenting arrangements should be afforded legal recognition and protection. Suggestions are also presented as to the most appropriate manner of providing for this recognition. In this regard, guidance is drawn from the English and South African experience in this area. Overall, the objective is to assess the current laws from the perspective of the best interests principle so as to ensure that the law operates in a manner which adheres to the rights and interests of children.
Resumo:
A complainant alleged the Department of Revenue violated the South Carolina Procurement Code. This paper examines that complaint.
Resumo:
Pós-graduação em Física - IFT
Resumo:
As far as external gravitational fields described by Newton's theory are concerned, theory shows that there is an unavoidable conflict between the universality of free fall (Galileo's equivalence principle) and quantum mechanics - a result confirmed by experiment. Is this conflict due perhaps to the use of Newton's gravity, instead of general relativity, in the analysis of the external gravitational field? The response is negative. To show this we compute the low corrections to the cross-section for the scattering of different quantum particles by an external gravitational field, treated as an external field, in the framework of Einstein's linearized gravity. To first order the cross-sections are spin-dependent; if the calculations are pushed to the next order they become dependent upon energy as well. Therefore, the Galileo's equivalence and, consequently, the classical equivalence principle, is violated in both cases. We address these issues here.
Resumo:
The purpose of this exploratory investigation was to provide a more precise understanding and basis from which to assess the potential role of the precautionary principle in tourism. The precautionary principle, analogous to the ideal of sustainable development, is a future-focused planning and regulatory mechanism that emphasizes pro-action and recognizes the limitations of contemporary scientific methods. A total of 100 respondents (80 tourism academics, 20 regional government tourism officials) from Canada, United States, United Kingdom, Australia and New Zealand completed the webbased survey between May and June 2003. Respondents reported their understanding of the precautionary principle, rated stakeholder involvement and education strategies, assessed potential barriers in implementation, and appraised steps of a proposed fi-amework for implementation. Due to low sub sample numbers, measures of central tendency were primarily used to compare groups, while inferential statistics were applied when warranted. Results indicated that most respondents (79%) felt the principle could be a guiding principle for tourism, while local and regional government entities were reported to have the most power in the implementation process. Findings suggested close links between the precautionary principle and sustainability, as concern for future generations was the most critical element of the principle for tourism. Overall, tourism academics were more supportive of the precautionary principle in tourism than were regional government tourism officials. Only minor variation was found in responses among regional groups across all variables. This study established basic ground for understanding the precautionary principle in tourism and has been effective in formulating more precise questions for future research.
Resumo:
Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.
Resumo:
By using a nonholonomic moving frame version of the general covariance principle, an active version of the equivalence principle, an analysis of the gravitational coupling prescription of teleparallel gravity is made. It is shown that the coupling prescription determined by this principle is always equivalent with the corresponding prescription of general relativity, even in the presence of fermions. An application to the case of a Dirac spinor is made.
Resumo:
A strict proof of the equivalence of the Duffin-Kemmer-Petiau and Klein-Gordon Fock theories is presented for physical S-matrix elements in the case of charged scalar particles minimally interacting with an external or quantized electromagnetic field. The Hamiltonian canonical approach to the Duffin - Kemmer Petiau theory is first developed in both the component and the matrix form. The theory is then quantized through the construction of the generating functional for the Green's functions, and the physical matrix elements of the S-matrix are proved to be relativistic invariants. The equivalence of the two theories is then proved for the matrix elements of the scattered scalar particles using the reduction formulas of Lehmann, Symanzik, and Zimmermann and for the many-photon Green's functions.
Resumo:
We studied the low energy motion of particles in the general covariant. version of Horava-Lifshitz gravity proposed by Horava and Melby-Thompson. Using a scalar field coupled to gravity according to the minimal substitution recipe proposed by da Silva and taking the geometrical optics limit, we could write an effective relativistic metric for a general solution. As a result, we discovered that the equivalence principle is not in general recovered at low energies, unless the spatial Laplacian of A vanishes. Finally, we analyzed the motion on the spherical symmetric solution proposed by Horava and Melby-Thompson, where we could find its effective line element and compute spin-0 geodesics. Using standard methods we have shown that such an effective metric cannot reproduce Newton's gravity law even in the weak gravitational field approximation. (C) 2011 Elsevier B.V All rights reserved.
Resumo:
In the maximum parsimony (MP) and minimum evolution (ME) methods of phylogenetic inference, evolutionary trees are constructed by searching for the topology that shows the minimum number of mutational changes required (M) and the smallest sum of branch lengths (S), respectively, whereas in the maximum likelihood (ML) method the topology showing the highest maximum likelihood (A) of observing a given data set is chosen. However, the theoretical basis of the optimization principle remains unclear. We therefore examined the relationships of M, S, and A for the MP, ME, and ML trees with those for the true tree by using computer simulation. The results show that M and S are generally greater for the true tree than for the MP and ME trees when the number of nucleotides examined (n) is relatively small, whereas A is generally lower for the true tree than for the ML tree. This finding indicates that the optimization principle tends to give incorrect topologies when n is small. To deal with this disturbing property of the optimization principle, we suggest that more attention should be given to testing the statistical reliability of an estimated tree rather than to finding the optimal tree with excessive efforts. When a reliability test is conducted, simplified MP, ME, and ML algorithms such as the neighbor-joining method generally give conclusions about phylogenetic inference very similar to those obtained by the more extensive tree search algorithms.
Resumo:
Simplifying the Einstein field equation by assuming the cosmological principle yields a set of differential equations which governs the dynamics of the universe as described in the cosmological standard model. The cosmological principle assumes the space appears the same everywhere and in every direction and moreover, the principle has earned its position as a fundamental assumption in cosmology by being compatible with the observations of the 20th century. It was not until the current century when observations in cosmological scales showed significant deviation from isotropy and homogeneity implying the violation of the principle. Among these observations are the inconsistency between local and non-local Hubble parameter evaluations, baryon acoustic features of the Lyman-α forest and the anomalies of the cosmic microwave background radiation. As a consequence, cosmological models beyond the cosmological principle have been studied vastly; after all, the principle is a hypothesis and as such should frequently be tested as any other assumption in physics. In this thesis, the effects of inhomogeneity and anisotropy, arising as a consequence of discarding the cosmological principle, is investigated. The geometry and matter content of the universe becomes more cumbersome and the resulting effects on the Einstein field equation is introduced. The cosmological standard model and its issues, both fundamental and observational are presented. Particular interest is given to the local Hubble parameter, supernova explosion, baryon acoustic oscillation, and cosmic microwave background observations and the cosmological constant problems. Explored and proposed resolutions emerging by violating the cosmological principle are reviewed. This thesis is concluded by a summary and outlook of the included research papers.
Resumo:
Laboratory tests with aqueous solutions of Euphorbia splendens var. hislopii latex have demonstrated seasonal stability of the molluscicidal principle, with LD90 values of 1.14 ppm (spring), 1.02 ppm (fall), 1.09 ppm (winter), and 1.07 ppm (summer) that have been determined against Biomphalaria tenagophila in the field. Assays on latex collected in Belo Horizonte and Recife yielded LD90 values similar to those obtained with the reference substance collected in Rio de Janeiro (Ilha do Governador), demonstrating geographic stability of the molluscicidal effect. The molluscicidal action of aqueous dilutions of the latex in natura, centrifuged (precipitate) and lyophilized, was stable for up to 124 days at room temperature (in natura) and for up to 736 days in a common refrigerator at 10 to 12ºC (lyophilized product). A 5.0 ppm solution is 100% lethal for snails up to 13 days after preparation, the effect being gradually lost to almost total inactivity by the 30th day. This observation indicated that the active principle is instable. These properties together with the wide distribution of the plant, its resistance and adaptation to the tropical climate, its easy cultivation and the easy obtention of latex and preparation of the molluscicidal solution, make this a promising material for large-scale use in the control of schistosomiasis