120 resultados para FORMALISMS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we look at two different 1-dimensional quantum systems. The potentials for these systems are a linear potential in an infinite well and an inverted harmonic oscillator in an infinite well. We will solve the Schrödinger equation for both of these systems and get the energy eigenvalues and eigenfunctions. The solutions are obtained by using the boundary conditions and numerical methods. The motivation for our study comes from experimental background. For the linear potential we have two different boundary conditions. The first one is the so called normal boundary condition in which the wave function goes to zero on the edge of the well. The second condition is called derivative boundary condition in which the derivative of the wave function goes to zero on the edge of the well. The actual solutions are Airy functions. In the case of the inverted oscillator the solutions are parabolic cylinder functions and they are solved only using the normal boundary condition. Both of the potentials are compared with the particle in a box solutions. We will also present figures and tables from which we can see how the solutions look like. The similarities and differences with the particle in a box solution are also shown visually. The figures and calculations are done using mathematical software. We will also compare the linear potential to a case where the infinite wall is only on the left side. For this case we will also show graphical information of the different properties. With the inverted harmonic oscillator we will take a closer look at the quantum mechanical tunneling. We present some of the history of the quantum tunneling theory, its developers and finally we show the Feynman path integral theory. This theory enables us to get the instanton solutions. The instanton solutions are a way to look at the tunneling properties of the quantum system. The results are compared with the solutions of the double-well potential which is very similar to our case as a quantum system. The solutions are obtained using the same methods which makes the comparison relatively easy. All in all we consider and go through some of the stages of the quantum theory. We also look at the different ways to interpret the theory. We also present the special functions that are needed in our solutions, and look at the properties and different relations to other special functions. It is essential to notice that it is possible to use different mathematical formalisms to get the desired result. The quantum theory has been built for over one hundred years and it has different approaches. Different aspects make it possible to look at different things.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Qualitative spatial reasoning (QSR) is an important field of AI that deals with qualitative aspects of spatial entities. Regions and their relationships are described in qualitative terms instead of numerical values. This approach models human based reasoning about such entities closer than other approaches. Any relationships between regions that we encounter in our daily life situations are normally formulated in natural language. For example, one can outline one's room plan to an expert by indicating which rooms should be connected to each other. Mereotopology as an area of QSR combines mereology, topology and algebraic methods. As mereotopology plays an important role in region based theories of space, our focus is on one of the most widely referenced formalisms for QSR, the region connection calculus (RCC). RCC is a first order theory based on a primitive connectedness relation, which is a binary symmetric relation satisfying some additional properties. By using this relation we can define a set of basic binary relations which have the property of being jointly exhaustive and pairwise disjoint (JEPD), which means that between any two spatial entities exactly one of the basic relations hold. Basic reasoning can now be done by using the composition operation on relations whose results are stored in a composition table. Relation algebras (RAs) have become a main entity for spatial reasoning in the area of QSR. These algebras are based on equational reasoning which can be used to derive further relations between regions in a certain situation. Any of those algebras describe the relation between regions up to a certain degree of detail. In this thesis we will use the method of splitting atoms in a RA in order to reproduce known algebras such as RCC15 and RCC25 systematically and to generate new algebras, and hence a more detailed description of regions, beyond RCC25.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Among many other knowledge representations formalisms, Ontologies and Formal Concept Analysis (FCA) aim at modeling ‘concepts’. We discuss how these two formalisms may complement another from an application point of view. In particular, we will see how FCA can be used to support Ontology Engineering, and how ontologies can be exploited in FCA applications. The interplay of FCA and ontologies is studied along the life cycle of an ontology: (i) FCA can support the building of the ontology as a learning technique. (ii) The established ontology can be analyzed and navigated by using techniques of FCA. (iii) Last but not least, the ontology may be used to improve an FCA application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

About ten years ago, triadic contexts were presented by Lehmann and Wille as an extension of Formal Concept Analysis. However, they have rarely been used up to now, which may be due to the rather complex structure of the resulting diagrams. In this paper, we go one step back and discuss how traditional line diagrams of standard (dyadic) concept lattices can be used for exploring and navigating triadic data. Our approach is inspired by the slice & dice paradigm of On-Line-Analytical Processing (OLAP). We recall the basic ideas of OLAP, and show how they may be transferred to triadic contexts. For modeling the navigation patterns a user might follow, we use the formalisms of finite state machines. In order to present the benefits of our model, we show how it can be used for navigating the IT Baseline Protection Manual of the German Federal Office for Information Security.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

DIADEM, created by THOMSON-CSF, is a methodology for specifying and developing user interfaces. It improves productivity of the interface development process as well as quality of the interface. The method provides support to user interface development in three aspects. (1) DIADEM defines roles of people involved and their tasks and organises the sequence of activities. (2) It provides graphical formalisms supporting information exchange between people. (3) It offers a basic set of rules for optimum human-machine interfaces. The use of DIADEM in three areas (process control, sales support, and multimedia presentation) was observed and evaluated by our laboratory in the European project DIAMANTA (ESPRIT P20507). The method provides an open procedure that leaves room for adaptation to a specific application and environment. This paper gives an overview of DIADEM and shows how to extend formalisms for developing multimedia interfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computational formalisms have been pushing the boundaries of the field of computing for the last 80 years and much debate has surrounded what computing entails; what it is, and what it is not. This paper seeks to explore the boundaries of the ideas of computation and provide a framework for enabling a constructive discussion of computational ideas. First, a review of computing is given, ranging from Turing Machines to interactive computing. Then, a variety of natural physical systems are considered for their computational qualities. From this exploration, a framework is presented under which all dynamical systems can be considered as instances of the class of abstract computational platforms. An abstract computational platform is defined by both its intrinsic dynamics and how it allows computation that is meaningful to an external agent through the configuration of constraints upon those dynamics. It is asserted that a platform’s computational expressiveness is directly related to the freedom with which constraints can be placed. Finally, the requirements for a formal constraint description language are considered and it is proposed that Abstract State Machines may provide a reasonable basis for such a language.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using a new proposal for the ""picture lowering"" operators, we compute the tree level scattering amplitude in the minimal pure spinor formalism by performing the integration over the pure spinor space as a multidimensional Cauchy-type integral. The amplitude will be written in terms of the projective pure spinor variables, which turns out to be useful to relate rigorously the minimal and non-minimal versions of the pure spinor formalism. The natural language for relating these formalisms is the. Cech-Dolbeault isomorphism. Moreover, the Dolbeault cocycle corresponding to the tree-level scattering amplitude must be evaluated in SO(10)/SU(5) instead of the whole pure spinor space, which means that the origin is removed from this space. Also, the. Cech-Dolbeault language plays a key role for proving the invariance of the scattering amplitude under BRST, Lorentz and supersymmetry transformations, as well as the decoupling of unphysical states. We also relate the Green`s function for the massless scalar field in ten dimensions to the tree-level scattering amplitude and comment about the scattering amplitude at higher orders. In contrast with the traditional picture lowering operators, with our new proposal the tree level scattering amplitude is independent of the constant spinors introduced to define them and the BRST exact terms decouple without integrating over these constant spinors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We employed the Density Functional Theory along with small basis sets, B3LYP/LANL2DZ, for the study of FeTIM complexes with different pairs of axial ligands (CO, H(2)O, NH(3), imidazole and CH(3)CN). These calculations did not result in relevant changes of molecular quantities as bond lengths, vibrational frequencies and electronic populations supporting any significant back-donation to the carbonyl or acetonitrile axial ligands. Moreover, a back-donation mechanism to the macrocycle cannot be used to explain the observed changes in molecular properties along these complexes with CO or CH(3)CN. This work also indicates that complexes with CO show smaller binding energies and are less stable than complexes with CH(3)CN. Further, the electronic band with the largest intensity in the visible region (or close to this region) is associated to the transition from an occupied 3d orbital on iron to an empty pi* orbital located at the macrocycle. The energy of this Metal-to-Ligand Charge Transfer (MLCT) transition shows a linear relation to the total charge of the macrocycle in these complexes as given by Mulliken or Natural Population Analysis (NPA) formalisms. Finally, the macrocycle total charge seems to be influenced by the field induced by the axial ligands. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prosodic /template Morphology, that "draws heavily on the theoretical apparatus and formalisms of the generative phonology model known as autosegmental phonology" (Katamba, F. 1993: 154), is the best analysis that can handle Arabic morphology. Verbs in Arabic are represented on three independent tiers: root tier, the skeletal tier and the vocalic melody tier (Katamba, F. 1993). Vowel morphemes, which are represented by diacritics, are inserted within the consonant morphemes, which are represented by primary symbols, to form words. The morpheme tier hypothesis paves the way to understand the nonconcatenative Arabic morphology. This paper analyzes gender in perfect active and passive 3rd person singular verbs on the basis of PM. The focus of the analysis shall be drawn heavily on the most common Arabic verbs; triconsonantal verbs, with brief introduction of the less common verbs; quadriconsonantal perfect active and passive masculine and feminine 3rd person singular verbs. I shall, too, cast the light on some vowel changes that some verbs undergo when voice changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many solutions to AI problems require the task to be represented in one of a multitude of rigorous mathematical formalisms. The construction of such mathematical models forms a difficult problem which is often left to the user of the problem solver. This void between problem solvers and the problems is studied by the eclectic field of automated modelling. Within this field, compositional modelling, a knowledge-based methodology for system modelling, has established itself as a leading approach. In general, a compositional modeller organises knowledge in a structure of composable fragments that relate to particular system components or processes. Its embedded inference mechanism chooses the appropriate fragments with respect to a given problem, instantiates and assembles them into a consistent system model. Many different types of compositional modeller exist, however, with significant differences in their knowledge representation and approach to inference. This paper examines compositional modelling. It presents a general framework for building and analysing compositional modellers. Based on this framework, a number of influential compositional modellers are examined and compared. The paper also identifies the strengths and weaknesses of compositional modelling and discusses some typical applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interoperability of water quality data depends on the use of common models, schemas and vocabularies. However, terms are usually collected during different activities and projects in isolation of one another, resulting in vocabularies that have the same scope being represented with different terms, using different formats and formalisms, and published in various access methods. Significantly, most water quality vocabularies conflate multiple concepts in a single term, e.g. quantity kind, units of measure, substance or taxon, medium and procedure. This bundles information associated with separate elements from the OGC Observations and Measurements (O&M) model into a single slot. We have developed a water quality vocabulary, formalized using RDF, and published as Linked Data. The terms were extracted from existing water quality vocabularies. The observable property model is inspired by O&M but aligned with existing ontologies. The core is an OWL ontology that extends the QUDT ontology for Unit and QuantityKind definitions. We add classes to generalize the QuantityKind model, and properties for explicit description of the conflated concepts. The key elements are defined to be sub-classes or sub-properties of SKOS elements, which enables a SKOS view to be published through standard vocabulary APIs, alongside the full view. QUDT terms are re-used where possible, supplemented with additional Unit and QuantityKind entries required for water quality. Along with items from separate vocabularies developed for objects, media, and procedures, these are linked into definitions in the actual observable property vocabulary. Definitions of objects related to chemical substances are linked to items from the Chemical Entities of Biological Interest (ChEBI) ontology. Mappings to other vocabularies, such as DBPedia, are in separately maintained files. By formalizing the model for observable properties, and clearly labelling the separate concerns, water quality observations from different sources may be more easily merged and also transformed to O&M for cross-domain applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although formal methods can dramatically increase the quality of software systems, they have not widely been adopted in software industry. Many software companies have the perception that formal methods are not cost-effective cause they are plenty of mathematical symbols that are difficult for non-experts to assimilate. The Java Modelling Language (short for JML) Section 3.3 is an academic initiative towards the development of a common formal specification language for Java programs, and the implementation of tools to check program correctness. This master thesis work shows how JML based formal methods can be used to formally develop a privacy sensitive Java application. This is a smart card application for managing medical appointments. The application is named HealthCard. We follow the software development strategy introduced by João Pestana, presented in Section 3.4. Our work influenced the development of this strategy by providing hands-on insight on challenges related to development of a privacy sensitive application in Java. Pestana’s strategy is based on a three-step evolution strategy of software specifications, from informal ones, through semiformal ones, to JML formal specifications. We further prove that this strategy can be automated by implementing a tool that generates JML formal specifications from a welldefined subset of informal software specifications. Hence, our work proves that JML-based formal methods techniques are cost-effective, and that they can be made popular in software industry. Although formal methods are not popular in many software development companies, we endeavour to integrate formal methods to general software practices. We hope our work can contribute to a better acceptance of mathematical based formalisms and tools used by software engineers. The structure of this document is as follows. In Section 2, we describe the preliminaries of this thesis work. We make an introduction to the application for managing medical applications we have implemented. We also describe the technologies used in the development of the application. This section further illustrates the Java Card Remote Method Invocation communication model used in the medical application for the client and server applications. Section 3 introduces software correctness, including the design by contract and the concept of contract in JML. Section 4 presents the design structure of the application. Section 5 shows the implementation of the HealthCard. Section 6 describes how the HealthCard is verified and validated using JML formal methods tools. Section 7 includes some metrics of the HealthCard implementation and specification. Section 8 presents a short example of how a client-side of a smart card application can be implemented while respecting formal specifications. Section 9 describes a prototype tools to generate JML formal specifications from informal specifications automatically. Section 10 describes some challenges and main ideas came acrorss during the development of the HealthCard. The full formal specification and implementation of the HealthCard smart card application presented in this document can be reached at https://sourceforge.net/projects/healthcard/.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this dissertation, after a brief review on the Einstein s General Relativity Theory and its application to the Friedmann-Lemaitre-Robertson-Walker (FLRW) cosmological models, we present and discuss the alternative theories of gravity dubbed f(R) gravity. These theories come about when one substitute in the Einstein-Hilbert action the Ricci curvature R by some well behaved nonlinear function f(R). They provide an alternative way to explain the current cosmic acceleration with no need of invoking neither a dark energy component, nor the existence of extra spatial dimensions. In dealing with f(R) gravity, two different variational approaches may be followed, namely the metric and the Palatini formalisms, which lead to very different equations of motion. We briefly describe the metric formalism and then concentrate on the Palatini variational approach to the gravity action. We make a systematic and detailed derivation of the field equations for Palatini f(R) gravity, which generalize the Einsteins equations of General Relativity, and obtain also the generalized Friedmann equations, which can be used for cosmological tests. As an example, using recent compilations of type Ia Supernovae observations, we show how the f(R) = R − fi/Rn class of gravity theories explain the recent observed acceleration of the universe by placing reasonable constraints on the free parameters fi and n. We also examine the question as to whether Palatini f(R) gravity theories permit space-times in which causality, a fundamental issue in any physical theory [22], is violated. As is well known, in General Relativity there are solutions to the viii field equations that have causal anomalies in the form of closed time-like curves, the renowned Gödel model being the best known example of such a solution. Here we show that every perfect-fluid Gödel-type solution of Palatini f(R) gravity with density and pressure p that satisfy the weak energy condition + p 0 is necessarily isometric to the Gödel geometry, demonstrating, therefore, that these theories present causal anomalies in the form of closed time-like curves. This result extends a theorem on Gödel-type models to the framework of Palatini f(R) gravity theory. We derive an expression for a critical radius rc (beyond which causality is violated) for an arbitrary Palatini f(R) theory. The expression makes apparent that the violation of causality depends on the form of f(R) and on the matter content components. We concretely examine the Gödel-type perfect-fluid solutions in the f(R) = R−fi/Rn class of Palatini gravity theories, and show that for positive matter density and for fi and n in the range permitted by the observations, these theories do not admit the Gödel geometry as a perfect-fluid solution of its field equations. In this sense, f(R) gravity theory remedies the causal pathology in the form of closed timelike curves which is allowed in General Relativity. We also examine the violation of causality of Gödel-type by considering a single scalar field as the matter content. For this source, we show that Palatini f(R) gravity gives rise to a unique Gödeltype solution with no violation of causality. Finally, we show that by combining a perfect fluid plus a scalar field as sources of Gödel-type geometries, we obtain both solutions in the form of closed time-like curves, as well as solutions with no violation of causality

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A modelagem baseada no indivíduo tem sido crescentemente empregada para analisar processos ecológicos, desenvolver e avaliar teorias, bem como para fins de manejo da vida silvestre e conservação. Os modelos baseados no indivíduo (MBI) são bastante flexíveis, permitem o uso detalhado de parâmetros com maior significado biológico, sendo portanto mais realistas do que modelos populacionais clássicos, mais presos dentro de um rígido formalismo matemático. O presente artigo apresenta e discute sete razões para a adoção dos MBI em estudos de simulação na Ecologia: (1) a inerente complexidade de sistemas ecológicos, impassíveis de uma análise matemática formal; (2) processos populacionais são fenômenos emergentes, resultando das interações entre seus elementos constituintes (indivíduos) e destes com o meio; (3) poder de predição; (4) a adoção definitiva, por parte da Ecologia, de uma visão evolutiva; (5) indivíduos são entidades discretas; (6) interações são localizadas no espaço e (7) indivíduos diferem entre si.