939 resultados para abstract data type
Resumo:
The efficiency of the biological pump of carbon to the deep ocean depends largely on the biologically mediated export of carbon from the surface ocean and its remineralization with depth. Global satellite studies have primarily focused on chlorophyll concentration and net primary production (NPP) to understand the role of phytoplankton in these processes. Recent satellite retrievals of phytoplankton composition now allow for the size of phytoplankton cells to be considered. Here, we improve understanding of phytoplankton size structure impacts on particle export, remineralization and transfer. Particulate organic carbon (POC) flux observations from sediment traps and 234Th are compiled across the global ocean. Annual climatologies of NPP, percent microplankton, and POC flux at four time series locations and within biogeochemical provinces are constructed, and sinking velocities are calculated to align surface variables with POC flux at depth. Parameters that characterize POC flux vs. depth (export flux ratio, labile fraction, remineralization length scale) are then fit to the aligned dataset. Times of the year dominated by different size compositions are identified and fit separately in regions of the ocean where phytoplankton cell size showed enough dynamic range over the annual cycle. Considering all data together, our findings support the paradigm of high export flux but low transfer efficiency in more productive regions and vice versa for oligotrophic regions. However, when parsing by dominant size class, we find periods dominated by small cells to have both greater export flux and lower transfer efficiency than periods when large cells comprise a greater proportion of the phytoplankton community.
Resumo:
El cálculo de relaciones binarias fue creado por De Morgan en 1860 para ser posteriormente desarrollado en gran medida por Peirce y Schröder. Tarski, Givant, Freyd y Scedrov demostraron que las álgebras relacionales son capaces de formalizar la lógica de primer orden, la lógica de orden superior así como la teoría de conjuntos. A partir de los resultados matemáticos de Tarski y Freyd, esta tesis desarrolla semánticas denotacionales y operacionales para la programación lógica con restricciones usando el álgebra relacional como base. La idea principal es la utilización del concepto de semántica ejecutable, semánticas cuya característica principal es el que la ejecución es posible utilizando el razonamiento estándar del universo semántico, este caso, razonamiento ecuacional. En el caso de este trabajo, se muestra que las álgebras relacionales distributivas con un operador de punto fijo capturan toda la teoría y metateoría estándar de la programación lógica con restricciones incluyendo los árboles utilizados en la búsqueda de demostraciones. La mayor parte de técnicas de optimización de programas, evaluación parcial e interpretación abstracta pueden ser llevadas a cabo utilizando las semánticas aquí presentadas. La demostración de la corrección de la implementación resulta extremadamente sencilla. En la primera parte de la tesis, un programa lógico con restricciones es traducido a un conjunto de términos relacionales. La interpretación estándar en la teoría de conjuntos de dichas relaciones coincide con la semántica estándar para CLP. Las consultas contra el programa traducido son llevadas a cabo mediante la reescritura de relaciones. Para concluir la primera parte, se demuestra la corrección y equivalencia operacional de esta nueva semántica, así como se define un algoritmo de unificación mediante la reescritura de relaciones. La segunda parte de la tesis desarrolla una semántica para la programación lógica con restricciones usando la teoría de alegorías—versión categórica del álgebra de relaciones—de Freyd. Para ello, se definen dos nuevos conceptos de Categoría Regular de Lawvere y _-Alegoría, en las cuales es posible interpretar un programa lógico. La ventaja fundamental que el enfoque categórico aporta es la definición de una máquina categórica que mejora e sistema de reescritura presentado en la primera parte. Gracias al uso de relaciones tabulares, la máquina modela la ejecución eficiente sin salir de un marco estrictamente formal. Utilizando la reescritura de diagramas, se define un algoritmo para el cálculo de pullbacks en Categorías Regulares de Lawvere. Los dominios de las tabulaciones aportan información sobre la utilización de memoria y variable libres, mientras que el estado compartido queda capturado por los diagramas. La especificación de la máquina induce la derivación formal de un juego de instrucciones eficiente. El marco categórico aporta otras importantes ventajas, como la posibilidad de incorporar tipos de datos algebraicos, funciones y otras extensiones a Prolog, a la vez que se conserva el carácter 100% declarativo de nuestra semántica. ABSTRACT The calculus of binary relations was introduced by De Morgan in 1860, to be greatly developed by Peirce and Schröder, as well as many others in the twentieth century. Using different formulations of relational structures, Tarski, Givant, Freyd, and Scedrov have shown how relation algebras can provide a variable-free way of formalizing first order logic, higher order logic and set theory, among other formal systems. Building on those mathematical results, we develop denotational and operational semantics for Constraint Logic Programming using relation algebra. The idea of executable semantics plays a fundamental role in this work, both as a philosophical and technical foundation. We call a semantics executable when program execution can be carried out using the regular theory and tools that define the semantic universe. Throughout this work, the use of pure algebraic reasoning is the basis of denotational and operational results, eliminating all the classical non-equational meta-theory associated to traditional semantics for Logic Programming. All algebraic reasoning, including execution, is performed in an algebraic way, to the point we could state that the denotational semantics of a CLP program is directly executable. Techniques like optimization, partial evaluation and abstract interpretation find a natural place in our algebraic models. Other properties, like correctness of the implementation or program transformation are easy to check, as they are carried out using instances of the general equational theory. In the first part of the work, we translate Constraint Logic Programs to binary relations in a modified version of the distributive relation algebras used by Tarski. Execution is carried out by a rewriting system. We prove adequacy and operational equivalence of the semantics. In the second part of the work, the relation algebraic approach is improved by using allegory theory, a categorical version of the algebra of relations developed by Freyd and Scedrov. The use of allegories lifts the semantics to typed relations, which capture the number of logical variables used by a predicate or program state in a declarative way. A logic program is interpreted in a _-allegory, which is in turn generated from a new notion of Regular Lawvere Category. As in the untyped case, program translation coincides with program interpretation. Thus, we develop a categorical machine directly from the semantics. The machine is based on relation composition, with a pullback calculation algorithm at its core. The algorithm is defined with the help of a notion of diagram rewriting. In this operational interpretation, types represent information about memory allocation and the execution mechanism is more efficient, thanks to the faithful representation of shared state by categorical projections. We finish the work by illustrating how the categorical semantics allows the incorporation into Prolog of constructs typical of Functional Programming, like abstract data types, and strict and lazy functions.
Resumo:
Los sistemas de adquisición de datos utilizados en los diagnósticos de los dispositivos de fusión termonuclear se enfrentan a importantes retos planteados en los dispositivos de pulso largo. Incluso en los dispositivos de pulso corto, en los que se analizan los datos después de la descarga, existen aún una gran cantidad de datos sin analizar, lo cual supone que queda una gran cantidad de conocimiento por descubrir dentro de las bases de datos existentes. En la última década, la comunidad de fusión ha realizado un gran esfuerzo para mejorar los métodos de análisis off‐line para mejorar este problema, pero no se ha conseguido resolver completamente, debido a que algunos de estos métodos han de resolverse en tiempo real. Este paradigma lleva a establecer que los dispositivos de pulso largo deberán incluir dispositivos de adquisición de datos con capacidades de procesamiento local, capaces de ejecutar avanzados algoritmos de análisis. Los trabajos de investigación realizados en esta tesis tienen como objetivo determinar si es posible incrementar la capacidad local de procesamiento en tiempo real de dichos sistemas mediante el uso de GPUs. Para ello durante el trascurso del periodo de experimentación realizado se han evaluado distintas propuestas a través de casos de uso reales elaborados para algunos de los dispositivos de fusión más representativos como ITER, JET y TCV. Las conclusiones y experiencias obtenidas en dicha fase han permitido proponer un modelo y una metodología de desarrollo para incluir esta tecnología en los sistemas de adquisición para diagnósticos de distinta naturaleza. El modelo define no sólo la arquitectura hardware óptima para realizar dicha integración, sino también la incorporación de este nuevo recurso de procesamiento en los Sistemas de Control de Supervisión y Adquisición de Datos (SCADA) utilizados en la comunidad de fusión (EPICS), proporcionando una solución completa. La propuesta se complementa con la definición de una metodología que resuelve las debilidades detectadas, y permite trazar un camino de integración de la solución en los estándares hardware y software existentes. La evaluación final se ha realizado mediante el desarrollo de un caso de uso representativo de los diagnósticos que necesitan adquisición y procesado de imágenes en el contexto del dispositivo internacional ITER, y ha sido testeada con éxito en sus instalaciones. La solución propuesta en este trabajo ha sido incluida por la ITER IO en su catálogo de soluciones estándar para el desarrollo de sus futuros diagnósticos. Por otra parte, como resultado y fruto de la investigación de esta tesis, cabe destacar el acuerdo llevado a cabo con la empresa National Instruments en términos de transferencia tecnológica, lo que va a permitir la actualización de los sistemas de adquisición utilizados en los dispositivos de fusión. ABSTRACT Data acquisition systems used in the diagnostics of thermonuclear fusion devices face important challenges due to the change in the data acquisition paradigm needed for long pulse operation. Even in shot pulse devices, where data is mainly analyzed after the discharge has finished , there is still a large amount of data that has not been analyzed, therefore producing a lot of buried knowledge that still lies undiscovered in the data bases holding the vast amount of data that has been generated. There has been a strong effort in the fusion community in the last decade to improve the offline analysis methods to overcome this problem, but it has proved to be insufficient unless some of these mechanisms can be run in real time. In long pulse devices this new paradigm, where data acquisition devices include local processing capabilities to be able to run advanced data analysis algorithms, will be a must. The research works done in this thesis aim to determining whether it is possible to increase local capacity for real‐time processing of such systems by using GPUs. For that, during the experimentation period, various proposals have been evaluated through use cases developed for several of the most representative fusion devices, ITER, JET and TCV. Conclusions and experiences obtained have allowed to propose a model, and a development methodology, to include this technology in systems for diagnostics of different nature. The model defines not only the optimal hardware architecture for achieving this integration, but also the incorporation of this new processing resource in one of the Systems of Supervision Control and Data Acquisition (SCADA) systems more relevant at the moment in the fusion community (EPICS), providing a complete solution. The final evaluation has been performed through a use case developed for a generic diagnostic requiring image acquisition and processing for the international ITER device, and has been successfully tested in their premises. The solution proposed in this thesis has been included by the ITER IO in his catalog of standard solutions for the development of their future diagnostics. This has been possible thanks to the technologic transfer agreement signed with xi National Instruments which has permitted us to modify and update one of their core software products targeted for the acquisition systems used in these devices.
Resumo:
This study gives an overview of the theoretical foundations, empirical procedures and derived results of the literature identifying determinants of land prices. Special attention is given to the effects of different government support policies on land prices. Since almost all empirical studies on the determination of land prices refer either to the net present value method or the hedonic pricing approach as a theoretical basis, a short review of these models is provided. While the two approaches have different theoretical bases, their empirical implementation converges. Empirical studies use a broad range of variables to explain land values and we systematise those into six categories. In order to investigate the influence of different measures of government support on land prices, a meta-regression analysis is carried out. Our results reveal a significantly higher rate of capitalisation for decoupled direct payments and a significantly lower rate of capitalisation for agri-environmental payments, as compared to the rest of government support. Furthermore, the results show that taking theoretically consistent land rents (returns to land) and including non-agricultural variables like urban pressure in the regression implies lower elasticities of capitalisation. In addition, we find a significant influence of the land type, the data type and estimation techniques on the capitalisation rate.
Resumo:
Senior thesis written for Oceanography 445
Resumo:
A refinement calculus provides a method for transforming specifications to executable code, maintaining the correctness of the code with respect to its specification. In this paper we introduce modules into a logic programming refinement calculus. Modules allow data types to be grouped together with sets of procedures that manipulate the data types. By placing restrictions on the way a program uses a module, we develop a technique for refining the module so that it uses a more efficient representation of the data type.
Resumo:
This paper describes a formal component language, used to support automated component-based program development. The components, referred to as templates, are machine processable, meaning that appropriate tool support, such as retrieval support, can be developed. The templates are highly adaptable, meaning that they can be applied to a wide range of problems. Some of the main features of the language are described, including: higher-order parameters; state variable declarations; specification statements and conditionals; applicability conditions and theories; meta-level place holders; and abstract data structures.
Resumo:
Cognitive systems research involves the synthesis of ideas from natural and artificial systems in the analysis, understanding, and design of all intelligent systems. This chapter discusses the cognitive systems associated with the hippocampus (HC) of the human brain and their possible role in behaviour and neurodegenerative disease. The hippocampus (HC) is concerned with the analysis of highly abstract data derived from all sensory systems but its specific role remains controversial. Hence, there have been three major theories concerning its function, viz., the memory theory, the spatial theory, and the behavioral inhibition theory. The memory theory has its origin in the surgical destruction of the HC, which results in severe anterograde and partial retrograde amnesia. The spatial theory has its origin in the observation that neurons in the HC of animals show activity related to their location within the environment. By contrast, the behavioral inhibition theory suggests that the HC acts as a ‘comparator’, i.e., it compares current sensory events with expected or predicted events. If a set of expectations continues to be verified then no alteration of behavior occurs. If, however, a ‘mismatch’ is detected then the HC intervenes by initiating appropriate action by active inhibition of current motor programs and initiation of new data gathering. Understanding the cognitive systems of the hippocampus in humans may aid in the design of intelligent systems involved in spatial mapping, memory, and decision making. In addition, this information may lead to a greater understanding of the course of clinical dementia in the various neurodegenerative diseases in which there is significant damage to the HC.
Resumo:
This article discusses the structure, anatomical connections, and functions of the hippocampus (HC) of the human brain and its significance in neuropsychology and disease. The HC is concerned with the analysis of highly abstract data derived from all sensory systems but its specific role remains controversial. Hence, there have been three major theories concerning its function, viz., the memory theory, the spatial theory, and the behavioral inhibition system (BIS) theory. The memory theory has its origin in the surgical destruction of the HC, which results in severe anterograde and partial retrograde amnesia. The spatial theory has its origin in the observation that neurons in the HC of animals show activity related to their location within the environment. By contrast, the behavioral inhibition theory suggests that the HC acts as a ‘comparator’, i.e., it compares current sensory events with expected or predicted events. If a set of expectations continues to be verified then no alteration of behavior occurs. If, however, a ‘mismatch’ is detected then the HC intervenes by initiating appropriate action by active inhibition of current motor programs and initiation of new data gathering. Understanding the anatomical connections of the hippocampus may lead to a greater understanding of memory, spatial orientation, and states of anxiety in humans. In addition, HC damage is a feature of neurodegenerative diseases such as Alzheimer’s disease (AD), dementia with Lewy bodies (DLB), Pick’s disease (PiD), and Creutzfeldt-Jakob disease (CJD) and understanding HC function may help to explain the development of clinical dementia in these disorders.
Resumo:
In the article, we have reviewed the means for visualization of syntax, semantics and source code for programming languages which support procedural and/or object-oriented paradigm. It is examined how the structure of the source code of the structural and object-oriented programming styles has influenced different approaches for their teaching. We maintain a thesis valid for the object-oriented programming paradigm, which claims that the activities for design and programming of classes are done by the same specialist, and the training of this specialist should include design as well as programming skills and knowledge for modeling of abstract data structures. We put the question how a high level of abstraction in the object-oriented paradigm should be presented in simple model in the design stage, so the complexity in the programming stage stay low and be easily learnable. We give answer to this question, by building models using the UML notation, as we take a concrete example from the teaching practice including programming techniques for inheritance and polymorphism.
Resumo:
This article discusses the structure, anatomical connections, and functions of the hippocampus (HC) of the human brain and its significance in neuropsychology and disease. The HC is concerned with the analysis of highly abstract data derived from all sensory systems but its specific role remains controversial. Hence, there have been three major theories concerning its function, viz., the memory theory, the spatial theory, and the behavioral inhibition system (BIS) theory. The memory theory has its origin in the surgical destruction of the HC, which results in severe anterograde and partial retrograde amnesia. The spatial theory has its origin in the observation that neurons in the HC of animals show activity related to their location within the environment. By contrast, the behavioral inhibition theory suggests that the HC acts as a 'comparator', i.e., it compares current sensory events with expected or predicted events. If a set of expectations continues to be verified then no alteration of behavior occurs. If, however, a 'mismatch' is detected then the HC intervenes by initiating appropriate action by active inhibition of current motor programs and initiation of new data gathering. Understanding the anatomical connections of the hippocampus may lead to a greater understanding of memory, spatial orientation, and states of anxiety in humans. In addition, HC damage is a feature of neurodegenerative diseases such as Alzheimer's disease (AD), dementia with Lewy bodies (DLB), Pick's disease (PiD), and Creutzfeldt-Jakob disease (CJD) and understanding HC function may help to explain the development of clinical dementia in these disorders.
Resumo:
OLIVEIRA,Jonas Sâmi Albuquerque de; ENDERS, Bertha Cruz; MENEZES, Rejane Maria Paiva de MEDEIROS, Soraya Maria de. O estágio extracurricular remunerado no cuidar da enfermagem nos hospitais de ensino. Revista Gaúcha de Enfermagem, Porto Alegre(RS),v.30,n.2, p.311-8,jun.2009.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08