990 resultados para Functional programming languages
Resumo:
In previous work we showed how to verify a compiler for a small language with exceptions. In this article we show how to calculate, as opposed to verify, an abstract machine for this language. The key step is the use of Reynold's defunctionalization, an old program transformation technique that has recently been rejuvenated by the work of Danvy et al.
Resumo:
Corecursive programs produce values of greatest fixpoint types, in contrast to recursive programs, which consume values of least fixpoint types. There are a number of widely used methods for proving properties of corecursive programs, including fixpoint induction, the take lemma, and coinduction. However, these methods are all rather low level, in that they do not exploit the common structure that is often present in corecursive definitions. We argue for a more structured approach to proving properties of corecursive programs. In particular, we show that by writing corecursive programs using a simple operator that encapsulates a common pattern of corecursive definition, we can then use high-level algebraic properties of this operator to conduct proofs in a purely calculational style that avoids the use of inductive or coinductive methods.
Resumo:
The definition for the notion of a "function" is not cast in stone, but depends upon what we adopt as types in our language. With partial equivalence relations (pers) as types in a relational language, we show that the functional relations are precisely those satisfying the simple equation f = f o fu o f, where "o" and "u" are respectively the composition and converse operators for relations. This article forms part of "A calculational theory of pers as types".
Resumo:
MAIDL, André Murbach; CARVILHE, Claudio; MUSICANTE, Martin A. Maude Object-Oriented Action Tool. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2008.
Resumo:
The PROSPER (Proof and Specification Assisted Design Environments) project advocates the use of toolkits which allow existing verification tools to be adapted to a more flexible format so that they may be treated as components. A system incorporating such tools becomes another component that can be embedded in an application. This paper describes the PROSPER Toolkit which enables this. The nature of communication between components is specified in a language-independent way. It is implemented in several common programming languages to allow a wide variety of tools to have access to the toolkit.
Resumo:
One challenge on data assimilation (DA) methods is how the error covariance for the model state is computed. Ensemble methods have been proposed for producing error covariance estimates, as error is propagated in time using the non-linear model. Variational methods, on the other hand, use the concepts of control theory, whereby the state estimate is optimized from both the background and the measurements. Numerical optimization schemes are applied which solve the problem of memory storage and huge matrix inversion needed by classical Kalman filter methods. Variational Ensemble Kalman filter (VEnKF), as a method inspired the Variational Kalman Filter (VKF), enjoys the benefits from both ensemble methods and variational methods. It avoids filter inbreeding problems which emerge when the ensemble spread underestimates the true error covariance. In VEnKF this is tackled by resampling the ensemble every time measurements are available. One advantage of VEnKF over VKF is that it needs neither tangent linear code nor adjoint code. In this thesis, VEnKF has been applied to a two-dimensional shallow water model simulating a dam-break experiment. The model is a public code with water height measurements recorded in seven stations along the 21:2 m long 1:4 m wide flume’s mid-line. Because the data were too sparse to assimilate the 30 171 model state vector, we chose to interpolate the data both in time and in space. The results of the assimilation were compared with that of a pure simulation. We have found that the results revealed by the VEnKF were more realistic, without numerical artifacts present in the pure simulation. Creating a wrapper code for a model and DA scheme might be challenging, especially when the two were designed independently or are poorly documented. In this thesis we have presented a non-intrusive approach of coupling the model and a DA scheme. An external program is used to send and receive information between the model and DA procedure using files. The advantage of this method is that the model code changes needed are minimal, only a few lines which facilitate input and output. Apart from being simple to coupling, the approach can be employed even if the two were written in different programming languages, because the communication is not through code. The non-intrusive approach is made to accommodate parallel computing by just telling the control program to wait until all the processes have ended before the DA procedure is invoked. It is worth mentioning the overhead increase caused by the approach, as at every assimilation cycle both the model and the DA procedure have to be initialized. Nonetheless, the method can be an ideal approach for a benchmark platform in testing DA methods. The non-intrusive VEnKF has been applied to a multi-purpose hydrodynamic model COHERENS to assimilate Total Suspended Matter (TSM) in lake Säkylän Pyhäjärvi. The lake has an area of 154 km2 with an average depth of 5:4 m. Turbidity and chlorophyll-a concentrations from MERIS satellite images for 7 days between May 16 and July 6 2009 were available. The effect of the organic matter has been computationally eliminated to obtain TSM data. Because of computational demands from both COHERENS and VEnKF, we have chosen to use 1 km grid resolution. The results of the VEnKF have been compared with the measurements recorded at an automatic station located at the North-Western part of the lake. However, due to TSM data sparsity in both time and space, it could not be well matched. The use of multiple automatic stations with real time data is important to elude the time sparsity problem. With DA, this will help in better understanding the environmental hazard variables for instance. We have found that using a very high ensemble size does not necessarily improve the results, because there is a limit whereby additional ensemble members add very little to the performance. Successful implementation of the non-intrusive VEnKF and the ensemble size limit for performance leads to an emerging area of Reduced Order Modeling (ROM). To save computational resources, running full-blown model in ROM is avoided. When the ROM is applied with the non-intrusive DA approach, it might result in a cheaper algorithm that will relax computation challenges existing in the field of modelling and DA.
Resumo:
Humans have a high ability to extract visual data information acquired by sight. Trought a learning process, which starts at birth and continues throughout life, image interpretation becomes almost instinctively. At a glance, one can easily describe a scene with reasonable precision, naming its main components. Usually, this is done by extracting low-level features such as edges, shapes and textures, and associanting them to high level meanings. In this way, a semantic description of the scene is done. An example of this, is the human capacity to recognize and describe other people physical and behavioral characteristics, or biometrics. Soft-biometrics also represents inherent characteristics of human body and behaviour, but do not allow unique person identification. Computer vision area aims to develop methods capable of performing visual interpretation with performance similar to humans. This thesis aims to propose computer vison methods which allows high level information extraction from images in the form of soft biometrics. This problem is approached in two ways, unsupervised and supervised learning methods. The first seeks to group images via an automatic feature extraction learning , using both convolution techniques, evolutionary computing and clustering. In this approach employed images contains faces and people. Second approach employs convolutional neural networks, which have the ability to operate on raw images, learning both feature extraction and classification processes. Here, images are classified according to gender and clothes, divided into upper and lower parts of human body. First approach, when tested with different image datasets obtained an accuracy of approximately 80% for faces and non-faces and 70% for people and non-person. The second tested using images and videos, obtained an accuracy of about 70% for gender, 80% to the upper clothes and 90% to lower clothes. The results of these case studies, show that proposed methods are promising, allowing the realization of automatic high level information image annotation. This opens possibilities for development of applications in diverse areas such as content-based image and video search and automatica video survaillance, reducing human effort in the task of manual annotation and monitoring.
Resumo:
Les langages de programmation typés dynamiquement tels que JavaScript et Python repoussent la vérification de typage jusqu’au moment de l’exécution. Afin d’optimiser la performance de ces langages, les implémentations de machines virtuelles pour langages dynamiques doivent tenter d’éliminer les tests de typage dynamiques redondants. Cela se fait habituellement en utilisant une analyse d’inférence de types. Cependant, les analyses de ce genre sont souvent coûteuses et impliquent des compromis entre le temps de compilation et la précision des résultats obtenus. Ceci a conduit à la conception d’architectures de VM de plus en plus complexes. Nous proposons le versionnement paresseux de blocs de base, une technique de compilation à la volée simple qui élimine efficacement les tests de typage dynamiques redondants sur les chemins d’exécution critiques. Cette nouvelle approche génère paresseusement des versions spécialisées des blocs de base tout en propageant de l’information de typage contextualisée. Notre technique ne nécessite pas l’utilisation d’analyses de programme coûteuses, n’est pas contrainte par les limitations de précision des analyses d’inférence de types traditionnelles et évite la complexité des techniques d’optimisation spéculatives. Trois extensions sont apportées au versionnement de blocs de base afin de lui donner des capacités d’optimisation interprocédurale. Une première extension lui donne la possibilité de joindre des informations de typage aux propriétés des objets et aux variables globales. Puis, la spécialisation de points d’entrée lui permet de passer de l’information de typage des fonctions appellantes aux fonctions appellées. Finalement, la spécialisation des continuations d’appels permet de transmettre le type des valeurs de retour des fonctions appellées aux appellants sans coût dynamique. Nous démontrons empiriquement que ces extensions permettent au versionnement de blocs de base d’éliminer plus de tests de typage dynamiques que toute analyse d’inférence de typage statique.
Resumo:
This work proposes to adjust the Notification Oriented Paradigm (NOP) so that it provides support to fuzzy concepts. NOP is inspired by elements of imperative and declarative paradigms, seeking to solve some of the drawbacks of both. By decomposing an application into a network of smaller computational entities that are executed only when necessary, NOP eliminates the need to perform unnecessary computations and helps to achieve better logical-causal uncoupling, facilitating code reuse and application distribution over multiple processors or machines. In addition, NOP allows to express the logical-causal knowledge at a high level of abstraction, through rules in IF-THEN format. Fuzzy systems, in turn, perform logical inferences on causal knowledge bases (IF-THEN rules) that can deal with problems involving uncertainty. Since PON uses IF-THEN rules in an alternative way, reducing redundant evaluations and providing better decoupling, this research has been carried out to identify, propose and evaluate the necessary changes to be made on NOP allowing to be used in the development of fuzzy systems. After that, two fully usable materializations were created: a C++ framework, and a complete programming language (LingPONFuzzy) that provide support to fuzzy inference systems. From there study cases have been created and several tests cases were conducted, in order to validate the proposed solution. The test results have shown a significant reduction in the number of rules evaluated in comparison to a fuzzy system developed using conventional tools (frameworks), which could represent an improvement in performance of the applications.
Resumo:
In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance -- They allow to save time and to avoid errors during part programming and permit code re-usage -- Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility -- In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while) -- Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability -- Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs -- Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions
Resumo:
Sustainability in buildings, while reducing the impact on the environment, contributes to the promotion of social welfare, to increase the health and productivity of occupants. The search for a way of build that meets the aspirations and development of humanity without, however, represent degradation of the environment, has become the great challenge of contemporary architecture. It is considered that the incorporation of principles that provide a sustainable building with careful choices of design solutions contribute to a better economic and thermal performance of the building, as well as functional and psychological comfort to its users. Based on this general understanding, this paper presents an architecture project aimed to health care whose the solutions adopted follow carefully the relevant legislation and sets his sights on the theme of sustainability. The methodology began with studies on the themes of verification service of deaths, sustainability and those application in construction developed through research in academic studies and analysis of architectural projects, using them like reference for the solutions adopted. Within the project analysis was performed a visit to the verification service of deaths in the city of Palmas in Tocantins, subsidizing information that, plus the relevant legislation, led to functional programming and pre-dimensional of the building to be designed. The result of this programming environments were individual records with information from environmental restrictions, space required for the development of activities, desirable flow and sustainability strategies, that can be considered as the first product of relevance of the professional master's degree. Finally we have outlined the basic design architecture of a Verification Service of Death SVO/RN (in portuguese), whose process of projecting defined as a guiding line of work four points: the use of bioclimatic architecture as the main feature projectual, the use of resources would provide minimal harm to the environment, the use of modulation and structure to the building as a form of rationalization and finally the search for solutions that ensure environmental and psychological comfort to users. Importantly to highlight that, besides owning a rare theme in literature that refers to architectural projects, the whole project was drawn up with foundations in projective criteria that contribute to environmental sustainability, with emphasis on thermal performance, energy efficiency and reuse of rainwater
Resumo:
This work proposes an environment for programming programmable logic controllers applied to oil wells with BCP type method of artificially lifting. The environment will have an editor based in the diagram of sequential functions for programming of PLCs. This language was chosen due to the fact of being high-level and accepted by the international standard IEC 61131-3. The use of these control programs in real PLC will be possible with the use of an intermediate level of language based on XML specification PLCopen T6 XML. For the testing and validation of the control programs, an area should be available for viewing variables obtained through communication with a real PLC. Thus, the main contribution of this work is to develop a computational environment that allows: modeling, testing and validating the controls represented in SFC and applied in oil wells with BCP type method of artificially lifting
Resumo:
Neste documento foi feita a apresentação do conceito de Programação Visual, estudados alguns exemplos de Linguagens de Programação Visual e investigado vantagens e desvantagens sobre o seu uso comparativamente à programação tradicional (por texto) e, adicionalmente, foram estudados três jogos para aprender a programar com fortes características visuais. A partir desta investigação, foi criada uma nova Linguagem de Programação Visual(LPV) sendo a base de um prototipo de um jogo para aprender a programar. Este jogo é representado pela deslocação e execução de tarefas feitas por um robô dentro de uma casa, sendo o objetivo do programador colocar objetos nesta casa que manipulam a deslocação do robô dentro desta de forma a resolver um problema; Abstract: A game to learn programming Aplication of concepts of visual programming to create a game to learn how to program In this document we will explain the concept of Visual Programming, study some examples of Visual Programming Languages, analyze some arguments in favor and against its use in comparison to traditional programming (by text) and, additionally, study three games to learn programming with strong visual features. Using this research, a new Visual Programming Language(VPL) was created, being the basis of a prototype of a game to learn programming. This game is represented by the movement and execution of tasks done by a robot inside a house, being the goal of the programmer to place objects in this house that manipulate the movement of the robot inside of it in order to solve a problem.
Resumo:
This dissertation explores the link between hate crimes that occurred in the United Kingdom in June 2017, June 2018 and June 2019 through the posts of a robust sample of Conservative and radical right users on Twitter. In order to avoid the traditional challenges of this kind of research, I adopted a four staged research protocol that enabled me to merge content produced by a group of randomly selected users to observe the phenomenon from different angles. I collected tweets from thirty Conservative/right wing accounts for each month of June over the three years with the help of programming languages such as Python and CygWin tools. I then examined the language of my data focussing on humorous content in order to reveal whether, and if so how, radical users online often use humour as a tool to spread their views in conditions of heightened disgust and wide-spread political instability. A reflection on humour as a moral occurrence, expanding on the works of Christie Davies as well as applying recent findings on the behavioural immune system on online data, offers new insights on the overlooked humorous nature of radical political discourse. An unorthodox take on the moral foundations pioneered by Jonathan Haidt enriched my understanding of the analysed material through the addition of a moral-based layer of enquiry to my more traditional content-based one. This convergence of theoretical, data driven and real life events constitutes a viable “collection of strategies” for academia, data scientists; NGO’s fighting hate crimes and the wider public alike. Bringing together the ideas of Davies, Haidt and others to my data, helps us to perceive humorous online content in terms of complex radical narratives that are all too often compressed into a single tweet.