888 resultados para Software testing. Problem-oriented programming. Teachingmethodology
Resumo:
Dissertação de natureza científica para obtenção do grau de Mestre em Engenharia Civil
Resumo:
Ce mémoire vise à recenser les avantages et les inconvénients de l'utilisation du langage de programmation fonctionnel dynamique Scheme pour le développement de jeux vidéo. Pour ce faire, la méthode utilisée est d'abord basée sur une approche plus théorique. En effet, une étude des besoins au niveau de la programmation exprimés par ce type de développement, ainsi qu'une description détaillant les fonctionnalités du langage Scheme pertinentes au développement de jeux vidéo sont données afin de bien mettre en contexte le sujet. Par la suite, une approche pratique est utilisée en effectuant le développement de deux jeux vidéo de complexités croissantes: Space Invaders et Lode Runner. Le développement de ces jeux vidéo a mené à l'extension du langage Scheme par plusieurs langages spécifiques au domaine et bibliothèques, dont notamment un système de programmation orienté objets et un système de coroutines. L'expérience acquise par le développement de ces jeux est finalement comparée à celle d'autres développeurs de jeux vidéo de l'industrie qui ont utilisé Scheme pour la création de titres commerciaux. En résumé, l'utilisation de ce langage a permis d'atteindre un haut niveau d'abstraction favorisant la modularité des jeux développés sans affecter les performances de ces derniers.
Resumo:
L'analyse quantitative a été réalisée en cotutelle avec Rémi Boivin et Pierre Tremblay et publiée dans la Revue de Criminologie: Boivin, R., Lamige, C,. Tremblay, P. (2009) La police devrait-elle cibler les taudis malfamés? Criminologie, (42)1, 225-266.
Resumo:
”compositions” is a new R-package for the analysis of compositional and positive data. It contains four classes corresponding to the four different types of compositional and positive geometry (including the Aitchison geometry). It provides means for computation, plotting and high-level multivariate statistical analysis in all four geometries. These geometries are treated in an fully analogous way, based on the principle of working in coordinates, and the object-oriented programming paradigm of R. In this way, called functions automatically select the most appropriate type of analysis as a function of the geometry. The graphical capabilities include ternary diagrams and tetrahedrons, various compositional plots (boxplots, barplots, piecharts) and extensive graphical tools for principal components. Afterwards, ortion and proportion lines, straight lines and ellipses in all geometries can be added to plots. The package is accompanied by a hands-on-introduction, documentation for every function, demos of the graphical capabilities and plenty of usage examples. It allows direct and parallel computation in all four vector spaces and provides the beginner with a copy-and-paste style of data analysis, while letting advanced users keep the functionality and customizability they demand of R, as well as all necessary tools to add own analysis routines. A complete example is included in the appendix
Resumo:
In an earlier investigation (Burger et al., 2000) five sediment cores near the Rodrigues Triple Junction in the Indian Ocean were studied applying classical statistical methods (fuzzy c-means clustering, linear mixing model, principal component analysis) for the extraction of endmembers and evaluating the spatial and temporal variation of geochemical signals. Three main factors of sedimentation were expected by the marine geologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. The display of fuzzy membership values and/or factor scores versus depth provided consistent results for two factors only; the ultra-basic component could not be identified. The reason for this may be that only traditional statistical methods were applied, i.e. the untransformed components were used and the cosine-theta coefficient as similarity measure. During the last decade considerable progress in compositional data analysis was made and many case studies were published using new tools for exploratory analysis of these data. Therefore it makes sense to check if the application of suitable data transformations, reduction of the D-part simplex to two or three factors and visual interpretation of the factor scores would lead to a revision of earlier results and to answers to open questions . In this paper we follow the lines of a paper of R. Tolosana- Delgado et al. (2005) starting with a problem-oriented interpretation of the biplot scattergram, extracting compositional factors, ilr-transformation of the components and visualization of the factor scores in a spatial context: The compositional factors will be plotted versus depth (time) of the core samples in order to facilitate the identification of the expected sources of the sedimentary process. Kew words: compositional data analysis, biplot, deep sea sediments
Resumo:
This paper introduces how artificial intelligence technologies can be integrated into a known computer aided control system design (CACSD) framework, Matlab/Simulink, using an object oriented approach. The aim is to build a framework to aid supervisory systems analysis, design and implementation. The idea is to take advantage of an existing CACSD framework, Matlab/Simulink, so that engineers can proceed: first to design a control system, and then to design a straightforward supervisory system of the control system in the same framework. Thus, expert systems and qualitative reasoning tools are incorporated into this popular CACSD framework to develop a computer aided supervisory system design (CASSD) framework. Object-variables an introduced into Matlab/Simulink for sharing information between tools
Resumo:
Need to edit an image but don't have any software? No problem - you can do it all online for free at this website - and no annoying adverts either.
Resumo:
En el Centre d'Investigació en Robòtica Submarina (CIRS) de la Universitat de Gironaes disposa de diferents robots submarins els quals utilitzen una arquitectura software anomenada Component Oriented Layered-based Architecture for Autonomy ( COLA2 ), la qual ha estat desenvolupada per estudiants i professors del mateix centre. Per tal de fer aquesta arquitectura més accessible per a professors i estudiant d’altres centres la COLA2 s’està adaptant al Robot Operative System (ROS) que és un framework genèric per al desenvolupament d’aplicacions amb robots. Aquest projecte pretén dissenyar un comportament per al robot Girona500 que estigui desenvolupat dins la versió ROS de l’arquitectura COLA2. El comportament haurà de fer mantenir una determinada posició al robot amb informació visual de la càmera del robot i amb dades de navegació. La tasca de mantenir la posició es de vital importància per a poder realitzar intervencions submarines que requereixen de precisió i, precisament, el medi on es treballa no ajuda
Resumo:
The SystemVerilog implementation of the Open Verification Methodology (OVM) is exercised on an 8b/10b RTL open core design in the hope of being a simple yet complete exercise to expose the key features of OVM. Emphasis is put onto the actual usage of the verification components rather than a complete verification flow aiming at being of help to readers unfamiliar with OVM seeking to apply the methodology to their own designs. A link that takes you to the complete code is given to reinforce this aim. We found the methodology easy to use but intimidating at first glance specially for someone with little experience in object oriented programming. However it is clear to see the flexibility, portability and reusability of verification code once you manage to give some first steps.
Resumo:
Despite several examples of deployed agent systems, there remain barriers to the large-scale adoption of agent technologies. In order to understand these barriers, this paper considers aspects of marketing theory which deal with diffusion of innovations and their relevance to the agents domain and the current state of diffusion of agent technologies. In particular, the paper examines the role of standards in the adoption of new technologies, describes the agent standards landscape, and compares the development and diffusion of agent technologies with that of object-oriented programming. The paper also reports on a simulation model developed in order to consider different trajectories for the adoption of agent technologies, with trajectories based on various assumptions regarding industry structure and the existence of competing technology standards. We present details of the simulation model and its assumptions, along with the results of the simulation exercises.
Resumo:
Uma linguagem orientada ao problema de projeto estrutural de edifícios e a correspondente estrutura de armazenamento de dados são apresentados, como núcleo principal do sistema PROADE. Objetiva-se assim permitir ao engenheiro estrutural descrever o problema em termos correntes de Engenharia, organizandose os dados recebidos para posterior análise e dimensionamento da estrutura. São discutidos o problema PROADE e os dados correspondentes, seguidos pela descrição das estruturas de armazenamento de dados do sistema. A seguir, define-se a linguagem PROADE e finalmente apresenta-se a organização do sistema PROADE.