985 resultados para Source code visualization
Resumo:
Numérifrag, la première partie de ce mémoire, se présente en tant que code source d’un projet de poésie numérique où les balises html ont été trafiquées de manière esthétique. L’effet répétitif et parasitant du code oblige le lecteur à effectuer un travail de décryptage afin de rendre aux poèmes leur lisibilité. Si le texte est linéaire sur papier, la programmation de chaque poème en tant que page web incite le lecteur à naviguer dans l’œuvre et à actualiser son potentiel d’a-linéarité. La seconde partie de ce mémoire, Corps discursif et dispositif dans Le centre blanc de Nicole Brossard, s’intéresse à la notion de dispositif en tant que subversion, dans le recueil Le centre blanc (1970) de Nicole Brossard. L’élaboration de ce dispositif passe par le corps qui s’exprime au-travers du texte et trouve son souffle chez le lecteur, par l’acte d'interprétation.
Resumo:
Thesis (Ph.D, Computing) -- Queen's University, 2016-09-30 09:55:51.506
Resumo:
Purpose: Custom cranio-orbital implants have been shown to achieve better performance than their hand-shaped counterparts by restoring skull anatomy more accurately and by reducing surgery time. Designing a custom implant involves reconstructing a model of the patient's skull using their computed tomography (CT) scan. The healthy side of the skull model, contralateral to the damaged region, can then be used to design an implant plan. Designing implants for areas of thin bone, such as the orbits, is challenging due to poor CT resolution of bone structures. This makes preoperative design time-intensive since thin bone structures in CT data must be manually segmented. The objective of this thesis was to research methods to accurately and efficiently design cranio-orbital implant plans, with a focus on the orbits, and to develop software that integrates these methods. Methods: The software consists of modules that use image and surface restoration approaches to enhance both the quality of CT data and the reconstructed model. It enables users to input CT data, and use tools to output a skull model with restored anatomy. The skull model can then be used to design the implant plan. The software was designed using 3D Slicer, an open-source medical visualization platform. It was tested on CT data from thirteen patients. Results: The average time it took to create a skull model with restored anatomy using our software was 0.33 hours ± 0.04 STD. In comparison, the design time of the manual segmentation method took between 3 and 6 hours. To assess the structural accuracy of the reconstructed models, CT data from the thirteen patients was used to compare the models created using our software with those using the manual method. When registering the skull models together, the difference between each set of skulls was found to be 0.4 mm ± 0.16 STD. Conclusions: We have developed a software to design custom cranio-orbital implant plans, with a focus on thin bone structures. The method described decreases design time, and is of similar accuracy to the manual method.
Resumo:
Embedded systems are increasingly integral to daily life, improving and facilitating the efficiency of modern Cyber-Physical Systems which provide access to sensor data, and actuators. As modern architectures become increasingly complex and heterogeneous, their optimization becomes a challenging task. Additionally, ensuring platform security is important to avoid harm to individuals and assets. This study primarily addresses challenges in contemporary Embedded Systems, focusing on platform optimization and security enforcement. The initial section of this study delves into the application of machine learning methods to efficiently determine the optimal number of cores for a parallel RISC-V cluster to minimize energy consumption using static source code analysis. Results demonstrate that automated platform configuration is not only viable but also that there is a moderate performance trade-off when relying solely on static features. The second part focuses on addressing the problem of heterogeneous device mapping, which involves assigning tasks to the most suitable computational device in a heterogeneous platform for optimal runtime. The contribution of this section lies in the introduction of novel pre-processing techniques, along with a training framework called Siamese Networks, that enhances the classification performance of DeepLLVM, an advanced approach for task mapping. Importantly, these proposed approaches are independent from the specific deep-learning model used. Finally, this research work focuses on addressing issues concerning the binary exploitation of software running in modern Embedded Systems. It proposes an architecture to implement Control-Flow Integrity in embedded platforms with a Root-of-Trust, aiming to enhance security guarantees with limited hardware modifications. The approach involves enhancing the architecture of a modern RISC-V platform for autonomous vehicles by implementing a side-channel communication mechanism that relays control-flow changes executed by the process running on the host core to the Root-of-Trust. This approach has limited impact on performance and it is effective in enhancing the security of embedded platforms.
Resumo:
The usage of version control systems and the capabilities of storing the source code in public platforms such as GitHub increased the number of passwords, API Keys and tokens that can be found and used causing a massive security issue for people and companies. In this project, SAP's secret scanner Credential Digger is presented. How it can scan repositories to detect hardcoded secrets and how it manages to filter out the false positives between them. Moreover, how I have implemented the Credential Digger's pre-commit hook. A performance comparison between three different implementations of the hook based on how it interacts with the Machine Learning model is presented. This project also includes how it is possible to use already detected credentials to decrease the number false positive by leveraging the similarity between leaks by using the Bucket System.
Resumo:
Dans cet article issu d’une conférence prononcée dans le cadre du Colloque Leg@l.IT (www.legalit.ca), l’auteur offre un rapide survol des fonctionnalités offertes par les systèmes de dépôt électronique de la Cour fédérale et de la Cour canadienne de l’impôt afin de dégager les avantages et inconvénients de chacune des technologies proposées. Cet exercice s’inscrit dans une réflexion plus large sur les conséquences de la migration progressive de certaines juridictions vers le dépôt électronique. Si cette tentative de moderniser le processus judiciaire se veut bénéfique, il demeure qu’un changement technologique d’une telle importance n’est pas sans risques et sans incidences sur les us et coutumes de l’appareil judiciaire. L’auteur se questionne ainsi sur la pratique adoptée par certains tribunaux judiciaires de développer en silo des solutions d’informatisation du processus de gestion des dossiers de la Cour. L’absence de compatibilité des systèmes et le repli vers des modèles propriétaires sont causes de soucis. Qui plus est, en confiant le développement de ces systèmes à des firmes qui en conservent la propriété du code source, ils contribuent à une certaine privatisation du processus rendant la mise en réseau de l’appareil judiciaire d’autant plus difficile. Or, dans la mesure où les systèmes de différents tribunaux seront appelés à communiquer et échanger des données, l’adoption de solutions technologiques compatibles et ouvertes est de mise. Une autre problématique réside dans l’apparente incapacité du législateur de suivre l’évolution vers la virtualisation du processus judiciaire. Le changement technologique impose, dans certains cas, un changement conceptuel difficilement compatible avec la législation applicable. Ce constat implique la nécessité d’un questionnement plus profond sur la pertinence d’adapter le droit à la technologie ou encore la technologie au droit afin d’assurer une coexistence cohérente et effective de ces deux univers.
Resumo:
Dental identification is the most valuable method to identify human remains in single cases with major postmortem alterations as well as in mass casualties because of its practicability and demanding reliability. Computed tomography (CT) has been investigated as a supportive tool for forensic identification and has proven to be valuable. It can also scan the dentition of a deceased within minutes. In the present study, we investigated currently used restorative materials using ultra-high-resolution dual-source CT and the extended CT scale for the purpose of a color-encoded, in scale, and artifact-free visualization in 3D volume rendering. In 122 human molars, 220 cavities with 2-, 3-, 4- and 5-mm diameter were prepared. With presently used filling materials (different composites, temporary filling materials, ceramic, and liner), these cavities were restored in six teeth for each material and cavity size (exception amalgam n = 1). The teeth were CT scanned and images reconstructed using an extended CT scale. Filling materials were analyzed in terms of resulting Hounsfield units (HU) and filling size representation within the images. Varying restorative materials showed distinctively differing radiopacities allowing for CT-data-based discrimination. Particularly, ceramic and composite fillings could be differentiated. The HU values were used to generate an updated volume-rendering preset for postmortem extended CT scale data of the dentition to easily visualize the position of restorations, the shape (in scale), and the material used which is color encoded in 3D. The results provide the scientific background for the application of 3D volume rendering to visualize the human dentition for forensic identification purposes.
Resumo:
Source routes and Spatial Diffusion of capuchin monkeys over the past 6 million years, rebuilt in the SPREAD 1.0.6 from the MCC tree. The map shows the 10 different regions to which distinctive samples were associated. The different transmission routes have been calculated from the average rate over time. Only rates with Bayes factor> 3 were considered as significantly different from zero. Significant diffusion pathways are highlighted with color varying from dark brown to red, being the dark brown less significant rates and deep red the most significant rates.
Resumo:
This paper presents results on a verification test of a Direct Numerical Simulation code of mixed high-order of accuracy using the method of manufactured solutions (MMS). This test is based on the formulation of an analytical solution for the Navier-Stokes equations modified by the addition of a source term. The present numerical code was aimed at simulating the temporal evolution of instability waves in a plane Poiseuille flow. The governing equations were solved in a vorticity-velocity formulation for a two-dimensional incompressible flow. The code employed two different numerical schemes. One used mixed high-order compact and non-compact finite-differences from fourth-order to sixth-order of accuracy. The other scheme used spectral methods instead of finite-difference methods for the streamwise direction, which was periodic. In the present test, particular attention was paid to the boundary conditions of the physical problem of interest. Indeed, the verification procedure using MMS can be more demanding than the often used comparison with Linear Stability Theory. That is particularly because in the latter test no attention is paid to the nonlinear terms. For the present verification test, it was possible to manufacture an analytical solution that reproduced some aspects of an instability wave in a nonlinear stage. Although the results of the verification by MMS for this mixed-order numerical scheme had to be interpreted with care, the test was very useful as it gave confidence that the code was free of programming errors. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
This paper analyzes the DNA code of several species in the perspective of information content. For that purpose several concepts and mathematical tools are selected towards establishing a quantitative method without a priori distorting the alphabet represented by the sequence of DNA bases. The synergies of associating Gray code, histogram characterization and multidimensional scaling visualization lead to a collection of plots with a categorical representation of species and chromosomes.
Resumo:
Low-density parity-check (LDPC) codes are nowadays one of the hottest topics in coding theory, notably due to their advantages in terms of bit error rate performance and low complexity. In order to exploit the potential of the Wyner-Ziv coding paradigm, practical distributed video coding (DVC) schemes should use powerful error correcting codes with near-capacity performance. In this paper, new ways to design LDPC codes for the DVC paradigm are proposed and studied. The new LDPC solutions rely on merging parity-check nodes, which corresponds to reduce the number of rows in the parity-check matrix. This allows to change gracefully the compression ratio of the source (DCT coefficient bitplane) according to the correlation between the original and the side information. The proposed LDPC codes reach a good performance for a wide range of source correlations and achieve a better RD performance when compared to the popular turbo codes.
Resumo:
Absolute positioning – the real time satellite based positioning technique that relies solely on global navigation satellite systems – lacks accuracy for several real time application domains. To provide increased positioning quality, ground or satellite based augmentation systems can be devised, depending on the extent of the area to cover. The underlying technique – multiple reference station differential positioning – can, in the case of ground systems, be further enhanced through the implementation of the virtual reference station concept. Our approach is a ground based system made of a small-sized network of three stations where the concept of virtual reference station was implemented. The stations provide code pseudorange corrections, which are combined using a measurement domain approach inversely proportional to the distance from source station to rover. All data links are established trough the Internet.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologias da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Ciência e Sistemas de Informação Geográfica
Resumo:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering