963 resultados para atk-ohjelmat - LSP - Library software package
Resumo:
Software updates are critical to the security of software systems and devices. Yet users often do not install them in a timely manner, leaving their devices open to security exploits. This research explored a re-design of automatic software updates on desktop and mobile devices to improve the uptake of updates through three studies. First using interviews, we studied users’ updating patterns and behaviors on desktop machines in a formative study. Second, we distilled these findings into the design of a low-fi prototype for desktops, and evaluated its efficacy for automating updates by means of a think-aloud study. Third, we investigated individual differences in update automation on Android devices using a large scale survey, and interviews. In this thesis, I present the findings of all three studies and provide evidence for how automatic updates can be better appropriated to fit users on both desktops and mobile devices. Additionally, I provide user interface design suggestions for software updates and outline recommendations for future work to improve the user experience of software updates.
Resumo:
Hardware vendors make an important effort creating low-power CPUs that keep battery duration and durability above acceptable levels. In order to achieve this goal and provide good performance-energy for a wide variety of applications, ARM designed the big.LITTLE architecture. This heterogeneous multi-core architecture features two different types of cores: big cores oriented to performance and little cores, slower and aimed to save energy consumption. As all the cores have access to the same memory, multi-threaded applications must resort to some mutual exclusion mechanism to coordinate the access to shared data by the concurrent threads. Transactional Memory (TM) represents an optimistic approach for shared-memory synchronization. To take full advantage of the features offered by software TM, but also benefit from the characteristics of the heterogeneous big.LITTLE architectures, our focus is to propose TM solutions that take into account the power/performance requirements of the application and what it is offered by the architecture. In order to understand the current state-of-the-art and obtain useful information for future power-aware software TM solutions, we have performed an analysis of a popular TM library running on top of an ARM big.LITTLE processor. Experiments show, in general, better scalability for the LITTLE cores for most of the applications except for one, which requires the computing performance that the big cores offer.
Resumo:
Oslo, capitale della Norvegia, sta sperimentando un’improvvisa crescita della popolazione e secondo le stime fornite da Statistics Norway si prevede un aumento di 200 000 abitanti entro il 2040. La crescita della popolazione comporterà un rilevante aumento di domanda di acqua e, insieme ad altri fattori quali l’età delle infrastrutture e i cambiamenti climatici, sarà responsabile di una notevole pressione sulle infrastrutture idriche presenti. In risposta alla necessità di tempestivi cambiamenti, il gestore del servizio idrico della città (Oslo VAV) ha deciso di finanziare progetti per migliorare la robustezza delle infrastrutture idriche. Il lavoro di tesi si inserisce all’interno del progetto E3WDM, istituito nel 2005 con lo scopo di definire una gestione più efficiente della risorsa idrica di Oslo. L’obiettivo generale della tesi è la creazione di un modello metabolico attraverso il software UWOT (Makropoulos et al., 2008) con lo scopo di rappresentare i consumi idrici di due tipiche tipologie abitative nella città di Oslo. L’innovazione di questo studio consiste nella definizione e nella modellazione della domanda idrica all’interno delle abitazioni ad un livello di dettaglio molto elevato. Il nuovo approccio fornito da UWOT consente la simulazione di differenti strategie di intervento e la successiva gestione ottimale della risorsa idrica in grado di minimizzare i consumi di acqua, di energia e i costi, compatibilmente con la domanda idrica richiesta. Il lavoro di tesi comprende: -La descrizione del software UWOT, in particolare lo scopo del modello, l’innovativo approccio adottato, la struttura e il procedimento per creare un modello del sistema idrico urbano. -La definizione dei dati richiesti per la simulazione dei consumi idrici all’interno delle abitazioni nella città di Oslo e i metodi utilizzati per raccoglierli -L’applicazione del modello UWOT per la definizione dei trend di consumi idrici e la successiva analisi dei risultati
Resumo:
Purpose: Custom cranio-orbital implants have been shown to achieve better performance than their hand-shaped counterparts by restoring skull anatomy more accurately and by reducing surgery time. Designing a custom implant involves reconstructing a model of the patient's skull using their computed tomography (CT) scan. The healthy side of the skull model, contralateral to the damaged region, can then be used to design an implant plan. Designing implants for areas of thin bone, such as the orbits, is challenging due to poor CT resolution of bone structures. This makes preoperative design time-intensive since thin bone structures in CT data must be manually segmented. The objective of this thesis was to research methods to accurately and efficiently design cranio-orbital implant plans, with a focus on the orbits, and to develop software that integrates these methods. Methods: The software consists of modules that use image and surface restoration approaches to enhance both the quality of CT data and the reconstructed model. It enables users to input CT data, and use tools to output a skull model with restored anatomy. The skull model can then be used to design the implant plan. The software was designed using 3D Slicer, an open-source medical visualization platform. It was tested on CT data from thirteen patients. Results: The average time it took to create a skull model with restored anatomy using our software was 0.33 hours ± 0.04 STD. In comparison, the design time of the manual segmentation method took between 3 and 6 hours. To assess the structural accuracy of the reconstructed models, CT data from the thirteen patients was used to compare the models created using our software with those using the manual method. When registering the skull models together, the difference between each set of skulls was found to be 0.4 mm ± 0.16 STD. Conclusions: We have developed a software to design custom cranio-orbital implant plans, with a focus on thin bone structures. The method described decreases design time, and is of similar accuracy to the manual method.
Resumo:
Security defects are common in large software systems because of their size and complexity. Although efficient development processes, testing, and maintenance policies are applied to software systems, there are still a large number of vulnerabilities that can remain, despite these measures. Some vulnerabilities stay in a system from one release to the next one because they cannot be easily reproduced through testing. These vulnerabilities endanger the security of the systems. We propose vulnerability classification and prediction frameworks based on vulnerability reproducibility. The frameworks are effective to identify the types and locations of vulnerabilities in the earlier stage, and improve the security of software in the next versions (referred to as releases). We expand an existing concept of software bug classification to vulnerability classification (easily reproducible and hard to reproduce) to develop a classification framework for differentiating between these vulnerabilities based on code fixes and textual reports. We then investigate the potential correlations between the vulnerability categories and the classical software metrics and some other runtime environmental factors of reproducibility to develop a vulnerability prediction framework. The classification and prediction frameworks help developers adopt corresponding mitigation or elimination actions and develop appropriate test cases. Also, the vulnerability prediction framework is of great help for security experts focus their effort on the top-ranked vulnerability-prone files. As a result, the frameworks decrease the number of attacks that exploit security vulnerabilities in the next versions of the software. To build the classification and prediction frameworks, different machine learning techniques (C4.5 Decision Tree, Random Forest, Logistic Regression, and Naive Bayes) are employed. The effectiveness of the proposed frameworks is assessed based on collected software security defects of Mozilla Firefox.
Resumo:
O CERN - a Organização Europeia para a Investigação Nuclear - é um dos maiores centros de investigação a nível mundial, responsável por diversas descobertas na área da física bem como na área das ciências da computação. O CERN Document Server, também conhecido como CDS Invenio, é um software desenvolvido no CERN, que tem como objectivo fornecer um conjunto de ferramentas para gerir bibliotecas digitais. A fim de melhorar as funcionalidades do CDS Invenio foi criado um novo módulo, chamado BibCirculation, para gerir os livros (e outros itens) da biblioteca do CERN, funcionando como um sistema integrado de gestão de bibliotecas. Esta tese descreve os passos que foram dados para atingir os vários objectivos deste projecto, explicando, entre outros, o processo de integração com os outros módulos existentes bem como a forma encontrada para associar informações dos livros com os metadados do CDS lnvenio. É também possível encontrar uma apresentação detalhada sobre todo o processo de implementação e os testes realizados. Finalmente, são apresentadas as conclusões deste projecto e o trabalho a desenvolver futuramente. ABSTRACT: CERN - The European Organization for Nuclear Research - is one of the largest research centers worldwide, responsible for several discoveries in physics as well as in computer science. The CERN Document Server, also known as CDS Invenio, is a software developed at CERN, which aims to provide a set of tools for managing digital libraries. ln order to improve the functionalities of CDS Invenio a new module was developed, called BibCirculation, to manage books (and other items) from the CERN library, and working as an Integrated Library System. This thesis shows the steps that have been done to achieve the several goals of this project, explaining, among others aspects, the process of integration with other existing modules as well as the way to associate the information about books with the metadata from CDS lnvenio. You can also find detailed explanation of the entire implementation process and testing. Finally, there are presented the conclusions of this project and ideas for future development.
Resumo:
The metapopulation paradigm is central in ecology and conservation biology to understand the dynamics of spatially-structured populations in fragmented landscapes. Metapopulations are often studied using simulation modelling, and there is an increasing demand of user-friendly software tools to simulate metapopulation responses to environmental change. Here we describe the MetaLandSim R package, mwhich integrates ideas from metapopulation and graph theories to simulate the dynamics of real and virtual metapopulations. The package offers tools to (i) estimate metapopulation parameters from empirical data, (ii) to predict variation in patch occupancy over time in static and dynamic landscapes, either real or virtual, and (iii) to quantify the patterns and speed of metapopulation expansion into empty landscapes. MetaLandSim thus provides detailed information on metapopulation processes, which can be easily combined with land use and climate change scenarios to predict metapopulation dynamics and range expansion for a variety of taxa and ecological systems.
Resumo:
La Mapoteca Virtual es un sitio web construido sobre la plataforma Joomla, auspiciado por la Escuela de Ciencias Geográfica de la Universidad Nacional, en colaboración con UNA VIRTUAL.Este sitio pretende apoyar la labor docente al permitirle cargar y difundir cartografía digital para los estudiantes, y a los investigadores les facilita la localización de cartografía en línea, necesaria para la realización de sus trabajos en diferentes áreas del conocimiento. Adicionalmente, es un espacio para presentar documentos actuales en relación a la práctica de la cartografía y ciencias conexas, a la vez que fomenta la colaboración y el acceso libre a cartografía digital.Palabras clave: Cartografía, Joomla, Mapas digitales, Herramientas didácticas en líneaAbstractThe Virtual Map Library, www.mapoteca.geo.una.ac.cr, is a website constructed on the Joomla platform and supported by the School of Geographic Sciences at National University, Costa Rica, in collaboration with UNAVIRTUAL. The site intends to support the work of education by allowing the teacher to load and disseminate digital cartography for the students and helping geographic investigators locate cartography needed to accomplish their works in different areas of knowledge.In addition, Mapoteca offers a space to present current documents relating to the practice of cartography and related sciences that at the same time promotes the contribution and free access to digital cartography.Key Words: Cartography, Mapoteca Virtual, Virtual Map Library, Joomla, digital maps, online teaching tools, School of Geographic Sciences, National University, Costa Rica.
Resumo:
Modern High-Performance Computing HPC systems are gradually increasing in size and complexity due to the correspondent demand of larger simulations requiring more complicated tasks and higher accuracy. However, as side effects of the Dennard’s scaling approaching its ultimate power limit, the efficiency of software plays also an important role in increasing the overall performance of a computation. Tools to measure application performance in these increasingly complex environments provide insights into the intricate ways in which software and hardware interact. The monitoring of the power consumption in order to save energy is possible through processors interfaces like Intel Running Average Power Limit RAPL. Given the low level of these interfaces, they are often paired with an application-level tool like Performance Application Programming Interface PAPI. Since several problems in many heterogeneous fields can be represented as a complex linear system, an optimized and scalable linear system solver algorithm can decrease significantly the time spent to compute its resolution. One of the most widely used algorithms deployed for the resolution of large simulation is the Gaussian Elimination, which has its most popular implementation for HPC systems in the Scalable Linear Algebra PACKage ScaLAPACK library. However, another relevant algorithm, which is increasing in popularity in the academic field, is the Inhibition Method. This thesis compares the energy consumption of the Inhibition Method and Gaussian Elimination from ScaLAPACK to profile their execution during the resolution of linear systems above the HPC architecture offered by CINECA. Moreover, it also collates the energy and power values for different ranks, nodes, and sockets configurations. The monitoring tools employed to track the energy consumption of these algorithms are PAPI and RAPL, that will be integrated with the parallel execution of the algorithms managed with the Message Passing Interface MPI.
Resumo:
During the last semester of the Master’s Degree in Artificial Intelligence, I carried out my internship working for TXT e-Solution on the ADMITTED project. This paper describes the work done in those months. The thesis will be divided into two parts representing the two different tasks I was assigned during the course of my experience. The First part will be about the introduction of the project and the work done on the admittedly library, maintaining the code base and writing the test suits. The work carried out is more connected to the Software engineer role, developing features, fixing bugs and testing. The second part will describe the experiments done on the Anomaly detection task using a Deep Learning technique called Autoencoder, this task is on the other hand more connected to the data science role. The two tasks were not done simultaneously but were dealt with one after the other, which is why I preferred to divide them into two separate parts of this paper.
Resumo:
The BP (Bundle Protocol) version 7 has been recently standardized by IETF in RFC 9171, but it is the whole DTN (Delay-/Disruption-Tolerant Networking) architecture, of which BP is the core, that is gaining a renewed interest, thanks to its planned adoption in future space missions. This is obviously positive, but at the same time it seems to make space agencies more interested in deployment than in research, with new BP implementations that may challenge the central role played until now by the historical BP reference implementations, such as ION and DTNME. To make Unibo research on DTN independent of space agency decisions, the development of an internal BP implementation was in order. This is the goal of this thesis, which deals with the design and implementation of Unibo-BP: a novel, research-driven BP implementation, to be released as Free Software. Unibo-BP is fully compliant with RFC 9171, as demonstrated by a series of interoperability tests with ION and DTNME, and presents a few innovations, such as the ability to manage remote DTN nodes by means of the BP itself. Unibo-BP is compatible with pre-existing Unibo implementations of CGR (Contact Graph Routing) and LTP (Licklider Transmission Protocol) thanks to interfaces designed during the thesis. The thesis project also includes an implementation of TCPCLv3 (TCP Convergence Layer version 3, RFC 7242), which can be used as an alternative to LTPCL to connect with proximate nodes, especially in terrestrial networks. Summarizing, Unibo-BP is at the heart of a larger project, Unibo-DTN, which aims to implement the main components of a complete DTN stack (BP, TCPCL, LTP, CGR). Moreover, Unibo-BP is compatible with all DTNsuite applications, thanks to an extension of the Unified API library on which DTNsuite applications are based. The hope is that Unibo-BP and all the ancillary programs developed during this thesis will contribute to the growth of DTN popularity in academia and among space agencies.
Resumo:
A fosmid metagenomic library was constructed with total community DNA obtained from a municipal wastewater treatment plant (MWWTP), with the aim of identifying new FeFe-hydrogenase genes encoding the enzymes most important for hydrogen metabolism. The dataset generated by pyrosequencing of a fosmid library was mined to identify environmental gene tags (EGTs) assigned to FeFe-hydrogenase. The majority of EGTs representing FeFe-hydrogenase genes were affiliated with the class Clostridia, suggesting that this group is the main hydrogen producer in the MWWTP analyzed. Based on assembled sequences, three FeFe-hydrogenase genes were predicted based on detection of the L2 motif (MPCxxKxxE) in the encoded gene product, confirming true FeFe-hydrogenase sequences. These sequences were used to design specific primers to detect fosmids encoding FeFe-hydrogenase genes predicted from the dataset. Three identified fosmids were completely sequenced. The cloned genomic fragments within these fosmids are closely related to members of the Spirochaetaceae, Bacteroidales and Firmicutes, and their FeFe-hydrogenase sequences are characterized by the structure type M3, which is common to clostridial enzymes. FeFe-hydrogenase sequences found in this study represent hitherto undetected sequences, indicating the high genetic diversity regarding these enzymes in MWWTP. Results suggest that MWWTP have to be considered as reservoirs for new FeFe-hydrogenase genes.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
INTRODUCTION: Open access publishing is becoming increasingly popular within the biomedical sciences. SciELO, the Scientific Electronic Library Online, is a digital library covering a selected collection of Brazilian scientific journals many of which provide open access to full-text articles.This library includes a number of dental journals some of which may include reports of clinical trials in English, Portuguese and/or Spanish. Thus, SciELO could play an important role as a source of evidence for dental healthcare interventions especially if it yields a sizeable number of high quality reports. OBJECTIVE: The aim of this study was to identify reports of clinical trials by handsearching of dental journals that are accessible through SciELO, and to assess the overall quality of these reports. MATERIAL AND METHODS: Electronic versions of six Brazilian dental Journals indexed in SciELO were handsearched at www.scielo.br in September 2008. Reports of clinical trials were identified and classified as controlled clinical trials (CCTs - prospective, experimental studies comparing 2 or more healthcare interventions in human beings) or randomized controlled trials (RCTs - a random allocation method is clearly reported), according to Cochrane eligibility criteria. CRITERIA TO ASSESS METHODOLOGICAL QUALITY INCLUDED: method of randomization, concealment of treatment allocation, blinded outcome assessment, handling of withdrawals and losses and whether an intention-to-treat analysis had been carried out. RESULTS: The search retrieved 33 CCTs and 43 RCTs. A majority of the reports provided no description of either the method of randomization (75.3%) or concealment of the allocation sequence (84.2%). Participants and outcome assessors were reported as blinded in only 31.2% of the reports. Withdrawals and losses were only clearly described in 6.5% of the reports and none mentioned an intention-to-treat analysis or any similar procedure. CONCLUSIONS: The results of this study indicate that a substantial number of reports of trials and systematic reviews are available in the dental journals listed in SciELO, and that these could provide valuable evidence for clinical decision making. However, it is clear that the quality of a number of these reports is of some concern and that improvement in the conduct and reporting of these trials could be achieved if authors adhered to internationally accepted guidelines, e.g. the CONSORT statement.