954 resultados para Computational tools
Resumo:
work presented in the context of the European Master’s program in Computational Logic, as the partial requirement for obtaining Master of Science degree in Computational Logic
Resumo:
The aim of this article is to identify patterns in spatial distribution of cases of dengue fever that occurred in the municipality of Cruzeiro, State of São Paulo, in 2006. This is an ecological and exploratory study using the tools of spatial analysis in the preparation of thematic maps with data from Sinan-Net. An analysis was made by area, taking as unit the IBGE census, the analysis included four months in 2006 which show the occurrence of the disease in the city. The thematic maps were constructed by TerraView 3.3.1 software, the same software provided the values of the indicators of Global Moran (I M) every month and the Kernel estimation. In the year 2006, 691 cases of dengue were georeferenced (with a rate of 864.2 cases/100,000 inhabitants); the indicators of Moran and p-values obtained were I M = 0.080 (March) p = 0.11; I M = 0.285 (April) p = 0.01; I M = 0.201 (May) p = 0.01 and I M = 0.002 (June) p = 0.57. The first cases were identified in the Northeast and Central areas of Cruzeiro and the recent cases, in the North, Northeast and Central. It was possible to identify census tracts where the epidemic began and how it occurred temporally and spatially in the city.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
Dissertação para obtenção do Grau de Mestre em Genética Molecular e Biomedicina
Resumo:
Dissertação para obtenção do Grau de Mestre em Biotecnologia
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
As an introduction to a series of articles focused on the exploration of particular tools and/or methods to bring together digital technology and historical research, the aim of this paper is mainly to highlight and discuss in what measure those methodological approaches can contribute to improve analytical and interpretative capabilities available to historians. In a moment when the digital world present us with an ever-increasing variety of tools to perform extraction, analysis and visualization of large amounts of text, we thought it would be relevant to bring the digital closer to the vast historical academic community. More than repeating an idea of digital revolution introduced in the historical research, something recurring in the literature since the 1980s, the aim was to show the validity and usefulness of using digital tools and methods, as another set of highly relevant tools that the historians should consider. For this several case studies were used, combining the exploration of specific themes of historical knowledge and the development or discussion of digital methodologies, in order to highlight some changes and challenges that, in our opinion, are already affecting the historians' work, such as a greater focus given to interdisciplinarity and collaborative work, and a need for the form of communication of historical knowledge to become more interactive.
Resumo:
Dissertation presented to obtain the Ph.D degree in Biology, Computational Biology.
Resumo:
Dissertation presented to obtain the Ph.D degree in Biology
Resumo:
The widespread use of mobile devices has made known to the general public new areas that were hitherto confined to specialized devices. In general, the smartphone came to give all users the ability to execute multiple tasks, and among them, take photographs using the integrated cameras. Although these devices are continuously receiving improved cameras, their manufacturers do not take advantage of their full potential, since the operating systems normally offer simple APIs and applications for shooting. Therefore, taking advantage of this environment for mobile devices, we find ourselves in the best scenario to develop applications that help the user obtaining a good result when shooting. In an attempt to provide a set of techniques and tools more applied to the task, this dissertation presents, as a contribution, a set of tools for mobile devices that provides information in real-time on the composition of the scene before capturing an image. Thus, the proposed solution gives support to a user while capturing a scene with a mobile device. The user will be able to receive multiple suggestions on the composition of the scene, which will be based on rules of photography or other useful tools for photographers. The tools include horizon detection and graphical visualization of the color palette presented on the scenario being photographed. These tools were evaluated regarding the mobile device implementation and how users assess their usefulness.
Resumo:
The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.