802 resultados para GPU computing
Resumo:
Introducción: Los softwares dietoterapéuticos constituyen actualmente una herramienta básica en el tratamiento dietético de pacientes, ya sea desde un punto de vista fisiológico y/o patológico. Las nuevas tecnologías y la investigación en este sentido, han favorecido la aparición de nuevas aplicaciones de gestión dietético-nutricional que facilitan la gestión de la empresa dietoterapéutica. Objetivos: Estudiar comparativamente las principales aplicaciones dietoterapéuticas existentes en el mercado para dar criterio a los usuarios profesionales de la dietética y nutrición en la selección de una de las principales herramientas para éstos. Resultados: Desde nuestro punto de vista, dietopro. com resulta, junto con otras de las aplicaciones dietoterapéuticas analizadas, una de las más completas para la gestión de la clínica nutricional. Conclusión: En función de la necesidad del usuario, éste dispone de diferentes softwares dietéticos donde elegir. Se concluye que la selección de una u otra, depende de las necesidades del profesional.
Resumo:
Visualization of vector fields plays an important role in research activities nowadays -- Web applications allow a fast, multi-platform and multi-device access to data, which results in the need of optimized applications to be implemented in both high-performance and low-performance devices -- Point trajectory calculation procedures usually perform repeated calculations due to the fact that several points might lie over the same trajectory -- This paper presents a new methodology to calculate point trajectories over highly-dense and uniformly-distributed grid of points in which the trajectories are forced to lie over the points in the grid -- Its advantages rely on a highly parallel computing architecture implementation and in the reduction of the computational effort to calculate the stream paths since unnecessary calculations are avoided, reusing data through iterations -- As case study, the visualization of oceanic currents through in the web platform is presented and analyzed, using WebGL as the parallel computing architecture and the rendering Application Programming Interface
Resumo:
Dissertação de Mestrado, Engenharia Informática, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2015
Resumo:
There are diferent applications in Engineering that require to compute improper integrals of the first kind (integrals defined on an unbounded domain) such as: the work required to move an object from the surface of the earth to in nity (Kynetic Energy), the electric potential created by a charged sphere, the probability density function or the cumulative distribution function in Probability Theory, the values of the Gamma Functions(wich useful to compute the Beta Function used to compute trigonometrical integrals), Laplace and Fourier Transforms (very useful, for example in Differential Equations).
Resumo:
After a decade evolving in the High Performance Computing arena, GPU-equipped supercomputers have con- quered the top500 and green500 lists, providing us unprecedented levels of computational power and memory bandwidth. This year, major vendors have introduced new accelerators based on 3D memory, like Xeon Phi Knights Landing by Intel and Pascal architecture by Nvidia. This paper reviews hardware features of those new HPC accelerators and unveils potential performance for scientific applications, with an emphasis on Hybrid Memory Cube (HMC) and High Bandwidth Memory (HBM) used by commercial products according to roadmaps already announced.
Resumo:
Image and video compression play a major role in the world today, allowing the storage and transmission of large multimedia content volumes. However, the processing of this information requires high computational resources, hence the improvement of the computational performance of these compression algorithms is very important. The Multidimensional Multiscale Parser (MMP) is a pattern-matching-based compression algorithm for multimedia contents, namely images, achieving high compression ratios, maintaining good image quality, Rodrigues et al. [2008]. However, in comparison with other existing algorithms, this algorithm takes some time to execute. Therefore, two parallel implementations for GPUs were proposed by Ribeiro [2016] and Silva [2015] in CUDA and OpenCL-GPU, respectively. In this dissertation, to complement the referred work, we propose two parallel versions that run the MMP algorithm in CPU: one resorting to OpenMP and another that converts the existing OpenCL-GPU into OpenCL-CPU. The proposed solutions are able to improve the computational performance of MMP by 3 and 2:7 , respectively. The High Efficiency Video Coding (HEVC/H.265) is the most recent standard for compression of image and video. Its impressive compression performance, makes it a target for many adaptations, particularly for holoscopic image/video processing (or light field). Some of the proposed modifications to encode this new multimedia content are based on geometry-based disparity compensations (SS), developed by Conti et al. [2014], and a Geometric Transformations (GT) module, proposed by Monteiro et al. [2015]. These compression algorithms for holoscopic images based on HEVC present an implementation of specific search for similar micro-images that is more efficient than the one performed by HEVC, but its implementation is considerably slower than HEVC. In order to enable better execution times, we choose to use the OpenCL API as the GPU enabling language in order to increase the module performance. With its most costly setting, we are able to reduce the GT module execution time from 6.9 days to less then 4 hours, effectively attaining a speedup of 45 .
Resumo:
Acute Coronary Syndrome (ACS) is transversal to a broad and heterogeneous set of human beings, and assumed as a serious diagnosis and risk stratification problem. Although one may be faced with or had at his disposition different tools as biomarkers for the diagnosis and prognosis of ACS, they have to be previously evaluated and validated in different scenarios and patient cohorts. Besides ensuring that a diagnosis is correct, attention should also be directed to ensure that therapies are either correctly or safely applied. Indeed, this work will focus on the development of a diagnosis decision support system in terms of its knowledge representation and reasoning mechanisms, given here in terms of a formal framework based on Logic Programming, complemented with a problem solving methodology to computing anchored on Artificial Neural Networks. On the one hand it caters for the evaluation of ACS predisposing risk and the respective Degree-of-Confidence that one has on such a happening. On the other hand it may be seen as a major development on the Multi-Value Logics to understand things and ones behavior. Undeniably, the proposed model allows for an improvement of the diagnosis process, classifying properly the patients that presented the pathology (sensitivity ranging from 89.7% to 90.9%) as well as classifying the absence of ACS (specificity ranging from 88.4% to 90.2%).
Resumo:
Plants of genus Schinus are native South America and introduced in Mediterranean countries, a long time ago. Some Schinus species have been used in folk medicine, and Essential Oils of Schinus spp. (EOs) have been reported as having antimicrobial, anti-tumoural and anti-inflammatory properties. Such assets are related with the EOs chemical composition that depends largely on the species, the geographic and climatic region, and on the part of the plants used. Considering the difficulty to infer the pharmacological properties of EOs of Schinus species without a hard experimental setting, this work will focus on the development of an Artificial Intelligence grounded Decision Support System to predict pharmacological properties of Schinus EOs. The computational framework was built on top of a Logic Programming Case Base approach to knowledge representation and reasoning, which caters to the handling of incomplete, unknown, or even self-contradictory information. New clustering methods centered on an analysis of attribute’s similarities were used to distinguish and aggregate historical data according to the context under which it was added to the Case Base, therefore enhancing the prediction process.
Resumo:
A Flood Vulnerability Index (FloodVI) was developed using Principal Component Analysis (PCA) and a new aggregation method based on Cluster Analysis (CA). PCA simplifies a large number of variables into a few uncorrelated factors representing the social, economic, physical and environmental dimensions of vulnerability. CA groups areas that have the same characteristics in terms of vulnerability into vulnerability classes. The grouping of the areas determines their classification contrary to other aggregation methods in which the areas' classification determines their grouping. While other aggregation methods distribute the areas into classes, in an artificial manner, by imposing a certain probability for an area to belong to a certain class, as determined by the assumption that the aggregation measure used is normally distributed, CA does not constrain the distribution of the areas by the classes. FloodVI was designed at the neighbourhood level and was applied to the Portuguese municipality of Vila Nova de Gaia where several flood events have taken place in the recent past. The FloodVI sensitivity was assessed using three different aggregation methods: the sum of component scores, the first component score and the weighted sum of component scores. The results highlight the sensitivity of the FloodVI to different aggregation methods. Both sum of component scores and weighted sum of component scores have shown similar results. The first component score aggregation method classifies almost all areas as having medium vulnerability and finally the results obtained using the CA show a distinct differentiation of the vulnerability where hot spots can be clearly identified. The information provided by records of previous flood events corroborate the results obtained with CA, because the inundated areas with greater damages are those that are identified as high and very high vulnerability areas by CA. This supports the fact that CA provides a reliable FloodVI.
Resumo:
The AntiPhospholipid Syndrome (APS) is an acquired autoimmune disorder induced by high levels of antiphospholipid antibodies that cause arterial and veins thrombosis, as well as pregnancy-related complications and morbidity, as clinical manifestations. This autoimmune hypercoagulable state, usually known as Hughes syndrome, has severe consequences for the patients, being one of the main causes of thrombotic disorders and death. Therefore, it is required to be preventive; being aware of how probable is to have that kind of syndrome. Despite the updated of antiphospholipid syndrome classification, the diagnosis remains difficult to establish. Additional research on clinically relevant antibodies and standardization of their quantification are required in order to improve the antiphospholipid syndrome risk assessment. Thus, this work will focus on the development of a diagnosis decision support system in terms of a formal agenda built on a Logic Programming approach to knowledge representation and reasoning, complemented with a computational framework based on Artificial Neural Networks. The proposed model allows for improving the diagnosis, classifying properly the patients that really presented this pathology (sensitivity higher than 85%), as well as classifying the absence of APS (specificity close to 95%).
Resumo:
Internet of Things systems are pervasive systems evolved from cyber-physical to large-scale systems. Due to the number of technologies involved, software development involves several integration challenges. Among them, the ones preventing proper integration are those related to the system heterogeneity, and thus addressing interoperability issues. From a software engineering perspective, developers mostly experience the lack of interoperability in the two phases of software development: programming and deployment. On the one hand, modern software tends to be distributed in several components, each adopting its most-appropriate technology stack, pushing programmers to code in a protocol- and data-agnostic way. On the other hand, each software component should run in the most appropriate execution environment and, as a result, system architects strive to automate the deployment in distributed infrastructures. This dissertation aims to improve the development process by introducing proper tools to handle certain aspects of the system heterogeneity. Our effort focuses on three of these aspects and, for each one of those, we propose a tool addressing the underlying challenge. The first tool aims to handle heterogeneity at the transport and application protocol level, the second to manage different data formats, while the third to obtain optimal deployment. To realize the tools, we adopted a linguistic approach, i.e.\ we provided specific linguistic abstractions that help developers to increase the expressive power of the programming language they use, writing better solutions in more straightforward ways. To validate the approach, we implemented use cases to show that the tools can be used in practice and that they help to achieve the expected level of interoperability. In conclusion, to move a step towards the realization of an integrated Internet of Things ecosystem, we target programmers and architects and propose them to use the presented tools to ease the software development process.
Resumo:
Early definitions of Smart Building focused almost entirely on the technology aspect and did not suggest user interaction at all. Indeed, today we would attribute it more to the concept of the automated building. In this sense, control of comfort conditions inside buildings is a problem that is being well investigated, since it has a direct effect on users’ productivity and an indirect effect on energy saving. Therefore, from the users’ perspective, a typical environment can be considered comfortable, if it’s capable of providing adequate thermal comfort, visual comfort and indoor air quality conditions and acoustic comfort. In the last years, the scientific community has dealt with many challenges, especially from a technological point of view. For instance, smart sensing devices, the internet, and communication technologies have enabled a new paradigm called Edge computing that brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth. This has allowed us to improve services, sustainability and decision making. Many solutions have been implemented such as smart classrooms, controlling the thermal condition of the building, monitoring HVAC data for energy-efficient of the campus and so forth. Though these projects provide to the realization of smart campus, a framework for smart campus is yet to be determined. These new technologies have also introduced new research challenges: within this thesis work, some of the principal open challenges will be faced, proposing a new conceptual framework, technologies and tools to move forward the actual implementation of smart campuses. Keeping in mind, several problems known in the literature have been investigated: the occupancy detection, noise monitoring for acoustic comfort, context awareness inside the building, wayfinding indoor, strategic deployment for air quality and books preserving.
Resumo:
The present Thesis reports on the various research projects to which I have contributed during my PhD period, working with several research groups, and whose results have been communicated in a number of scientific publications. The main focus of my research activity was to learn, test, exploit and extend the recently developed vdW-DFT (van der Waals corrected Density Functional Theory) methods for computing the structural, vibrational and electronic properties of ordered molecular crystals from first principles. A secondary, and more recent, research activity has been the analysis with microelectrostatic methods of Molecular Dynamics (MD) simulations of disordered molecular systems. While only very unreliable methods based on empirical models were practically usable until a few years ago, accurate calculations of the crystal energy are now possible, thanks to very fast modern computers and to the excellent performance of the best vdW-DFT methods. Accurate energies are particularly important for describing organic molecular solids, since they often exhibit several alternative crystal structures (polymorphs), with very different packing arrangements but very small energy differences. Standard DFT methods do not describe the long-range electron correlations which give rise to the vdW interactions. Although weak, these interactions are extremely sensitive to the packing arrangement, and neglecting them used to be a problem. The calculations of reliable crystal structures and vibrational frequencies has been made possible only recently, thanks to development of some good representations of the vdW contribution to the energy (known as “vdW corrections”).
Resumo:
Safe collaboration between a robot and human operator forms a critical requirement for deploying a robotic system into a manufacturing and testing environment. In this dissertation, the safety requirement for is developed and implemented for the navigation system of the mobile manipulators. A methodology for human-robot co-existence through a 3d scene analysis is also investigated. The proposed approach exploits the advance in computing capability by relying on graphic processing units (GPU’s) for volumetric predictive human-robot contact checking. Apart from guaranteeing safety of operators, human-robot collaboration is also fundamental when cooperative activities are required, as in appliance test automation floor. To achieve this, a generalized hierarchical task controller scheme for collision avoidance is developed. This allows the robotic arm to safely approach and inspect the interior of the appliance without collision during the testing procedure. The unpredictable presence of the operators also forms dynamic obstacle that changes very fast, thereby requiring a quick reaction from the robot side. In this aspect, a GPU-accelarated distance field is computed to speed up reaction time to avoid collision between human operator and the robot. An automated appliance testing also involves robotized laundry loading and unloading during life cycle testing. This task involves Laundry detection, grasp pose estimation and manipulation in a container, inside the drum and during recovery grasping. A wrinkle and blob detection algorithms for grasp pose estimation are developed and grasp poses are calculated along the wrinkle and blobs to efficiently perform grasping task. By ranking the estimated laundry grasp poses according to a predefined cost function, the robotic arm attempt to grasp poses that are more comfortable from the robot kinematic side as well as collision free on the appliance side. This is achieved through appliance detection and full-model registration and collision free trajectory execution using online collision avoidance.
Resumo:
We report on student and staff perceptions of synchronous online teaching and learning sessions in mathematics and computing. The study is based on two surveys of students and tutors conducted 5 years apart, and focusses on the educational experience as well as societal and accessibility dimensions. Key conclusions are that both staff and students value online sessions, to supplement face-to-face sessions, mainly for their convenience, but interaction within the sessions is limited. Students find the recording of sessions particularly helpful in their studies.