53 resultados para Computer Systems
Resumo:
Transactional memory (TM) is a new synchronization mechanism devised to simplify parallel programming, thereby helping programmers to unleash the power of current multicore processors. Although software implementations of TM (STM) have been extensively analyzed in terms of runtime performance, little attention has been paid to an equally important constraint faced by nearly all computer systems: energy consumption. In this work we conduct a comprehensive study of energy and runtime tradeoff sin software transactional memory systems. We characterize the behavior of three state-of-the-art lock-based STM algorithms, along with three different conflict resolution schemes. As a result of this characterization, we propose a DVFS-based technique that can be integrated into the resolution policies so as to improve the energy-delay product (EDP). Experimental results show that our DVFS-enhanced policies are indeed beneficial for applications with high contention levels. Improvements of up to 59% in EDP can be observed in this scenario, with an average EDP reduction of 16% across the STAMP workloads. © 2012 IEEE.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Pós-graduação em Engenharia Mecânica - FEG
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Introduction: In the Web environment, there is a need for greater care with regard to the processing of descriptive and thematic information. The concern with the recovery of information in computer systems precedes the development of the first personal computers. Models of information retrieval have been and are today widely used in databases specific to a field whose scope is known. Objectives: Verify how the issue of relevance is treated in the main computer models of information retrieval and, especially, as the issue is addressed in the future of the Web, the called Semantic Web. Methodology: Bibliographical research. Results: In the classical models studied here, it was realized that the main concern is retrieving documents whose description is closest to the search expression used by the user, which does not necessarily imply that this really needs. In semantic retrieval is the use of ontologies, feature that extends the user's search for a wider range of possible relevant options. Conclusions: The relevance is a subjective judgment and inherent to the user, it will depend on the interaction with the system and especially the fact that he expects to recover in your search. Systems that are based on a model of relevance are not popular, because it requires greater interaction and depend on the user's disposal. The Semantic Web is so far the initiative more efficient in the case of information retrieval in the digital environment.
Resumo:
Huge image collections are becoming available lately. In this scenario, the use of Content-Based Image Retrieval (CBIR) systems has emerged as a promising approach to support image searches. The objective of CBIR systems is to retrieve the most similar images in a collection, given a query image, by taking into account image visual properties such as texture, color, and shape. In these systems, the effectiveness of the retrieval process depends heavily on the accuracy of ranking approaches. Recently, re-ranking approaches have been proposed to improve the effectiveness of CBIR systems by taking into account the relationships among images. The re-ranking approaches consider the relationships among all images in a given dataset. These approaches typically demands a huge amount of computational power, which hampers its use in practical situations. On the other hand, these methods can be massively parallelized. In this paper, we propose to speedup the computation of the RL-Sim algorithm, a recently proposed image re-ranking approach, by using the computational power of Graphics Processing Units (GPU). GPUs are emerging as relatively inexpensive parallel processors that are becoming available on a wide range of computer systems. We address the image re-ranking performance challenges by proposing a parallel solution designed to fit the computational model of GPUs. We conducted an experimental evaluation considering different implementations and devices. Experimental results demonstrate that significant performance gains can be obtained. Our approach achieves speedups of 7x from serial implementation considering the overall algorithm and up to 36x on its core steps.
Resumo:
The paper presents the radiometric parameters determined by the medical physicist during routine radiotherapy planning service in cases of breast cancer . The contours of the breast volume in patients undergoing radiation breast tumors at the Hospital das Clinicas, Faculty of Medicine , UNESP, Botucatu ( HCFMB ) during the year 2012 were analyzed . In order to analyze the influence of physical and radiometric parameters for the determination of the dose distribution of irradiated breast volume , four measurements of isodose curves were prepared in four different heights breast , and compared with the isodose curves plotted computationally . In the routine of planning , the medical physicist must determine the isodose curve that gives the best dose distribution homogeneity in the irradiated volume . The choice of the treatment plan can be done by dedicated computer systems , which require significantly costly investments available services having better financial support . In the Service of Medical Physics , Department of Radiotherapy , HC FMB , we use a two-dimensional software for determination of isodose curves , however , this software is out of date and frequently becomes inoperable due to the lack of maintenance and it is a closed system without feasibility of interference from computer professionals . This fact requires manual preparation of isodose curves , which are subject to uncertainties due to the subjectivity in the clinical interpretation of medical radiation oncologist and medical physicist responsible for planning , plus dispendiar significant calculation time . The choice of the optimal isodose curve depends on the energy of the radiation beam , the geometry and dimensions of the irradiated area . The contours of the breast studied in this work evaluations showed that , for a given energy input , such as the energy of 1.25 MeV of gamma radiation Unit Telecobaltoterapia , the determination of the percentage depth dose ( PDP ) ...