795 resultados para 080504 Ubiquitous Computing
Resumo:
The Graphics Processing Unit (GPU) is present in almost every modern day personal computer. Despite its specific purpose design, they have been increasingly used for general computations with very good results. Hence, there is a growing effort from the community to seamlessly integrate this kind of devices in everyday computing. However, to fully exploit the potential of a system comprising GPUs and CPUs, these devices should be presented to the programmer as a single platform. The efficient combination of the power of CPU and GPU devices is highly dependent on each device’s characteristics, resulting in platform specific applications that cannot be ported to different systems. Also, the most efficient work balance among devices is highly dependable on the computations to be performed and respective data sizes. In this work, we propose a solution for heterogeneous environments based on the abstraction level provided by algorithmic skeletons. Our goal is to take full advantage of the power of all CPU and GPU devices present in a system, without the need for different kernel implementations nor explicit work-distribution.To that end, we extended Marrow, an algorithmic skeleton framework for multi-GPUs, to support CPU computations and efficiently balance the work-load between devices. Our approach is based on an offline training execution that identifies the ideal work balance and platform configurations for a given application and input data size. The evaluation of this work shows that the combination of CPU and GPU devices can significantly boost the performance of our benchmarks in the tested environments, when compared to GPU-only executions.
Resumo:
Breast cancer is the most common cancer among women, being a major public health problem. Worldwide, X-ray mammography is the current gold-standard for medical imaging of breast cancer. However, it has associated some well-known limitations. The false-negative rates, up to 66% in symptomatic women, and the false-positive rates, up to 60%, are a continued source of concern and debate. These drawbacks prompt the development of other imaging techniques for breast cancer detection, in which Digital Breast Tomosynthesis (DBT) is included. DBT is a 3D radiographic technique that reduces the obscuring effect of tissue overlap and appears to address both issues of false-negative and false-positive rates. The 3D images in DBT are only achieved through image reconstruction methods. These methods play an important role in a clinical setting since there is a need to implement a reconstruction process that is both accurate and fast. This dissertation deals with the optimization of iterative algorithms, with parallel computing through an implementation on Graphics Processing Units (GPUs) to make the 3D reconstruction faster using Compute Unified Device Architecture (CUDA). Iterative algorithms have shown to produce the highest quality DBT images, but since they are computationally intensive, their clinical use is currently rejected. These algorithms have the potential to reduce patient dose in DBT scans. A method of integrating CUDA in Interactive Data Language (IDL) is proposed in order to accelerate the DBT image reconstructions. This method has never been attempted before for DBT. In this work the system matrix calculation, the most computationally expensive part of iterative algorithms, is accelerated. A speedup of 1.6 is achieved proving the fact that GPUs can accelerate the IDL implementation.
Resumo:
No atual contexto da inovação, um grande número de estudos tem analisado o potencial do modelo de Inovação Aberta. Neste sentido, o autor Henry Chesbrough (2003) considerado o pai da Inovação Aberta, afirma que as empresas estão vivenciando uma “mudança de paradigma” na maneira como desenvolvem os seus processos de inovação e na comercialização de tecnologia e conhecimento. Desta forma, o modelo de Inovação Aberta defende que as empresas podem e devem utilizar os recursos disponíveis fora das suas fronteiras sendo esta combinação de ideias e tecnologias internas e externas crucial para atingir uma posição de liderança no mercado. Já afirmava Chesbrough (2003) que não se faz inovação isoladamente e o próprio dinamismo do cenário atual reforça esta ideia. Assim, os riscos inerentes ao processo de inovação podem ser atenuados através da realização de parcerias entre empresas e instituições. A adoção do modelo de Inovação Aberta é percebida com base na abundância de conhecimento disponível, que poderá proporcionar valor também à empresa que o criou, como é o caso do licenciamento de patentes. O presente estudo teve como objetivo identificar as práticas de Inovação Aberta entre as parcerias mencionadas pelas empresas prestadoras de Cloud Computing. Através da Análise de Redes Sociais foram construídas matrizes referentes às parcerias mencionadas pelas empresas e informações obtidas em fontes secundárias (Sousa, 2012). Essas matrizes de relacionamento (redes) foram analisadas e representadas através de diagramas. Desta forma, foi possível traçar um panorama das parcerias consideradas estratégicas pelas empresas entrevistadas e identificar quais delas constituem, de fato, práticas de Inovação Aberta. Do total de 26 parcerias estratégicas mencionadas nas entrevistas, apenas 11 foram caracterizadas como práticas do modelo aberto. A análise das práticas conduzidas pelas empresas entrevistadas permite verificar algumas limitações no aproveitamento do modelo de Inovação Aberta. Por fim, são feitas algumas recomendações sobre a implementação deste modelo pelas pequenas e médias empresas baseadas em tecnologias emergentes, como é o caso do conceito de cloud computing.
Resumo:
This study discusses some fundamental issues so that the development and diffusion of services based in cloud computing happen positively in several countries. For exposure of this subject is discusses public initiatives by the most advanced countries in terms of cloud computing application and the brazilin position in this context. Based on presented evidences here it appears that the essential elements for the development and diffusion of cloud computing in Brazil made important steps and show evidence of maturity, as the cybercrime legislation. However, other elements still require analysis and specifically adaptations for the cloud computing case, such as the Intellectual Property Rights. Despite showing broadband services still lacking, one cannot disregard the government effort to facilitate access for all society. In contrast, the large volume of the Brazilian IT market is an interest factor for companies seeking to invest in the country.
Resumo:
In the following text I will develop three major aspects. The first is to draw attention to those who seem to have been the disciplinary fields where, despite everything, the Digital Humanities (in the broad perspective as will be regarded here) have asserted themselves in a more comprehensive manner. I think it is here that I run into greater risks, not only for what I have mentioned above, but certainly because a significant part, perhaps, of the achievements and of the researchers might have escaped the look that I sought to cast upon the past few decades, always influenced by my own experience and the work carried out in the field of History. But this can be considered as a work in progress and it is open to criticism and suggestions. A second point to note is that emphasis will be given to the main lines of development in the relationship between historical research and digital methodologies, resources and tools. Finally, I will try to make a brief analysis of what has been the Digital Humanities discourse appropriation in recent years, with very debatable data and methods for sure, because studies are still scarce and little systematic information is available that would allow to go beyond an introductory reflection.
Resumo:
This paper presents a proposal for a management model based on reliability requirements concerning Cloud Computing (CC). The proposal was based on a literature review focused on the problems, challenges and underway studies related to the safety and reliability of Information Systems (IS) in this technological environment. This literature review examined the existing obstacles and challenges from the point of view of respected authors on the subject. The main issues are addressed and structured as a model, called "Trust Model for Cloud Computing environment". This is a proactive proposal that purposes to organize and discuss management solutions for the CC environment, aiming improved reliability of the IS applications operation, for both providers and their customers. On the other hand and central to trust, one of the CC challenges is the development of models for mutual audit management agreements, so that a formal relationship can be established involving the relevant legal responsibilities. To establish and control the appropriate contractual requirements, it is necessary to adopt technologies that can collect the data needed to inform risk decisions, such as access usage, security controls, location and other references related to the use of the service. In this process, the cloud service providers and consumers themselves must have metrics and controls to support cloud-use management in compliance with the SLAs agreed between the parties. The organization of these studies and its dissemination in the market as a conceptual model that is able to establish parameters to regulate a reliable relation between provider and user of IT services in CC environment is an interesting instrument to guide providers, developers and users in order to provide services and secure and reliable applications.
Resumo:
Business Intelligence (BI) can be seen as a method that gathers information and data from information systems in order to help companies to be more accurate in their decision-making process. Traditionally BI systems were associated with the use of Data Warehouses (DW). The prime purpose of DW is to serve as a repository that stores all the relevant information required for making the correct decision. The necessity to integrate streaming data became crucial with the need to improve the efficiency and effectiveness of the decision process. In primary and secondary education, there is a lack of BI solutions. Due to the schools reality the main purpose of this study is to provide a Pervasive BI solution able to monitoring the schools and student data anywhere and anytime in real-time as well as disseminating the information through ubiquitous devices. The first task consisted in gathering data regarding the different choices made by the student since his enrolment in a certain school year until the end of it. Thereafter a dimensional model was developed in order to be possible building a BI platform. This paper presents the dimensional model, a set of pre-defined indicators, the Pervasive Business Intelligence characteristics and the prototype designed. The main contribution of this study was to offer to the schools a tool that could help them to make accurate decisions in real-time. Data dissemination was achieved through a localized application that can be accessed anywhere and anytime.
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto.
Resumo:
Kidney renal failure means that one’s kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient’s history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis.The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the sensitivity and the specificity exhibited values range between 93.1 and 94.9 and 91.9–94.2 %, respectively.
Resumo:
Uno de los temas centrales del proyecto concierne la naturaleza de la ciencia de la computación. La reciente aparición de esta disciplina sumada a su origen híbrido como ciencia formal y disciplina tecnológica hace que su caracterización aún no esté completa y menos aún acordada entre los científicos del área. En el trabajo Three paradigms of Computer Science de A. Eden, se presentan tres posiciones admitidamente exageradas acerca de como entender tanto el objeto de estudio (ontología) como los métodos de trabajo (metodología) y la estructura de la teoría y las justificaciones del conocimiento informático (epistemología): La llamada racionalista, la cual se basa en la idea de que los programas son fórmulas lógicas y que la forma de trabajo es deductiva, la tecnocrática que presenta a la ciencia computacional como una disciplina ingenieril y la ahi llamada científica, la cual asimilaría a la computación a las ciencias empíricas. Algunos de los problemas de ciencia de la computación están relacionados con cuestiones de filosofía de la matemática, en particular la relación entre las entidades abstractas y el mundo. Sin embargo, el carácter prescriptivo de los axiomas y teoremas de las teorías de la programación puede permitir interpretaciones alternativas y cuestionaría fuertemente la posibilidad de pensar a la ciencia de la computación como una ciencia empírica, al menos en el sentido tradicional. Por otro lado, es posible que el tipo de análisis aplicado a las ciencias de la computación propuesto en este proyecto aporte nuevas ideas para pensar problemas de filosofía de la matemática. Un ejemplo de estos posibles aportes puede verse en el trabajo de Arkoudas Computers, Justi?cation, and Mathematical Knowledge el cual echa nueva luz al problema del significado de las demostraciones matemáticas.Los objetivos del proyecto son: Caracterizar el campo de las ciencias de la computación.Evaluar los fundamentos ontológicos, epistemológicos y metodológicos de la ciencia de la computación actual.Analizar las relaciones entre las diferentes perspectivas heurísticas y epistémicas y las practicas de la programación.
Resumo:
Monitoring, object-orientation, real-time, execution-time, scheduling
Resumo:
En este proyecto se han visto dos sistemas de computación distribuida diferentes entre ellos: Condor y BOINC. Se exploran las posibilidades para poder conseguir que ambos sistemas logren trabajar conjuntamente, escogiendo la parte más efectiva de cada uno de los sistemas con el fin de complementarse.
Resumo:
Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.
Resumo:
Recent studies at high field (7Tesla) have reported small metabolite changes, in particular lactate and glutamate (below 0.3μmol/g) during visual stimulation. These studies have been limited to the visual cortex because of its high energy metabolism and good magnetic resonance spectroscopy (MRS) sensitivity using surface coil. The aim of this study was to extend functional MRS (fMRS) to investigate for the first time the metabolite changes during motor activation at 7T. Small but sustained increases in lactate (0.17μmol/g±0.05μmol/g, p<0.001) and glutamate (0.17μmol/g±0.09μmol/g, p<0.005) were detected during motor activation followed by a return to the baseline after the end of activation. The present study demonstrates that increases in lactate and glutamate during motor stimulation are small, but similar to those observed during visual stimulation. From the observed glutamate and lactate increase, we inferred that these metabolite changes may be a general manifestation of the increased neuronal activity. In addition, we propose that the measured metabolite concentration increases imply an increase in ΔCMRO2 that is transiently below that of ΔCMRGlc during the first 1 to 2min of the stimulation.