87 resultados para monitoring framework
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Eletrotécnica e de Computadores
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Química e Bioquímica, Especialidade em Engenharia Bioquímica
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia do Ambiente
Resumo:
The Graphics Processing Unit (GPU) is present in almost every modern day personal computer. Despite its specific purpose design, they have been increasingly used for general computations with very good results. Hence, there is a growing effort from the community to seamlessly integrate this kind of devices in everyday computing. However, to fully exploit the potential of a system comprising GPUs and CPUs, these devices should be presented to the programmer as a single platform. The efficient combination of the power of CPU and GPU devices is highly dependent on each device’s characteristics, resulting in platform specific applications that cannot be ported to different systems. Also, the most efficient work balance among devices is highly dependable on the computations to be performed and respective data sizes. In this work, we propose a solution for heterogeneous environments based on the abstraction level provided by algorithmic skeletons. Our goal is to take full advantage of the power of all CPU and GPU devices present in a system, without the need for different kernel implementations nor explicit work-distribution.To that end, we extended Marrow, an algorithmic skeleton framework for multi-GPUs, to support CPU computations and efficiently balance the work-load between devices. Our approach is based on an offline training execution that identifies the ideal work balance and platform configurations for a given application and input data size. The evaluation of this work shows that the combination of CPU and GPU devices can significantly boost the performance of our benchmarks in the tested environments, when compared to GPU-only executions.
Resumo:
The reported productivity gains while using models and model transformations to develop entire systems, after almost a decade of experience applying model-driven approaches for system development, are already undeniable benefits of this approach. However, the slowness of higher-level, rule based model transformation languages hinders the applicability of this approach to industrial scales. Lower-level, and efficient, languages can be used but productivity and easy maintenance seize to exist. The abstraction penalty problem is not new, it also exists for high-level, object oriented languages but everyone is using them now. Why is not everyone using rule based model transformation languages then? In this thesis, we propose a framework, comprised of a language and its respective environment, designed to tackle the most performance critical operation of high-level model transformation languages: the pattern matching. This framework shows that it is possible to mitigate the performance penalty while still using high-level model transformation languages.
Resumo:
During drilling operation, cuttings are produced downhole and must be removed to avoid issues which can lead to Non Productive Time (NPT). Most of stuck pipe and then Bottom-Hole Assembly (BHA) lost events are hole cleaned related. There are many parameters which help determine hole cleaning conditions, but a proper selection of the key parameters will facilitate monitoring hole cleaning conditions and interventions. The aim of Hole Cleaning Monitoring is to keep track of borehole conditions including hole cleaning efficiency and wellbore stability issues during drilling operations. Adequate hole cleaning is the one of the main concerns in the underbalanced drilling operations especially for directional and horizontal wells. This dissertation addresses some hole cleaning fundamentals which will act as the basis for recommendation practice during drilling operations. Understand how parameters such as Flowrate, Rotation per Minute (RPM), Rate of Penetration (ROP) and Mud Weight are useful to improve the hole cleaning performance and how Equivalent Circulate Density (ECD), Torque & Drag (T&D) and Cuttings Volumes coming from downhole help to indicate how clean and stable the well is. For case study, hole cleaning performance or cuttings volume removal monitoring, will be based on real-time measurements of the cuttings volume removal from downhole at certain time, taking into account Flowrate, RPM, ROP and Drilling fluid or Mud properties, and then will be plotted and compared to the volume being drilled expected. ECD monitoring will dictate hole stability conditions and T&D and Cuttings Volume coming from downhole monitoring will dictate how clean the well is. T&D Modeling Software provide theoretical calculated T&D trends which will be plotted and compared to the real-time measurements. It will use the measured hookloads to perform a back-calculation of friction factors along the wellbore.
Resumo:
Despite the recent progresses in robotics, autonomous robots still have too many limitations to reliably help people with disabilities. On the other hand, animals, and especially dogs, have already demonstrated great skills in assisting people in many daily situations. However, dogs also have their own set of limitations. For example, they need to rest periodically, to be healthy (physically and psychologically), and it is difficult to control them remotely. This project aims to “augment” the Assistance dog, by developing a system that compensates some of the dog weaknesses through a robotic device mounted on the dog harness. This specific study, involved in the COCHISE project, focuses on the development of a system for the monitoring of dogs activity and physiological parameters.
Resumo:
Cloud computing has been one of the most important topics in Information Technology which aims to assure scalable and reliable on-demand services over the Internet. The expansion of the application scope of cloud services would require cooperation between clouds from different providers that have heterogeneous functionalities. This collaboration between different cloud vendors can provide better Quality of Services (QoS) at the lower price. However, current cloud systems have been developed without concerns of seamless cloud interconnection, and actually they do not support intercloud interoperability to enable collaboration between cloud service providers. Hence, the PhD work is motivated to address interoperability issue between cloud providers as a challenging research objective. This thesis proposes a new framework which supports inter-cloud interoperability in a heterogeneous computing resource cloud environment with the goal of dispatching the workload to the most effective clouds available at runtime. Analysing different methodologies that have been applied to resolve various problem scenarios related to interoperability lead us to exploit Model Driven Architecture (MDA) and Service Oriented Architecture (SOA) methods as appropriate approaches for our inter-cloud framework. Moreover, since distributing the operations in a cloud-based environment is a nondeterministic polynomial time (NP-complete) problem, a Genetic Algorithm (GA) based job scheduler proposed as a part of interoperability framework, offering workload migration with the best performance at the least cost. A new Agent Based Simulation (ABS) approach is proposed to model the inter-cloud environment with three types of agents: Cloud Subscriber agent, Cloud Provider agent, and Job agent. The ABS model is proposed to evaluate the proposed framework.
Resumo:
The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.
Resumo:
RESUMO - 1. INTRODUÇÃO: Ao longo dos tempos, assistiu-se a um aumento da importância da Saúde Pública na Comunidade Europeia, mas só há relativamente pouco tempo teve o merecido lugar de destaque à luz da legislação comunitária. Neste contexto e com a adopção do Programa Europeu de Saúde Pública, surge a necessidade de actualizar o pensamento nesta área. Assim, é identificada uma oportunidade para formular uma estratégia, que seja passível de reduzir desigualdades e que também em compreenda as necessidades de saúde. Com o expandir da questão e com o propósito de reduzir as desigualdades, surge a Directiva 2011/24/UE, que visa regulamentar os direitos dos doentes em matéria de cuidados transfronteiriços. 2. OBJETIVO: Este trabalho apresenta como objetivo primordial analisar a Directiva 2011/24/UE, bem como a Lei n.º 52/2014, de 25 de Agosto, e identificar as principais barreiras, ao exercício do direito de acesso aos cuidados de saúde transfronteiriços, pelos beneficiários do SNS em Portugal, derivadas da aplicação de tais instrumentos legais. 3. METODOLOGIA: Foi utilizada uma abordagem analítica e documental, baseada na metodologia qualitativa. 4. CONCLUSÕES: As principais barreiras ao direito de acesso aos cuidados de saúde transfronteiriços, para os beneficiários do SNS em Portugal, são de ordem financeira, linguística e cultural, informacional, de mobilidade física, de proximidade geográfica, de carácter administrativo e de continuidade dos cuidados. A transposição da Directiva 2011/24/UE para o quadro jurídico português resulta essencialmente em iniquidades no âmbito do acesso aos cuidados de saúde transfronteiriços.
Resumo:
Nowadays, the consumption of goods and services on the Internet are increasing in a constant motion. Small and Medium Enterprises (SMEs) mostly from the traditional industry sectors are usually make business in weak and fragile market sectors, where customized products and services prevail. To survive and compete in the actual markets they have to readjust their business strategies by creating new manufacturing processes and establishing new business networks through new technological approaches. In order to compete with big enterprises, these partnerships aim the sharing of resources, knowledge and strategies to boost the sector’s business consolidation through the creation of dynamic manufacturing networks. To facilitate such demand, it is proposed the development of a centralized information system, which allows enterprises to select and create dynamic manufacturing networks that would have the capability to monitor all the manufacturing process, including the assembly, packaging and distribution phases. Even the networking partners that come from the same area have multi and heterogeneous representations of the same knowledge, denoting their own view of the domain. Thus, different conceptual, semantic, and consequently, diverse lexically knowledge representations may occur in the network, causing non-transparent sharing of information and interoperability inconsistencies. The creation of a framework supported by a tool that in a flexible way would enable the identification, classification and resolution of such semantic heterogeneities is required. This tool will support the network in the semantic mapping establishments, to facilitate the various enterprises information systems integration.
Resumo:
Generating personalized movie recommendations to users is a problem that most commonly relies on user-movie ratings. These ratings are generally used either to understand the user preferences or to recommend movies that users with similar rating patterns have rated highly. However, movie recommenders are often subject to the Cold-Start problem: new movies have not been rated by anyone, so, they will not be recommended to anyone; likewise, the preferences of new users who have not rated any movie cannot be learned. In parallel, Social-Media platforms, such as Twitter, collect great amounts of user feedback on movies, as these are very popular nowadays. This thesis proposes to explore feedback shared on Twitter to predict the popularity of new movies and show how it can be used to tackle the Cold-Start problem. It also proposes, at a finer grain, to explore the reputation of directors and actors on IMDb to tackle the Cold-Start problem. To assess these aspects, a Reputation-enhanced Recommendation Algorithm is implemented and evaluated on a crawled IMDb dataset with previous user ratings of old movies,together with Twitter data crawled from January 2014 to March 2014, to recommend 60 movies affected by the Cold-Start problem. Twitter revealed to be a strong reputation predictor, and the Reputation-enhanced Recommendation Algorithm improved over several baseline methods. Additionally, the algorithm also proved to be useful when recommending movies in an extreme Cold-Start scenario, where both new movies and users are affected by the Cold-Start problem.
Resumo:
ABSTRACT: Background. In India, prevalence rates of dementia and prodromal amnestic Mild Cognitive Impairment (MCI) are 3.1% and 4.3% respectively. Most Indians refer to the full spectrum of cognitive disorders simply as ‘memory loss.’ Barring prevention or cure, these conditions will rise rapidly with population aging. Evidence-based policies and practices can improve the lives of affected individuals and their caregivers, but will require timely and sustained uptake. Objectives. Framed by social cognitive theories of health behavior, this study explores the knowledge, attitudes and practices concerning cognitive impairment and related service use by older adults who screen positive for MCI, their primary caregivers, and health providers. Methods. I used the Montreal Cognitive Assessment to screen for cognitive impairment in memory camps in Mumbai. To achieve sampling diversity, I used maximum variation sampling. Ten adults aged 60+ who had no significant functional impairment but screened positive for MCI and their caregivers participated in separate focus groups. Four other such dyads and six doctors/ traditional healers completed in-depth interviews. Data were translated from Hindi or Marathi to English and analyzed in Atlas.ti using Framework Analysis. Findings. Knowledge and awareness of cognitive impairment and available resources were very low. Physicians attributed the condition to disease-induced pathology while lay persons blamed brain malfunction due to normal aging. Main attitudes were that this condition is not a disease, is not serious and/or is not treatable, and that it evokes stigma toward and among impaired persons, their families and providers. Low knowledge and poor attitudes impeded help-seeking. Conclusions. Cognitive disorders of aging will take a heavy toll on private lives and public resources in developing countries. Early detection, accurate diagnosis, systematic monitoring and quality care are needed to compress the period of morbidity and promote quality of life. Key stakeholders provide essential insights into how scientific and indigenous knowledge and sociocultural attitudes affect use and provision of resources.
Resumo:
Digital Businesses have become a major driver for economic growth and have seen an explosion of new startups. At the same time, it also includes mature enterprises that have become global giants in a relatively short period of time. Digital Businesses have unique characteristics that make the running and management of a Digital Business much different from traditional offline businesses. Digital businesses respond to online users who are highly interconnected and networked. This enables a rapid flow of word of mouth, at a pace far greater than ever envisioned when dealing with traditional products and services. The relatively low cost of incremental user addition has led to a variety of innovation in pricing of digital products, including various forms of free and freemium pricing models. This thesis explores the unique characteristics and complexities of Digital Businesses and its implications on the design of Digital Business Models and Revenue Models. The thesis proposes an Agent Based Modeling Framework that can be used to develop Simulation Models that simulate the complex dynamics of Digital Businesses and the user interactions between users of a digital product. Such Simulation models can be used for a variety of purposes such as simple forecasting, analysing the impact of market disturbances, analysing the impact of changes in pricing models and optimising the pricing for maximum revenue generation or a balance between growth in usage and revenue generation. These models can be developed for a mature enterprise with a large historical record of user growth rate as well as for early stage enterprises without much historical data. Through three case studies, the thesis demonstrates the applicability of the Framework and its potential applications.