821 resultados para cloud computing, hypervisor, virtualizzazione, live migration, infrastructure as a service
Resumo:
Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Harnessing idle PCs CPU cycles, storage space and other resources of networked computers to collaborative are mainly fixated on for all major grid computing research projects. Most of the university computers labs are occupied with the high puissant desktop PC nowadays. It is plausible to notice that most of the time machines are lying idle or wasting their computing power without utilizing in felicitous ways. However, for intricate quandaries and for analyzing astronomically immense amounts of data, sizably voluminous computational resources are required. For such quandaries, one may run the analysis algorithms in very puissant and expensive computers, which reduces the number of users that can afford such data analysis tasks. Instead of utilizing single expensive machines, distributed computing systems, offers the possibility of utilizing a set of much less expensive machines to do the same task. BOINC and Condor projects have been prosperously utilized for solving authentic scientific research works around the world at a low cost. In this work the main goal is to explore both distributed computing to implement, Condor and BOINC, and utilize their potency to harness the ideal PCs resources for the academic researchers to utilize in their research work. In this thesis, Data mining tasks have been performed in implementation of several machine learning algorithms on the distributed computing environment.
Resumo:
A population-based case-control design was used to investigate the association between migration, urbanisation and schistosomiasis in the Metropolitan Region of Recife, Northeast of Brazil. 1022 cases and 994 controls, aged 10 to 25, were selected. The natives and the migrants who come from endemic areas have a similar risk of infection. On the other hand, the risk of infection of migrants from nonendemic areas seems to be related with the time elapsed since their arrival in São Lourenço da Mata; those who have been living in that urban area for 5 or more years have a risk of infection similar to that of the natives. Those arriving in the metropolitan region of Recife mostly emigrate from "zona da mata" and "zona do agreste" in the state of Pernambuco. Due to the changes in the sugar agro-industry and to the increase in the area used for cattle grazing these workers were driven to villages and cities. The pattern of urbanisation created the conditions for the establishment of foci of transmission in São Lourenço da Mata.
Resumo:
Tinea capitis is a dermatophyte infection that occurs mainly in childhood; there are few reports, in Brazil, in adolescents and adults. The detection of asymptomatic carriers is of great importance in the disease control. From February 1998 to February 1999, a study was performed at the outpatient Dermatologic Unit of Instituto de Puericultura e Pediatria Martagão Gesteira (Universidade Federal do Rio de Janeiro, Brasil) to verify the frequency of asymptomatic carriers and tinea capitis between 79 adolescents, adults and elderly who lived in the same household of 56 children (0-12 years) with tinea capitis. Of these, one female and one male adults (2.5%) were asymptomatic carriers and the cultures revealed Trichophyton tonsurans and Microsporum canis respectively. One female adolescent and two female adults (3.8%) had tinea capitis and all cultures revealed Trichophyton tonsurans. The study has shown that adolescents and adults who live in the same household of children with tinea capitis may be sick or asymptomatic carriers.
Resumo:
Near real time media content personalisation is nowadays a major challenge involving media content sources, distributors and viewers. This paper describes an approach to seamless recommendation, negotiation and transaction of personalised media content. It adopts an integrated view of the problem by proposing, on the business-to-business (B2B) side, a brokerage platform to negotiate the media items on behalf of the media content distributors and sources, providing viewers, on the business-to-consumer (B2C) side, with a personalised electronic programme guide (EPG) containing the set of recommended items after negotiation. In this setup, when a viewer connects, the distributor looks up and invites sources to negotiate the contents of the viewer personal EPG. The proposed multi-agent brokerage platform is structured in four layers, modelling the registration, service agreement, partner lookup, invitation as well as item recommendation, negotiation and transaction stages of the B2B processes. The recommendation service is a rule-based switch hybrid filter, including six collaborative and two content-based filters. The rule-based system selects, at runtime, the filter(s) to apply as well as the final set of recommendations to present. The filter selection is based on the data available, ranging from the history of items watched to the ratings and/or tags assigned to the items by the viewer. Additionally, this module implements (i) a novel item stereotype to represent newly arrived items, (ii) a standard user stereotype for new users, (iii) a novel passive user tag cloud stereotype for socially passive users, and (iv) a new content-based filter named the collinearity and proximity similarity (CPS). At the end of the paper, we present off-line results and a case study describing how the recommendation service works. The proposed system provides, to our knowledge, an excellent holistic solution to the problem of recommending multimedia contents.
Resumo:
Nowadays, data centers are large energy consumers and the trend for next years is expected to increase further, considering the growth in the order of cloud services. A large portion of this power consumption is due to the control of physical parameters of the data center (such as temperature and humidity). However, these physical parameters are tightly coupled with computations, and even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in the cloud infrastructure hosted in the data center. Therefore, managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolution of the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center and with them, find opportunities to optimize energy consumptions. Having a high-resolution picture of the data center conditions, also enables minimizing local hot-spots, perform more accurate predictive maintenance (failures in all infrastructure equipments can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.
Resumo:
Dissertation presented to obtain the Ph.D degree in Biology
Resumo:
Abdominal angiostrongyliasis is a zoonotic infection produced by a metastrongylid intra-arterial nematode, Angiostrongylus costaricensis. Human accidental infection may result in abdominal lesions and treatment with anti-helminthics is contra-indicated because of potential higher morbidity with excitement or death of worms inside vessels. To evaluate the effect of mebendazole on localization of the worms, male Swiss mice, 5 week-old, were infected with 10 third stage larvae per animal. Twelve infected mice were treated with oral mebendazol, at 5 mg/kg/day, for 5 consecutive days, begining 22 days after inoculation. As control groups, 12 infected but non-treated mice and other 12 non-infected and non-treated mice were studied. The findings at necropsy were, respectively for the treated (T) and control (C) groups: 92% and 80% of the worms were inside the cecal mesenteric arterial branch; 8% and 10% were located inside the aorta. Only in the group C some worms (10%) were found inside the portal vein or splenic artery. These data indicate that treatment with mebendazole does not lead to distal or ectopic migration of A. costaricensis worms.
Resumo:
13th IEEE/IFIP International Conference on Embedded and Ubiquitous Computing (EUC 2015). 21 to 23, Oct, 2015, Session W1-A: Multiprocessing and Multicore Architectures. Porto, Portugal.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
This paper presents the TEC4SEA research infrastructure created in Portugal to support research, development, and validation of marine technologies. It is a multidisciplinary open platform, capable of supporting research, development, and test of marine robotics, telecommunications, and sensing technologies for monitoring and operating in the ocean environment. Due to the installed research facilities and its privileged geographic location, it allows fast access to deep sea, and can support multidisciplinary research, enabling full validation and evaluation of technological solutions designed for the ocean environment. It is a vertically integrated infrastructure, in the sense that it possesses a set of skills and resources which range from pure conceptual research to field deployment missions, with strong industrial and logistic capacities in the middle tier of prototype production. TEC4SEA is open to the entire scientific and enterprise community, with a free access policy for researchers affiliated with the research units that ensure its maintenance and sustainability. The paper describes the infrastructure in detail, and discusses associated research programs, providing a strategic vision for deep sea research initiatives, within the context of both the Portuguese National Ocean Strategy and European Strategy frameworks.
Resumo:
DEWI will provide key solutions for wireless seamless connectivity and interoperability in the everyday physical environment of citizens, thereby significantly contributing to the emerging smart home and smart public space.
Resumo:
Near real time media content personalisation is nowadays a major challenge involving media content sources, distributors and viewers. This paper describes an approach to seamless recommendation, negotiation and transaction of personalised media content. It adopts an integrated view of the problem by proposing, on the business-to-business (B2B) side, a brokerage platform to negotiate the media items on behalf of the media content distributors and sources, providing viewers, on the business-to-consumer (B2C) side, with a personalised electronic programme guide (EPG) containing the set of recommended items after negotiation. In this setup, when a viewer connects, the distributor looks up and invites sources to negotiate the contents of the viewer personal EPG. The proposed multi-agent brokerage platform is structured in four layers, modelling the registration, service agreement, partner lookup, invitation as well as item recommendation, negotiation and transaction stages of the B2B processes. The recommendation service is a rule-based switch hybrid filter, including six collaborative and two content-based filters. The rule-based system selects, at runtime, the filter(s) to apply as well as the final set of recommendations to present. The filter selection is based on the data available, ranging from the history of items watched to the ratings and/or tags assigned to the items by the viewer. Additionally, this module implements (i) a novel item stereotype to represent newly arrived items, (ii) a standard user stereotype for new users, (iii) a novel passive user tag cloud stereotype for socially passive users, and (iv) a new content-based filter named the collinearity and proximity similarity (CPS). At the end of the paper, we present off-line results and a case study describing how the recommendation service works. The proposed system provides, to our knowledge, an excellent holistic solution to the problem of recommending multimedia contents.
Resumo:
Antigenic preparations from Sporothrix schenckii usually involve materials from mixed cultures of yeast and mycelia presenting cross-reactions with other deep mycoses. We have standardized pure yeast phase with high viability of the cells suitable to obtain specific excretion-secretion products without somatic contaminations. These excretion-secretion products were highly immunogenic and did not produce noticeable cross-reactions in either double immunodiffusion or Western blot. The antigenic preparation consists mainly of proteins with molecular weights between 40 and 70 kDa, some of them with proteolytic activity in mild acidic conditions. We also observed cathepsin-like activity at two days of culture and chymotrypsin-like activity at four days of culture consistent with the change in concentration of different secreted proteins. The proteases were able to cleave different subclasses of human IgG suggesting a sequential production of antigens and molecules that could interact and interfere with the immune response of the host.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.