981 resultados para internet data centers


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The era of big data brings new challenges to the network traffic technique that is an essential tool for network management and security. To deal with the problems of dynamic ports and encrypted payload in traditional port-based and payload-basedmethods, the state-of-the-art method employs flow statistical features and machine learning techniques to identify network traffic. This chapter reviews the statistical-feature based traffic classification methods, that have been proposed in the last decade. We also examine a new problem: unclean traffic in the training stage of machine learning due to the labeling mistake and complex composition of big Internet data. This chapter further evaluates the performance of typical machine learning algorithms with unclean training data. The review and the empirical study can provide a guide for academia and practitioners in choosing proper traffic classification methods in real-world scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Live video forwarding for IP cameras has become a popular service in video data centers. In the forwarding service, requests of end users from different regions arrive in real-time to gain live video streams of IP cameras from inter-connected video data centers. A fundamental scheduling problem is how to assign resources with the global optimal resource cost and forwarding delay to forward live video streams. We introduce the resource provisioning cost as the combination of media server cost, connection bandwidth cost, and forwarding delay cost. In this paper, a multi-objective resource provisioning (MORP) approach is proposed to deal with the online inter-datacenter resource provisioning problem. The approach aims at minimizing the resource provisioning cost during live video forwarding. It adaptively allocates media servers in appropriate video data centers and connects the chosen media servers together to provide system scalability and connectivity. Different from previous works, MORP takes both resource capacity and diversity (e.g. location and price) into consideration during live video forwarding. Finally, the experimental results show that MORP approach not only cuts the resource provisioning cost of 3% to 10% comparing to the bench mark approach, but also shortens the resource provisioning delay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The past decade has seen the energy consumption in servers and Internet Data Centers (IDCs) skyrocket. A recent survey estimated that the worldwide spending on servers and cooling have risen to above $30 billion and is likely to exceed spending on the new server hardware . The rapid rise in energy consumption has posted a serious threat to both energy resources and the environment, which makes green computing not only worthwhile but also necessary. This dissertation intends to tackle the challenges of both reducing the energy consumption of server systems and by reducing the cost for Online Service Providers (OSPs). Two distinct subsystems account for most of IDC’s power: the server system, which accounts for 56% of the total power consumption of an IDC, and the cooling and humidifcation systems, which accounts for about 30% of the total power consumption. The server system dominates the energy consumption of an IDC, and its power draw can vary drastically with data center utilization. In this dissertation, we propose three models to achieve energy effciency in web server clusters: an energy proportional model, an optimal server allocation and frequency adjustment strategy, and a constrained Markov model. The proposed models have combined Dynamic Voltage/Frequency Scaling (DV/FS) and Vary-On, Vary-off (VOVF) mechanisms that work together for more energy savings. Meanwhile, corresponding strategies are proposed to deal with the transition overheads. We further extend server energy management to the IDC’s costs management, helping the OSPs to conserve, manage their own electricity cost, and lower the carbon emissions. We have developed an optimal energy-aware load dispatching strategy that periodically maps more requests to the locations with lower electricity prices. A carbon emission limit is placed, and the volatility of the carbon offset market is also considered. Two energy effcient strategies are applied to the server system and the cooling system respectively. With the rapid development of cloud services, we also carry out research to reduce the server energy in cloud computing environments. In this work, we propose a new live virtual machine (VM) placement scheme that can effectively map VMs to Physical Machines (PMs) with substantial energy savings in a heterogeneous server cluster. A VM/PM mapping probability matrix is constructed, in which each VM request is assigned with a probability running on PMs. The VM/PM mapping probability matrix takes into account resource limitations, VM operation overheads, server reliability as well as energy effciency. The evolution of Internet Data Centers and the increasing demands of web services raise great challenges to improve the energy effciency of IDCs. We also express several potential areas for future research in each chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The expected pervasive use of mobile cloud computing and the growing number of Internet data centers have brought forth many concerns, such as, energy costs and energy saving management of both data centers and mobile connections. Therefore, the need for adaptive and distributed resource allocation schedulers for minimizing the communication-plus-computing energy consumption has become increasingly important. In this paper, we propose and test an efficient dynamic resource provisioning scheduler that jointly minimizes computation and communication energy consumption, while guaranteeing user Quality of Service (QoS) constraints. We evaluate the performance of the proposed dynamic resource provisioning algorithm with respect to the execution time, goodput and bandwidth usage and compare the performance of the proposed scheduler against the exiting approaches. The attained experimental results show that the proposed dynamic resource provisioning algorithm achieves much higher energy-saving than the traditional schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the Big Data development and the growth of cloud computing and Internet of Things, data centers have been multiplying in Brazil and the rest of the world. Designing and running this sites in an efficient way has become a necessary challenge and to do so, it's essential a better understanding of its infrastructure. Thus, this paper presents a bibliography study using technical concepts in order to understand the specific needs related to this environment and the best forms address them. It discusses the data center infrastructure main systems, methods to improve their energy efficiency and their future trends

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the Big Data development and the growth of cloud computing and Internet of Things, data centers have been multiplying in Brazil and the rest of the world. Designing and running this sites in an efficient way has become a necessary challenge and to do so, it's essential a better understanding of its infrastructure. Thus, this paper presents a bibliography study using technical concepts in order to understand the specific needs related to this environment and the best forms address them. It discusses the data center infrastructure main systems, methods to improve their energy efficiency and their future trends

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mixture models are a flexible tool for unsupervised clustering that have found popularity in a vast array of research areas. In studies of medicine, the use of mixtures holds the potential to greatly enhance our understanding of patient responses through the identification of clinically meaningful clusters that, given the complexity of many data sources, may otherwise by intangible. Furthermore, when developed in the Bayesian framework, mixture models provide a natural means for capturing and propagating uncertainty in different aspects of a clustering solution, arguably resulting in richer analyses of the population under study. This thesis aims to investigate the use of Bayesian mixture models in analysing varied and detailed sources of patient information collected in the study of complex disease. The first aim of this thesis is to showcase the flexibility of mixture models in modelling markedly different types of data. In particular, we examine three common variants on the mixture model, namely, finite mixtures, Dirichlet Process mixtures and hidden Markov models. Beyond the development and application of these models to different sources of data, this thesis also focuses on modelling different aspects relating to uncertainty in clustering. Examples of clustering uncertainty considered are uncertainty in a patient’s true cluster membership and accounting for uncertainty in the true number of clusters present. Finally, this thesis aims to address and propose solutions to the task of comparing clustering solutions, whether this be comparing patients or observations assigned to different subgroups or comparing clustering solutions over multiple datasets. To address these aims, we consider a case study in Parkinson’s disease (PD), a complex and commonly diagnosed neurodegenerative disorder. In particular, two commonly collected sources of patient information are considered. The first source of data are on symptoms associated with PD, recorded using the Unified Parkinson’s Disease Rating Scale (UPDRS) and constitutes the first half of this thesis. The second half of this thesis is dedicated to the analysis of microelectrode recordings collected during Deep Brain Stimulation (DBS), a popular palliative treatment for advanced PD. Analysis of this second source of data centers on the problems of unsupervised detection and sorting of action potentials or "spikes" in recordings of multiple cell activity, providing valuable information on real time neural activity in the brain.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Internet-based surveillance systems provide a novel approach to monitoring infectious diseases. Surveillance systems built on internet data are economically, logistically and epidemiologically appealing and have shown significant promise. The potential for these systems has increased with increased internet availability and shifts in health-related information seeking behaviour. This approach to monitoring infectious diseases has, however, only been applied to single or small groups of select diseases. This study aims to systematically investigate the potential for developing surveillance and early warning systems using internet search data, for a wide range of infectious diseases. Methods Official notifications for 64 infectious diseases in Australia were downloaded and correlated with frequencies for 164 internet search terms for the period 2009–13 using Spearman’s rank correlations. Time series cross correlations were performed to assess the potential for search terms to be used in construction of early warning systems. Results Notifications for 17 infectious diseases (26.6%) were found to be significantly correlated with a selected search term. The use of internet metrics as a means of surveillance has not previously been described for 12 (70.6%) of these diseases. The majority of diseases identified were vaccine-preventable, vector-borne or sexually transmissible; cross correlations, however, indicated that vector-borne and vaccine preventable diseases are best suited for development of early warning systems. Conclusions The findings of this study suggest that internet-based surveillance systems have broader applicability to monitoring infectious diseases than has previously been recognised. Furthermore, internet-based surveillance systems have a potential role in forecasting emerging infectious disease events, especially for vaccine-preventable and vector-borne diseases

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de Natureza Científica para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização de Edificações

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal of the work presented in this paper is to provide mobile platforms within our campus with a GPS based data service capable of supporting precise outdoor navigation. This can be achieved by providing campus-wide access to real time Differential GPS (DGPS) data. As a result, we designed and implemented a three-tier distributed system that provides Internet data links between remote DGPS sources and the campus and a campus-wide DGPS data dissemination service. The Internet data link service is a two-tier client/server where the server-side is connected to the DGPS station and the client-side is located at the campus. The campus-wide DGPS data provider disseminates the DGPS data received at the campus via the campus Intranet and via a wireless data link. The wireless broadcast is intended for portable receivers equipped with a DGPS wireless interface and the Intranet link is provided for receivers with a DGPS serial interface. The application is expected to provide adequate support for accurate outdoor campus navigation tasks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La protection des données personnelles en Suisse trouve son fondement dans la constitution et se concrétise avant tout dans une loi fédérale adoptée avant l'avènement d'Internet et la généralisation de la transmission d'informations personnelles sur des réseaux numériques. Cette réglementation est complétée par les engagements internationaux de la Suisse et notamment la Convention européenne des Droits de l'Homme du Conseil de l'Europe. L'article délimite tout d'abord le champ d'application de la législation, qui joue un rôle pour le traitement de données personnelles par des particuliers comme par les autorités de l'administration fédérale. Suit une brève analyse des principes fondamentaux (licéité, bonne foi, proportionnalité, finalité, exactitude, communication à l'étranger, sécurité, droit d'accès) et de leur application sur Internet. Enfin, la protection du contenu des messages électroniques privés est brièvement abordée sous l'angle du secret des télécommunications et à la lumière d'une jurisprudence récente du Tribunal fédéral.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Wednesday 23rd April 2014 Speaker(s): Willi Hasselbring Organiser: Leslie Carr Time: 23/04/2014 11:00-11:50 Location: B32/3077 File size: 669 Mb Abstract For good scientific practice, it is important that research results may be properly checked by reviewers and possibly repeated and extended by other researchers. This is of particular interest for "digital science" i.e. for in-silico experiments. In this talk, I'll discuss some issues of how software systems and services may contribute to good scientific practice. Particularly, I'll present our PubFlow approach to automate publication workflows for scientific data. The PubFlow workflow management system is based on established technology. We integrate institutional repository systems (based on EPrints) and world data centers (in marine science). PubFlow collects provenance data automatically via our monitoring framework Kieker. Provenance information describes the origins and the history of scientific data in its life cycle, and the process by which it arrived. Thus, provenance information is highly relevant to repeatability and trustworthiness of scientific results. In our evaluation in marine science, we collaborate with the GEOMAR Helmholtz Centre for Ocean Research Kiel.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introducción. En Colombia, el 80% de los pacientes con enfermedad renal crónica en hemodiálisis tienen fístula arteriovenosa periférica (FAV) que asegura el flujo de sangre durante la hemodiálisis (1), la variabilidad en el flujo de sangre en el brazo de la FAV hacia la parte distal, puede afectar la lectura de la oximetría de pulso (SpO2) (2), llevando a la toma de decisiones equivocadas por el personal de salud. El objetivo de este estudio es aclarar si existe diferencia entre la SpO2 del brazo de la FAV y el brazo contralateral. Materiales y métodos. Se realizó un estudio de correlación entre los valores de SpO2 del brazo con FAV contra el brazo sin FAV, de 40 pacientes que asistieron a hemodiálisis. La recolección de los datos se llevó a cabo, con un formato que incluyó el resultado de la pulsioximetria y variables asociadas, antes, durante y después de la hemodiálisis. Se comparó la mediana de los deltas de las diferencias con pruebas estadísticas T Student – Mann Whitney, aceptando un valor significativo de p < 0,05. Resultados. No se encontraron diferencias estadísticamente significativas de la SpO2 entre el brazo con FAV y el brazo sin FAV, antes, durante y después de la diálisis, sin embargo si se apreció una correlación positiva estadísticamente significativa. Conclusiones. Se encontró correlación positiva estadísticamente significativa, donde no hubo diferencias en el resultado la pulsioximetría entre el brazo con FAV y brazo sin FAV, por lo tanto es válido tomar la pulsioximetría en cualquiera de los brazos.

Relevância:

90.00% 90.00%

Publicador: