348 resultados para STREAMING


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A real-time adaptive resource allocation algorithm considering the end user's Quality of Experience (QoE) in the context of video streaming service is presented in this work. An objective no-reference quality metric, namely Pause Intensity (PI), is used to control the priority of resource allocation to users during the scheduling process. An online adjustment has been introduced to adaptively set the scheduler's parameter and maintain a desired trade-off between fairness and efficiency. The correlation between the data rates (i.e. video code rates) demanded by users and the data rates allocated by the scheduler is taken into account as well. The final allocated rates are determined based on the channel status, the distribution of PI values among users, and the scheduling policy adopted. Furthermore, since the user's capability varies as the environment conditions change, the rate adaptation mechanism for video streaming is considered and its interaction with the scheduling process under the same PI metric is studied. The feasibility of implementing this algorithm is examined and the result is compared with the most commonly existing scheduling methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AMS Subj. Classification: 68U05, 68P30

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with long-term (20+ years) forecasting of broadband traffic in next-generation networks. Such long-term approach requires going beyond extrapolations of past traffic data while facing high uncertainty in predicting the future developments and facing the fact that, in 20 years, the current network technologies and architectures will be obsolete. Thus, "order of magnitude" upper bounds of upstream and downstream traffic are deemed to be good enough to facilitate such long-term forecasting. These bounds can be obtained by evaluating the limits of human sighting and assuming that these limits will be achieved by future services or, alternatively, by considering the contents transferred by bandwidth-demanding applications such as those using embedded interactive 3D video streaming. The traffic upper bounds are a good indication of the peak values and, subsequently, also of the future network capacity demands. Furthermore, the main drivers of traffic growth including multimedia as well as non-multimedia applications are identified. New disruptive applications and services are explored that can make good use of the large bandwidth provided by next-generation networks. The results can be used to identify monetization opportunities of future services and to map potential revenues for network operators. © 2014 The Author(s).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to review some of the standards, connected with multimedia and their metadata. We start with MPEG family. MPEG-21 provides an open framework for multimedia delivery and consumption. MPEG- 7 is a multimedia content description standard. With the Internet grow several format were proposed for media scenes description. Some of them are open standards such as: VRML1, X3D2, SMIL3, SVG4, MPEG-4 BIFS, MPEG-4, XMT, MPEG-4, LaSER, COLLADA5, published by ISO, W3C, etc. Television has become the most important mass medium. Standards such as MHEG, DAVIC, Java TV, MHP, GEM, OCAP and ACAP have been developed. Efficient video-streaming is presented. There exist a large number of standards for representing audiovisual metadata. We cover the Material Exchange Format (MXF), the Digital Picture Exchange (DPX), and the Digital Cinema Package (DCP).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A framework that aims to best utilize the mobile network resources for video applications is presented in this paper. The main contribution of the work proposed is the QoE-driven optimization method that can maintain a desired trade-off between fairness and efficiency in allocating resources in terms of data rates to video streaming users in LTE networks. This method is concerned with the control of the user satisfaction level from the service continuity's point of view and applies appropriate QoE metrics (Pause Intensity and variations) to determine the scheduling strategies in combination with the mechanisms used for adaptive video streaming such as 3GP/MPEG-DASH. The superiority of the proposed algorithms are demonstrated, showing how the resources of a mobile network can be optimally utilized by using quantifiable QoE measurements. This approach can also find the best match between demand and supply in the process of network resource distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A segment selection method controlled by Quality of Experience (QoE) factors for Dynamic Adaptive Streaming over HTTP (DASH) is presented in this paper. Current rate adaption algorithms aim to eliminate buffer underrun events by significantly reducing the code rate when experiencing pauses in replay. In reality, however, viewers may choose to accept a level of buffer underrun in order to achieve an improved level of picture fidelity or to accept the degradation in picture fidelity in order to maintain the service continuity. The proposed rate adaption scheme in our work can maximize the user QoE in terms of both continuity and fidelity (picture quality) in DASH applications. It is shown that using this scheme a high level of quality for streaming services, especially at low packet loss rates, can be achieved. Our scheme can also maintain a best trade-off between continuity-based quality and fidelity-based quality, by determining proper threshold values for the level of quality intended by clients with different quality requirements. In addition, the integration of the rate adaptation mechanism with the scheduling process is investigated in the context of a mobile communication network and related performances are analyzed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Internet Protocol Television (IPTV) is a system where a digital television service is delivered by using Internet Protocol over a network infrastructure. There is considerable confusion and concern about the IPTV, since two different technologies have to be mended together to provide the end customers with some thing better than the conventional television. In this research, functional architecture of the IPTV system was investigated. Very Large Scale Integration based system for streaming server controller were designed and different ways of hosting a web server which can be used to send the control signals to the streaming server controller were studied. The web server accepts inputs from the keyboard and FPGA board switches and depending on the preset configuration the server will open a selected web page and also sends the control signals to the streaming server controller. It was observed that the applications run faster on PowerPC since it is embedded into the FPGA. Commercial market and Global deployment of IPTV were discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article will acquaint you with ten of the more important leftwing films I have reviewed over the past sixteen years as a member of New York Film Critics Online. You will not see listed familiar works such as “The Battle of Algiers” but instead those that deserve wider attention, the proverbial neglected masterpieces. They originate from different countries and are available through Internet streaming, either freely from Youtube or through Netflix or Amazon rental. In several instances you will be referred to film club websites that like the films under discussion deserve wider attention since they are the counterparts to the small, independent theaters where such films get premiered. The country of origin, date and director will be identified next to the title, followed by a summary of the film, and finally by its availability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work was performing effluent degradation studies by electrochemical treatment. The electrochemical oxidation (EO) hydroquinone (H2Q) was carried out in acid medium, using PbO2 electrode by galvanostatic electrolysis, applying current densities of 10 and 30 mA/cm2 . The concentration of H2Q was monitored by differential pulse voltammetry (DPV). The experimental results showed that the galvanostatic electrolysis process performance significantly depends on the applied current density, achieving removal efficiencies of 100% and 80 % and 10 applying 30 mA/cm2 , respectively. Furthermore, the electroanalytical technique was effective in H2Q be used as a detection method. In order to test the efficiency of PbO2 electrode, the electrochemical treatment was conducted in an actual effluent, leachate from a landfill. The liquid waste leachate (600ml effluent) was treated in a batch electrochemical cell, with or without addition of NaCl by applying 7 mA/cm2 . The efficiency of EO was assessed against the removal of thermo-tolerant coliforms, total organic carbon (TOC), total phosphorus and metals (copper, cobalt, chromium, iron and nickel). These results showed that efficient removal of coliforms was obtained (100%), and was further decrease the concentration of heavy metals by the cathode processes. However, results were not satisfactory TOC, achieving low total removal of dissolved organic load. Because it is considered an effluent complex were developed other tests with this effluent to monitor a larger number of decontamination parameters (Turbidity, Total Solids, Color, Conductivity, Total Organic Carbon (TOC) and metals (barium, chromium, lithium, manganese and Zinc), comparing the efficiency of this type of electrochemical treatment (EO or electrocoagulation) using a flow cell. In this assay was compared to electro streaming. In the case of the OE, Ti/IrO2-TaO5 was used as the anode, however, the electrocoagulation process, aluminum electrodes were used; applying current densities of 10, 20 and 30 mA/cm2 in the presence and absence of NaCl as an electrolyte. The results showed that EO using Ti/IrO2–TaO5 was anode as efficient when Cl- was present in the effluent. In contrast, the electrocoagulation flow reduces the dissolved organic matter in the effluent, under certain experimental conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The great amount of data generated as the result of the automation and process supervision in industry implies in two problems: a big demand of storage in discs and the difficulty in streaming this data through a telecommunications link. The lossy data compression algorithms were born in the 90’s with the goal of solving these problems and, by consequence, industries started to use those algorithms in industrial supervision systems to compress data in real time. These algorithms were projected to eliminate redundant and undesired information in a efficient and simple way. However, those algorithms parameters must be set for each process variable, becoming impracticable to configure this parameters for each variable in case of systems that monitor thousands of them. In that context, this paper propose the algorithm Adaptive Swinging Door Trending that consists in a adaptation of the Swinging Door Trending, as this main parameters are adjusted dynamically by the analysis of the signal tendencies in real time. It’s also proposed a comparative analysis of performance in lossy data compression algorithms applied on time series process variables and dynamometer cards. The algorithms used to compare were the piecewise linear and the transforms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Durante el desarrollo del proyecto he aprendido sobre Big Data, Android y MongoDB mientras que ayudaba a desarrollar un sistema para la predicción de las crisis del trastorno bipolar mediante el análisis masivo de información de diversas fuentes. En concreto hice una parte teórica sobre bases de datos NoSQL, Streaming Spark y Redes Neuronales y después diseñé y configuré una base de datos MongoDB para el proyecto del trastorno bipolar. También aprendí sobre Android y diseñé y desarrollé una aplicación de móvil en Android para recoger datos para usarlos como entrada en el sistema de predicción de crisis. Una vez terminado el desarrollo de la aplicación también llevé a cabo una evaluación con usuarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper will look at the benefits and limitations of content distribution using Forward Error Correction (FEC) in conjunction with the Transmission Control Protocol (TCP). FEC can be used to reduce the number of retransmissions which would usually result from a lost packet. The requirement for TCP to deal with any losses is then greatly reduced. There are however side-effects to using FEC as a countermeasure to packet loss: an additional requirement for bandwidth. When applications such as real-time video conferencing are needed, delay must be kept to a minimum, and retransmissions are certainly not desirable. A balance, therefore, between additional bandwidth and delay due to retransmissions must be struck. Our results show that the throughput of data can be significantly improved when packet loss occurs using a combination of FEC and TCP, compared to relying solely on TCP for retransmissions. Furthermore, a case study applies the result to demonstrate the achievable improvements in the quality of streaming video perceived by end users.