9 resultados para Television bandwidth compression
em Greenwich Academic Literature Archive - UK
Resumo:
The article focuses on an information system to exploit the use of metadata within film and television production. It is noted that the television and film industries are used to working on big projects. This involves the use of actual film, video tape, and P.E.R.T charts for project planning. Scripts are in most instances revised. It is essential to attach information on these in order to manage, track and retrieve them. The use of metadata eases the operations involved in these industries.
Resumo:
The television and film industries are used to working on large projects. These projects use media and documents of various types, ranging from actual film and videotape to items such as PERT charts for project planning. Some items, such as scripts, evolve over a period and go through many versions. It is often necessary to attach information to these “objects” in order to manage, track, and retrieve them. On large productions there may be hundreds of personnel who need access to this material and who in their turn generate new items which form some part of the final production. The requirements for this industry in terms of an information system may be generalized and a distributed software architecture built, primarily using the internet, to serve the needs of these projects. This architecture must enable potentially very large collections of objects to be managed in a secure environment with distributed responsibilities held by many working on the production. Copyright © 2005 by the Society of Motion Picture and Television Engineers, Inc.
Resumo:
An important factor for high-speed optical communication is the availability of ultrafast and low-noise photodetectors. Among the semiconductor photodetectors that are commonly used in today’s long-haul and metro-area fiber-optic systems, avalanche photodiodes (APDs) are often preferred over p-i-n photodiodes due to their internal gain, which significantly improves the receiver sensitivity and alleviates the need for optical pre-amplification. Unfortunately, the random nature of the very process of carrier impact ionization, which generates the gain, is inherently noisy and results in fluctuations not only in the gain but also in the time response. Recently, a theory characterizing the autocorrelation function of APDs has been developed by us which incorporates the dead-space effect, an effect that is very significant in thin, high-performance APDs. The research extends the time-domain analysis of the dead-space multiplication model to compute the autocorrelation function of the APD impulse response. However, the computation requires a large amount of memory space and is very time consuming. In this research, we describe our experiences in parallelizing the code in MPI and OpenMP using CAPTools. Several array partitioning schemes and scheduling policies are implemented and tested. Our results show that the code is scalable up to 64 processors on a SGI Origin 2000 machine and has small average errors.
Resumo:
Fractal image compression is a relatively recent image compression method. Its extension to a sequence of motion images is important in video compression applications. There are two basic fractal compression methods, namely the cube-based and the frame-based methods, being commonly used in the industry. However there are advantages and disadvantages in both methods. This paper proposes a hybrid algorithm highlighting the advantages of the two methods in order to produce a good compression algorithm for video industry. Experimental results show the hybrid algorithm improves the compression ratio and the quality of decompressed images.
Resumo:
Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.
Resumo:
Fractal image compression is a relatively recent image compression method, which is simple to use and often leads to a high compression ratio. These advantages make it suitable for the situation of a single encoding and many decoding, as required in video on demand, archive compression, etc. There are two fundamental fractal compression methods, namely, the cube-based and the frame-based methods, being commonly studied. However, there are advantages and disadvantages in both methods. This paper gives an extension of the fundamental compression methods based on the concept of adaptive partition. Experimental results show that the algorithms based on adaptive partition may obtain a much higher compression ratio compared to algorithms based on fixed partition while maintaining the quality of decompressed images.
Resumo:
The intrinsic independent features of the optimal codebook cubes searching process in fractal video compression systems are examined and exploited. The design of a suitable parallel algorithm reflecting the concept is presented. The Message Passing Interface (MPI) is chosen to be the communication tool for the implementation of the parallel algorithm on distributed memory parallel computers. Experimental results show that the parallel algorithm is able to reduce the compression time and achieve a high speed-up without changing the compression ratio and the quality of the decompressed image. A scalability test was also performed, and the results show that this parallel algorithm is scalable.
Resumo:
The authors' experience in the treatment of grey video compression using fractals is summarized and compared with other research in the same field. Experience with parallel and distributed computing is also discussed.
Resumo:
This article examines the first major British television series about the First World War, The Great War (BBC, 1964), in terms of its cultural, historical and aesthetic significance. As a central component of the BBC`s 50th anniversary commemorative programme to mark the outbreak of war, the series was a major media event -a small-screen memorial cast in sounds and images instead of stone and bronze. This article looks at how the British television audience responded to this form of on-screen commemoration. Material for this article was derived from the series' extensive production records housed in the BBC Written Archives Centre at Caversham, Berkshire. This was supplemented by, among other sources, material from interviews and correspondence with several surviving members of the production team. This allows a broader understanding of the motivations of those involved in the production of a groundbreaking historical series, while acknowledging the wide-ranging nature of its audience. [From the Publisher]