14 resultados para High definition television

em CentAUR: Central Archive University of Reading - UK


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This chapter explores the distinctive qualities of the Matt Smith era Doctor Who, focusing on how dramatic emphases are connected with emphases on visual style, and how this depends on the programme's production methods and technologies. Doctor Who was first made in the 1960s era of live, studio-based, multi-camera television with monochrome pictures. However, as technical innovations like colour filming, stereo sound, CGI and post-production effects technology have been routinely introduced into the programme, and now High Definition (HD) cameras, they have given Doctor Who’s creators new ways of making visually distinctive narratives. Indeed, it has been argued that since the 1980s television drama has become increasingly like cinema in its production methods and aesthetic aims. Viewers’ ability to view the programme on high-specification TV sets, and to record and repeat episodes using digital media, also encourage attention to visual style in television as much as in cinema. The chapter evaluates how these new circumstances affect what Doctor Who has become and engages with arguments that visual style has been allowed to override characterisation and story in the current Doctor Who. The chapter refers to specific episodes, and frames the analysis with reference to earlier years in Doctor Who’s long history. For example, visual spectacle using green-screen and CGI can function as a set-piece (at the opening or ending of an episode) but can also work ‘invisibly’ to render a setting realistically. Shooting on location using HD cameras provides a rich and detailed image texture, but also highlights mistakes and especially problems of lighting. The reduction of Doctor Who’s budget has led to Steven Moffat’s episodes relying less on visual extravagance, connecting back both to Russell T. Davies’s concern to show off the BBC’s investment in the series but also to reference British traditions of gritty and intimate social drama. Pressures to capitalise on Doctor Who as a branded product are the final aspect of the chapter’s analysis, where the role of Moffat as ‘showrunner’ links him to an American (not British) style of television production where the preservation of format and brand values give him unusual power over the look of the series.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The creation of OFDM based Wireless Personal Area Networks (WPANs) has allowed the development of high bit-rate wireless communication devices suitable for streaming High Definition video between consumer products, as demonstrated in Wireless-USB and Wireless-HDMI. However, these devices need high frequency clock rates, particularly for the OFDM, FFT and symbol processing sections resulting in high silicon cost and high electrical power. The high clock rates make hardware prototyping difficult and verification is therefore very important but costly. Acknowledging that electrical power in wireless consumer devices is more critical than the number of implemented logic gates, this paper presents a Double Data Rate (DDR) architecture for implementation inside a OFDM baseband codec in order to reduce the high frequency clock rates by a complete factor of 2. The presented architecture has been implemented and tested for ECMA-368 (Wireless- USB context) resulting in a maximum clock rate of 264MHz instead of the expected 528MHz clock rate existing anywhere on the baseband codec die.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The creation of OFDM based Wireless Personal Area Networks (WPANs) has allowed high bit-rate wireless communication devices suitable for streaming High Definition video between consumer products as demonstrated in Wireless- USB. However, these devices need high clock rates, particularly for the OFDM sections resulting in high silicon cost and high electrical power. Acknowledging that electrical power in wireless consumer devices is more critical than the number of implemented logic gates, this paper presents a Double Data Rate (DDR) architecture to reduce the OFDM input and output clock rate by a factor of 2. The architecture has been implemented and tested for Wireless-USB (ECMA-368) resulting in a maximum clock of 264MHz instead of 528MHz existing anywhere on the die.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have designed and implemented a low-cost digital system using closed-circuit television cameras coupled to a digital acquisition system for the recording of in vivo behavioral data in rodents and for allowing observation and recording of more than 10 animals simultaneously at a reduced cost, as compared with commercially available solutions. This system has been validated using two experimental rodent models: one involving chemically induced seizures and one assessing appetite and feeding. We present observational results showing comparable or improved levels of accuracy and observer consistency between this new system and traditional methods in these experimental models, discuss advantages of the presented system over conventional analog systems and commercially available digital systems, and propose possible extensions to the system and applications to nonrodent studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prevalence of the metabolic syndrome (MetS), CVD and type 2 diabetes (T2D) is known to be higher in populations from the Indian subcontinent compared with the general UK population. While identification of this increased risk is crucial to allow for effective treatment, there is controversy over the applicability of diagnostic criteria, and particularly measures of adiposity in ethnic minorities. Diagnostic cut-offs for BMI and waist circumference have been largely derived from predominantly white Caucasian populations and, therefore, have been inappropriate and not transferable to Asian groups. Many Asian populations, particularly South Asians, have a higher total and central adiposity for a similar body weight compared with matched Caucasians and greater CVD risk associated with a lower BMI. Although the causes of CVD and T2D are multi-factorial, diet is thought to make a substantial contribution to the development of these diseases. Low dietary intakes and tissue levels of long-chain (LC) n-3 PUFA in South Asian populations have been linked to high-risk abnormalities in the MetS. Conversely, increasing the dietary intake of LC n-3 PUFA in South Asians has proved an effective strategy for correcting such abnormalities as dyslipidaemia in the MetS. Appropriate diagnostic criteria that include a modified definition of adiposity must be in place to facilitate the early detection and thus targeted treatment of increased risk in ethnic minorities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have designed and implemented a low-cost digital system using closed-circuit television cameras coupled to a digital acquisition system for the recording of in vivo behavioral data in rodents and for allowing observation and recording of more than 10 animals simultaneously at a reduced cost, as compared with commercially available solutions. This system has been validated using two experimental rodent models: one involving chemically induced seizures and one assessing appetite and feeding. We present observational results showing comparable or improved levels of accuracy and observer consistency between this new system and traditional methods in these experimental models, discuss advantages of the presented system over conventional analog systems and commercially available digital systems, and propose possible extensions to the system and applications to non-rodent studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usefulness of any simulation of atmospheric tracers using low-resolution winds relies on both the dominance of large spatial scales in the strain and time dependence that results in a cascade in tracer scales. Here, a quantitative study on the accuracy of such tracer studies is made using the contour advection technique. It is shown that, although contour stretching rates are very insensitive to the spatial truncation of the wind field, the displacement errors in filament position are sensitive. A knowledge of displacement characteristics is essential if Lagrangian simulations are to be used for the inference of airmass origin. A quantitative lower estimate is obtained for the tracer scale factor (TSF): the ratio of the smallest resolved scale in the advecting wind field to the smallest “trustworthy” scale in the tracer field. For a baroclinic wave life cycle the TSF = 6.1 ± 0.3 while for the Northern Hemisphere wintertime lower stratosphere the TSF = 5.5 ± 0.5, when using the most stringent definition of the trustworthy scale. The similarity in the TSF for the two flows is striking and an explanation is discussed in terms of the activity of potential vorticity (PV) filaments. Uncertainty in contour initialization is investigated for the stratospheric case. The effect of smoothing initial contours is to introduce a spinup time, after which wind field truncation errors take over from initialization errors (2–3 days). It is also shown that false detail from the proliferation of finescale filaments limits the useful lifetime of such contour advection simulations to 3σ−1 days, where σ is the filament thinning rate, unless filaments narrower than the trustworthy scale are removed by contour surgery. In addition, PV analysis error and diabatic effects are so strong that only PV filaments wider than 50 km are at all believable, even for very high-resolution winds. The minimum wind field resolution required to accurately simulate filaments down to the erosion scale in the stratosphere (given an initial contour) is estimated and the implications for the modeling of atmospheric chemistry are briefly discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous research has shown that listening to stories supports vocabulary growth in preschool and school-aged children and that lexical entries for even very difficult or rare words can be established if these are defined when they are first introduced. However, little is known about the nature of the lexical representations children form for the words they encounter while listening to stories, or whether these are sufficiently robust to support the child’s own use of such ‘high-level’ vocabulary. This study explored these questions by administering multiple assessments of children’s knowledge about a set of newly-acquired vocabulary. Four- and 6-year-old children were introduced to nine difficult new words (including nouns, verbs and adjectives) through three exposures to a story read by their class teacher. The story included a definition of each new word at its first encounter. Learning of the target vocabulary was assessed by means of two tests of semantic understanding – a forced choice picture-selection task and a definition production task – and a grammaticality judgment task, which asked children to choose between a syntactically-appropriate and syntactically-inappropriate usage of the word. Children in both age groups selected the correct pictorial representation and provided an appropriate definition for the target words in all three word classes significantly more often than they did for a matched set of non-exposed control words. However, only the older group was able to identify the syntactically-appropriate sentence frames in the grammaticality judgment task. Further analyses elucidate some of the components of the lexical representations children lay down when they hear difficult new vocabulary in stories and how different tests of word knowledge might overlap in their assessment of these components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Location is of paramount importance within the retail sector, yet defining locational obsolescence remains overlooked, despite significant concerns over the viability of parts of the complex sector. This paper reviews the existing literature and, through this, explores retail locational obsolescence, including the multi-spatial nature of the driving forces that range from the global economy, local markets and submarkets, to individual property-specific factors; and, crucially, the need to disentangle locational obsolescence from other important concepts such as depreciation and functional obsolescence that are often mistakenly used. Through this, a conceptual model, definition and diagnostic criteria are presented to guide future studies, policy development and the allocation of resources. Importantly, three stages are presented to enable the operationalization of the model, essential to future academic and industry studies as well as the ongoing development of policy in this economically important, complex and contentious area.