933 resultados para Cameron
Resumo:
This is an extended version of Philip Murphy's inaugural lecture as director of the Institute of Commonwealth Studies, delivered on 23 February 2011. It traces the relationship of the UK with the wider Commonwealth over 40 years, paying particular attention to the rhetoric of governments and opposition parties from Wilson and Heath to Cameron. It examines the reasons for the Commonwealth being relegated to a peripheral role in British foreign policy, especially European preoccupations and the issues of Rhodesia and South Africa. It argues that the Commonwealth remains of considerable practical and enormous symbolic importance to the UK. The British government should engage with the Commonwealth more than it has done in the recent past and the Commonwealth should be both open to and critical of its imperial past.
Selected wheat seed defense proteins exhibit competitive binding to model microbial lipid interfaces
Resumo:
Puroindolines (Pins) and purothionins (Pths) are basic, amphiphilic, cysteine-rich wheat proteins that play a role in plant defense against microbial pathogens. We have examined the co-adsorption and sequential addition of Pins (Pin-a, Pin-b and a mutant form of Pin-b with Trp-44 to Arg-44 substitution) and β-purothionin (β-Pth) model anionic lipid layers, using a combination of surface pressure measurements, external reflection FTIR spectroscopy and neutron reflectometry. Results highlighted differences in the protein binding mechanisms, and in the competitive binding and penetration of lipid layers between respective Pins and β-Pth. Pin-a formed a blanket-like layer of protein below the lipid surface that resulted in the reduction or inhibition of β-Pth penetration of the lipid layer. Wild-type Pin-b participated in co-operative binding with β-Pth, whereas the mutant Pin-b did not bind to the lipid layer in the presence of β-Pth. The results provide further insight into the role of hydrophobic and cationic amino acid residues in antimicrobial activity.
Resumo:
The optimal utilisation of hyper-spectral satellite observations in numerical weather prediction is often inhibited by incorrectly assuming independent interchannel observation errors. However, in order to represent these observation-error covariance structures, an accurate knowledge of the true variances and correlations is needed. This structure is likely to vary with observation type and assimilation system. The work in this article presents the initial results for the estimation of IASI interchannel observation-error correlations when the data are processed in the Met Office one-dimensional (1D-Var) and four-dimensional (4D-Var) variational assimilation systems. The method used to calculate the observation errors is a post-analysis diagnostic which utilises the background and analysis departures from the two systems. The results show significant differences in the source and structure of the observation errors when processed in the two different assimilation systems, but also highlight some common features. When the observations are processed in 1D-Var, the diagnosed error variances are approximately half the size of the error variances used in the current operational system and are very close in size to the instrument noise, suggesting that this is the main source of error. The errors contain no consistent correlations, with the exception of a handful of spectrally close channels. When the observations are processed in 4D-Var, we again find that the observation errors are being overestimated operationally, but the overestimation is significantly larger for many channels. In contrast to 1D-Var, the diagnosed error variances are often larger than the instrument noise in 4D-Var. It is postulated that horizontal errors of representation, not seen in 1D-Var, are a significant contributor to the overall error here. Finally, observation errors diagnosed from 4D-Var are found to contain strong, consistent correlation structures for channels sensitive to water vapour and surface properties.
Resumo:
We present results from 30 nights of observations of the open cluster NGC 7789 with the Wide Field Camera on the Isaac Newton Telescope, La Palma. From ~900 epochs, we obtained light curves and Sloan r'-i' colours for ~33000 stars, with ~2400 stars having better than 1 per cent precision. We expected to detect ~2 transiting hot Jupiter planets if 1 per cent of stars host such a companion and a typical hot Jupiter radius is ~1.2R_J. We find 24 transit candidates, 14 of which we can assign a period. We rule out the transiting planet model for 21 of these candidates using various robust arguments. For two candidates, we are unable to decide on their nature, although it seems most likely that they are eclipsing binaries as well. We have one candidate exhibiting a single eclipse, for which we derive a radius of 1.81+0.09-0.00R_J. Three candidates remain that require follow-up observations in order to determine their nature.
Resumo:
British Television Drama provides resources for critical thinking about key aspects of television drama in Britain since 1960, including institutional, textual, cultural, economic and audience-centred modes of study. It presents and contests significant strands of critical work in the field, and comprises essays by TV professionals and academics plus editors' introductions to each section that contextualise the chapters. The new edition includes a revised chapter by acclaimed TV producer Tony Garnett reflecting on his work since Cathy Come Home in the 1960s, new chapters by Phil Redmond, the creator of Brookside and Hollyoaks, and Cameron Roach, Head of Drama Commissioning at Sky TV and former executive producer of Waterloo Road. New academic analyses include work on Downton Abbey, The Sarah Jane Adventures, Ashes to Ashes, adaptations of Persuasion, and the changing production methods on Coronation Street.
Resumo:
Deposit modelling based on archived borehole logs supplemented by a small number of dedicated boreholes is used to reconstruct the main boundary surfaces and the thickness of the main sediment units within the succession of Holocene alluvial deposits underlying the floodplain in the Barking Reach of the Lower Thames Valley. The basis of the modelling exercise is discussed and the models are used to assess the significance of floodplain relief in determining patterns of sedimentation. This evidence is combined with the results of biostratigraphical and geochronological investigations to reconstruct the environmental conditions associated with each successive stage of floodplain aggradation. The two main factors affecting the history and spatial pattern of Holocene sedimentation are shown to be the regional behaviour of relative sea level and the pattern of relief on the surface of the sub-alluvial, Late Devensian Shepperton Gravel. As is generally the case in the Lower Thames Valley, three main stratigraphic units are recognised, the Lower Alluvium, a peat bed broadly equivalent to the Tilbury III peat of Devoy (1979) and an Upper Alluvium. There is no evidence to suggest that the floodplain was substantially re-shaped by erosion during the Holocene. Instead, the relief inherited from the Shepperton Gravel surface was gradually buried either by the accumulation of peat or by deposition of fine-grained sediment from suspension in standing or slow-moving water. The palaeoenvironmental record from Barking confirms important details of the Holocene record observed elsewhere in the Lower Thames Valley, including the presence of Taxus in the valley-floor fen carr woodland between about 5000 and 4000 cal BP, and the subsequent growth of Ulmus on the peat surface.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in three consulting studies carried out by Capgemini involving four UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in two consulting studies carried out by Capgemini involving three UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts communities worked together on ‘end-to-end’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.
Resumo:
Multi-proxy analyses from floodplain deposits in the Colne Valley, southern England, have provided a palaeoenvironmental context for the immediately adjacent Terminal Upper Palaeolithic and Early Mesolithic site of Three Ways Wharf. These deposits show the transition from an open cool environment to fully developed heterogeneous floodplain vegetation during the Early Mesolithic. Several distinct phases of burning are shown to have occurred that are chronologically contemporary with the local archaeological record. The floodplain itself is shown to have supported a number of rare Urwaldrelikt insect species implying human manipulation of the floodplain at this time must have been limited or episodic. By the Late Mesolithic a reed-sedge swamp had developed across much of the floodplain, within which repeated burning of the in situ vegetation took place. This indicates deliberate land management practices utilising fire, comparable with findings from other floodplain sequences in southern Britain. With similar sedimentary sequences known to exist across the Colne Valley, often closely associated with contemporary archaeology, the potential for placing the archaeological record within a spatially explicit palaeoenvironmental context is great.
Resumo:
Urban greening solutions such as green roofs help improve residents’ thermal comfort and building insulation. However, not all plants provide the same level of cooling. This is partially due to differences in plant structure and function, including different mechanisms that plants employ to regulate leaf temperature. Ranking of multiple leaf/plant traits involved in the regulation of leaf temperature (and, consequently, plants’ cooling ‘service’) is not well understood. We therefore investigated the relative importance of water loss, leaf colour, thickness and extent of pubescence for the regulation of leaf temperature, in the context of species for semi-extensive green roofs. Leaf temperature were measured with an infrared imaging camera in a range of contrasting genotypes within three plant genera (Heuchera, Salvia and Sempervivum). In three glasshouse experiments (each evaluating three or four genotypes of each genera) we varied water availability to the plants and assessed how leaf temperature altered depending on water loss and specific leaf traits. Greatest reductions in leaf temperature were closely associated with higher water loss. Additionally, in non-succulents (Heuchera, Salvia), lighter leaf colour and longer hair length (on pubescent leaves) both contributed to reduced leaf temperature. However, in succulent Sempervivum, colour/pubescence made no significant contribution; leaf thickness and water loss rate were the key regulating factors. We propose that this can lead to different plant types having significantly different potentials for cooling. We suggest that maintaining transpirational water loss by sustainable irrigation and selecting urban plants with favourable morphological traits is the key to maximising thermal benefits provided by applications such as green roofs.
Resumo:
The emergence and development of digital imaging technologies and their impact on mainstream filmmaking is perhaps the most familiar special effects narrative associated with the years 1981-1999. This is in part because some of the questions raised by the rise of the digital still concern us now, but also because key milestone films showcasing advancements in digital imaging technologies appear in this period, including Tron (1982) and its computer generated image elements, the digital morphing in The Abyss (1989) and Terminator 2: Judgment Day (1991), computer animation in Jurassic Park (1993) and Toy Story (1995), digital extras in Titanic (1997), and ‘bullet time’ in The Matrix (1999). As a result it is tempting to characterize 1981-1999 as a ‘transitional period’ in which digital imaging processes grow in prominence and technical sophistication, and what we might call ‘analogue’ special effects processes correspondingly become less common. But such a narrative risks eliding the other practices that also shape effects sequences in this period. Indeed, the 1980s and 1990s are striking for the diverse range of effects practices in evidence in both big budget films and lower budget productions, and for the extent to which analogue practices persist independently of or alongside digital effects work in a range of production and genre contexts. The chapter seeks to document and celebrate this diversity and plurality, this sustaining of earlier traditions of effects practice alongside newer processes, this experimentation with materials and technologies old and new in the service of aesthetic aspirations alongside budgetary and technical constraints. The common characterization of the period as a series of rapid transformations in production workflows, practices and technologies will be interrogated in relation to the persistence of certain key figures as Douglas Trumbull, John Dykstra, and James Cameron, but also through a consideration of the contexts for and influences on creative decision-making. Comparative analyses of the processes used to articulate bodies, space and scale in effects sequences drawn from different generic sites of special effects work, including science fiction, fantasy, and horror, will provide a further frame for the chapter’s mapping of the commonalities and specificities, continuities and variations in effects practices across the period. In the process, the chapter seeks to reclaim analogue processes’ contribution both to moments of explicit spectacle, and to diegetic verisimilitude, in the decades most often associated with the digital’s ‘arrival’.