102 resultados para The 7pm Project


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A set of high-resolution radar observations of convective storms has been collected to evaluate such storms in the UK Met Office Unified Model during the DYMECS project (Dynamical and Microphysical Evolution of Convective Storms). The 3-GHz Chilbolton Advanced Meteorological Radar was set up with a scan-scheduling algorithm to automatically track convective storms identified in real-time from the operational rainfall radar network. More than 1,000 storm observations gathered over fifteen days in 2011 and 2012 are used to evaluate the model under various synoptic conditions supporting convection. In terms of the detailed three-dimensional morphology, storms in the 1500-m grid-length simulations are shown to produce horizontal structures a factor 1.5–2 wider compared to radar observations. A set of nested model runs at grid lengths down to 100m show that the models converge in terms of storm width, but the storm structures in the simulations with the smallest grid lengths are too narrow and too intense compared to the radar observations. The modelled storms were surrounded by a region of drizzle without ice reflectivities above 0 dBZ aloft, which was related to the dominance of ice crystals and was improved by allowing only aggregates as an ice particle habit. Simulations with graupel outperformed the standard configuration for heavy-rain profiles, but the storm structures were a factor 2 too wide and the convective cores 2 km too deep.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Mediterranean is an important eco-region, however, it suffers from the lack of common procedures for the management and monitoring of its protected areas sustainability. The INNOVA project addresses this issue by developing a procedure namely PASEMP as well as tools which can assist Protected Areas Managers and responsible authorities to develop and implement a monitoring strategy for their areas. This handbook is proposed as a flexible tool, or a reference text which should be used in combination with the PASEMP guidelines to identify indicators, but also contains guidance on how to implement and report the monitoring strategy results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nanoparticles emitted from road traffic are the largest source of respiratory exposure for the general public living in urban areas. It has been suggested that adverse health effects of airborne particles may scale with airborne particle number, which if correct, focuses attention on the nanoparticle (less than 100 nm) size range which dominates the number count in urban areas. Urban measurements of particle size distributions have tended to show a broadly similar pattern dominated by a mode centred on 20–30 nm diameter emitted by diesel engine exhaust. In this paper we report the results of measurements of particle number concentration and size distribution made in a major London park as well as on the BT Tower, 160 m aloft. These measurements taken during the REPARTEE project (Regents Park and BT Tower experiment) show a remarkable shift in particle size distributions with major losses of the smallest particle class as particles are advected away from the traffic source. In the Park, the traffic related mode at 20–30 nm diameter is much reduced with a new mode at <10 nm. Size distribution measurements also revealed higher number concentrations of sub-50 nm particles at the BT Tower during days affected by higher turbulence as determined by Doppler Lidar measurements and are indicative of loss of nanoparticles from air aged during less turbulent conditions. These results are suggestive of nanoparticle loss by evaporation, rather than coagulation processes. The results have major implications for understanding the impacts of traffic-generated particulate matter on human health.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The inclusion of the direct and indirect radiative effects of aerosols in high-resolution global numerical weather prediction (NWP) models is being increasingly recognised as important for the improved accuracy of short-range weather forecasts. In this study the impacts of increasing the aerosol complexity in the global NWP configuration of the Met Office Unified Model (MetUM) are investigated. A hierarchy of aerosol representations are evaluated including three-dimensional monthly mean speciated aerosol climatologies, fully prognostic aerosols modelled using the CLASSIC aerosol scheme and finally, initialised aerosols using assimilated aerosol fields from the GEMS project. The prognostic aerosol schemes are better able to predict the temporal and spatial variation of atmospheric aerosol optical depth, which is particularly important in cases of large sporadic aerosol events such as large dust storms or forest fires. Including the direct effect of aerosols improves model biases in outgoing long-wave radiation over West Africa due to a better representation of dust. However, uncertainties in dust optical properties propagate to its direct effect and the subsequent model response. Inclusion of the indirect aerosol effects improves surface radiation biases at the North Slope of Alaska ARM site due to lower cloud amounts in high-latitude clean-air regions. This leads to improved temperature and height forecasts in this region. Impacts on the global mean model precipitation and large-scale circulation fields were found to be generally small in the short-range forecasts. However, the indirect aerosol effect leads to a strengthening of the low-level monsoon flow over the Arabian Sea and Bay of Bengal and an increase in precipitation over Southeast Asia. Regional impacts on the African Easterly Jet (AEJ) are also presented with the large dust loading in the aerosol climatology enhancing of the heat low over West Africa and weakening the AEJ. This study highlights the importance of including a more realistic treatment of aerosol–cloud interactions in global NWP models and the potential for improved global environmental prediction systems through the incorporation of more complex aerosol schemes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Predictions of twenty-first century sea level change show strong regional variation. Regional sea level change observed by satellite altimetry since 1993 is also not spatially homogenous. By comparison with historical and pre-industrial control simulations using the atmosphere–ocean general circulation models (AOGCMs) of the CMIP5 project, we conclude that the observed pattern is generally dominated by unforced (internal generated) variability, although some regions, especially in the Southern Ocean, may already show an externally forced response. Simulated unforced variability cannot explain the observed trends in the tropical Pacific, but we suggest that this is due to inadequate simulation of variability by CMIP5 AOGCMs, rather than evidence of anthropogenic change. We apply the method of pattern scaling to projections of sea level change and show that it gives accurate estimates of future local sea level change in response to anthropogenic forcing as simulated by the AOGCMs under RCP scenarios, implying that the pattern will remain stable in future decades. We note, however, that use of a single integration to evaluate the performance of the pattern-scaling method tends to exaggerate its accuracy. We find that ocean volume mean temperature is generally a better predictor than global mean surface temperature of the magnitude of sea level change, and that the pattern is very similar under the different RCPs for a given model. We determine that the forced signal will be detectable above the noise of unforced internal variability within the next decade globally and may already be detectable in the tropical Atlantic.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Personalised nutrition (PN) has the potential to reduce disease risk and optimise health and performance. Although previous research has shown good acceptance of the concept of PN in the UK, preferences regarding the delivery of a PN service (e.g. online v. face-to-face) are not fully understood. It is anticipated that the presence of a free at point of delivery healthcare system, the National Health Service (NHS), in the UK may have an impact on end-user preferences for deliverances. To determine this, supplementary analysis of qualitative data obtained from focus group discussions on PN service delivery, collected as part of the Food4Me project in the UK and Ireland, was undertaken. Irish data provided comparative analysis of a healthcare system that is not provided free of charge at the point of delivery to the entire population. Analyses were conducted using the 'framework approach' described by Rabiee (Focus-group interview and data analysis. Proc Nutr Soc 63, 655-660). There was a preference for services to be led by the government and delivered face-to-face, which was perceived to increase trust and transparency, and add value. Both countries associated paying for nutritional advice with increased commitment and motivation to follow guidelines. Contrary to Ireland, however, and despite the perceived benefit of paying, UK discussants still expected PN services to be delivered free of charge by the NHS. Consideration of this unique challenge of free healthcare that is embedded in the NHS culture will be crucial when introducing PN to the UK.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aeolian dust modelling has improved significantly over the last ten years and many institutions now consistently model dust uplift, transport and deposition in general circulation models (GCMs). However, the representation of dust in GCMs is highly variable between modelling communities due to differences in the uplift schemes employed and the representation of the global circulation that subsequently leads to dust deflation. In this study two different uplift schemes are incorporated in the same GCM. This approach enables a clearer comparison of the dust uplift schemes themselves, without the added complexity of several different transport and deposition models. The global annual mean dust aerosol optical depths (at 550 nm) using two different dust uplift schemes were found to be 0.014 and 0.023—both lying within the estimates from the AeroCom project. However, the models also have appreciably different representations of the dust size distribution adjacent to the West African coast and very different deposition at various sites throughout the globe. The different dust uplift schemes were also capable of influencing the modelled circulation, surface air temperature, and precipitation despite the use of prescribed sea surface temperatures. This has important implications for the use of dust models in AMIP-style (Atmospheric Modelling Intercomparison Project) simulations and Earth-system modelling.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studies within the QLIF project reviewed in this article suggest that organic or low-input management is more likely to result in milk with fatty acid profiles that are higher in α-linolenic acid and/or beneficial isomers of conjugated linoleic acid and antioxidants with up to a 2.5-fold increase in some cases, relative to milk from conventional production. These advantages are preserved during processing, resulting in elevated contents or concentrations of these constituents in processed dairy products of organic or low input origin. Much of the literature suggests that these benefits are very likely to be a result of a greater reliance on forages in the dairy diets (especially grazed grass). Since the adoption of alternative breeds or crosses is often an integral part sustaining these low-input systems, it is not possible to rule out an interaction with genotype in these monitored herds. The results suggest that milk fat composition with respect to human health can be optimized by exploiting grazing in the diet of dairy cows. However, in many European regions this may not be possible due to extremes in temperature, soil moisture levels or both. In such cases milk quality can be maintained by the inclusion of oil seeds in the dairy diets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a summary of the work done within the European Union's Seventh Framework Programme project ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants). ECLIPSE had a unique systematic concept for designing a realistic and effective mitigation scenario for short-lived climate pollutants (SLCPs; methane, aerosols and ozone, and their precursor species) and quantifying its climate and air quality impacts, and this paper presents the results in the context of this overarching strategy. The first step in ECLIPSE was to create a new emission inventory based on current legislation (CLE) for the recent past and until 2050. Substantial progress compared to previous work was made by including previously unaccounted types of sources such as flaring of gas associated with oil production, and wick lamps. These emission data were used for present-day reference simulations with four advanced Earth system models (ESMs) and six chemistry transport models (CTMs). The model simulations were compared with a variety of ground-based and satellite observational data sets from Asia, Europe and the Arctic. It was found that the models still underestimate the measured seasonality of aerosols in the Arctic but to a lesser extent than in previous studies. Problems likely related to the emissions were identified for northern Russia and India, in particular. To estimate the climate impacts of SLCPs, ECLIPSE followed two paths of research: the first path calculated radiative forcing (RF) values for a large matrix of SLCP species emissions, for different seasons and regions independently. Based on these RF calculations, the Global Temperature change Potential metric for a time horizon of 20 years (GTP20) was calculated for each SLCP emission type. This climate metric was then used in an integrated assessment model to identify all emission mitigation measures with a beneficial air quality and short-term (20-year) climate impact. These measures together defined a SLCP mitigation (MIT) scenario. Compared to CLE, the MIT scenario would reduce global methane (CH4) and black carbon (BC) emissions by about 50 and 80 %, respectively. For CH4, measures on shale gas production, waste management and coal mines were most important. For non-CH4 SLCPs, elimination of high-emitting vehicles and wick lamps, as well as reducing emissions from gas flaring, coal and biomass stoves, agricultural waste, solvents and diesel engines were most important. These measures lead to large reductions in calculated surface concentrations of ozone and particulate matter. We estimate that in the EU, the loss of statistical life expectancy due to air pollution was 7.5 months in 2010, which will be reduced to 5.2 months by 2030 in the CLE scenario. The MIT scenario would reduce this value by another 0.9 to 4.3 months. Substantially larger reductions due to the mitigation are found for China (1.8 months) and India (11–12 months). The climate metrics cannot fully quantify the climate response. Therefore, a second research path was taken. Transient climate ensemble simulations with the four ESMs were run for the CLE and MIT scenarios, to determine the climate impacts of the mitigation. In these simulations, the CLE scenario resulted in a surface temperature increase of 0.70 ± 0.14 K between the years 2006 and 2050. For the decade 2041–2050, the warming was reduced by 0.22 ± 0.07 K in the MIT scenario, and this result was in almost exact agreement with the response calculated based on the emission metrics (reduced warming of 0.22 ± 0.09 K). The metrics calculations suggest that non-CH4 SLCPs contribute ~ 22 % to this response and CH4 78 %. This could not be fully confirmed by the transient simulations, which attributed about 90 % of the temperature response to CH4 reductions. Attribution of the observed temperature response to non-CH4 SLCP emission reductions and BC specifically is hampered in the transient simulations by small forcing and co-emitted species of the emission basket chosen. Nevertheless, an important conclusion is that our mitigation basket as a whole would lead to clear benefits for both air quality and climate. The climate response from BC reductions in our study is smaller than reported previously, possibly because our study is one of the first to use fully coupled climate models, where unforced variability and sea ice responses cause relatively strong temperature fluctuations that may counteract (and, thus, mask) the impacts of small emission reductions. The temperature responses to the mitigation were generally stronger over the continents than over the oceans, and with a warming reduction of 0.44 K (0.39–0.49) K the largest over the Arctic. Our calculations suggest particularly beneficial climate responses in southern Europe, where surface warming was reduced by about 0.3 K and precipitation rates were increased by about 15 (6–21) mm yr−1 (more than 4 % of total precipitation) from spring to autumn. Thus, the mitigation could help to alleviate expected future drought and water shortages in the Mediterranean area. We also report other important results of the ECLIPSE project.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this invited article the authors present an evaluative report on the development of the MESHGuides project (http://www.meshguides.org/). MESHGuides’ objective is to provide education with an international knowledge management system. MESHGuides were conceived as research summaries for supporting teachers’ in developing evidence-based practice. Their aim is to enhance teachers’ capacity to engage actively with research in their own classrooms. The original thinking for MESH arose from the work of UK-based academics Professor Marilyn Leask and Dr Sarah Younie in response to a desire, which has recently gathered momentum in the UK, for the development of a more research-informed teaching profession and for the establishment of an on-line platform to support evidence-based practice (DfE, 2015; Leask and Younie 2001; OECD 2009). The focus of this article is on how the MESHGuides project was conceived and structured, the technical systems supporting it and the practical reality for academics and teachers of composing and using MESHGuides. The project and the guides are in the early stages of development, and discussion indicates future possibilities for more global engagement with this knowledge management system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This chapter re-evaluates the diachronic, evolutionist model that establishes the Second World War as a watershed between classical and modern cinemas, and ‘modernity’ as the political project of ‘slow cinema’. I will start by historicising the connection between cinematic speed and modernity, going on to survey the veritable obsession with the modern that continues to beset film studies despite the vagueness and contradictions inherent in the term. I will then attempt to clarify what is really at stake within the modern-classical debate by analysing two canonical examples of Japanese cinema, drawn from the geidomono genre (films on the lives of theatre actors), Kenji Mizoguchi’s Story of the Late Chrysanthemums (Zangiku monogatari, 1939) and Yasujiro Ozu’s Floating Weeds (Ukigusa, 1954), with a view to investigating the role of the long take or, conversely, classical editing, in the production or otherwise of a supposed ‘slow modernity’. By resorting to Ozu and Mizoguchi, I hope to demonstrate that the best narrative films in the world have always combined a ‘classical’ quest for perfection with the ‘modern’ doubt of its existence, hence the futility of classifying cinema in general according to an evolutionary and Eurocentric model based on the classical-modern binary. Rather than on a confusing politics of the modern, I will draw on Bazin’s prophetic insight of ‘impure cinema’, a concept he forged in defence of literary and theatrical screen adaptations. Anticipating by more than half a century the media convergence on which the near totality of our audiovisual experience is currently based, ‘impure cinema’ will give me the opportunity to focus on the confluence of film and theatre in these Mizoguchi and Ozu films as the site of a productive crisis where established genres dissolve into self-reflexive stasis, ambiguity of expression and the revelation of the reality of the film medium, all of which, I argue, are more reliable indicators of a film’s political programme than historical teleology. At the end of the journey, some answers may emerge to whether the combination of the long take and the long shot are sufficient to account for a film’s ‘slowness’ and whether ‘slow’ is indeed the best concept to signify resistance to the destructive pace of capitalism.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this research is to exhibit how literary playtexts can evoke multisensory trends prevalent in 21st century theatre. In order to do so, it explores a range of practical forms and theoretical contexts for creating participatory, site-specific and immersive theatre. With reference to literary theory, specifically to semiotics, reader-response theory, postmodernism and deconstruction, it attempts to revise dramatic theory established by Aristotle’s Poetics. Considering Gertrude Stein’s essay, Plays (1935), and relevant trends in theatre and performance, shaped by space, technology and the everchanging role of the audience member, a postdramatic poetics emerges from which to analyze the plays of Mac Wellman and Suzan-Lori Parks. Distinguishing the two textual lives of a play as the performance playtext and the literary playtext, it examines the conventions of the printed literary playtext, with reference to models of practice that radicalize the play form, including works by Mabou Mines, The Living Theatre and Fiona Templeton. The arguments of this practice-led Ph.D. developed out of direct engagement with the practice project, which explores the multisensory potential of written language when combined with hypermedia. The written thesis traces the development process of a new play, Rumi High, which is presented digitally as a ‘hyper(play)text,’ accessible through the Internet at www.RumiHigh.org. Here, ‘playwrighting’ practice is expanded spatially, collaboratively and textually. Plays are built, designed and crafted with many layers of meaning that explore both linguistic and graphic modes of poetic expression. The hyper(play)text of Rumi High establishes playwrighting practice as curatorial, where performance and literary playtexts are in a reciprocal relationship. This thesis argues that digital writing and reading spaces enable new approaches to expressing the many languages of performance, while expanding the collaborative network that produces the work. It questions how participatory forms of immersive and site-specific theatre can be presented as interactive literary playtexts, which enable the reader to have a multisensory experience. Through a reflection on process and an evaluation of the practice project, this thesis problematizes notions of authorship and text.