862 resultados para make energy use more effective
Resumo:
Commercial kitchens are one of the most profligate users of gas, water and electricity in the UK and can leave a large carbon footprint. It is estimated that the total energy consumption of Britain’s catering industry is in excess of 21,600 million kWh per year. In order to facilitate appropriate energy reduction within licensed restaurants, energy use must be translated into a form that can be compared between kitchens to enable operators to assess how they are improving and to allow rapid identification of facilities which require action. A review of relevant literature is presented and current benchmarking methods are discussed in order to assist in the development and categorisation of benchmarking energy reduction in commercial kitchens. Energy use within UK industry leading brands is discussed for the purpose of benchmarking in terms of factors such as size and output.
Resumo:
This letter presents an effective approach for selection of appropriate terrain modeling methods in forming a digital elevation model (DEM). This approach achieves a balance between modeling accuracy and modeling speed. A terrain complexity index is defined to represent a terrain's complexity. A support vector machine (SVM) classifies terrain surfaces into either complex or moderate based on this index associated with the terrain elevation range. The classification result recommends a terrain modeling method for a given data set in accordance with its required modeling accuracy. Sample terrain data from the lunar surface are used in constructing an experimental data set. The results have shown that the terrain complexity index properly reflects the terrain complexity, and the SVM classifier derived from both the terrain complexity index and the terrain elevation range is more effective and generic than that designed from either the terrain complexity index or the terrain elevation range only. The statistical results have shown that the average classification accuracy of SVMs is about 84.3% ± 0.9% for terrain types (complex or moderate). For various ratios of complex and moderate terrain types in a selected data set, the DEM modeling speed increases up to 19.5% with given DEM accuracy.
Resumo:
In order to make best use of the opportunities provided by space missions such as the Radiation Belt Storm Probes, we determine the response of complementary subionospheric radiowave propagation measurements (VLF), riometer absorption measurements (CNA), and GPS-produced total electron content (vTEC) to different energetic electron precipitation (EEP). We model the relative sensitivity and responses of these instruments to idealised monoenergetic beams of precipitating electrons, and more realistic EEP spectra chosen to represent radiation belts and substorm precipitation. In the monoenergetic beam case, we find riometers are more sensitive to the same EEP event occurring during the day than during the night, while subionospheric VLF shows the opposite relationship, and the change in vTEC is independent. In general, the subionospheric VLF measurements are much more sensitive than the other two techniques for EEP over 200 keV, responding to flux magnitudes two-three orders of magnitude smaller than detectable by a riometer. Detectable TEC changes only occur for extreme monoenergetic fluxes. For the radiation belt EEP case, clearly detectable subionospheric VLF responses are produced by daytime fluxes that are ~10 times lower than required for riometers, while nighttime fluxes can be 10,000 times lower. Riometers are likely to respond only to radiation belt fluxes during the largest EEP events and vTEC is unlikely to be significantly disturbed by radiation belt EEP. For the substorm EEP case both the riometer absorption and the subionospheric VLF technique respond significantly, as does the change in vTEC, which is likely to be detectable at ~3-4 TECu.
Resumo:
The UK Government's Department for Energy and Climate Change has been investigating the feasibility of developing a national energy efficiency data framework covering both domestic and non-domestic buildings. Working closely with the Energy Saving Trust and energy suppliers, the aim is to develop a data framework to monitor changes in energy efficiency, develop and evaluate programmes and improve information available to consumers. Key applications of the framework are to understand trends in built stock energy use, identify drivers and evaluate the success of different policies. For energy suppliers, it could identify what energy uses are growing, in which sectors and why. This would help with market segmentation and the design of products. For building professionals, it could supplement energy audits and modelling of end-use consumption with real data and support the generation of accurate and comprehensive benchmarks. This paper critically examines the results of the first phase of work to construct a national energy efficiency data-framework for the domestic sector focusing on two specific issues: (a) drivers of domestic energy consumption in terms of the physical nature of the dwellings and socio-economic characteristics of occupants and (b) the impact of energy efficiency measures on energy consumption.
Resumo:
A new aerosol index for the Along-Track Scanning Radiometers (ATSRs) is presented that provides a means to detect desert dust contamination in infrared SST retrievals. The ATSR Saharan dust index (ASDI) utilises only the thermal infrared channels and may therefore be applied consistently to the entire ATSR data record (1991 to present), for both day time and night time observations. The derivation of the ASDI is based on a principal component (PC) analysis (PCA) of two unique pairs of channel brightness temperature differences (BTDs). In 2-D space (i.e. BTD vs BTD), it is found that the loci of data unaffected by aerosol are confined to a single axis of variability. In contrast, the loci of aerosol-contaminated data fall off-axis, shifting in a direction that is approximately orthogonal to the clear-sky axis. The ASDI is therefore defined to be the second PC, where the first PC accounts for the clear-sky variability. The primary ASDI utilises the ATSR nadir and forward-view observations at 11 and 12 μm (ASDI2). A secondary, three-channel nadir-only ASDI (ASDI3) is also defined for situations where data from the forward view are not available. Empirical and theoretical analyses suggest that ASDI is well correlated with aerosol optical depth (AOD: correlation r is typically > 0.7) and provides an effective tool for detecting desert mineral dust. Overall, ASDI2 is found to be more effective than ASDI3, with the latter being sensitive only to very high dust loading. In addition, use of ASDI3 is confined to night time observations as it relies on data from the 3.7 μm channel, which is sensitive to reflected solar radiation. This highlights the benefits of having data from both a nadir- and a forward-view for this particular approach to aerosol detection.
Resumo:
The aim of this article is to improve the communication of the probabilistic flood forecasts generated by hydrological ensemble prediction systems (HEPS) by understanding perceptions of different methods of visualizing probabilistic forecast information. This study focuses on interexpert communication and accounts for differences in visualization requirements based on the information content necessary for individual users. The perceptions of the expert group addressed in this study are important because they are the designers and primary users of existing HEPS. Nevertheless, they have sometimes resisted the release of uncertainty information to the general public because of doubts about whether it can be successfully communicated in ways that would be readily understood to nonexperts. In this article, we explore the strengths and weaknesses of existing HEPS visualization methods and thereby formulate some wider recommendations about the best practice for HEPS visualization and communication. We suggest that specific training on probabilistic forecasting would foster use of probabilistic forecasts with a wider range of applications. The result of a case study exercise showed that there is no overarching agreement between experts on how to display probabilistic forecasts and what they consider the essential information that should accompany plots and diagrams. In this article, we propose a list of minimum properties that, if consistently displayed with probabilistic forecasts, would make the products more easily understandable. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Background Major depressive disorders (MDD) are a debilitating and pervasive group of mental illnesses afflicting many millions of people resulting in the loss of 110 million working days and more than 2,500 suicides per annum. Adolescent MDD patients attending NHS clinics show high rates of recurrence into adult life. A meta-analysis of recent research shows that psychological treatments are not as efficacious as previously thought. Modest treatment outcomes of approximately 65% of cases responding suggest that aetiological and clinical heterogeneity may hamper the better use of existing therapies and discovery of more effective treatments. Information with respect to optimal treatment choice for individuals is lacking, with no validated biomarkers to aid therapeutic decision-making. Methods/Design Magnetic resonance-Improving Mood with Psychoanalytic and Cognitive Therapies, the MR-IMPACT study, plans to identify brain regions implicated in the pathophysiology of depressions and examine whether there are specific behavioural or neural markers predicting remission and/or subsequent relapse in a subsample of depressed adolescents recruited to the IMPACT randomised controlled trial (Registration # ISRCTN83033550). Discussion MR-IMPACT is an investigative biomarker component of the IMPACT pragmatic effectiveness trial. The aim of this investigation is to identify neural markers and regional indicators of the pathophysiology of and treatment response for MDD in adolescents. We anticipate that these data may enable more targeted treatment delivery by identifying those patients who may be optimal candidates for therapeutic response.
Resumo:
In 2007, the world reached the unprecedented milestone of half of its people living in cities, and that proportion is projected to be 60% in 2030. The combined effect of global climate change and rapid urban growth, accompanied by economic and industrial development, will likely make city residents more vulnerable to a number of urban environmental problems, including extreme weather and climate conditions, sea-level rise, poor public health and air quality, atmospheric transport of accidental or intentional releases of toxic material, and limited water resources. One fundamental aspect of predicting the future risks and defining mitigation strategies is to understand the weather and regional climate affected by cities. For this reason, dozens of researchers from many disciplines and nations attended the Urban Weather and Climate Workshop.1 Twenty-five students from Chinese universities and institutes also took part. The presentations by the workshop's participants span a wide range of topics, from the interaction between the urban climate and energy consumption in climate-change environments to the impact of urban areas on storms and local circulations, and from the impact of urbanization on the hydrological cycle to air quality and weather prediction.
Resumo:
Cities and global climate change are closely linked: cities are where the bulk of greenhouse gas emissions take place through the consumption of fossil fuels; they are where an increasing proportion of the world’s people live; and they also generate their own climate – commonly characterized by the urban heat island. In this way, understanding the way cities affect the cycling of energy, water, and carbon to create an urban climate is a key element of climate mitigation and adaptation strategies, especially in the context of rising global temperatures and deteriorating air quality in many cities. As climate models resolve finer spatial-scales, they will need to represent those areas in which more than 50% of the world’s population already live to provide climate projections that are of greater use to planning and decision-making. Finally, many of the processes that are instrumental in determining urban climate are the same factors leading to global anthropogenic climate change, namely regional-scale land-use changes; increased energy use; and increased emissions of climatically-relevant atmospheric constituents. Cities are therefore both a case study for understanding, and an agent in mitigating, anthropogenic climate change. This chapter reviews and summarizes the current state of understanding of the physical basis of urban climates, as well as our ability to represent these in models. We argue that addressing the challenges of managing urban environments in a changing climate requires understanding the energy, water, and carbon balances for an urban landscape and, importantly, their interactions and feedbacks, together with their links to human behaviour and controls. We conclude with some suggestions for where further research is needed.
Resumo:
This paper highlights some communicative and institutional challenges to using ensemble prediction systems (EPS) in operational flood forecasting, warning, and civil protection. Focusing in particular on the Swedish experience, as part of the PREVIEW FP6 project, of applying EPS to operational flood forecasting, the paper draws on a wider set of site visits, interviews, and participant observation with flood forecasting centres and civil protection authorities (CPAs) in Sweden and 15 other European states to reflect on the comparative success of Sweden in enabling CPAs to make operational use of EPS for flood risk management. From that experience, the paper identifies four broader lessons for other countries interested in developing the operational capacity to make, communicate, and use EPS for flood forecasting and civil protection. We conclude that effective training and clear communication of EPS, while clearly necessary, are by no means sufficient to ensure effective use of EPS. Attention must also be given to overcoming the institutional obstacles to their use and to identifying operational choices for which EPS is seen to add value rather than uncertainty to operational decision making by CPAs.
Resumo:
Purpose The relative efficiency of different eye exercise regimes is unclear, and in particular the influences of practice, placebo and the amount of effort required are rarely considered. This study measured conventional clinical measures after different regimes in typical young adults. Methods 156 asymptomatic young adults were directed to carry out eye exercises 3 times daily for two weeks. Exercises were directed at improving blur responses (accommodation), disparity responses (convergence), both in a naturalistic relationship, convergence in excess of accommodation, accommodation in excess of convergence, and a placebo regime. They were compared to two control groups, neither of which were given exercises, but the second of which were asked to make maximum effort during the second testing. Results Instruction set and participant effort were more effective than many exercises. Convergence exercises independent of accommodation were the most effective treatment, followed by accommodation exercises, and both regimes resulted in changes in both vergence and accommodation test responses. Exercises targeting convergence and accommodation working together were less effective than those where they were separated. Accommodation measures were prone to large instruction/effort effects and monocular accommodation facility was subject to large practice effects. Conclusions Separating convergence and accommodation exercises seemed more effective than exercising both systems concurrently and suggests that stimulation of accommodation and convergence may act in an additive fashion to aid responses. Instruction/effort effects are large and should be carefully controlled if claims for the efficacy of any exercise regime are to be made.
Resumo:
Biodiversity informatics plays a central enabling role in the research community's efforts to address scientific conservation and sustainability issues. Great strides have been made in the past decade establishing a framework for sharing data, where taxonomy and systematics has been perceived as the most prominent discipline involved. To some extent this is inevitable, given the use of species names as the pivot around which information is organised. To address the urgent questions around conservation, land-use, environmental change, sustainability, food security and ecosystem services that are facing Governments worldwide, we need to understand how the ecosystem works. So, we need a systems approach to understanding biodiversity that moves significantly beyond taxonomy and species observations. Such an approach needs to look at the whole system to address species interactions, both with their environment and with other species.It is clear that some barriers to progress are sociological, basically persuading people to use the technological solutions that are already available. This is best addressed by developing more effective systems that deliver immediate benefit to the user, hiding the majority of the technology behind simple user interfaces. An infrastructure should be a space in which activities take place and, as such, should be effectively invisible.This community consultation paper positions the role of biodiversity informatics, for the next decade, presenting the actions needed to link the various biodiversity infrastructures invisibly and to facilitate understanding that can support both business and policy-makers. The community considers the goal in biodiversity informatics to be full integration of the biodiversity research community, including citizens' science, through a commonly-shared, sustainable e-infrastructure across all sub-disciplines that reliably serves science and society alike.
Resumo:
Using 1D Vlasov drift-kinetic computer simulations, it is shown that electron trapping in long period standing shear Alfven waves (SAWs) provides an efficient energy sink for wave energy that is much more effective than Landau damping. It is also suggested that the plasma environment of low altitude auroral-zone geomagnetic field lines is more suited to electron acceleration by inertial or kinetic scale Alfven waves. This is due to the self-consistent response of the electron distribution function to SAWs, which must accommodate the low altitude large-scale current system in standing waves. We characterize these effects in terms of the relative magnitude of the wave phase and electron thermal velocities. While particle trapping is shown to be significant across a wide range of plasma temperatures and wave frequencies, we find that electron beam formation in long period waves is more effective in relatively cold plasma.
Resumo:
A universal systems design process is specified, tested in a case study and evaluated. It links English narratives to numbers using a categorical language framework with mathematical mappings taking the place of conjunctions and numbers. The framework is a ring of English narrative words between 1 (option) and 360 (capital); beyond 360 the ring cycles again to 1. English narratives are shown to correspond to the field of fractional numbers. The process can enable the development, presentation and communication of complex narrative policy information among communities of any scale, on a software implementation known as the "ecoputer". The information is more accessible and comprehensive than that in conventional decision support, because: (1) it is expressed in narrative language; and (2) the narratives are expressed as compounds of words within the framework. Hence option generation is made more effective than in conventional decision support processes including Multiple Criteria Decision Analysis, Life Cycle Assessment and Cost-Benefit Analysis.The case study is of a participatory workshop in UK bioenergy project objectives and criteria, at which attributes were elicited in environmental, economic and social systems. From the attributes, the framework was used to derive consequences at a range of levels of precision; these are compared with the project objectives and criteria as set out in the Case for Support. The design process is to be supported by a social information manipulation, storage and retrieval system for numeric and verbal narratives attached to the "ecoputer". The "ecoputer" will have an integrated verbal and numeric operating system. Novel design source code language will assist the development of narrative policy. The utility of the program, including in the transition to sustainable development and in applications at both community micro-scale and policy macro-scale, is discussed from public, stakeholder, corporate, Governmental and regulatory perspectives.
Resumo:
The precipitation of bovine serum albumin (BSA), lysozyme (LYS) and alfalfa leaf protein (ALF) by two large- and two medium-sized condensed tannin (CT) fractions of similar flavan-3-ol subunit composition is described. CT fractions isolated from white clover flowers and big trefoil leaves exhibited high purity profiles by 1D/2D NMR and purities >90% (determined by thiolysis). At pH 6.5, large CTs with a mean degree of polymerization (mDP) of ~18 exhibited similar protein precipitation behaviors and were significantly more effective than medium CTs (mDP ~9). Medium CTs exhibited similar capacities to precipitate ALF or BSA, but showed small but significant differences in their capacity to precipitate LYS. All CTs precipitated ALF more effectively than BSA or LYS. Aggregation of CT-protein complexes likely aided precipitation of ALF and BSA, but not LYS. This study, one of the first to use CTs of confirmed high purity, demonstrates that mDP of CTs influences protein precipitation efficacy.