13 resultados para Unified Power Quality Conditioner (UPQC)

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is now accepted that some human-induced climate change is unavoidable. Potential impacts on water supply have received much attention, but relatively little is known about the concomitant changes in water quality. Projected changes in air temperature and rainfall could affect river flows and, hence, the mobility and dilution of contaminants. Increased water temperatures will affect chemical reaction kinetics and, combined with deteriorations in quality, freshwater ecological status. With increased flows there will be changes in stream power and, hence, sediment loads with the potential to alter the morphology of rivers and the transfer of sediments to lakes, thereby impacting freshwater habitats in both lake and stream systems. This paper reviews such impacts through the lens of UK surface water quality. Widely accepted climate change scenarios suggest more frequent droughts in summer, as well as flash-flooding, leading to uncontrolled discharges from urban areas to receiving water courses and estuaries. Invasion by alien species is highly likely, as is migration of species within the UK adapting to changing temperatures and flow regimes. Lower flows, reduced velocities and, hence, higher water residence times in rivers and lakes will enhance the potential for toxic algal blooms and reduce dissolved oxygen levels. Upland streams could experience increased dissolved organic carbon and colour levels, requiring action at water treatment plants to prevent toxic by-products entering public water supplies. Storms that terminate drought periods will flush nutrients from urban and rural areas or generate acid pulses in acidified upland catchments. Policy responses to climate change, such as the growth of bio-fuels or emission controls, will further impact freshwater quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deconstructs the relationship between the Environmental Sustainability Index (ESI) and national income. The ESI attempts to provide a single figure which encapsulates environmental sustainability' for each country included in the analysis, and this allied with a 'league table' format so as to name and shame bad performers, has resulted in widespread reporting within the popular presses of a number of countries. In essence, the higher the value of the ESI then the more 'environmentally sustainable' a country is deemed to be. A logical progression beyond the use of the ESI to publicise environmental sustainability is its use within a more analytical context. Thus an index designed to simplify in order to have an impact on policy is used to try and understand causes of good and bad performance in environmental sustainability. For example the creators of the ESI claim that ESI is related to GDP/capita (adjusted for Purchasing Power Parity) such that the ESI increases linearly with wealth. While this may in a sense be a comforting picture, do the variables within the ESI allow for alternatives to the story, and if they do then what are the repercussions for those producing such indices for broad consumption amongst the policy makers, mangers, the press, etc.? The latter point is especially important given the appetite for such indices amongst non-specialists, and for all their weaknesses the ESI and other such aggregated indices will not go away. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this article was to determine which aspects of Huntington's disease (HD) are most important with regard to the health-related quality of life (HrQOL) of patients with this neurodegenerative disease. Seventy patients with HD participated in the study. Assessment comprised the Unified Huntington's Disease Rating Scale (UHDRS) motor, cognitive and functional capacity sections, and the Beck Depression inventory. Mental and physical HrQOL were assessed using summary scores of the SF-36. Multiple regression analyses showed that functional capacity and depressive mood were significantly associated with HrQOL, in that greater impairments in HrQOL were associated with higher levels of depressive mood and lower functional capacity. Motor symptoms and cognitive function were not found to be as closely linked with HrQOL. Therefore, it can be concluded that, depressive mood and greater functional incapacity are key factors in HrQOL for people with HD, and further longitudinal investigation will be useful to determine their utility as specific targets in intervention studies aimed at improving patient HrQOL, or whether other mediating variables. As these two factors had a similar association with the mental and physical summary scores of the SF-36, this generic HrQOL measure did not adequately capture and distinguish the true mental and physical health-related HrQOL in HD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Military doctrine is one of the conceptual components of war. Its raison d’être is that of a force multiplier. It enables a smaller force to take on and defeat a larger force in battle. This article’s departure point is the aphorism of Sir Julian Corbett, who described doctrine as ‘the soul of warfare’. The second dimension to creating a force multiplier effect is forging doctrine with an appropriate command philosophy. The challenge for commanders is how, in unique circumstances, to formulate, disseminate and apply an appropriate doctrine and combine it with a relevant command philosophy. This can only be achieved by policy-makers and senior commanders successfully answering the Clausewitzian question: what kind of conflict are they involved in? Once an answer has been provided, a synthesis of these two factors can be developed and applied. Doctrine has implications for all three levels of war. Tactically, doctrine does two things: first, it helps to create a tempo of operations; second, it develops a transitory quality that will produce operational effect, and ultimately facilitate the pursuit of strategic objectives. Its function is to provide both training and instruction. At the operational level instruction and understanding are critical functions. Third, at the strategic level it provides understanding and direction. Using John Gooch’s six components of doctrine, it will be argued that there is a lacunae in the theory of doctrine as these components can manifest themselves in very different ways at the three levels of war. They can in turn affect the transitory quality of tactical operations. Doctrine is pivotal to success in war. Without doctrine and the appropriate command philosophy military operations cannot be successfully concluded against an active and determined foe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The formulation and performance of the Met Office visibility analysis and prediction system are described. The visibility diagnostic within the limited-area Unified Model is a function of humidity and a prognostic aerosol content. The aerosol model includes advection, industrial and general urban sources, plus boundary-layer mixing and removal by rain. The assimilation is a 3-dimensional variational scheme in which the visibility observation operator is a very nonlinear function of humidity, aerosol and temperature. A quality control scheme for visibility data is included. Visibility observations can give rise to humidity increments of significant magnitude compared with the direct impact of humidity observations. We present the results of sensitivity studies which show the contribution of different components of the system to improved skill in visibility forecasts. Visibility assimilation is most important within the first 6-12 hours of the forecast and for visibilities below 1 km, while modelling of aerosol sources and advection is important for slightly higher visibilities (1-5 km) and is still significant at longer forecast times

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article focuses on sustainable development and public procurement and reflects on the significance of questioning the goals sustainable public procurement seeks to achieve. While it is recognised that developing appropriate legal frameworks and regulatory tools for environmental, social and economic quality assurance is important, achieving sustainable procurement nevertheless remains political. With the forthcoming adoption of new European Union Public Procurement Directives, the article provides a timely reminder that for sustainability to be integral to good procurement, the power of purchase must capture a paradigmatic shift from doing things better to doing better things.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses issues raised in two recent papers published in this journal about the UK Association of Business Schools' Journal Quality Guide (ABS Guide). While much of the debate about journal rankings in general, and the ABS Guide in particular, has focused on the construction, power and (mis)use of these rankings, this paper differs in that it explains and provides evidence about explicit and implicit biases in the ABS Guide. In so doing, it poses potentially difficult questions that the editors of the ABS Guide need to address and urgently rectify if the ABS Guide seeks to build and retain legitimacy. In particular, the evidence in this paper shows explicit bias in the ABS Guide against several subject areas, including accounting and finance. It also shows implicit bias against accounting and finance when comparing journal rankings in sub-areas shared between accounting and finance and the broader business management subject areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A theoretical model is presented of an electron acceleration-as-oscillator method derived from the work of Joseph Larmor unified with J. Clerk Maxwell’s theory of vorticity for the displacement of radiation into free-space at an antenna interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Good urban design has the power to aid in the provision of inclusive journey environments, yet traditionally neglects the perspective of the cyclist. This paper starts from the premise that more can be done to understand and articulate cyclists’ experiences and perceptions of the urban environment in which they cycle, as part of a closer linking of urban design qualities with transport planning and infrastructure interventions. This approach is particularly applicable in relation to older cyclists, a group whose needs are often poorly understood and for whom perceptions can significantly influence mobile behaviours. Currently, knowledge regarding the relationship between the built environment and physical activity, including cycling, in older adults is limited. As European countries face up to the challenges associated with ageing populations, some metropolitan regions, such as Munich, Germany, are making inroads into widening cycling’s appeal across generations through a combination of urban design, policy and infrastructure initiatives. The paper provides a systematic understanding of the urban design qualities and built environment features that affect cycling participation and have the potential to contribute towards healthy ageing. Urban design features such as legibility, aesthetics, scale and open space have been shown to influence and affect other mobile behaviours (e.g. walking), but their role as a mediator in cycle behaviour remains under-explored. Many of these design ‘qualities’ are related to individual perceptions; capturing these can help build a picture of quality in the built environment that includes an individual’s relationship with their local neighbourhood and its influences on their mobility choices. Issues of accessibility, facilities, and safety in cycling remain crucial, and, when allied to these design ‘qualities‘, provides a more rounded reflection of everyday journeys and trips taken or desired. The paper sets out the role that urban design might play in mediating these critical mobility issues, and in particular, in better understanding the ‘quality of the journey’. It concludes by highlighting the need for designers, policy makers, planners and academics to consider the role that design can play in encouraging cycle participation, especially as part of a healthy ageing agenda.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to investigate the effects of numerous milk compositional factors on milk coagulation properties using Partial Least Squares (PLS). Milk from herds of Jersey and Holstein- Friesian cattle was collected across the year and blended (n=55), to maximise variation in composition and coagulation. The milk was analysed for casein, protein, fat, titratable acidity, lactose, Ca2+, urea content, micelles size, fat globule size, somatic cell count and pH. Milk coagulation properties were defined as coagulation time, curd firmness and curd firmness rate measured by a controlled strain rheometer. The models derived from PLS had higher predictive power than previous models demonstrating the value of measuring more milk components. In addition to the well-established relationships with casein and protein levels, CMS and fat globule size were found to have as strong impact on all of the three models. The study also found a positive impact of fat on milk coagulation properties and a strong relationship between lactose and curd firmness, and urea and curd firmness rate, all of which warrant further investigation due to current lack of knowledge of the underlying mechanism. These findings demonstrate the importance of using a wider range of milk compositional variables for the prediction of the milk coagulation properties, and hence as indicators of milk suitability for cheese making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The real-time quality control (RTQC) methods applied to Argo profiling float data by the United Kingdom (UK) Met Office, the United States (US) Fleet Numerical Meteorology and Oceanography Centre, the Australian Bureau of Meteorology and the Coriolis Centre are compared and contrasted. Data are taken from the period 2007 to 2011 inclusive and RTQC performance is assessed with respect to Argo delayed-mode quality control (DMQC). An intercomparison of RTQC techniques is performed using a common data set of profiles from 2010 and 2011. The RTQC systems are found to have similar power in identifying faulty Argo profiles but to vary widely in the number of good profiles incorrectly rejected. The efficacy of individual QC tests are inferred from the results of the intercomparison. Techniques to increase QC performance are discussed.