862 resultados para make energy use more effective


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the fall of 2003, Governor Blagojevich unveiled Opportunity Returns, a regional economic development plan that is the most aggressive, comprehensive approach to creating jobs in Illinois' history. The Governor divided the state into 10 economic development regions -- finding areas with common economic strengths and needs, and is developing plans for each region that include specific actions to make these regions more accessible and more attractive to business. This grassroots effort is a product of significant outreach with economic development leaders, local elected officials and business and community leaders. Each Opportunity Returns economic development plan is designed to be flexible and effective and tailored to deliver real results that local businesses will see, feel, and hopefully profit from.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To devise more-effective physical activity interventions, the mediating mechanisms yielding behavioral change need to be identified. The Baron-Kenny method is most commonly used. but has low statistical power and May not identify mechanisms of behavioral change in small-to-medium size Studies. More powerful statistical tests are available, Study Design and Setting: Inactive adults (N = 52) were randomized to either a print or a print-plus-telephone intervention. Walking and exercise-related social support Were assessed at baseline, after file intervention, and 4 weeks later. The Baron-Kenny and three alternative methods of mediational analysis (Freedman-Schatzkin; MacKinnon et al.: bootstrap method) were used to examine the effects of social support on initial behavior change and maintenance. Results: A significant mediational effect of social support on initial behavior change was indicated by the MacKinnon et al., bootstrap. and. marginally. Freedman-Schatzkin methods, but not by the Baron-Kenny method. No significant mediational effecl of social support on maintenance of walking was found. Conclusions: Methodologically rigorous intervention studies to identify mediators of change in physical activity are costly and labor intensive, and may not be feasible with large samples. The Use of statistically powerful tests of mediational effects in small-scale studies can inform the development of more effective interventions. (C) 2006 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because organizations are making large investments in Information systems (IS), efficient IS project management has been found critical to success. This study examines how the use of incentives can improve the project success. Agency theory is used to: identify motivational factors of project success, help the IS owners to understand to what extent management incentives can improve IS development and implementation (ISD/I). The outcomes will help practitioners and researchers to build on theoretical model of project management elements which lead to project success. Given the principal-agent nature of most significant scale of IS development, insights that will allow for greater alignment of the agent’s goals with those of the principal through incentive contracts, will serve to make ISD/I both more efficient and more effective, leading to more successful IS projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationship between theory and practice has been discussed in the social sciences for generations. Academics from management and organization studies regularly lament the divide between theory and practice. They regret the insufficient academic knowledge of managerial problems and their solutions, and criticize the scholarly production of theories that are not relevant for organizational practice (Hambrick 1994). Despite the prevalence of this topic in academic discourse, we do not know much about what kind of academic knowledge would be useful to practice, how it would be produced and how the transfer of knowledge between theory and practice actually works. In short, we do not know how we can make academic work more relevant for practice or even whether this would be desirable. In this introduction to the Special Issue, we apply philosophical, theoretical and empirical perspectives to examine the challenges of studying the generation and use of academic knowledge. We then briefly describe the contribution of the seven papers that were selected for this Special Issue. Finally, we discuss issues that still need to be addressed, and make some proposals for future avenues of research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Great Britain and Brazil healthcare is free at the point of delivery and based study only on citizenship. However, the British NHS is fifty-five years old and has undergone extensive reforms. The Brazilian SUS is barely fifteen years old. This research investigated the middle management mediation role within hospitals comparing managerial planning and control using cost information in Great Britain and Brazil. This investigation was conducted in two stages entailing quantitative and qualitative techniques. The first stage was a survey involving managers of 26 NHS Trusts in Great Britain and 22 public hospitals in Brazil. The second stage consisted of interviews, 10 in Great Britain and 22 in Brazil, conducted in four selected hospitals, two in each country. This research builds on the literature by investigating the interaction of contingency theory and modes of governance in a cross-national study in terms of public hospitals. It further builds on the existing literature by measuring managerial dimensions related to cost information usefulness. The project unveils the practice involved in planning and control processes. It highlights important elements such as the use of predictive models and uncertainty reduction when planning. It uncovers the different mechanisms employed on control processes. It also depicts that planning and control within British hospitals are structured procedures and guided by overall goals. In contrast, planning and control processes in Brazilian hospitals are accidental, involving more ad hoc actions and a profusion of goals. The clinicians in British hospitals have been integrated into the management hierarchy. Their use of cost information in planning and control processes reflects this integration. However, in Brazil, clinicians have been shown to operate more independently and make little use of cost information but the potential signalled for cost information use is seen to be even greater than that of their British counterparts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The techno-economic implications of recycling the components of mixed plastics waste have been studied in a two-part investigation: (a) An economic survey of the prospects for plastics recycling, the plastics waste arisings from retailing, building, automotive, light engineering and chemical industries have been surveyed by mans of questionnaires and interviews. This was partially successful and indicated that very considerable quantities of relatively clean plastics packaging was available in major department chains and household stores. The possibility of devising collection systems for such sources, which do not lead to any extra cost, have been suggested. However, the household collection of plastics waste has been found to be uneconomic due to high cost of collection, transportation and lack of markets for the end products. (b) In a technical study of blends of PE/PP and PE/PS which are found in admixture in waste plastics, it has been shown that they exhibit poor mechanical properties due to incompatibility. Consequently reprocessing of such unsegregated blends results in products of little technological value. The inclusion of some commercial block and graft copolymers which behave as solid phase dispersants (SPES) increase the toughness of the blends (e.g. EPDM in PE/PP blend and SBS in PE/PS blend). Also, EPDM is found to be very effective for improving the toughness of single component polypropylene. However, the improved Technical properties of such blends have been accompanied by a fast rate of photo-oxidation and loss of toughness due to the presence of unsaturation in SPD's. The change in mechanical properties occurring during oven ageing and ultra-violet light accelerated weathering of these binary and ternary blends was followed by a viscoelastonetric technique (Rheovibron) over 9,, wide range of temperatures, impact resistance at room temperature (20-41'G) and changes in functional groups (i.e. carbonyl and trans-1,4-polybutadiene). Also the heat and light stability of single and mixed plastics to which thiol antioxidants were bound to SPE1 segment have been studied and compared with conventional antioxidants. The long-term performance of the mixed plastics containing SPE1 have been improved significantly by the use of conventional and bound antioxidants. It is concluded that an estimated amount of 30000 tonnes/year of plastics waste is available from department chains and household stores which can be converted to useful end products. This justifies pilot-experiments in collaboration with supermarkets, recyclers and converters by use of low cost SPD's and additives designed to make the materials more compatible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

World and UK energy resources and use are reviewed and the role of energy conservation in energy policy identified. In considering various energy conservation measures, a distinction is made between energy intensive and non-intensive industries and also between direct and indirect uses of energy. Particular attention is given to the non-intensive user of energy. Energy use on one such industrial site has been studied to determine the most effective energy saving measures in the short term. Here it is estimated that over 65% of energy is consumed for indirect purposes, mainly for heating and lighting buildings. Emphasis is placed on energy auditing techniques and those energy saving measures requiring greater technical, economic and organisational resources to secure their implementation. Energy auditing techniques include the use of aerial thermography and snow formation surveys to detect heat losses. Qualitative and quantitative interpretations are carried out, but restricted mainly to evaluating building roof heat losses. From the energy auditing exercise, it is confirmed that the intermittent heating of buildings is the largest and most cost effective fuel saving measure. This was implemented on the site and a heat monitoring programme established to verify results. Industrial combined heat and power generation is investigated. A proposal for the site demonstrates that there are several obstacles to its successful implementation. By adopting an alternative financial rationale, a way of overcoming these obstacles is suggested. A useful by-product of the study is the classification of industrial sites according to the nature of industrial energy demand patterns. Finally, energy saving measures implemented on the site are quantlfied using comparative verification methods. Overall fuel savings of 13% are indicated. Cumulative savings in heating fuel amount to 26% over four years although heated area increased by approximately 25%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included. © 2010 The authors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysing the molecular polymorphism and interactions of DNA, RNA and proteins is of fundamental importance in biology. Predicting functions of polymorphic molecules is important in order to design more effective medicines. Analysing major histocompatibility complex (MHC) polymorphism is important for mate choice, epitope-based vaccine design and transplantation rejection etc. Most of the existing exploratory approaches cannot analyse these datasets because of the large number of molecules with a high number of descriptors per molecule. This thesis develops novel methods for data projection in order to explore high dimensional biological dataset by visualising them in a low-dimensional space. With increasing dimensionality, some existing data visualisation methods such as generative topographic mapping (GTM) become computationally intractable. We propose variants of these methods, where we use log-transformations at certain steps of expectation maximisation (EM) based parameter learning process, to make them tractable for high-dimensional datasets. We demonstrate these proposed variants both for synthetic and electrostatic potential dataset of MHC class-I. We also propose to extend a latent trait model (LTM), suitable for visualising high dimensional discrete data, to simultaneously estimate feature saliency as an integrated part of the parameter learning process of a visualisation model. This LTM variant not only gives better visualisation by modifying the project map based on feature relevance, but also helps users to assess the significance of each feature. Another problem which is not addressed much in the literature is the visualisation of mixed-type data. We propose to combine GTM and LTM in a principled way where appropriate noise models are used for each type of data in order to visualise mixed-type data in a single plot. We call this model a generalised GTM (GGTM). We also propose to extend GGTM model to estimate feature saliencies while training a visualisation model and this is called GGTM with feature saliency (GGTM-FS). We demonstrate effectiveness of these proposed models both for synthetic and real datasets. We evaluate visualisation quality using quality metrics such as distance distortion measure and rank based measures: trustworthiness, continuity, mean relative rank errors with respect to data space and latent space. In cases where the labels are known we also use quality metrics of KL divergence and nearest neighbour classifications error in order to determine the separation between classes. We demonstrate the efficacy of these proposed models both for synthetic and real biological datasets with a main focus on the MHC class-I dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a set of criteria for evaluation of serious games (SGs) which are intended as effective methods of engaging energy users and lowering consumption. We discuss opportunities for using SGs in energy research which go beyond existing feedback mechanisms, including use of immersive virtual worlds for learning and testing behaviours, and sparking conversations within households. From a review of existing SG evaluation criteria, we define a tailored set of criteria for energy SG development and evaluation. The criteria emphasise the need for the game to increase energy literacy through applicability to real-life energy use/management; clear, actionable goals and feedback; ways of comparing usage socially and personal relevance. Three existing energy games are evaluated according to this framework. The paper concludes by outlining directions for future development of SGs as an effective tool in social science research, including games which inspire reflection on trade-offs and usage at different scales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Aims: Consumption of antioxidant nutrients can reduce the risk of progression of age-related macular degeneration (AMD) - the leading cause of visual impairment in adults over the age of 50 years in the UK. Lutein and zeaxanthin (L&Z) are of particular interest because they are selectively absorbed by the central retina. The objectives of this study were to analyse the dietary intake of a group of AMD patients, assess their ability to prepare and cook healthy food, and to make comparisons with people not affected by AMD. Methods: 158 participants with AMD were recruited via the UK charity The Macular Society, and fifty participants without AMD were recruited from optometric practice. A telephone interview was conducted by trained workers where participants completed a 24 hour food diary, and answered questions about cooking and shopping capabilities. Results: In the AMD group, the average L&Z intake was low in for both males and females. Those able to cook a hot meal consumed significantly more L&Z than those who were not able. Most participants were not consuming the recommended dietary allowance of fibre, calcium, vitamin D and E, and calorific intake was also lower than recommendations for their age-group. The non-AMD group consumed more kilocalories and more nutrients than the AMD group, but the L&Z intake was similar to those with AMD. The main factor that influenced participant’s food choices was personal preference. Conclusion: For an ‘informed’ population, many AMD participants were under-consuming nutrients considered to be useful for their condition. Participants without AMD were more likely to reach recommended daily allowance values for energy and a range of nutrients. It is therefore essential to design more effective dietary education and dissemination methods for people with, and at risk of, AMD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dry eye disease is a common clinical condition whose aetiology and management challenges clinicians and researchers alike. Practitioners have a number of dry eye tests available to clinically assess dry eye disease, in order to treat their patients effectively and successfully. This thesis set out to determine the most relevant and successful key tests for dry eye disease diagnosis/ management. There has been very little research on determining the most effective treatment options for these patients; therefore a randomised controlled study was conducted in order to see how different artificial treatments perform compared to each other, whether the preferred treatment could have been predicted from their ocular clinical assessment, and if the preferred treatment subjectively related to the greatest improvement in ocular physiology and tear film stability. This research has found: 1. From the plethora of ocular the tear tests available to utilise in clinical practice, the tear stability tests as measured by the non-invasive tear break (NITBUT) up time and invasive tear break up time (NaFL TBUT) are strongly correlated. The tear volume tests are also related as measured by the phenol red thread (PRT) and tear meniscus height (TMH). Lid Parallel Conjunctival Folds (LIPCOF) and conjunctival staining are significantly correlated to one another. Symptomology and osmolarity were also found to be important tests in order to assess for dry eye. 2. Artificial tear supplements do work for ocular comfort, as well as the ocular surface as observed by conjunctival staining and the reduction LIPCOF. There is no strong evidence of one type of artificial tear supplement being more effective than others, and the data suggest that these improvements are more due to the time than the specific drops. 3. When trying to predict patient preference for artificial tears from baseline measurements, the individual category of artificial tear supplements appeared to have an improvement in at least 1 tear metric. Undoubtedly, from the study the patients preferred artificial tear supplements’ were rated much higher than the other three drops used in the study and their subjective responses were statistically significant than the signs. 4. Patients are also willing to pay for a community dry eye service in their area of £17. In conclusion, the dry eye tests conducted in the study correlate with one another and with the symptoms reported by the patient. Artificial tears do make a difference objectively as well as subjectively. There is no optimum artificial treatment for dry eye, however regular consistent use of artificial eye drops will improve the ocular surface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The poor retention and efficacy of instilled drops as a means of delivering drugs to the ophthalmic environment is well-recognised. The potential value of contact lenses as a means of ophthalmic drug delivery, and consequent improvement of pre-corneal retention is one obvious route to the development of a more effective ocular delivery system. Furthermore, the increasing availability and clinical use of daily disposable contact lenses provides the platform for the development of viable single-day use drug delivery devices based on existing materials and lenses. In order to provide a basis for the effective design of such devices, a systematic understanding of the factors affecting the interaction of individual drugs with the lens matrix is required. Because a large number of potential structural variables are involved, it is necessary to achieve some rationalisation of the parameters and physicochemical properties (such as molecular weight, charge, partition coefficients) that influence drug interactions. Ophthalmic dyes and structurally related compounds based on the same core structure were used to investigate these various factors and the way in which they can be used in concert to design effective release systems for structurally different drugs. Initial studies of passive diffusional release form a necessary precursor to the investigation of the features of the ocular environment that over-ride this simple behaviour. Commercially available contact lenses of differing structural classifications were used to study factors affecting the uptake of the surrogate actives and their release under 'passive' conditions. The interaction between active and lens material shows considerable and complex structure dependence, which is not simply related to equilibrium water content. The structure of the polymer matrix itself was found to have the dominant controlling influence on active uptake; hydrophobic interaction with the ophthalmic dye playing a major role. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.