44 resultados para Energy-based model
em Aston University Research Archive
Resumo:
Measuring variations in efficiency and its extension, eco-efficiency, during a restructuring period in different industries has always been a point of interest for regulators and policy makers. This paper assesses the impacts of restructuring of procurement in the Iranian power industry on the performance of power plants. We introduce a new slacks-based model for Malmquist-Luenberger (ML) Index measurement and apply it to the power plants to calculate the efficiency, eco-efficiency, and technological changes over the 8-year period (2003-2010) of restructuring in the power industry. The results reveal that although the restructuring had different effects on the individual power plants, the overall growth in the eco-efficiency of the sector was mainly due to advances in pure technology. We also assess the correlation between efficiency and eco-efficiency of the power plants, which indicates a close relationship between these two steps, thus lending support to the incorporation of environmental factors in efficiency analysis. © 2014 Elsevier Ltd.
Resumo:
A framework based on the continuum damage mechanics and thermodynamics of irreversible processes using internal state variables is used to characterize the distributed damage in viscoelastic asphalt materials in the form of micro-crack initiation and accumulation. At low temperatures and high deformation rates, micro-cracking is considered as the source of nonlinearity and thus the cause of deviation from linear viscoelastic response. Using a non-associated damage evolution law, the proposed model shows the ability to describe the temperature-dependent processes of micro-crack initiation, evolution and macro-crack formation with good comparison to the material response in the Superpave indirect tensile (IDT) strength test.
Resumo:
The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests efficient design and operation philosophy, construction methodology and logical insurance plans. The risk-based model uses Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.
Resumo:
Adapting to blurred images makes in-focus images look too sharp, and vice-versa (Webster et al, 2002 Nature Neuroscience 5 839 - 840). We asked how such blur adaptation is related to contrast adaptation. Georgeson (1985 Spatial Vision 1 103 - 112) found that grating contrast adaptation followed a subtractive rule: perceived (matched) contrast of a grating was fairly well predicted by subtracting some fraction k(~0.3) of the adapting contrast from the test contrast. Here we apply that rule to the responses of a set of spatial filters at different scales and orientations. Blur is encoded by the pattern of filter response magnitudes over scale. We tested two versions - the 'norm model' and 'fatigue model' - against blur-matching data obtained after adaptation to sharpened, in-focus or blurred images. In the fatigue model, filter responses are simply reduced by exposure to the adapter. In the norm model, (a) the visual system is pre-adapted to a focused world and (b) discrepancy between observed and expected responses to the experimental adapter leads to additional reduction (or enhancement) of filter responses during experimental adaptation. The two models are closely related, but only the norm model gave a satisfactory account of results across the four experiments analysed, with one free parameter k. This model implies that the visual system is pre-adapted to focused images, that adapting to in-focus or blank images produces no change in adaptation, and that adapting to sharpened or blurred images changes the state of adaptation, leading to changes in perceived blur or sharpness.
Resumo:
Shropshire Energy Team initiated this study to examine consumption and associated emissions in the predominantly rural county of Shropshire. Current use of energy is not sustainable in the long term and there are various approaches to dealing with the environmental problems it creates. Energy planning by a local authority for a sustainable future requires detailed energy consumption and environmental information. This information would enable target setting and the implementation of policies designed to encourage energy efficiency improvements and exploitation of renewable energy resources. This could aid regeneration strategies by providing new employment opportunities. Associated reductions in carbon dioxide and other emissions would help to meet national and international environmental targets. In the absence of this detailed information, the objective was to develop a methodology to assess energy consumption and emissions on a regional basis from 1990 onwards for all local planning authorities. This would enable a more accurate assessment of the relevant issues, such that plans are more appropriate and longer lasting. A first comprehensive set of data has been gathered from a wide range of sources and a strong correlation was found between population and energy consumption for a variety of regions across the UK. In this case the methodology was applied to the county of Shropshire to give, for the first time, estimates of primary fuel consumption, electricity consumption and associated emissions in Shropshire for 1990 to 2025. The estimates provide a suitable baseline for assessing the potential contribution renewable energy could play in meeting electricity demand in the country and in reducing emissions. The assessment indicated that in 1990 total primary fuel consumption was 63,518,018 GJ/y increasing to 119,956,465 GJ/y by 2025. This is associated with emissions of 1,129,626 t/y of carbon in 1990 rising to 1,303,282 t/y by 2025. In 1990, 22,565,713 GJ/y of the primary fuel consumption was used for generating electricity rising to 23,478,050 GJ/y in 2025. If targets to reduce primary fuel consumption are reached, then emissions of carbon would fall to 1,042,626 by 2025, if renewable energy targets were also reached then emissions of carbon would fall to 988,638 t/y by 2025.
Resumo:
The starting point of this research was the belief that manufacturing and similar industries need help with the concept of e-business, especially in assessing the relevance of possible e-business initiatives. The research hypotheses was that it should be possible to produce a systematic model that defines, at a useful level of detail, the probable e-business requirements of an organisation based on objective criteria with an accuracy of 85%-90%. This thesis describes the development and validation of such a model. A preliminary model was developed from a variety of sources, including a survey of current and planned e-business activity and representative examples of e-business material produced by e-business solution providers. The model was subject to a process of testing and refinement based on recursive case studies, with controls over the improving accuracy and stability of the model. Useful conclusions were also possible as to the relevance of e-business functions to the case study participants themselves. Techniques were evolved to synthesise the e-business requirements of an organisation and present them at a management summary level of detail. The results of applying these techniques to all the case studies used in this research were discussed. The conclusion of the research was that the case study methodology employed was successful. A model was achieved suitable for practical application in a manufacturing organisation requiring help with a requirements definition process.
Resumo:
We investigate knowledge exchange among commercial organizations, the rationale behind it, and its effects on the market. Knowledge exchange is known to be beneficial for industry, but in order to explain it, authors have used high-level concepts like network effects, reputation, and trust. We attempt to formalize a plausible and elegant explanation of how and why companies adopt information exchange and why it benefits the market as a whole when this happens. This explanation is based on a multiagent model that simulates a market of software providers. Even though the model does not include any high-level concepts, information exchange naturally emerges during simulations as a successful profitable behavior. The conclusions reached by this agent-based analysis are twofold: 1) a straightforward set of assumptions is enough to give rise to exchange in a software market, and 2) knowledge exchange is shown to increase the efficiency of the market.
Resumo:
The existing method of pipeline monitoring, which requires an entire pipeline to be inspected periodically, wastes time and is expensive. A risk-based model that reduces the amount of time spent on inspection has been developed. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests an efficient design and operation philosophy, construction method and logical insurance plans.The risk-based model uses analytic hierarchy process, a multiple attribute decision-making technique, to identify factors that influence failure on specific segments and analyze their effects by determining the probabilities of risk factors. The severity of failure is determined through consequence analysis, which establishes the effect of a failure in terms of cost caused by each risk factor and determines the cumulative effect of failure through probability analysis.
Resumo:
In this work we propose a NLSE-based model of power and spectral properties of the random distributed feedback (DFB) fiber laser. The model is based on coupled set of non-linear Schrödinger equations for pump and Stokes waves with the distributed feedback due to Rayleigh scattering. The model considers random backscattering via its average strength, i.e. we assume that the feedback is incoherent. In addition, this allows us to speed up simulations sufficiently (up to several orders of magnitude). We found that the model of the incoherent feedback predicts the smooth and narrow (comparing with the gain spectral profile) generation spectrum in the random DFB fiber laser. The model allows one to optimize the random laser generation spectrum width varying the dispersion and nonlinearity values: we found, that the high dispersion and low nonlinearity results in narrower spectrum that could be interpreted as four-wave mixing between different spectral components in the quasi-mode-less spectrum of the random laser under study could play an important role in the spectrum formation. Note that the physical mechanism of the random DFB fiber laser formation and broadening is not identified yet. We investigate temporal and statistical properties of the random DFB fiber laser dynamics. Interestingly, we found that the intensity statistics is not Gaussian. The intensity auto-correlation function also reveals that correlations do exist. The possibility to optimize the system parameters to enhance the observed intrinsic spectral correlations to further potentially achieved pulsed (mode-locked) operation of the mode-less random distributed feedback fiber laser is discussed.
Resumo:
There has been an increasing interest in the use of agent-based simulation and some discussion of the relative merits of this approach as compared to discrete-event simulation. There are differing views on whether an agent-based simulation offers capabilities that discrete-event cannot provide or whether all agent-based applications can at least in theory be undertaken using a discrete-event approach. This paper presents a simple agent-based NetLogo model and corresponding discrete-event versions implemented in the widely used ARENA software. The two versions of the discrete-event model presented use a traditional process flow approach normally adopted in discrete-event simulation software and also an agent-based approach to the model build. In addition a real-time spatial visual display facility is provided using a spreadsheet platform controlled by VBA code embedded within the ARENA model. Initial findings from this investigation are that discrete-event simulation can indeed be used to implement agent-based models and with suitable integration elements such as VBA provide the spatial displays associated with agent-based software.
Resumo:
Softeam has over 20 years of experience providing UML-based modelling solutions, such as its Modelio modelling tool, and its Constellation enterprise model management and collaboration environment. Due to the increasing number and size of the models used by Softeam’s clients, Softeam joined the MONDO FP7 EU research project, which worked on solutions for these scalability challenges and produced the Hawk model indexer among other results. This paper presents the technical details and several case studies on the integration of Hawk into Softeam’s toolset. The first case study measured the performance of Hawk’s Modelio support using varying amounts of memory for the Neo4j backend. In another case study, Hawk was integrated into Constellation to provide scalable global querying of model repositories. Finally, the combination of Hawk and the Epsilon Generation Language was compared against Modelio for document generation: for the largest model, Hawk was two orders of magnitude faster.
Resumo:
Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.
Resumo:
In recent years, there has been an increasing interest in learning a distributed representation of word sense. Traditional context clustering based models usually require careful tuning of model parameters, and typically perform worse on infrequent word senses. This paper presents a novel approach which addresses these limitations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned representations outperform the publicly available embeddings on half of the metrics in the word similarity task, 6 out of 13 sub tasks in the analogical reasoning task, and gives the best overall accuracy in the word sense effect classification task, which shows the effectiveness of our proposed distributed distribution learning model.
Resumo:
In this paper we propose an alternative method for measuring efficiency of Decision making Units, which allows the presence of variables with both negative and positive values. The model is applied to data on the notional effluent processing system to compare the results with recent developed methods; Modified Slacks Based Model as suggested by Sharp et al (2007) and Range Directional Measures developed by Silva Portela et al (2004). A further example explores advantages of using the new model.
Resumo:
The last few years have witnessed an unprecedented increase in the price of energy available to industry in the United Kingdom and worldwide. The steel industry, as a major consumer of energy delivered in U.K. (8% of national total and nearly 25% of industrial total) and whose energy costs currently form some 28% of the total manufacturing cost, is very much aware of the need to conserve energy. Because of the complexities of steelmaking processes it is imperative that a full understanding of each process and its interlinking role in an integrated steelworks is understood. An analysis of energy distribution shows that as much as 70% of heat input is dissipated to the environment in a variety of forms. Of these, waste gases offer the best potential for energy conservation. The study identifies areas for and discusses novel methods of energy conservation in each process. Application of these schemes in BSC works is developed and their economic incentives highlighted. A major part of this thesis describes design, development and testing of a novel ceramic rotary regenerator for heat recovery from high temperature waste gases, where no such system is available. The regenerator is a compact, efficient heat exchanger. Application of such a system to a reheating furnace provides a fuel saving of up to 40%. A mathematical model developed is verified on the pilot plant. The results obtained confirm the success of the concept and material selection and outlines the work needed to develop an industrial unit. Last, but not least, the key position of an energy manager in an energy conservation programme is identified and a new Energy Management Model for the BSC is developed.