834 resultados para INPUT-OUTPUT ANALYSIS


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In a Data Envelopment Analysis model, some of the weights used to compute the efficiency of a unit can have zero or negligible value despite of the importance of the corresponding input or output. This paper offers an approach to preventing inputs and outputs from being ignored in the DEA assessment under the multiple input and output VRS environment, building on an approach introduced in Allen and Thanassoulis (2004) for single input multiple output CRS cases. The proposed method is based on the idea of introducing unobserved DMUs created by adjusting input and output levels of certain observed relatively efficient DMUs, in a manner which reflects a combination of technical information and the decision maker's value judgements. In contrast to many alternative techniques used to constrain weights and/or improve envelopment in DEA, this approach allows one to impose local information on production trade-offs, which are in line with the general VRS technology. The suggested procedure is illustrated using real data. © 2011 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. In this study, we provide a taxonomy and review of the fuzzy DEA methods. We present a classification scheme with four primary categories, namely, the tolerance approach, the a-level based approach, the fuzzy ranking approach and the possibility approach. We discuss each classification scheme and group the fuzzy DEA papers published in the literature over the past 20 years. To the best of our knowledge, this paper appears to be the only review and complete source of references on fuzzy DEA. © 2011 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Integer-valued data envelopment analysis (DEA) with alternative returns to scale technology has been introduced and developed recently by Kuosmanen and Kazemi Matin. The proportionality assumption of their introduced "natural augmentability" axiom in constant and nondecreasing returns to scale technologies makes it possible to achieve feasible decision-making units (DMUs) of arbitrary large size. In many real world applications it is not possible to achieve such production plans since some of the input and output variables are bounded above. In this paper, we extend the axiomatic foundation of integer-valuedDEAmodels for including bounded output variables. Some model variants are achieved by introducing a new axiom of "boundedness" over the selected output variables. A mixed integer linear programming (MILP) formulation is also introduced for computing efficiency scores in the associated production set. © 2011 The Authors. International Transactions in Operational Research © 2011 International Federation of Operational Research Societies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. This chapter provides a taxonomy and review of the fuzzy DEA (FDEA) methods. We present a classification scheme with six categories, namely, the tolerance approach, the α-level based approach, the fuzzy ranking approach, the possibility approach, the fuzzy arithmetic, and the fuzzy random/type-2 fuzzy set. We discuss each classification scheme and group the FDEA papers published in the literature over the past 30 years. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Optimal design for parameter estimation in Gaussian process regression models with input-dependent noise is examined. The motivation stems from the area of computer experiments, where computationally demanding simulators are approximated using Gaussian process emulators to act as statistical surrogates. In the case of stochastic simulators, which produce a random output for a given set of model inputs, repeated evaluations are useful, supporting the use of replicate observations in the experimental design. The findings are also applicable to the wider context of experimental design for Gaussian process regression and kriging. Designs are proposed with the aim of minimising the variance of the Gaussian process parameter estimates. A heteroscedastic Gaussian process model is presented which allows for an experimental design technique based on an extension of Fisher information to heteroscedastic models. It is empirically shown that the error of the approximation of the parameter variance by the inverse of the Fisher information is reduced as the number of replicated points is increased. Through a series of simulation experiments on both synthetic data and a systems biology stochastic simulator, optimal designs with replicate observations are shown to outperform space-filling designs both with and without replicate observations. Guidance is provided on best practice for optimal experimental design for stochastic response models. © 2013 Elsevier Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the nonparametric framework of Data Envelopment Analysis the statistical properties of its estimators have been investigated and only asymptotic results are available. For DEA estimators results of practical use have been proved only for the case of one input and one output. However, in the real world problems the production process is usually well described by many variables. In this paper a machine learning approach to variable aggregation based on Canonical Correlation Analysis is presented. This approach is applied for efficiency estimation of all the farms in Terceira Island of the Azorean archipelago.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Non-parametric methods for efficiency evaluation were designed to analyse industries comprising multi-input multi-output producers and lacking data on market prices. Education is a typical example. In this chapter, we review applications of DEA in secondary and tertiary education, focusing on the opportunities that this offers for benchmarking at institutional level. At secondary level, we investigate also the disaggregation of efficiency measures into pupil-level and school-level effects. For higher education, while many analyses concern overall institutional efficiency, we examine also studies that take a more disaggregated approach, centred either around the performance of specific functional areas or that of individual employees.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Waste biomass is generated during the conservation management of semi-natural habitats, and represents an unused resource and potential bioenergy feedstock that does not compete with food production. Thermogravimetric analysis was used to characterise a representative range of biomass generated during conservation management in Wales. Of the biomass types assessed, those dominated by rush (Juncus effuses) and bracken (Pteridium aquilinum) exhibited the highest and lowest volatile compositions respectively and were selected for bench scale conversion via fast pyrolysis. Each biomass type was ensiled and a sub-sample of silage was washed and pressed. Demineralization of conservation biomass through washing and pressing was associated with higher oil yields following fast pyrolysis. The oil yields were within the published range established for the dedicated energy crops miscanthus and willow. In order to examine the potential a multiple output energy system was developed with gross power production estimates following valorisation of the press fluid, char and oil. If used in multi fuel industrial burners the char and oil alone would displace 3.9 × 105 tonnes per year of No. 2 light oil using Welsh biomass from conservation management. Bioenergy and product development using these feedstocks could simultaneously support biodiversity management and displace fossil fuels, thereby reducing GHG emissions. Gross power generation predictions show good potential.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. Mobile wireless communications have witnessed the adoption of several generations, each of them complementing and improving the former. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. 4G is a collection of technologies and standards that will allow a range of ubiquitous computing and wireless communication architectures. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications from 100 Mbps, in high mobility links, to as high as 1 Gbps for low mobility users, in addition to high efficiency in the spectrum usage. On mobile wireless communications networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations, where a terrestrial infrastructure is unavailable. Thus, they must rely upon satellite coverage. Good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. This technique must adapt to the characteristics of the satellite channel and also be efficient in the use of allocated bandwidth. Satellite links are fading channels, when used by mobile users. Some measures designed to approach these fading environments make use of: (1) spatial diversity (two receive antenna configuration); (2) time diversity (channel interleaver/spreading techniques); and (3) upper layer FEC. The author proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. For this purpose, a good channel model is necessary.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications at high data rates, in addition to high efficiency in the spectrum usage. On mobile wireless communication networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations where a terrestrial infrastructure is unavailable. The results show that good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. The dissertation proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. The issue of Cooperative Satellite Communications is solved through a new algorithm that forwards the received data from the fixed node to the mobile node. This algorithm is very efficient because it does not allow unnecessary transmissions and is based on signal to noise ratio (SNR) measures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we consider the uplink of a single-cell massive multiple-input multiple-output (MIMO) system with inphase and quadrature-phase imbalance (IQI). This scenario is of particular importance in massive MIMO systems, where the deployment of lower-cost, lower-quality components is desirable to make massive MIMO a viable technology. Particularly, we investigate the effect of IQI on the performance of massive MIMO employing maximum-ratio combining (MRC) receivers. In order to study how IQI affects channel estimation, we derive a new channel estimator for the IQI-impaired model and show that IQI can substantially downgrade the performance of MRC receivers. Moreover, a low-complexity IQI compensation scheme, suitable for massive MIMO, is proposed which is based on the IQI coefficients' estimation and it is independent of the channel gain. The performance of the proposed compensation scheme is analytically evaluated by deriving a tractable approximation of the ergodic achievable rate and providing the asymptotic power scaling laws assuming transmission over Rayleigh fading channels with log-normal large-scale fading. Finally, we show that massive MIMO effectively suppresses the residual IQI effects, as long as, the compensation scheme is applied.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper studies the energy efficiency (EE) of a point-to-point rank-1 Ricean fading multiple-input-multiple-output (MIMO) channel. In particular, a tight lower bound and an asymptotic approximation for the EE of the considered MIMO system are presented, under the assumption that the channel is unknown at the transmitter and perfectly known at the receiver. Moreover, the effects of different system parameters, namely, transmit power, spectral efficiency (SE), and number of transmit and receive antennas, on the EE are analytically investigated. An important observation is that, in the high signal-to-noise ratio regime and with the other system parameters fixed, the optimal transmit power that maximizes the EE increases as the Ricean-K factor increases. On the contrary, the optimal SE and the optimal number of transmit antennas decrease as K increases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.