929 resultados para two input two output


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main advantage of Data Envelopment Analysis (DEA) is that it does not require any priori weights for inputs and outputs and allows individual DMUs to evaluate their efficiencies with the input and output weights that are only most favorable weights for calculating their efficiency. It can be argued that if DMUs are experiencing similar circumstances, then the pricing of inputs and outputs should apply uniformly across all DMUs. That is using of different weights for DMUs makes their efficiencies unable to be compared and not possible to rank them on the same basis. This is a significant drawback of DEA; however literature observed many solutions including the use of common set of weights (CSW). Besides, the conventional DEA methods require accurate measurement of both the inputs and outputs; however, crisp input and output data may not relevant be available in real world applications. This paper develops a new model for the calculation of CSW in fuzzy environments using fuzzy DEA. Further, a numerical example is used to show the validity and efficacy of the proposed model and to compare the results with previous models available in the literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modern business trends such as agile manufacturing and virtual corporations require high levels of flexibility and responsiveness to consumer demand, and require the ability to quickly and efficiently select trading partners. Automated computational techniques for supply chain formation have the potential to provide significant advantages in terms of speed and efficiency over the traditional manual approach to partner selection. Automated supply chain formation is the process of determining the participants within a supply chain and the terms of the exchanges made between these participants. In this thesis we present an automated technique for supply chain formation based upon the min-sum loopy belief propagation algorithm (LBP). LBP is a decentralised and distributed message-passing algorithm which allows participants to share their beliefs about the optimal structure of the supply chain based upon their costs, capabilities and requirements. We propose a novel framework for the application of LBP to the existing state-of-the-art case of the decentralised supply chain formation problem, and extend this framework to allow for application to further novel and established problem cases. Specifically, the contributions made by this thesis are: • A novel framework to allow for the application of LBP to the decentralised supply chain formation scenario investigated using the current state-of-the-art approach. Our experimental analysis indicates that LBP is able to match or outperform this approach for the vast majority of problem instances tested. • A new solution goal for supply chain formation in which economically motivated producers aim to maximise their profits by intelligently altering their profit margins. We propose a rational pricing strategy that allows producers to earn significantly greater profits than a comparable LBP-based profitmaking approach. • An LBP-based framework which allows the algorithm to be used to solve supply chain formation problems in which goods are exchanged in multiple units, a first for a fully decentralised technique. As well as multiple-unit exchanges, we also model in this scenario realistic constraints such as factory capacities and input-to-output ratios. LBP continues to be able to match or outperform an extended version of the existing state-of-the-art approach in this scenario. • Introduction of a dynamic supply chain formation scenario in which participants are able to alter their properties or to enter or leave the process at any time. Our results suggest that LBP is able to deal easily with individual occurences of these alterations and that performance degrades gracefully when they occur in larger numbers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We introduce a general matrix formulation for multiuser channels and analyse the special cases of Multiple-Input Multiple-Output channels, channels with interference and relay arrays under LDPC coding using methods developed for the statistical mechanics of disordered systems. We use the replica method to provide results for the typical overlaps of the original and recovered messages and discuss their implications. The results obtained are consistent with belief propagation and density evolution results but also complement them giving additional insights into the information dynamics of these channels with unexpected effects in some cases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The rationale for carrying out this research was to address the clear lack of knowledge surrounding the measurement of public hospital performance in Ireland. The objectives of this research were to develop a comprehensive model for measuring hospital performance and using this model to measure the performance of public acute hospitals in Ireland in 2007. Having assessed the advantages and disadvantages of various measurement models the Data Envelopment Analysis (DEA) model was chosen for this research. DEA was initiated by Charnes, Cooper and Rhodes in 1978 and further developed by Fare et al. (1983) and Banker et al. (1984). The method used to choose relevant inputs and outputs to be included in the model followed that adopted by Casu et al. (2005) which included the use of focus groups. The main conclusions of the research are threefold. Firstly, it is clear that each stakeholder group has differing opinions on what constitutes good performance. It is therefore imperative that any performance measurement model would be designed within parameters that are clearly understood by any intended audience. Secondly, there is a lack of publicly available qualitative information in Ireland that inhibits detailed analysis of hospital performance. Thirdly, based on available qualitative and quantitative data the results indicated a high level of efficiency among the public acute hospitals in Ireland in their staffing and non pay costs, averaging 98.5%. As DEA scores are sensitive to the number of input and output variables as well as the size of the sample it should be borne in mind that a high level of efficiency could be as a result of using DEA with too many variables compared to the number of hospitals. No hospital was deemed to be scale efficient in any of the models even though the average scale efficiency for all of the hospitals was relatively high at 90.3%. Arising from this research the main recommendations would be that information on medical outcomes, survival rates and patient satisfaction should be made publicly available in Ireland; that despite a high average efficiency level that many individual hospitals need to focus on improving their technical and scale efficiencies, and that performance measurement models should be developed that would include more qualitative data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: The ubiquity and value of teams in healthcare are well acknowledged. However, in practice, healthcare teams vary dramatically in their structures and effectiveness in ways that can damage team processes and patient outcomes. The aim of this paper is to highlight these characteristics and to extrapolate several important aspects of teamwork that have a powerful impact on team effectiveness across healthcare contexts. Design/methodology/approach: The paper draws upon the literature from health services management and organisational behaviour to provide an overview of the current science of healthcare teams. Findings: Underpinned by the input-process-output framework of team effectiveness, team composition, team task, and organisational support are viewed as critical inputs that influence key team processes including team objectives, leadership and reflexivity, which in turn impact staff and patient outcomes. Team training interventions and care pathways can facilitate more effective interdisciplinary teamwork. Originality/value: The paper argues that the prevalence of the term "team" in healthcare makes the synthesis and advancement of the scientific understanding of healthcare teams a challenge. Future research therefore needs to better define the fundamental characteristics of teams in studies in order to ensure that findings based on real teams, rather than pseudo-like groups, are accumulated. © Emerald Group Publishing Limited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a Data Envelopment Analysis model, some of the weights used to compute the efficiency of a unit can have zero or negligible value despite of the importance of the corresponding input or output. This paper offers an approach to preventing inputs and outputs from being ignored in the DEA assessment under the multiple input and output VRS environment, building on an approach introduced in Allen and Thanassoulis (2004) for single input multiple output CRS cases. The proposed method is based on the idea of introducing unobserved DMUs created by adjusting input and output levels of certain observed relatively efficient DMUs, in a manner which reflects a combination of technical information and the decision maker's value judgements. In contrast to many alternative techniques used to constrain weights and/or improve envelopment in DEA, this approach allows one to impose local information on production trade-offs, which are in line with the general VRS technology. The suggested procedure is illustrated using real data. © 2011 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a method for fabricating very small silica microbubbles having a micrometer-order wall thickness and demonstrate the first optical microbubble resonator. Our method is based on blowing a microbubble using stable radiative CO2 laser heating rather than unstable convective heating in a flame or furnace. Microbubbles are created along a microcapillary and are naturally opened to the input and output microfluidic or gas channels. The demonstrated microbubble resonator has 370 µm diameter, 2 µm wall thickness, and a Q factor exceeding 10. © 2010 Optical Society of America.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Integer-valued data envelopment analysis (DEA) with alternative returns to scale technology has been introduced and developed recently by Kuosmanen and Kazemi Matin. The proportionality assumption of their introduced "natural augmentability" axiom in constant and nondecreasing returns to scale technologies makes it possible to achieve feasible decision-making units (DMUs) of arbitrary large size. In many real world applications it is not possible to achieve such production plans since some of the input and output variables are bounded above. In this paper, we extend the axiomatic foundation of integer-valuedDEAmodels for including bounded output variables. Some model variants are achieved by introducing a new axiom of "boundedness" over the selected output variables. A mixed integer linear programming (MILP) formulation is also introduced for computing efficiency scores in the associated production set. © 2011 The Authors. International Transactions in Operational Research © 2011 International Federation of Operational Research Societies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) has been proven as an excellent data-oriented efficiency analysis method for comparing decision making units (DMUs) with multiple inputs and multiple outputs. In conventional DEA, it is assumed that the status of each measure is clearly known as either input or output. However, in some situations, a performance measure can play input role for some DMUs and output role for others. Cook and Zhu [Eur. J. Oper. Res. 180 (2007) 692–699] referred to these variables as flexible measures. The paper proposes an alternative model in which each flexible measure is treated as either input or output variable to maximize the technical efficiency of the DMU under evaluation. The main focus of this paper is on the impact that the flexible measures has on the definition of the PPS and the assessment of technical efficiency. An example in UK higher education intuitions shows applicability of the proposed approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Emrouznejad et al. (2010) proposed a Semi-Oriented Radial Measure (SORM) model for assessing the efficiency of Decision Making Units (DMUs) by Data Envelopment Analysis (DEA) with negative data. This paper provides a necessary and sufficient condition for boundedness of the input and output oriented SORM models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop a method for fabricating very small silica microbubbles having a micrometer-order wall thickness and demonstrate the first optical microbubble resonator. Our method is based on blowing a microbubble using stable radiative CO2 laser heating rather than unstable convective heating in a flame or furnace. Microbubbles are created along a microcapillary and are naturally opened to the input and output microfluidic or gas channels. The demonstrated microbubble resonator has 370 µm diameter, 2 µm wall thickness, and a Q factor exceeding 10. © 2010 Optical Society of America.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. This chapter provides a taxonomy and review of the fuzzy DEA (FDEA) methods. We present a classification scheme with six categories, namely, the tolerance approach, the α-level based approach, the fuzzy ranking approach, the possibility approach, the fuzzy arithmetic, and the fuzzy random/type-2 fuzzy set. We discuss each classification scheme and group the FDEA papers published in the literature over the past 30 years. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) has gained a wide range of applications in measuring comparative efficiency of decision making units (DMUs) with multiple incommensurate inputs and outputs. The standard DEA method requires that the status of all input and output variables be known exactly. However, in many real applications, the status of some measures is not clearly known as inputs or outputs. These measures are referred to as flexible measures. This paper proposes a flexible slacks-based measure (FSBM) of efficiency in which each flexible measure can play input role for some DMUs and output role for others to maximize the relative efficiency of the DMU under evaluation. Further, we will show that when an operational unit is efficient in a specific flexible measure, this measure can play both input and output roles for this unit. In this case, the optimal input/output designation for flexible measure is one that optimizes the efficiency of the artificial average unit. An application in assessing UK higher education institutions used to show the applicability of the proposed approach. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Supply chain formation (SCF) is the process of determining the set of participants and exchange relationships within a network with the goal of setting up a supply chain that meets some predefined social objective. Many proposed solutions for the SCF problem rely on centralized computation, which presents a single point of failure and can also lead to problems with scalability. Decentralized techniques that aid supply chain emergence offer a more robust and scalable approach by allowing participants to deliberate between themselves about the structure of the optimal supply chain. Current decentralized supply chain emergence mechanisms are only able to deal with simplistic scenarios in which goods are produced and traded in single units only and without taking into account production capacities or input-output ratios other than 1:1. In this paper, we demonstrate the performance of a graphical inference technique, max-sum loopy belief propagation (LBP), in a complex multiunit unit supply chain emergence scenario which models additional constraints such as production capacities and input-to-output ratios. We also provide results demonstrating the performance of LBP in dynamic environments, where the properties and composition of participants are altered as the algorithm is running. Our results suggest that max-sum LBP produces consistently strong solutions on a variety of network structures in a multiunit problem scenario, and that performance tends not to be affected by on-the-fly changes to the properties or composition of participants.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper is focused on a parallel JAVA implementation of a processor defined in a Network of Evolutionary Processors. Processor description is based on JDom, which provides a complete, Java-based solution for accessing, manipulating, and outputting XML data from Java code. Communication among different processor to obtain a fully functional simulation of a Network of Evolutionary Processors will be treated in future. A safe-thread model of processors performs all parallel operations such as rules and filters. A non-deterministic behavior of processors is achieved with a thread for each rule and for each filter (input and output). Different results of a processor evolution are shown.