75 resultados para multiple-input single-output FRF
Resumo:
Integer-valued data envelopment analysis (DEA) with alternative returns to scale technology has been introduced and developed recently by Kuosmanen and Kazemi Matin. The proportionality assumption of their introduced "natural augmentability" axiom in constant and nondecreasing returns to scale technologies makes it possible to achieve feasible decision-making units (DMUs) of arbitrary large size. In many real world applications it is not possible to achieve such production plans since some of the input and output variables are bounded above. In this paper, we extend the axiomatic foundation of integer-valuedDEAmodels for including bounded output variables. Some model variants are achieved by introducing a new axiom of "boundedness" over the selected output variables. A mixed integer linear programming (MILP) formulation is also introduced for computing efficiency scores in the associated production set. © 2011 The Authors. International Transactions in Operational Research © 2011 International Federation of Operational Research Societies.
Resumo:
Emrouznejad et al. (2010) proposed a Semi-Oriented Radial Measure (SORM) model for assessing the efficiency of Decision Making Units (DMUs) by Data Envelopment Analysis (DEA) with negative data. This paper provides a necessary and sufficient condition for boundedness of the input and output oriented SORM models.
Resumo:
Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.
Resumo:
We develop a method for fabricating very small silica microbubbles having a micrometer-order wall thickness and demonstrate the first optical microbubble resonator. Our method is based on blowing a microbubble using stable radiative CO2 laser heating rather than unstable convective heating in a flame or furnace. Microbubbles are created along a microcapillary and are naturally opened to the input and output microfluidic or gas channels. The demonstrated microbubble resonator has 370 µm diameter, 2 µm wall thickness, and a Q factor exceeding 10. © 2010 Optical Society of America.
Resumo:
The entorhinal cortex (EC) controls hippocampal input and output, playing major roles in memory and spatial navigation. Different layers of the EC subserve different functions and a number of studies have compared properties of neurones across layers. We have studied synaptic inhibition and excitation in EC neurones, and we have previously compared spontaneous synaptic release of glutamate and GABA using patch clamp recordings of synaptic currents in principal neurones of layers II (L2) and V (L5). Here, we add comparative studies in layer III (L3). Such studies essentially look at neuronal activity from a presynaptic viewpoint. To correlate this with the postsynaptic consequences of spontaneous transmitter release, we have determined global postsynaptic conductances mediated by the two transmitters, using a method to estimate conductances from membrane potential fluctuations. We have previously presented some of this data for L3 and now extend to L2 and L5. Inhibition dominates excitation in all layers but the ratio follows a clear rank order (highest to lowest) of L2>L3>L5. The variance of the background conductances was markedly higher for excitation and inhibition in L2 compared to L3 or L5. We also show that induction of synchronized network epileptiform activity by blockade of GABA inhibition reveals a relative reluctance of L2 to participate in such activity. This was associated with maintenance of a dominant background inhibition in L2, whereas in L3 and L5 the absolute level of inhibition fell below that of excitation, coincident with the appearance of synchronized discharges. Further experiments identified potential roles for competition for bicuculline by ambient GABA at the GABAA receptor, and strychnine-sensitive glycine receptors in residual inhibition in L2. We discuss our results in terms of control of excitability in neuronal subpopulations of EC neurones and what these may suggest for their functional roles. © 2014 Greenhill et al.
Resumo:
Desktop user interface design originates from the fact that users are stationary and can devote all of their visual resource to the application with which they are interacting. In contrast, users of mobile and wearable devices are typically in motion whilst using their device which means that they cannot devote all or any of their visual resource to interaction with the mobile application -- it must remain with the primary task, often for safety reasons. Additionally, such devices have limited screen real estate and traditional input and output capabilities are generally restricted. Consequently, if we are to develop effective applications for use on mobile or wearable technology, we must embrace a paradigm shift with respect to the interaction techniques we employ for communication with such devices.This paper discusses why it is necessary to embrace a paradigm shift in terms of interaction techniques for mobile technology and presents two novel multimodal interaction techniques which are effective alternatives to traditional, visual-centric interface designs on mobile devices as empirical examples of the potential to achieve this shift.
Resumo:
The entorhinal cortex (EC) is a key brain area controlling both hippocampal input and output via neurones in layer II and layer V, respectively. It is also a pivotal area in the generation and propagation of epilepsies involving the temporal lobe. We have previously shown that within the network of the EC, neurones in layer V are subject to powerful synaptic excitation but weak inhibition, whereas the reverse is true in layer II. The deep layers are also highly susceptible to acutely provoked epileptogenesis. Considerable evidence now points to a role of spontaneous background synaptic activity in control of neuronal, and hence network, excitability. In the present article we describe results of studies where we have compared background release of the excitatory transmitter, glutamate, and the inhibitory transmitter, GABA, in the two layers, the role of this background release in the balance of excitability, and its control by presynaptic auto- and heteroreceptors on presynaptic terminals. © The Physiological Society 2004.
Resumo:
Clogging is the main operational problem associated with horizontal subsurface flow constructed wetlands (HSSF CWs). The measurement of saturated hydraulic conductivity has proven to be a suitable technique to assess clogging within HSSF CWs. The vertical and horizontal distribution of hydraulic conductivity was assessed in two full-scale HSSF CWs by using two different in situ permeameter methods (falling head (FH) and constant head (CH) methods). Horizontal hydraulic conductivity profiles showed that both methods are correlated by a power function (FH= CH 0.7821, r 2=0.76) within the recorded range of hydraulic conductivities (0-70 m/day). However, the FH method provided lower values of hydraulic conductivity than the CH method (one to three times lower). Despite discrepancies between the magnitudes of reported readings, the relative distribution of clogging obtained via both methods was similar. Therefore, both methods are useful when exploring the general distribution of clogging and, specially, the assessment of clogged areas originated from preferential flow paths within full-scale HSSF CWs. Discrepancy between methods (either in magnitude and pattern) aroused from the vertical hydraulic conductivity profiles under highly clogged conditions. It is believed this can be attributed to procedural differences between the methods, such as the method of permeameter insertion (twisting versus hammering). Results from both methods suggest that clogging develops along the shortest distance between water input and output. Results also evidence that the design and maintenance of inlet distributors and outlet collectors appear to have a great influence on the pattern of clogging, and hence the asset lifetime of HSSF CWs. © Springer Science+Business Media B.V. 2011.
Resumo:
We provide a theoretical explanation of the results on the intensity distributions and correlation functions obtained from a random-beam speckle field in nonlinear bulk waveguides reported in the recent publication by Bromberg et al. [Nat. Photonics 4, 721 (2010) ].. We study both the focusing and defocusing cases and in the limit of small speckle size (short-correlated disordered beam) provide analytical asymptotes for the intensity probability distributions at the output facet. Additionally we provide a simple relation between the speckle sizes at the input and output of a focusing nonlinear waveguide. The results are of practical significance for nonlinear Hanbury Brown and Twiss interferometry in both optical waveguides and Bose-Einstein condensates. © 2012 American Physical Society.
Resumo:
We analyze a business model for e-supermarkets to enable multi-product sourcing capacity through co-opetition (collaborative competition). The logistics aspect of our approach is to design and execute a network system where “premium” goods are acquired from vendors at multiple locations in the supply network and delivered to customers. Our specific goals are to: (i) investigate the role of premium product offerings in creating critical mass and profit; (ii) develop a model for the multiple-pickup single-delivery vehicle routing problem in the presence of multiple vendors; and (iii) propose a hybrid solution approach. To solve the problem introduced in this paper, we develop a hybrid metaheuristic approach that uses a Genetic Algorithm for vendor selection and allocation, and a modified savings algorithm for the capacitated VRP with multiple pickup, single delivery and time windows (CVRPMPDTW). The proposed Genetic Algorithm guides the search for optimal vendor pickup location decisions, and for each generated solution in the genetic population, a corresponding CVRPMPDTW is solved using the savings algorithm. We validate our solution approach against published VRPTW solutions and also test our algorithm with Solomon instances modified for CVRPMPDTW.
Resumo:
Data Envelopment Analysis (DEA) is a powerful analytical technique for measuring the relative efficiency of alternatives based on their inputs and outputs. The alternatives can be in the form of countries who attempt to enhance their productivity and environmental efficiencies concurrently. However, when desirable outputs such as productivity increases, undesirable outputs increase as well (e.g. carbon emissions), thus making the performance evaluation questionable. In addition, traditional environmental efficiency has been typically measured by crisp input and output (desirable and undesirable). However, the input and output data, such as CO2 emissions, in real-world evaluation problems are often imprecise or ambiguous. This paper proposes a DEA-based framework where the input and output data are characterized by symmetrical and asymmetrical fuzzy numbers. The proposed method allows the environmental evaluation to be assessed at different levels of certainty. The validity of the proposed model has been tested and its usefulness is illustrated using two numerical examples. An application of energy efficiency among 23 European Union (EU) member countries is further presented to show the applicability and efficacy of the proposed approach under asymmetric fuzzy numbers.
Resumo:
Non-parametric methods for efficiency evaluation were designed to analyse industries comprising multi-input multi-output producers and lacking data on market prices. Education is a typical example. In this chapter, we review applications of DEA in secondary and tertiary education, focusing on the opportunities that this offers for benchmarking at institutional level. At secondary level, we investigate also the disaggregation of efficiency measures into pupil-level and school-level effects. For higher education, while many analyses concern overall institutional efficiency, we examine also studies that take a more disaggregated approach, centred either around the performance of specific functional areas or that of individual employees.
Resumo:
An iterative method for computing the channel capacity of both discrete and continuous input, continuous output channels is proposed. The efficiency of new method is demonstrated in comparison with the classical Blahut - Arimoto algorithm for several known channels. Moreover, we also present a hybrid method combining advantages of both the Blahut - Arimoto algorithm and our iterative approach. The new method is especially efficient for the channels with a priory unknown discrete input alphabet.
Resumo:
In this research the recovery of a DQPSK signal will be demonstrated using a single Mach-Zehnder Interferometer (MZI). By changing the phase delay in one of the arms it will be shown that different delays will produce different output levels. It will also be shown that with a certain level of phase shift the DQPSK signal can be converted into four different equally spaced optical power levels. With each decoded level representing one of the four possible bit permutations. By using this additional phase shift in one of the arms the number of MZIs required for decoding can be reduced from two to one.
Resumo:
In data envelopment analysis (DEA), operating units are compared on their outputs relative to their inputs. The identification of an appropriate input-output set is of decisive significance if assessment of the relative performance of the units is not to be biased. This paper reports on a novel approach used for identifying a suitable input-output set for assessing central administrative services at universities. A computer-supported group support system was used with an advisory board to enable the analysts to extract information pertaining to the boundaries of the unit of assessment and the corresponding input-output variables. The approach provides for a more comprehensive and less inhibited discussion of input-output variables to inform the DEA model. © 2005 Operational Research Society Ltd. All rights reserved.