928 resultados para Multiple-input-multiple-output (mimo)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integer-valued data envelopment analysis (DEA) with alternative returns to scale technology has been introduced and developed recently by Kuosmanen and Kazemi Matin. The proportionality assumption of their introduced "natural augmentability" axiom in constant and nondecreasing returns to scale technologies makes it possible to achieve feasible decision-making units (DMUs) of arbitrary large size. In many real world applications it is not possible to achieve such production plans since some of the input and output variables are bounded above. In this paper, we extend the axiomatic foundation of integer-valuedDEAmodels for including bounded output variables. Some model variants are achieved by introducing a new axiom of "boundedness" over the selected output variables. A mixed integer linear programming (MILP) formulation is also introduced for computing efficiency scores in the associated production set. © 2011 The Authors. International Transactions in Operational Research © 2011 International Federation of Operational Research Societies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emrouznejad et al. (2010) proposed a Semi-Oriented Radial Measure (SORM) model for assessing the efficiency of Decision Making Units (DMUs) by Data Envelopment Analysis (DEA) with negative data. This paper provides a necessary and sufficient condition for boundedness of the input and output oriented SORM models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop a method for fabricating very small silica microbubbles having a micrometer-order wall thickness and demonstrate the first optical microbubble resonator. Our method is based on blowing a microbubble using stable radiative CO2 laser heating rather than unstable convective heating in a flame or furnace. Microbubbles are created along a microcapillary and are naturally opened to the input and output microfluidic or gas channels. The demonstrated microbubble resonator has 370 µm diameter, 2 µm wall thickness, and a Q factor exceeding 10. © 2010 Optical Society of America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The entorhinal cortex (EC) controls hippocampal input and output, playing major roles in memory and spatial navigation. Different layers of the EC subserve different functions and a number of studies have compared properties of neurones across layers. We have studied synaptic inhibition and excitation in EC neurones, and we have previously compared spontaneous synaptic release of glutamate and GABA using patch clamp recordings of synaptic currents in principal neurones of layers II (L2) and V (L5). Here, we add comparative studies in layer III (L3). Such studies essentially look at neuronal activity from a presynaptic viewpoint. To correlate this with the postsynaptic consequences of spontaneous transmitter release, we have determined global postsynaptic conductances mediated by the two transmitters, using a method to estimate conductances from membrane potential fluctuations. We have previously presented some of this data for L3 and now extend to L2 and L5. Inhibition dominates excitation in all layers but the ratio follows a clear rank order (highest to lowest) of L2>L3>L5. The variance of the background conductances was markedly higher for excitation and inhibition in L2 compared to L3 or L5. We also show that induction of synchronized network epileptiform activity by blockade of GABA inhibition reveals a relative reluctance of L2 to participate in such activity. This was associated with maintenance of a dominant background inhibition in L2, whereas in L3 and L5 the absolute level of inhibition fell below that of excitation, coincident with the appearance of synchronized discharges. Further experiments identified potential roles for competition for bicuculline by ambient GABA at the GABAA receptor, and strychnine-sensitive glycine receptors in residual inhibition in L2. We discuss our results in terms of control of excitability in neuronal subpopulations of EC neurones and what these may suggest for their functional roles. © 2014 Greenhill et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Desktop user interface design originates from the fact that users are stationary and can devote all of their visual resource to the application with which they are interacting. In contrast, users of mobile and wearable devices are typically in motion whilst using their device which means that they cannot devote all or any of their visual resource to interaction with the mobile application -- it must remain with the primary task, often for safety reasons. Additionally, such devices have limited screen real estate and traditional input and output capabilities are generally restricted. Consequently, if we are to develop effective applications for use on mobile or wearable technology, we must embrace a paradigm shift with respect to the interaction techniques we employ for communication with such devices.This paper discusses why it is necessary to embrace a paradigm shift in terms of interaction techniques for mobile technology and presents two novel multimodal interaction techniques which are effective alternatives to traditional, visual-centric interface designs on mobile devices as empirical examples of the potential to achieve this shift.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supply chain formation (SCF) is the process of determining the set of participants and exchange relationships within a network with the goal of setting up a supply chain that meets some predefined social objective. Many proposed solutions for the SCF problem rely on centralized computation, which presents a single point of failure and can also lead to problems with scalability. Decentralized techniques that aid supply chain emergence offer a more robust and scalable approach by allowing participants to deliberate between themselves about the structure of the optimal supply chain. Current decentralized supply chain emergence mechanisms are only able to deal with simplistic scenarios in which goods are produced and traded in single units only and without taking into account production capacities or input-output ratios other than 1:1. In this paper, we demonstrate the performance of a graphical inference technique, max-sum loopy belief propagation (LBP), in a complex multiunit unit supply chain emergence scenario which models additional constraints such as production capacities and input-to-output ratios. We also provide results demonstrating the performance of LBP in dynamic environments, where the properties and composition of participants are altered as the algorithm is running. Our results suggest that max-sum LBP produces consistently strong solutions on a variety of network structures in a multiunit problem scenario, and that performance tends not to be affected by on-the-fly changes to the properties or composition of participants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The entorhinal cortex (EC) is a key brain area controlling both hippocampal input and output via neurones in layer II and layer V, respectively. It is also a pivotal area in the generation and propagation of epilepsies involving the temporal lobe. We have previously shown that within the network of the EC, neurones in layer V are subject to powerful synaptic excitation but weak inhibition, whereas the reverse is true in layer II. The deep layers are also highly susceptible to acutely provoked epileptogenesis. Considerable evidence now points to a role of spontaneous background synaptic activity in control of neuronal, and hence network, excitability. In the present article we describe results of studies where we have compared background release of the excitatory transmitter, glutamate, and the inhibitory transmitter, GABA, in the two layers, the role of this background release in the balance of excitability, and its control by presynaptic auto- and heteroreceptors on presynaptic terminals. © The Physiological Society 2004.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The energy balancing capability of cooperative communication is utilized to solve the energy hole problem in wireless sensor networks. We first propose a cooperative transmission strategy, where intermediate nodes participate in two cooperative multi-input single-output (MISO) transmissions with the node at the previous hop and a selected node at the next hop, respectively. Then, we study the optimization problems for power allocation of the cooperative transmission strategy by examining two different approaches: network lifetime maximization (NLM) and energy consumption minimization (ECM). For NLM, the numerical optimal solution is derived and a searching algorithm for suboptimal solution is provided when the optimal solution does not exist. For ECM, a closed-form solution is obtained. Numerical and simulation results show that both the approaches have much longer network lifetime than SISO transmission strategies and other cooperative communication schemes. Moreover, NLM which features energy balancing outperforms ECM which focuses on energy efficiency, in the network lifetime sense.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is focused on a parallel JAVA implementation of a processor defined in a Network of Evolutionary Processors. Processor description is based on JDom, which provides a complete, Java-based solution for accessing, manipulating, and outputting XML data from Java code. Communication among different processor to obtain a fully functional simulation of a Network of Evolutionary Processors will be treated in future. A safe-thread model of processors performs all parallel operations such as rules and filters. A non-deterministic behavior of processors is achieved with a thread for each rule and for each filter (input and output). Different results of a processor evolution are shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clogging is the main operational problem associated with horizontal subsurface flow constructed wetlands (HSSF CWs). The measurement of saturated hydraulic conductivity has proven to be a suitable technique to assess clogging within HSSF CWs. The vertical and horizontal distribution of hydraulic conductivity was assessed in two full-scale HSSF CWs by using two different in situ permeameter methods (falling head (FH) and constant head (CH) methods). Horizontal hydraulic conductivity profiles showed that both methods are correlated by a power function (FH= CH 0.7821, r 2=0.76) within the recorded range of hydraulic conductivities (0-70 m/day). However, the FH method provided lower values of hydraulic conductivity than the CH method (one to three times lower). Despite discrepancies between the magnitudes of reported readings, the relative distribution of clogging obtained via both methods was similar. Therefore, both methods are useful when exploring the general distribution of clogging and, specially, the assessment of clogged areas originated from preferential flow paths within full-scale HSSF CWs. Discrepancy between methods (either in magnitude and pattern) aroused from the vertical hydraulic conductivity profiles under highly clogged conditions. It is believed this can be attributed to procedural differences between the methods, such as the method of permeameter insertion (twisting versus hammering). Results from both methods suggest that clogging develops along the shortest distance between water input and output. Results also evidence that the design and maintenance of inlet distributors and outlet collectors appear to have a great influence on the pattern of clogging, and hence the asset lifetime of HSSF CWs. © Springer Science+Business Media B.V. 2011.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given cybernetic idea is formed on the basis of neurophysiologic, neuropsychological, neurocybernetic data and verisimilar hypotheses, which fill gaps of formers, of the author as well. First of all attention is focused on general principles of a Memory organization in the brain and processes which take part in it that realize such psychical functions as perception and identification of input information about patterns and a problem solving, which is specified by the input and output conditions, as well. Realization of the second function, essentially cogitative, is discussed in the aspects of figurative and lingual thinking on the levels of intuition and understanding. The reasons of advisability and principles of bionic approach to creation of appropriate tools of artificial intelligent are proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide a theoretical explanation of the results on the intensity distributions and correlation functions obtained from a random-beam speckle field in nonlinear bulk waveguides reported in the recent publication by Bromberg et al. [Nat. Photonics 4, 721 (2010) ].. We study both the focusing and defocusing cases and in the limit of small speckle size (short-correlated disordered beam) provide analytical asymptotes for the intensity probability distributions at the output facet. Additionally we provide a simple relation between the speckle sizes at the input and output of a focusing nonlinear waveguide. The results are of practical significance for nonlinear Hanbury Brown and Twiss interferometry in both optical waveguides and Bose-Einstein condensates. © 2012 American Physical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data Envelopment Analysis (DEA) is a powerful analytical technique for measuring the relative efficiency of alternatives based on their inputs and outputs. The alternatives can be in the form of countries who attempt to enhance their productivity and environmental efficiencies concurrently. However, when desirable outputs such as productivity increases, undesirable outputs increase as well (e.g. carbon emissions), thus making the performance evaluation questionable. In addition, traditional environmental efficiency has been typically measured by crisp input and output (desirable and undesirable). However, the input and output data, such as CO2 emissions, in real-world evaluation problems are often imprecise or ambiguous. This paper proposes a DEA-based framework where the input and output data are characterized by symmetrical and asymmetrical fuzzy numbers. The proposed method allows the environmental evaluation to be assessed at different levels of certainty. The validity of the proposed model has been tested and its usefulness is illustrated using two numerical examples. An application of energy efficiency among 23 European Union (EU) member countries is further presented to show the applicability and efficacy of the proposed approach under asymmetric fuzzy numbers.