950 resultados para Modelos fuzzy set


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. This chapter provides a taxonomy and review of the fuzzy DEA (FDEA) methods. We present a classification scheme with six categories, namely, the tolerance approach, the α-level based approach, the fuzzy ranking approach, the possibility approach, the fuzzy arithmetic, and the fuzzy random/type-2 fuzzy set. We discuss each classification scheme and group the FDEA papers published in the literature over the past 30 years. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Descriptions of vegetation communities are often based on vague semantic terms describing species presence and dominance. For this reason, some researchers advocate the use of fuzzy sets in the statistical classification of plant species data into communities. In this study, spatially referenced vegetation abundance values collected from Greek phrygana were analysed by ordination (DECORANA), and classified on the resulting axes using fuzzy c-means to yield a point data-set representing local memberships in characteristic plant communities. The fuzzy clusters matched vegetation communities noted in the field, which tended to grade into one another, rather than occupying discrete patches. The fuzzy set representation of the community exploited the strengths of detrended correspondence analysis while retaining richer information than a TWINSPAN classification of the same data. Thus, in the absence of phytosociological benchmarks, meaningful and manageable habitat information could be derived from complex, multivariate species data. We also analysed the influence of the reliability of different surveyors' field observations by multiple sampling at a selected sample location. We show that the impact of surveyor error was more severe in the Boolean than the fuzzy classification. © 2007 Springer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The basic matrixes method is suggested for the Leontief model analysis (LM) with some of its components indistinctly given. LM can be construed as a forecast task of product’s expenses-output on the basis of the known statistic information at indistinctly given several elements’ meanings of technological matrix, restriction vector and variables’ limits. Elements of technological matrix, right parts of restriction vector LM can occur as functions of some arguments. In this case the task’s dynamic analog occurs. LM essential complication lies in inclusion of variables restriction and criterion function in it.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

* This work is partially supported by CICYT (Spain) under project TIN 2005-08943-C02-001 and by UPM-CAM (Spain) under project R05/11240.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For inference purposes in both classical and fuzzy logic, neither the information itself should be contradictory, nor should any of the items of available information contradict each other. In order to avoid these troubles in fuzzy logic, a study about contradiction was initiated by Trillas et al. in [5] and [6]. They introduced the concepts of both self-contradictory fuzzy set and contradiction between two fuzzy sets. Moreover, the need to study not only contradiction but also the degree of such contradiction is pointed out in [1] and [2], suggesting some measures for this purpose. Nevertheless, contradiction could have been measured in some other way. This paper focuses on the study of contradiction between two fuzzy sets dealing with the problem from a geometrical point of view that allow us to find out new ways to measure the contradiction degree. To do this, the two fuzzy sets are interpreted as a subset of the unit square, and the so called contradiction region is determined. Specially we tackle the case in which both sets represent a curve in [0,1]2. This new geometrical approach allows us to obtain different functions to measure contradiction throughout distances. Moreover, some properties of these contradiction measure functions are established and, in some particular case, the relations among these different functions are obtained.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The design of reverse logistics networks has now emerged as a major issue for manufacturers, not only in developed countries where legislation and societal pressures are strong, but also in developing countries where the adoption of reverse logistics practices may offer a competitive advantage. This paper presents a new model for partner selection for reverse logistic centres in green supply chains. The model offers three advantages. Firstly, it enables economic, environment, and social factors to be considered simultaneously. Secondly, by integrating fuzzy set theory and artificial immune optimization technology, it enables both quantitative and qualitative criteria to be considered simultaneously throughout the whole decision-making process. Thirdly, it extends the flat criteria structure for partner selection evaluation for reverse logistics centres to the more suitable hierarchy structure. The applicability of the model is demonstrated by means of an empirical application based on data from a Chinese electronic equipment and instruments manufacturing company.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bulk density of undisturbed soil samples can be measured using computed tomography (CT) techniques with a spatial resolution of about 1 mm. However, this technique may not be readily accessible. On the other hand, x-ray radiographs have only been considered as qualitative images to describe morphological features. A calibration procedure was set up to generate two-dimensional, high-resolution bulk density images from x-ray radiographs made with a conventional x-ray diffraction apparatus. Test bricks were made to assess the accuracy of the method. Slices of impregnated soil samples were made using hardsetting seedbeds that had been gamma scanned at 5-mm depth increments in a previous study. The calibration procedure involved three stages: (i) calibration of the image grey levels in terms of glass thickness using a staircase made from glass cover slips, (ii) measurement of ratio between the soil and resin mass attenuation coefficients and the glass mass attenuation coefficient, using compacted bricks of known thickness and bulk density, and (iii) image correction accounting for the heterogeneity of the irradiation field. The procedure was simple, rapid, and the equipment was easily accessible. The accuracy of the bulk density determination was good (mean relative error 0.015), The bulk density images showed a good spatial resolution, so that many structural details could be observed. The depth functions were consistent with both the global shrinkage and the gamma probe data previously obtained. The suggested method would be easily applied to the new fuzzy set approach of soil structure, which requires generation of bulk density images. Also, it would be an invaluable tool for studies requiring high-resolution bulk density measurement, such as studies on soil surface crusts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most of distribution generation and smart grid research works are dedicated to the study of network operation parameters, reliability among others. However, many of this research works usually uses traditional test systems such as IEEE test systems. This work proposes a voltage magnitude study in presence of fault conditions considering the realistic specifications found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzyprobabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12 bus sub-transmission network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most of distributed generation and smart grid research works are dedicated to network operation parameters studies, reliability, etc. However, many of these works normally uses traditional test systems, for instance, IEEE test systems. This paper proposes voltage magnitude and reliability studies in presence of fault conditions, considering realistic conditions found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12-bus sub-transmission network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

No ambiente altamente competitivo de hoje, um processo eficaz de seleção de fornecedores é muito importante para o sucesso de qualquer organização [33]. Esta dissertação procura determinar quais os critérios e métodos mais utilizados no problema da seleção de fornecedores, contribuindo assim para o apoio a entidades que pretendam iniciar uma seleção de fornecedores de uma forma mais eficaz. Para atingir os objetivos propostos, foi realizada uma análise de artigos que fazem a revisão literária dos métodos e critérios desde o ano de 1985 até ao ano 2012. Com os dados obtidos destas revisões, foi possível identificar quais os três principais métodos utilizados ao longo dos anos, sendo eles o DEA, AHP e Fuzzy set theory e os principais critérios utilizados na seleção de fornecedores. Nesta dissertação, é apresentada uma visão geral da tomada de decisão e os métodos utilizados na tomada de decisão multicritério. É abordado o problema da seleção de fornecedores, o seu processo de seleção e as revisões literárias dos métodos e critérios de seleção utilizados nos últimos anos. Por fim, é apresentada a contribuição para a seleção de fornecedores do estudo realizado durante o desenvolvimento desta dissertação, sendo apresentados e explicados os principais métodos de seleção de fornecedores, bem como os critérios utilizados.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, the fight against money laundering has emerged as a key issue of financial regulation. The Wolfsberg Group is an important multistakeholder agreement establishing corporate responsibility (CR) principles against money laundering in a domain where international coordination remains otherwise difficult. The fact that 10 out of the 25 top private banking institutions joined this initiative opens up an interesting puzzle concerning the conditions for the participation of key industry players in the Wolfsberg Group. The article presents a fuzzy-set analysis of seven hypotheses based on firm-level organizational factors, the macro-institutional context, and the regulatory framework. Results from the analysis of these 25 financial institutions show that public ownership of the bank and the existence of a code of conduct are necessary conditions for participation in the Wolfsberg Group, whereas factors related to the type of financial institution, combined with the existence of a black list, are sufficient for explaining participation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.