33 resultados para Fuzzy set theory

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops an integratedapproach, combining quality function deployment (QFD), fuzzy set theory, and analytic hierarchy process (AHP) approach, to evaluate and select the optimal third-party logistics service providers (3PLs). In the approach, multiple evaluating criteria are derived from the requirements of company stakeholders using a series of house of quality (HOQ). The importance of evaluating criteria is prioritized with respect to the degree of achieving the stakeholder requirements using fuzzyAHP. Based on the ranked criteria, alternative 3PLs are evaluated and compared with each other using fuzzyAHP again to make an optimal selection. The effectiveness of proposed approach is demonstrated by applying it to a Hong Kong based enterprise that supplies hard disk components. The proposed integratedapproach outperforms the existing approaches because the outsourcing strategy and 3PLs selection are derived from the corporate/business strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Renewable energy project development is highly complex and success is by no means guaranteed. Decisions are often made with approximate or uncertain information yet the current methods employed by decision-makers do not necessarily accommodate this. Levelised energy costs (LEC) are one such commonly applied measure utilised within the energy industry to assess the viability of potential projects and inform policy. The research proposes a method for achieving this by enhancing the traditional discounting LEC measure with fuzzy set theory. Furthermore, the research develops the fuzzy LEC (F-LEC) methodology to incorporate the cost of financing a project from debt and equity sources. Applied to an example bioenergy project, the research demonstrates the benefit of incorporating fuzziness for project viability, optimal capital structure and key variable sensitivity analysis decision-making. The proposed method contributes by incorporating uncertain and approximate information to the widely utilised LEC measure and by being applicable to a wide range of energy project viability decisions. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using fuzzy-set qualitative comparative analysis (fsQCA), this study investigates the conditions leading to a higher level of innovation. More specifically, the study explores the impact of inter-organisational knowledge transfer networks and organisations' internal capabilities on different types of innovation in Small to Medium size Enterprises (SMEs) in the high-tech sector. A survey instrument was used to collect data from a sample of UK SMEs. The findings show that although individual factors are important, there is no need for a company to perform well in all the areas. The fsQCA, which enables the examination of the impacts of different combinations of factors, reveals that there are a number of paths to achieve better incremental and radical innovation performance. Companies need to choose the one that is closest to their abilities and fits best with their resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with a very important issue in any knowledge engineering discipline: the accurate representation and modelling of real life data and its processing by human experts. The work is applied to the GRiST Mental Health Risk Screening Tool for assessing risks associated with mental-health problems. The complexity of risk data and the wide variations in clinicians' expert opinions make it difficult to elicit representations of uncertainty that are an accurate and meaningful consensus. It requires integrating each expert's estimation of a continuous distribution of uncertainty across a range of values. This paper describes an algorithm that generates a consensual distribution at the same time as measuring the consistency of inputs. Hence it provides a measure of the confidence in the particular data item's risk contribution at the input stage and can help give an indication of the quality of subsequent risk predictions. © 2010 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. This chapter provides a taxonomy and review of the fuzzy DEA (FDEA) methods. We present a classification scheme with six categories, namely, the tolerance approach, the α-level based approach, the fuzzy ranking approach, the possibility approach, the fuzzy arithmetic, and the fuzzy random/type-2 fuzzy set. We discuss each classification scheme and group the FDEA papers published in the literature over the past 30 years. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Descriptions of vegetation communities are often based on vague semantic terms describing species presence and dominance. For this reason, some researchers advocate the use of fuzzy sets in the statistical classification of plant species data into communities. In this study, spatially referenced vegetation abundance values collected from Greek phrygana were analysed by ordination (DECORANA), and classified on the resulting axes using fuzzy c-means to yield a point data-set representing local memberships in characteristic plant communities. The fuzzy clusters matched vegetation communities noted in the field, which tended to grade into one another, rather than occupying discrete patches. The fuzzy set representation of the community exploited the strengths of detrended correspondence analysis while retaining richer information than a TWINSPAN classification of the same data. Thus, in the absence of phytosociological benchmarks, meaningful and manageable habitat information could be derived from complex, multivariate species data. We also analysed the influence of the reliability of different surveyors' field observations by multiple sampling at a selected sample location. We show that the impact of surveyor error was more severe in the Boolean than the fuzzy classification. © 2007 Springer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper introduces a new technique in the investigation of limited-dependent variable models. This paper illustrates that variable precision rough set theory (VPRS), allied with the use of a modern method of classification, or discretisation of data, can out-perform the more standard approaches that are employed in economics, such as a probit model. These approaches and certain inductive decision tree methods are compared (through a Monte Carlo simulation approach) in the analysis of the decisions reached by the UK Monopolies and Mergers Committee. We show that, particularly in small samples, the VPRS model can improve on more traditional models, both in-sample, and particularly in out-of-sample prediction. A similar improvement in out-of-sample prediction over the decision tree methods is also shown.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Market orientation (MO) and marketing performance measurement (MPM) are two of the most widespread strategic marketing concepts among practitioners. However, some have questioned the benefits of extensive investments in MO and MPM. More importantly, little is known about which combinations of MO and MPM are optimal in ensuring high business performance. To address this research gap, the authors analyze a unique data set of 628 firms with a novel method of configurational analysis: fuzzy-set qualitative comparative analysis. In line with prior research, the authors find that MO is an important determinant of business performance. However, to reap its benefits, managers need to complement it with appropriate MPM, the level and focus of which vary across firms. For example, whereas large firms and market leaders generally benefit from comprehensive MPM, small firms may benefit from measuring marketing performance only selectively or by focusing on particular dimensions of marketing performance. The study also finds that many of the highest-performing firms do not follow any of the particular best practices identified.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main advantage of Data Envelopment Analysis (DEA) is that it does not require any priori weights for inputs and outputs and allows individual DMUs to evaluate their efficiencies with the input and output weights that are only most favorable weights for calculating their efficiency. It can be argued that if DMUs are experiencing similar circumstances, then the pricing of inputs and outputs should apply uniformly across all DMUs. That is using of different weights for DMUs makes their efficiencies unable to be compared and not possible to rank them on the same basis. This is a significant drawback of DEA; however literature observed many solutions including the use of common set of weights (CSW). Besides, the conventional DEA methods require accurate measurement of both the inputs and outputs; however, crisp input and output data may not relevant be available in real world applications. This paper develops a new model for the calculation of CSW in fuzzy environments using fuzzy DEA. Further, a numerical example is used to show the validity and efficacy of the proposed model and to compare the results with previous models available in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we review recent theoretical approaches for analysing the dynamics of on-line learning in multilayer neural networks using methods adopted from statistical physics. The analysis is based on monitoring a set of macroscopic variables from which the generalisation error can be calculated. A closed set of dynamical equations for the macroscopic variables is derived analytically and solved numerically. The theoretical framework is then employed for defining optimal learning parameters and for analysing the incorporation of second order information into the learning process using natural gradient descent and matrix-momentum based methods. We will also briefly explain an extension of the original framework for analysing the case where training examples are sampled with repetition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of "typical set (pairs) decoding" for ensembles of Gallager's linear code is investigated using statistical physics. In this decoding method, errors occur, either when the information transmission is corrupted by atypical noise, or when multiple typical sequences satisfy the parity check equation as provided by the received corrupted codeword. We show that the average error rate for the second type of error over a given code ensemble can be accurately evaluated using the replica method, including the sensitivity to message length. Our approach generally improves the existing analysis known in the information theory community, which was recently reintroduced in IEEE Trans. Inf. Theory 45, 399 (1999), and is believed to be the most accurate to date. © 2002 The American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The whole set of the nickel(II) complexes with no derivatized edta-type hexadentate ligands has been investigated from their structural and electronic properties. Two more complexes have been prepared in order to complete the whole set: trans(O5)-[Ni(ED3AP)]2- and trans(O5O6)-[Ni(EDA3P)]2- complexes. trans(O5) geometry has been verified crystallographically and trans(O5O6) geometry of the second complex has been predicted by the DFT theory and spectral analysis. Mutual dependance has been established between: the number of the five-membered carboxylate rings, octahedral/tetrahedral deviation of metal-ligand/nitrogen-neighbour-atom angles and charge-transfer energies (CTE) calculated by the Morokuma’s energetic decomposition analysis; energy of the absorption bands and HOMO–LUMO gap.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What does endogenous growth theory tell about regional economies? Empirics of R&D worker-based productivity growth, Regional Studies. Endogenous growth theory emerged in the 1990s as ‘new growth theory’ accounting for technical progress in the growth process. This paper examines the role of research and development (R&D) workers underlying the Romer model (1990) and its subsequent modifications, and compares it with a model based on the accumulation of human capital engaged in R&D. Cross-section estimates of the models against productivity growth of European regions in the 1990s suggest that each R&D worker has a unique set of knowledge while his/her contributions are enhanced by knowledge sharing within a region as well as spillovers from other regions in proximity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous developments in the opportunism-independent theory of the firm are either restricted to special cases or are derived from the capabilities or resource-based perspective. However, a more general opportunism-independent approach can be developed, based on the work of Demsetz and Coase, which is nevertheless contractual in nature. This depends on 'direction', that is, deriving economic value by permitting one set of actors to direct the activities of another, and of non-human factors of production. Direction helps to explain not only firm boundaries and organisation, but also the existence of firms, without appealing to opportunism or moral hazard. The paper also considers the extent to which it is meaningful to speak of 'contractual' theories in the absence of opportunism, and whether this analysis can be extended beyond the employment contract to encompass ownership of assets by the firm. © The Author 2005. Published by Oxford University Press on behalf of the Cambridge Political Economy Society. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this editorial is to bring together thoughts and opinions from the Editors and Senior Advisory Board of EJM regarding the nature of the long-debated “theory-practice divide” in marketing scholarship. Design/methodology/approach – The authors synthesise diverse opinions from senior academics in order both to inspire further debate in marketing scholarship, and to draw some important conclusions for marketing academia as a whole. Findings – The authors propose that, for marketing scholarship to mature and progress, room must be found for those who wish to focus both on practical and on pure marketing scholarship. Career advancement from both routes is vital. Research limitations/implications – The topic of the theory-practice gap is complex. Many diverse opinions are cited and, due to space constraints, the coverage of many issues is necessarily brief. Practical implications – Scholars should find the thoughts contained in the paper of significant interest. Originality/value – The paper appears to be the first to bring together such a set of diverse opinions on the subject, and to try to draw some overall pragmatic conclusions, while still recognising the multiplicity of valid thought in the area.