19 resultados para rough set theory

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new technique in the investigation of limited-dependent variable models. This paper illustrates that variable precision rough set theory (VPRS), allied with the use of a modern method of classification, or discretisation of data, can out-perform the more standard approaches that are employed in economics, such as a probit model. These approaches and certain inductive decision tree methods are compared (through a Monte Carlo simulation approach) in the analysis of the decisions reached by the UK Monopolies and Mergers Committee. We show that, particularly in small samples, the VPRS model can improve on more traditional models, both in-sample, and particularly in out-of-sample prediction. A similar improvement in out-of-sample prediction over the decision tree methods is also shown.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Web document cluster analysis plays an important role in information retrieval by organizing large amounts of documents into a small number of meaningful clusters. Traditional web document clustering is based on the Vector Space Model (VSM), which takes into account only two-level (document and term) knowledge granularity but ignores the bridging paragraph granularity. However, this two-level granularity may lead to unsatisfactory clustering results with “false correlation”. In order to deal with the problem, a Hierarchical Representation Model with Multi-granularity (HRMM), which consists of five-layer representation of data and a twophase clustering process is proposed based on granular computing and article structure theory. To deal with the zero-valued similarity problemresulted from the sparse term-paragraphmatrix, an ontology based strategy and a tolerance-rough-set based strategy are introduced into HRMM. By using granular computing, structural knowledge hidden in documents can be more efficiently and effectively captured in HRMM and thus web document clusters with higher quality can be generated. Extensive experiments show that HRMM, HRMM with tolerancerough-set strategy, and HRMM with ontology all outperform VSM and a representative non VSM-based algorithm, WFP, significantly in terms of the F-Score.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper develops an integratedapproach, combining quality function deployment (QFD), fuzzy set theory, and analytic hierarchy process (AHP) approach, to evaluate and select the optimal third-party logistics service providers (3PLs). In the approach, multiple evaluating criteria are derived from the requirements of company stakeholders using a series of house of quality (HOQ). The importance of evaluating criteria is prioritized with respect to the degree of achieving the stakeholder requirements using fuzzyAHP. Based on the ranked criteria, alternative 3PLs are evaluated and compared with each other using fuzzyAHP again to make an optimal selection. The effectiveness of proposed approach is demonstrated by applying it to a Hong Kong based enterprise that supplies hard disk components. The proposed integratedapproach outperforms the existing approaches because the outsourcing strategy and 3PLs selection are derived from the corporate/business strategy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Renewable energy project development is highly complex and success is by no means guaranteed. Decisions are often made with approximate or uncertain information yet the current methods employed by decision-makers do not necessarily accommodate this. Levelised energy costs (LEC) are one such commonly applied measure utilised within the energy industry to assess the viability of potential projects and inform policy. The research proposes a method for achieving this by enhancing the traditional discounting LEC measure with fuzzy set theory. Furthermore, the research develops the fuzzy LEC (F-LEC) methodology to incorporate the cost of financing a project from debt and equity sources. Applied to an example bioenergy project, the research demonstrates the benefit of incorporating fuzziness for project viability, optimal capital structure and key variable sensitivity analysis decision-making. The proposed method contributes by incorporating uncertain and approximate information to the widely utilised LEC measure and by being applicable to a wide range of energy project viability decisions. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Concept evaluation at the early phase of product development plays a crucial role in new product development. It determines the direction of the subsequent design activities. However, the evaluation information at this stage mainly comes from experts' judgments, which is subjective and imprecise. How to manage the subjectivity to reduce the evaluation bias is a big challenge in design concept evaluation. This paper proposes a comprehensive evaluation method which combines information entropy theory and rough number. Rough number is first presented to aggregate individual judgments and priorities and to manipulate the vagueness under a group decision-making environment. A rough number based information entropy method is proposed to determine the relative weights of evaluation criteria. The composite performance values based on rough number are then calculated to rank the candidate design concepts. The results from a practical case study on the concept evaluation of an industrial robot design show that the integrated evaluation model can effectively strengthen the objectivity across the decision-making processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we review recent theoretical approaches for analysing the dynamics of on-line learning in multilayer neural networks using methods adopted from statistical physics. The analysis is based on monitoring a set of macroscopic variables from which the generalisation error can be calculated. A closed set of dynamical equations for the macroscopic variables is derived analytically and solved numerically. The theoretical framework is then employed for defining optimal learning parameters and for analysing the incorporation of second order information into the learning process using natural gradient descent and matrix-momentum based methods. We will also briefly explain an extension of the original framework for analysing the case where training examples are sampled with repetition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of "typical set (pairs) decoding" for ensembles of Gallager's linear code is investigated using statistical physics. In this decoding method, errors occur, either when the information transmission is corrupted by atypical noise, or when multiple typical sequences satisfy the parity check equation as provided by the received corrupted codeword. We show that the average error rate for the second type of error over a given code ensemble can be accurately evaluated using the replica method, including the sensitivity to message length. Our approach generally improves the existing analysis known in the information theory community, which was recently reintroduced in IEEE Trans. Inf. Theory 45, 399 (1999), and is believed to be the most accurate to date. © 2002 The American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The whole set of the nickel(II) complexes with no derivatized edta-type hexadentate ligands has been investigated from their structural and electronic properties. Two more complexes have been prepared in order to complete the whole set: trans(O5)-[Ni(ED3AP)]2- and trans(O5O6)-[Ni(EDA3P)]2- complexes. trans(O5) geometry has been verified crystallographically and trans(O5O6) geometry of the second complex has been predicted by the DFT theory and spectral analysis. Mutual dependance has been established between: the number of the five-membered carboxylate rings, octahedral/tetrahedral deviation of metal-ligand/nitrogen-neighbour-atom angles and charge-transfer energies (CTE) calculated by the Morokuma’s energetic decomposition analysis; energy of the absorption bands and HOMO–LUMO gap.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What does endogenous growth theory tell about regional economies? Empirics of R&D worker-based productivity growth, Regional Studies. Endogenous growth theory emerged in the 1990s as ‘new growth theory’ accounting for technical progress in the growth process. This paper examines the role of research and development (R&D) workers underlying the Romer model (1990) and its subsequent modifications, and compares it with a model based on the accumulation of human capital engaged in R&D. Cross-section estimates of the models against productivity growth of European regions in the 1990s suggest that each R&D worker has a unique set of knowledge while his/her contributions are enhanced by knowledge sharing within a region as well as spillovers from other regions in proximity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous developments in the opportunism-independent theory of the firm are either restricted to special cases or are derived from the capabilities or resource-based perspective. However, a more general opportunism-independent approach can be developed, based on the work of Demsetz and Coase, which is nevertheless contractual in nature. This depends on 'direction', that is, deriving economic value by permitting one set of actors to direct the activities of another, and of non-human factors of production. Direction helps to explain not only firm boundaries and organisation, but also the existence of firms, without appealing to opportunism or moral hazard. The paper also considers the extent to which it is meaningful to speak of 'contractual' theories in the absence of opportunism, and whether this analysis can be extended beyond the employment contract to encompass ownership of assets by the firm. © The Author 2005. Published by Oxford University Press on behalf of the Cambridge Political Economy Society. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this editorial is to bring together thoughts and opinions from the Editors and Senior Advisory Board of EJM regarding the nature of the long-debated “theory-practice divide” in marketing scholarship. Design/methodology/approach – The authors synthesise diverse opinions from senior academics in order both to inspire further debate in marketing scholarship, and to draw some important conclusions for marketing academia as a whole. Findings – The authors propose that, for marketing scholarship to mature and progress, room must be found for those who wish to focus both on practical and on pure marketing scholarship. Career advancement from both routes is vital. Research limitations/implications – The topic of the theory-practice gap is complex. Many diverse opinions are cited and, due to space constraints, the coverage of many issues is necessarily brief. Practical implications – Scholars should find the thoughts contained in the paper of significant interest. Originality/value – The paper appears to be the first to bring together such a set of diverse opinions on the subject, and to try to draw some overall pragmatic conclusions, while still recognising the multiplicity of valid thought in the area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interest is growing around the application of lean techniques to new product introduction (NPI). Although a relatively emergent topic compared with the application of ‘lean’ within the factory, since 2000 there has been an exponential rise in the literature on this subject. However, much of this work focuses on describing and extolling the virtues of the ‘Toyota approach’ to design. Therefore, by way of a stock take for the UK, the present authors' research has set out to understand how well lean product design practices have been adopted by leading manufacturers. This has been achieved by carrying out in-depth case studies with three carefully selected manufacturers of complex engineered products. This paper describes these studies, the detailed results and subsequent findings, and concludes that both the awareness and adoption of practices is generally embryonic and far removed from the theory advocated in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT This study is about leadership in American Evangelical Churches, which as a sub-set of American Christianity, are growing, while American Christianity as a whole is in decline. As a result evangelicalism is quickly becoming the dominate iteration of American Christianity. It is anecdotal that well led churches grow while poorly led churches do not, yet no one has identified what leadership, in the evangelical church context, is. Researchers have investigated a number of aspects of church leadership (much of it without identifying whether or not the churches under investigation were evangelical or not) but no one has put forth a unified theory linking these aspects together. The purpose of this research is to address that gap and develop a theory that explains how evangelicals view leadership in their local churches. In this study of three churches, dissimilar in size and governance, a purely qualitative approach to data collection and analysis was employed. The study involved 60 interviews that sought points-of-view from top and mid-level leadership along with congregant followers. The study borrowed heavily from Glaser and Strauss (1967) Grounded Theory approach to data analysis. The results developed a theory which provides a unified explanation of how leadership actually works in the three evangelical churches. Several implications for practice are discussed as to the theory's usefulness as a method of leadership education and evaluation. An original discovery was found that an individual's incumbency within the organization was identified as a social power. Limitations to this research are the limitations generally imputed to purely qualitative research in that questions are raised about the theory's applicability to evangelical churches beyond the three studied. The suggestions for further research involve addressing those limitations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Does entrepreneurial optimism affect business performance? Using a unique data set based on repeated survey design, we investigate this relationship empirically. Our measures of ëoptimismí and ërealismí are derived from comparing the turnover growth expectations of ...133 owners-managers with the actual outcomes one year later. Our results indicate that entrepreneurial optimists perform significantly better in terms of profits than pessimists. Moreover, it is the optimist-realist combination that performs best. We interpret our results using regulatory focus theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Does entrepreneurial optimism affect business performance? Using a unique data set based on repeated survey design, we investigate this relationship empirically. Our measures of ‘optimism’ and ‘realism’ are derived from comparing the turnover growth expectations of 133 owners-managers with the actual outcomes one year later. Our results indicate that entrepreneurial optimists perform significantly better in terms of profits than pessimists. Moreover, it is the optimist-realist combination that performs best. We interpret our results using regulatory focus theory.