40 resultados para Multi-dimensional scaling

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sentiment analysis has long focused on binary classification of text as either positive or negative. There has been few work on mapping sentiments or emotions into multiple dimensions. This paper studies a Bayesian modeling approach to multi-class sentiment classification and multidimensional sentiment distributions prediction. It proposes effective mechanisms to incorporate supervised information such as labeled feature constraints and document-level sentiment distributions derived from the training data into model learning. We have evaluated our approach on the datasets collected from the confession section of the Experience Project website where people share their life experiences and personal stories. Our results show that using the latent representation of the training documents derived from our approach as features to build a maximum entropy classifier outperforms other approaches on multi-class sentiment classification. In the more difficult task of multi-dimensional sentiment distributions prediction, our approach gives superior performance compared to a few competitive baselines. © 2012 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Text summarization has been studied for over a half century, but traditional methods process texts empirically and neglect the fundamental characteristics and principles of language use and understanding. Automatic summarization is a desirable technique for processing big data. This reference summarizes previous text summarization approaches in a multi-dimensional category space, introduces a multi-dimensional methodology for research and development, unveils the basic characteristics and principles of language use and understanding, investigates some fundamental mechanisms of summarization, studies dimensions on representations, and proposes a multi-dimensional evaluation mechanism. Investigation extends to incorporating pictures into summary and to the summarization of videos, graphs and pictures, and converges to a general summarization method. Further, some basic behaviors of summarization are studied in the complex cyber-physical-social space. Finally, a creative summarization mechanism is proposed as an effort toward the creative summarization of things, which is an open process of interactions among physical objects, data, people, and systems in cyber-physical-social space through a multi-dimensional lens of semantic computing. The insights can inspire research and development of many computing areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualising data for exploratory analysis is a big challenge in scientific and engineering domains where there is a need to gain insight into the structure and distribution of the data. Typically, visualisation methods like principal component analysis and multi-dimensional scaling are used, but it is difficult to incorporate prior knowledge about structure of the data into the analysis. In this technical report we discuss a complementary approach based on an extension of a well known non-linear probabilistic model, the Generative Topographic Mapping. We show that by including prior information of the covariance structure into the model, we are able to improve both the data visualisation and the model fit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualising data for exploratory analysis is a major challenge in many applications. Visualisation allows scientists to gain insight into the structure and distribution of the data, for example finding common patterns and relationships between samples as well as variables. Typically, visualisation methods like principal component analysis and multi-dimensional scaling are employed. These methods are favoured because of their simplicity, but they cannot cope with missing data and it is difficult to incorporate prior knowledge about properties of the variable space into the analysis; this is particularly important in the high-dimensional, sparse datasets typical in geochemistry. In this paper we show how to utilise a block-structured correlation matrix using a modification of a well known non-linear probabilistic visualisation model, the Generative Topographic Mapping (GTM), which can cope with missing data. The block structure supports direct modelling of strongly correlated variables. We show that including prior structural information it is possible to improve both the data visualisation and the model fit. These benefits are demonstrated on artificial data as well as a real geochemical dataset used for oil exploration, where the proposed modifications improved the missing data imputation results by 3 to 13%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this chapter we provide a comprehensive overview of the emerging field of visualising and browsing image databases. We start with a brief introduction to content-based image retrieval and the traditional query-by-example search paradigm that many retrieval systems employ. We specify the problems associated with this type of interface, such as users not being able to formulate a query due to not having a target image or concept in mind. The idea of browsing systems is then introduced as a means to combat these issues, harnessing the cognitive power of the human mind in order to speed up image retrieval.We detail common methods in which the often high-dimensional feature data extracted from images can be used to visualise image databases in an intuitive way. Systems using dimensionality reduction techniques, such as multi-dimensional scaling, are reviewed along with those that cluster images using either divisive or agglomerative techniques as well as graph-based visualisations. While visualisation of an image collection is useful for providing an overview of the contained images, it forms only part of an image database navigation system. We therefore also present various methods provided by these systems to allow for interactive browsing of these datasets. A further area we explore are user studies of systems and visualisations where we look at the different evaluations undertaken in order to test usability and compare systems, and highlight the key findings from these studies. We conclude the chapter with several recommendations for future work in this area. © 2011 Springer-Verlag Berlin Heidelberg.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Innovation is part and parcel of any service in today's environment, so as to remain competitive. Quality improvement in healthcare services is a complex, multi-dimensional task. This study proposes innovation management in healthcare services using a logical framework. A problem tree and an objective tree are developed to identify and mitigate issues and concerns. A logical framework is formulated to develop a plan for implementation and monitoring strategies, potentially creating an environment for continuous quality improvement in a specific unit. We recommend logical framework as a valuable model for innovation management in healthcare services. Copyright © 2006 Inderscience Enterprises Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop an integrated quality management model that identifies problems, suggests solutions, develops a framework for implementation and helps to evaluate dynamically healthcare service performance. Design/methodology/approach - This study used the logical framework analysis (LFA) to improve the performance of healthcare service processes. LFA has three major steps - problems identification, solution derivation, and formation of a planning matrix for implementation. LFA has been applied in a case-study environment to three acute healthcare services (Operating Room utilisation, Accident and Emergency, and Intensive Care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This study shows LFA application in three service processes in one hospital. This very limited population sample needs to be extended. Practical implications - The proposed model can be implemented in hospital-based healthcare services in order to improve performance. It may also be applied to other services. Originality/value - Quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, they are not without flaws. There is an absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues, develop a project management framework to implement those solutions. This study introduces an integrated and uniform quality management tool for healthcare services. © Emerald Group Publishing Limited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – This paper aims to contribute to the debate on the drivers of the productivity gap that exists between the UK and its major international competitors. Design/methodology/approach – From the macro perspective the paper explores the quantitative evidence on the productivity differentials and how they are measured. From the micro perspective, the article explores the quantitative evidence on the role of management practices claimed to be a key determinant in promoting firm competitiveness and in bridging the UK gap. Findings – This study suggests that management practices are an ambiguous driver of firm productivity and higher firm performance. On the methodological side, qualitative and subjective measures of either management practices or firm performance are often used. This makes the results not comparable across studies, across firms or even within firms over time. Productivity and profitability are often and erroneously interchangeably used while productivity is only one element of firm performance. On the other hand, management practices are multi-dimensional constructs that generally do not demonstrate a straightforward relationship with productivity variables. To assume that they are the only driver of higher productivity may be misleading. Moreover, there is evidence of an inverse causal relationship between management practices and firm performance. This calls into question most empirical results of the extant literature based on the unidirectional assumption of direct causality between management practices and firm performance. Research limitations/implications – These and other issues suggest that more research is needed to deepen the understanding of the UK productivity gap and more quantitative evidence should be provided on the way in which management practices contribute to the UK competitiveness. Their impact is not easily measurable due to their complexity and their complementary nature and this is a fertile ground for further research. Originality/value – This paper brings together the evidence on the UK productivity gap and its main drivers, provided by the economics, management and performance measurement literature. This issue scores very highly in the agenda of policy makers and academics and has important implications for practitioners interested in evaluating the impact of managerial best practices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Self-service technology is affecting the service encounter. The potential reduction in personal contact through self-service technology may affect assessments of consumer satisfaction and commitment, making it necessary to investigate self-service technology usage, particularly the long-term impact on consumers' relationships with service organisations. Thus, this paper presents a framework for investigating the impact of self-service technology on consumer satisfaction and on a multi-dimensional measure of consumer commitment. Illustrative quotes from exploratory in-depth interviews support the framework and lead to a set of propositions. Future research directions for testing the framework are also discussed, and potential implications of this research are outlined.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

FDI plays a key role in development, particularly in resource-constrained transition economies of Central and Eastern Europe with relatively low savings rates. Gains from technology transfer play a critical role in motivating FDI, yet potential for it may be hampered by a large technology gap between the source and host country. While the extent of this gap has traditionally been attributed to education, skills and capital intensity, recent literature has also emphasized the possible role of institutional environment in this respect. Despite tremendous interest among policy-makers and academics to understand the factors attracting FDI (Bevan and Estrin, 2000; Globerman and Shapiro, 2003) our knowledge about the effects of institutions on the location choice and ownership structure of foreign firms remains limited. This paper attempts to fill this gap in the literature by examining the link between institutions and foreign ownership structures. To the best of our knowledge, Javorcik (2004) is the only papers, which use firm-level data to analyse the role of institutional quality on an outward investor’s entry mode in transition countries. Our paper extends Javorcik (2004) in a number of ways: (a) rather than a cross-section, we use panel data for the period 1997-2006; (b) rather than a binary variable, we use the percentage foreign ownership as continuous variable; (c) we consider multi-dimensional institutional variables, such as corruption, intellectual property rights protection and government stability. We also use factor analysis to generate a composite index of institutional quality and see how stronger institutional environment could affect foreign ownership; (d) we explore how the distance between institutional environment in source and host countries affect foreign ownership in a host country. The firm-level data used includes both domestic and foreign firms for the period 1997-2006 and is drawn from ORBIS, a commercially available dataset provided by Bureau van Dijk. In order to examine the link between institutions and foreign ownership structures, we estimate four log-linear ownership equations/specifications augmented by institutional and other control variables. We find evidence that the decision of a foreign firm to either locate its subsidiary or acquire an existing domestic firm depends not only on factor cost differences but also on differences in institutional environment between the host and source countries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper begins by suggesting that when considering Corporate Social Responsibility (CSR), even CSR as justified in terms of the business case, stakeholders are of great importance to corporations. In the UK the Company Law Review (DTI, 2002) has suggested that it is appropriate for UK companies to be managed upon the basis of an enlightened shareholder approach. Within this approach the importance of stakeholders, other than shareholders, is recognised as being instrumental in succeeding in providing shareholder value. Given the importance of these other stakeholders it is then important that corporate management measure and manage stakeholder performance. In order to do this there are two general approaches that could be adopted and these are the use of monetary values to reflect stakeholder value or cost and non-monetary values. In order to consider these approaches further this paper considered the possible use of these approaches for two stakeholder groups: namely employees and the environment. It concludes that there are ethical and practical difficulties with calculating economic values for stakeholder resources and so prefers a multi-dimensional approach to stakeholder performance measurement that does not use economic valuation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Digital image processing is exploited in many diverse applications but the size of digital images places excessive demands on current storage and transmission technology. Image data compression is required to permit further use of digital image processing. Conventional image compression techniques based on statistical analysis have reached a saturation level so it is necessary to explore more radical methods. This thesis is concerned with novel methods, based on the use of fractals, for achieving significant compression of image data within reasonable processing time without introducing excessive distortion. Images are modelled as fractal data and this model is exploited directly by compression schemes. The validity of this is demonstrated by showing that the fractal complexity measure of fractal dimension is an excellent predictor of image compressibility. A method of fractal waveform coding is developed which has low computational demands and performs better than conventional waveform coding methods such as PCM and DPCM. Fractal techniques based on the use of space-filling curves are developed as a mechanism for hierarchical application of conventional techniques. Two particular applications are highlighted: the re-ordering of data during image scanning and the mapping of multi-dimensional data to one dimension. It is shown that there are many possible space-filling curves which may be used to scan images and that selection of an optimum curve leads to significantly improved data compression. The multi-dimensional mapping property of space-filling curves is used to speed up substantially the lookup process in vector quantisation. Iterated function systems are compared with vector quantisers and the computational complexity or iterated function system encoding is also reduced by using the efficient matching algcnithms identified for vector quantisers.