895 resultados para Customer-value based approach
Resumo:
Current e-learning systems are increasing their importance in higher education. However, the state of the art of e-learning applications, besides the state of the practice, does not achieve the level of interactivity that current learning theories advocate. In this paper, the possibility of enhancing e-learning systems to achieve deep learning has been studied by replicating an experiment in which students had to learn basic software engineering principles. One group learned these principles using a static approach, while the other group learned the same principles using a system-dynamics-based approach, which provided interactivity and feedback. The results show that, quantitatively, the latter group achieved a better understanding of the principles; furthermore, qualitatively, they enjoyed the learning experience
Resumo:
We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.
Resumo:
A sparse kernel density estimator is derived based on the zero-norm constraint, in which the zero-norm of the kernel weights is incorporated to enhance model sparsity. The classical Parzen window estimate is adopted as the desired response for density estimation, and an approximate function of the zero-norm is used for achieving mathemtical tractability and algorithmic efficiency. Under the mild condition of the positive definite design matrix, the kernel weights of the proposed density estimator based on the zero-norm approximation can be obtained using the multiplicative nonnegative quadratic programming algorithm. Using the -optimality based selection algorithm as the preprocessing to select a small significant subset design matrix, the proposed zero-norm based approach offers an effective means for constructing very sparse kernel density estimates with excellent generalisation performance.
Resumo:
This paper derives an efficient algorithm for constructing sparse kernel density (SKD) estimates. The algorithm first selects a very small subset of significant kernels using an orthogonal forward regression (OFR) procedure based on the D-optimality experimental design criterion. The weights of the resulting sparse kernel model are then calculated using a modified multiplicative nonnegative quadratic programming algorithm. Unlike most of the SKD estimators, the proposed D-optimality regression approach is an unsupervised construction algorithm and it does not require an empirical desired response for the kernel selection task. The strength of the D-optimality OFR is owing to the fact that the algorithm automatically selects a small subset of the most significant kernels related to the largest eigenvalues of the kernel design matrix, which counts for the most energy of the kernel training data, and this also guarantees the most accurate kernel weight estimate. The proposed method is also computationally attractive, in comparison with many existing SKD construction algorithms. Extensive numerical investigation demonstrates the ability of this regression-based approach to efficiently construct a very sparse kernel density estimate with excellent test accuracy, and our results show that the proposed method compares favourably with other existing sparse methods, in terms of test accuracy, model sparsity and complexity, for constructing kernel density estimates.
Resumo:
View-based and Cartesian representations provide rival accounts of visual navigation in humans, and here we explore possible models for the view-based case. A visual “homing” experiment was undertaken by human participants in immersive virtual reality. The distributions of end-point errors on the ground plane differed significantly in shape and extent depending on visual landmark configuration and relative goal location. A model based on simple visual cues captures important characteristics of these distributions. Augmenting visual features to include 3D elements such as stereo and motion parallax result in a set of models that describe the data accurately, demonstrating the effectiveness of a view-based approach.
Resumo:
An alternative approach to understanding innovation is made using two intersecting ideas. The first is that successful innovation requires consideration of the social and organizational contexts in which it is located. The complex context of construction work is characterized by inter-organizational collaboration, a project-based approach and power distributed amongst collaborating organizations. The second is that innovations can be divided into two modes: ‘bounded’, where the implications of innovation are restricted within a single, coherent sphere of influence, and ‘unbounded’, where the effects of implementation spill over beyond this. Bounded innovations are adequately explained within the construction literature. However, less discussed are unbounded innovations, where many firms' collaboration is required for successful implementation, even though many innovations can be considered unbounded within construction's inter-organizational context. It is argued that unbounded innovations require an approach to understand and facilitate the interactions both within a range of actors and between the actors and technological artefacts. The insights from a sociology of technology approach can be applied to the multiplicity of negotiations and alignments that constitute the implementation of unbounded innovation. The utility of concepts from the sociology of technology, including ‘system building’ and ‘heterogeneous engineering’, is demonstrated by applying them to an empirical study of an unbounded innovation on a major construction project (the new terminal at Heathrow Airport, London, UK). This study suggests that ‘system building’ contains outcomes that are not only transformations of practices, processes and systems, but also the potential transformation of technologies themselves.
Resumo:
The construction sector is often described as lagging behind other major industries. At first this appears fair when considering the concept of corporate social responsibility (CSR). It is argued that CSR is ill-defined, with firms struggling to make sense of and engage with it. Literature suggests that the short-termism view of construction firms renders the long-term, triple-bottom-line principle of CSR untenable. This seems to be borne out by literature indicating that construction firms typically adopt a compliance-based approach to CSR instead of discretionary CSR which is regarded as adding most value to firms and benefiting the broadest group of stakeholders. However, this research conducted in the UK using a regional construction firm offers a counter argument whereby discretionary CSR approaches are well embedded and enacted within the firms’ business operations even though they are not formally articulated as CSR strategies and thus remain 'hidden'. This raises questions in the current CSR debate. First, is ‘hidden’ CSR relevant to the long term success of construction firms? and to what extent do these firms need to reinvent themselves to formally take advantage of the CSR agenda?
Resumo:
There are many published methods available for creating keyphrases for documents. Previous work in the field has shown that in a significant proportion of cases author selected keyphrases are not appropriate for the document they accompany. This requires the use of such automated methods to improve the use of keyphrases. Often the keyphrases are not updated when the focus of a paper changes or include keyphrases that are more classificatory than explanatory. The published methods are all evaluated using different corpora, typically one relevant to their field of study. This not only makes it difficult to incorporate the useful elements of algorithms in future work but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of six corpora. The methods chosen were term frequency, inverse document frequency, the C-Value, the NC-Value, and a synonym based approach. These methods were compared to evaluate performance and quality of results, and to provide a future benchmark. It is shown that, with the comparison metric used for this study Term Frequency and Inverse Document Frequency were the best algorithms, with the synonym based approach following them. Further work in the area is required to determine an appropriate (or more appropriate) comparison metric.
Resumo:
The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.
Resumo:
Farming freshwater prawns with fish in rice fields is widespread in coastal regions of southwest Bangladesh because of favourable resources and ecological conditions. This article provides an overview of an ecosystem-based approach to integrated prawn-fish-rice farming in southwest Bangladesh. The practice of prawn and fish farming in rice fields is a form of integrated aquaculture-agriculture, which provides a wide range of social, economic and environmental benefits. Integrated prawn-fish-rice farming plays an important role in the economy of Bangladesh, earning foreign exchange and increasing food production. However, this unique farming system in coastal Bangladesh is particularly vulnerable to climatechange. We suggest that community-based adaptation strategies must be developed to cope with the challenges. We propose that integrated prawn-fish-rice farming could be relocated from the coastal region to less vulnerable upland areas, but caution that this will require appropriate adaptation strategies and an enabling institutional environment.
Resumo:
The article explores how fair trade and associated private agri-food standards are incorporated into public procurement in Europe. Procurement law is underpinned by principles of equity, non-discrimination and transparency; one consequence is that legal obstacles exist to fair trade being privileged within procurement practice. These obstacles have pragmatic dimensions, concerning whether and how procurement can be used to fulfil wider social policy objectives or to incorporate private standards; they also bring to the fore underlying issues of value. Taking an agency-based approach and incorporating the concept of governability, empirical evidence demonstrates the role played by different actors in negotiating fair trade’s passage into procurement through pre-empting and managing legal risk. This process exposes contestations that arise when contrasting values come together within sustainable procurement. This examination of fair trade in public procurement helps reveal how practices and knowledge on ethical consumption enter into a new governance arena within the global agri-food system.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
Business process modelling can help an organisation better understand and improve its business processes. Most business process modelling methods adopt a task- or activity-based approach to identifying business processes. Within our work, we use activity theory to categorise elements within organisations as being either human beings, activities or artefacts. Due to the direct relationship between these three elements, an artefact-oriented approach to organisation analysis emerges. Organisational semiotics highlights the ontological dependency between affordances within an organisation. We analyse the ontological dependency between organisational elements, and therefore produce the ontology chart for artefact-oriented business process modelling in order to clarify the relationship between the elements of an organisation. Furthermore, we adopt the techniques from semantic analysis and norm analysis, of organisational semiotics, to develop the artefact-oriented method for business process modelling. The proposed method provides a novel perspective for identifying and analysing business processes, as well as agents and artefacts, as the artefact-oriented perspective demonstrates the fundamental flow of an organisation. The modelling results enable an organisation to understand and model its processes from an artefact perspective, viewing an organisation as a network of artefacts. The information and practice captured and stored in artefact can also be shared and reused between organisations that produce similar artefacts.
Resumo:
Purpose – This paper seeks to make the case for new research into the perceived fairness and impact of executive pay. Design/methodology/approach – The paper reviews the literature regarding executive compensation and corporate performance and examines the evidence that a more egalitarian approach to pay could be justified in terms of long-term shareholder value. Findings – There would appear to be no evidence to suggest that the growing gap between the pay of executives and that of the average employee generates long-term enterprise value, and it may even be detrimental to firms, if not the liberal capitalist consensus on which the corporate licence to operate is based. Research limitations/implications – The paper outlines a new approach to tracking income differentials with corporate performance through the development of a corporate Gini coefficient “league table”. Social implications – The proposed research is expected to point towards better practice in executive remuneration, and support the growing momentum for a sustainable and enlightened approach to business, in which the key goal is long-term enterprise value based on a fair distribution of the rewards of business. Originality/value – In producing a deeper understanding of the impact of widening income differentials, the paper should be of interest to senior executives in publicly quoted companies as well as press commentators, government officials and academics.
Resumo:
Concern that European forest biodiversity is depleted and declining has provoked widespread efforts to improve management practices. To gauge the success of these actions, appropriate monitoring of forest ecosystems is paramount. Multi-species indicators are frequently used to assess the state of biodiversity and its response to implemented management, but generally applicable and objective methodologies for species' selection are lacking. Here we use a niche-based approach, underpinned by coarse quantification of species' resource use, to objectively select species for inclusion in a pan-European forest bird indicator. We identify both the minimum number of species required to deliver full resource coverage and the most sensitive species' combination, and explore the trade-off between two key characteristics, sensitivity and redundancy, associated with indicators comprising different numbers of species. We compare our indicator to an existing forest bird indicator selected on the basis of expert opinion and show it is more representative of the wider community. We also present alternative indicators for regional and forest type specific monitoring and show that species' choice can have a significant impact on the indicator and consequent projections about the state of the biodiversity it represents. Furthermore, by comparing indicator sets drawn from currently monitored species and the full forest bird community, we identify gaps in the coverage of the current monitoring scheme. We believe that adopting this niche-based framework for species' selection supports the objective development of multi-species indicators and that it has good potential to be extended to a range of habitats and taxa.