15 resultados para User-centric API Framework
em Aston University Research Archive
Resumo:
Learning user interests from online social networks helps to better understand user behaviors and provides useful guidance to design user-centric applications. Apart from analyzing users' online content, it is also important to consider users' social connections in the social Web. Graph regularization methods have been widely used in various text mining tasks, which can leverage the graph structure information extracted from data. Previously, graph regularization methods operate under the cluster assumption that nearby nodes are more similar and nodes on the same structure (typically referred to as a cluster or a manifold) are likely to be similar. We argue that learning user interests from complex, sparse, and dynamic social networks should be based on the link structure assumption under which node similarities are evaluated based on the local link structures instead of explicit links between two nodes. We propose a regularization framework based on the relation bipartite graph, which can be constructed from any type of relations. Using Twitter as our case study, we evaluate our proposed framework from social networks built from retweet relations. Both quantitative and qualitative experiments show that our proposed method outperforms a few competitive baselines in learning user interests over a set of predefined topics. It also gives superior results compared to the baselines on retweet prediction and topical authority identification. © 2014 ACM.
Resumo:
This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.
Resumo:
Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.
Resumo:
Purpose – This paper aims to present a framework that will help manufacturing firms to configure their internal production and support operations to enable effective and efficient delivery of products and their closely associated services. Design/methodology/approach – First, the key definitions and literature sources directly associated with servitization of manufacturing are established. Then, a theoretical framework that categorises the key characteristics of a manufacturer's operations strategy is developed and this is populated using both evidence from the extant literature and empirical data. Findings – The framework captures a set of operations principles, structures and processes that can guide a manufacturer in the delivery of product-centric servitized offering. These are illustrated and contrasted against operations that deliver purely product (production operations) and those which deliver purely services (services operations). Research limitations/implications – The work is based on a review of the literature supported by data collected from an exploratory case study. Whilst it provides an essential platform, further research will be needed to validate the framework. Originality/value – The principal contribution of this paper is a framework that captures the key characteristics of operations for product-centric servitized manufacture.
Resumo:
The development of increasingly powerful computers, which has enabled the use of windowing software, has also opened the way for the computer study, via simulation, of very complex physical systems. In this study, the main issues related to the implementation of interactive simulations of complex systems are identified and discussed. Most existing simulators are closed in the sense that there is no access to the source code and, even if it were available, adaptation to interaction with other systems would require extensive code re-writing. This work aims to increase the flexibility of such software by developing a set of object-oriented simulation classes, which can be extended, by subclassing, at any level, i.e., at the problem domain, presentation or interaction levels. A strategy, which involves the use of an object-oriented framework, concurrent execution of several simulation modules, use of a networked windowing system and the re-use of existing software written in procedural languages, is proposed. A prototype tool which combines these techniques has been implemented and is presented. It allows the on-line definition of the configuration of the physical system and generates the appropriate graphical user interface. Simulation routines have been developed for the chemical recovery cycle of a paper pulp mill. The application, by creation of new classes, of the prototype to the interactive simulation of this physical system is described. Besides providing visual feedback, the resulting graphical user interface greatly simplifies the interaction with this set of simulation modules. This study shows that considerable benefits can be obtained by application of computer science concepts to the engineering domain, by helping domain experts to tailor interactive tools to suit their needs.
Resumo:
This research investigates the general user interface problems in using networked services. Some of the problems are: users have to recall machine names and procedures to. invoke networked services; interactions with some of the services are by means of menu-based interfaces which are quite cumbersome to use; inconsistencies exist between the interfaces for different services because they were developed independently. These problems have to be removed so that users can use the services effectively. A prototype system has been developed to help users interact with networked services. This consists of software which gives the user an easy and consistent interface with the various services. The prototype is based on a graphical user interface and it includes the following appJications: Bath Information & Data Services; electronic mail; file editor. The prototype incorporates an online help facility to assist users using the system. The prototype can be divided into two parts: the user interface part that manages interactlon with the user; the communicatIon part that enables the communication with networked services to take place. The implementation is carried out using an object-oriented approach where both the user interface part and communication part are objects. The essential characteristics of object-orientation, - abstraction, encapsulation, inheritance and polymorphism - can all contribute to the better design and implementation of the prototype. The Smalltalk Model-View-Controller (MVC) methodology has been the framework for the construction of the prototype user interface. The purpose of the development was to study the effectiveness of users interaction to networked services. Having completed the prototype, tests users were requested to use the system to evaluate its effectiveness. The evaluation of the prototype is based on observation, i.e. observing the way users use the system and the opinion rating given by the users. Recommendations to improve further the prototype are given based on the results of the evaluation. based on the results of the evah:1ation. . .'. " "', ':::' ,n,<~;'.'
Resumo:
Using the resistance literature as an underpinning theoretical framework, this chapter analyzes how Web designers through their daily practices, (i) adopt recursive, adaptive, and resisting behavior regarding the inclusion of social cues online and (ii) shape the socio-technical power relationship between designers and other stakeholders. Five vignettes in the form of case studies with expert individual Web designers are used. Findings point out at three types of emerging resistance namely: market driven resistance, ideological resistance, and functional resistance. In addition, a series of propositions are provided linking the various themes. Furthermore, the authors suggest that stratification in Web designers’ type is occurring and that resistance offers a novel lens to analyze the debate.
Resumo:
Web APIs have gained increasing popularity in recent Web service technology development owing to its simplicity of technology stack and the proliferation of mashups. However, efficiently discovering Web APIs and the relevant documentations on the Web is still a challenging task even with the best resources available on the Web. In this paper we cast the problem of detecting the Web API documentations as a text classification problem of classifying a given Web page as Web API associated or not. We propose a supervised generative topic model called feature latent Dirichlet allocation (feaLDA) which offers a generic probabilistic framework for automatic detection of Web APIs. feaLDA not only captures the correspondence between data and the associated class labels, but also provides a mechanism for incorporating side information such as labelled features automatically learned from data that can effectively help improving classification performance. Extensive experiments on our Web APIs documentation dataset shows that the feaLDA model outperforms three strong supervised baselines including naive Bayes, support vector machines, and the maximum entropy model, by over 3% in classification accuracy. In addition, feaLDA also gives superior performance when compared against other existing supervised topic models.
Resumo:
This paper presents an interactive content-based image retrieval framework—uInteract, for delivering a novel four-factor user interaction model visually. The four-factor user interaction model is an interactive relevance feedback mechanism that we proposed, aiming to improve the interaction between users and the CBIR system and in turn users overall search experience. In this paper, we present how the framework is developed to deliver the four-factor user interaction model, and how the visual interface is designed to support user interaction activities. From our preliminary user evaluation result on the ease of use and usefulness of the proposed framework, we have learnt what the users like about the framework and the aspects we could improve in future studies. Whilst the framework is developed for our research purposes, we believe the functionalities could be adapted to any content-based image search framework.
Resumo:
Technology intermediaries are seen as potent vehicles for addressing perennial problems in transferring technology from university to industry in developed and developing countries. This paper examines what constitutes effective user-end intermediation in a low-technology, developing economy context, which is an under-researched topic. The social learning in technological innovation framework is extended using situated learning theory in a longitudinal instrumental case study of an exemplar technology intermediation programme. The paper documents the role that academic-related research and advisory centres can play as intermediaries in brokering, facilitating and configuring technology, against the backdrop of a group of small-scale pisciculture businesses in a rural area of Colombia. In doing so, it demonstrates how technology intermediation activities can be optimized in the domestication and innofusion of technology amongst end-users. The design components featured in this instrumental case of intermediation can inform policy making and practice relating to technology transfer from university to rural industry. Future research on this subject should consider the intermediation components put forward, as well as the impact of such interventions, in different countries and industrial sectors. Such research would allow for theoretical replication and help improve technology domestication and innofusion in different contexts, especially in less-developed countries.
Resumo:
Purpose: To define a research agenda for creating Resource-Efficient Supply Chains (RESC) by identifying and analysing their key characteristics as well as future research opportunities. Design/methodology/approach: We follow a systematic review method to analyse the literature and to understand RESC taking a substantive theory approach. Our approach is grounded in a specific domain, the agri-food sector, because it is an intensive user of an extensive range of resources. Findings: The review shows that literature has looked at the use of resources primarily from the environmental impact perspective. It shows a lack of understanding of the specific RESC characteristics, and concludes more research is needed on multi-disciplinary methods for resource use and impact analyses as well as assessment methods for resource sensitivity and responsiveness. There is a need to explore whether or not, and how, logistics/supply chain decisions will affect the overall configuration of future food supply chains in an era of resource scarcity and depletion and what the trade-offs will be. Research limitations/implications: The paper proposes an agenda for future research in the area of resource–efficient supply chain. The framework proposed along with the key characteristics identified for RESC can be applied to other sectors. Practical implications: Our research should facilitate further understanding of the implications and trade-offs of supply chain decisions taken on the use of resources by supply chain managers. Originality/value: The paper explores the interaction between supply chains and natural resources and also defines the key characteristics of RESC.
Resumo:
Autonomic systems are required to adapt continually to changing environments and user goals. This process involves the real-Time update of the system's knowledge base, which should therefore be stored in a machine-readable format and automatically checked for consistency. OWL ontologies meet both requirements, as they represent collections of knowl- edge expressed in FIrst order logic, and feature embedded reasoners. To take advantage of these OWL ontology char- acteristics, this PhD project will devise a framework com- prising a theoretical foundation, tools and methods for de- veloping knowledge-centric autonomic systems. Within this framework, the knowledge storage and maintenance roles will be fulfilled by a specialised class of OWL ontologies. ©2014 ACM.
Resumo:
People manage a spectrum of identities in cyber domains. Profiling individuals and assigning them to distinct groups or classes have potential applications in targeted services, online fraud detection, extensive social sorting, and cyber-security. This paper presents the Uncertainty of Identity Toolset, a framework for the identification and profiling of users from their social media accounts and e-mail addresses. More specifically, in this paper we discuss the design and implementation of two tools of the framework. The Twitter Geographic Profiler tool builds a map of the ethno-cultural communities of a person's friends on Twitter social media service. The E-mail Address Profiler tool identifies the probable identities of individuals from their e-mail addresses and maps their geographical distribution across the UK. To this end, this paper presents a framework for profiling the digital traces of individuals.
Resumo:
Purpose – The purpose of this paper is to develop an integrated patient-focused analytical framework to improve quality of care in accident and emergency (A&E) unit of a Maltese hospital. Design/methodology/approach – The study adopts a case study approach. First, a thorough literature review has been undertaken to study the various methods of healthcare quality management. Second, a healthcare quality management framework is developed using combined quality function deployment (QFD) and logical framework approach (LFA). Third, the proposed framework is applied to a Maltese hospital to demonstrate its effectiveness. The proposed framework has six steps, commencing with identifying patients’ requirements and concluding with implementing improvement projects. All the steps have been undertaken with the involvement of the concerned stakeholders in the A&E unit of the hospital. Findings – The major and related problems being faced by the hospital under study were overcrowding at A&E and shortage of beds, respectively. The combined framework ensures better A&E services and patient flow. QFD identifies and analyses the issues and challenges of A&E and LFA helps develop project plans for healthcare quality improvement. The important outcomes of implementing the proposed quality improvement programme are fewer hospital admissions, faster patient flow, expert triage and shorter waiting times at the A&E unit. Increased emergency consultant cover and faster first significant medical encounter were required to start addressing the problems effectively. Overall, the combined QFD and LFA method is effective to address quality of care in A&E unit. Practical/implications – The proposed framework can be easily integrated within any healthcare unit, as well as within entire healthcare systems, due to its flexible and user-friendly approach. It could be part of Six Sigma and other quality initiatives. Originality/value – Although QFD has been extensively deployed in healthcare setup to improve quality of care, very little has been researched on combining QFD and LFA in order to identify issues, prioritise them, derive improvement measures and implement improvement projects. Additionally, there is no research on QFD application in A&E. This paper bridges these gaps. Moreover, very little has been written on the Maltese health care system. Therefore, this study contributes demonstration of quality of emergency care in Malta.
Resumo:
Adaptability for distributed object-oriented enterprise frameworks in multimedia technology is a critical mission for system evolution. Today, building adaptive services is a complex task due to lack of adequate framework support in the distributed computing systems. In this paper, we propose a Metalevel Component-Based Framework which uses distributed computing design patterns as components to develop an adaptable pattern-oriented framework for distributed computing applications. We describe our approach of combining a meta-architecture with a pattern-oriented framework, resulting in an adaptable framework which provides a mechanism to facilitate system evolution. This approach resolves the problem of dynamic adaptation in the framework, which is encountered in most distributed multimedia applications. The proposed architecture of the pattern-oriented framework has the abilities to dynamically adapt new design patterns to address issues in the domain of distributed computing and they can be woven together to shape the framework in future. © 2011 Springer Science+Business Media B.V.