55 resultados para Ubiquitous and pervasive computing
em Aston University Research Archive
Resumo:
As mobile technologies continue to penetrate increasingly diverse domains of use, we accordingly need to understand the feasibility of different interaction technologies across such varied domains. This case study describes an investigation into whether speechbased input is a feasible interaction option for use in a complex, and arguably extreme, environment of use – that is, lobster fishing vessels. We reflect on our approaches to bringing the “high seas” into lab environments for this purpose, comparing the results obtained via our lab and our field studies. Our hope is that the work presented here will go some way to enhancing the literature in terms of approaches to bringing complex real-world contexts into lab environments for the purpose of evaluating the feasibility of specific interaction technologies.
Resumo:
As mobile technologies continue to penetrate increasingly diverse domains of use, we accordingly need to understand the feasibility of different interaction technologies across such varied domains. This case study describes an investigation into whether speechbased input is a feasible interaction option for use in a complex, and arguably extreme, environment of use – that is, lobster fishing vessels. We reflect on our approaches to bringing the “high seas” into lab environments for this purpose, comparing the results obtained via our lab and our field studies. Our hope is that the work presented here will go some way to enhancing the literature in terms of approaches to bringing complex real-world contexts into lab environments for the purpose of evaluating the feasibility of specific interaction technologies.
Resumo:
Concept evaluation at the early phase of product development plays a crucial role in new product development. It determines the direction of the subsequent design activities. However, the evaluation information at this stage mainly comes from experts' judgments, which is subjective and imprecise. How to manage the subjectivity to reduce the evaluation bias is a big challenge in design concept evaluation. This paper proposes a comprehensive evaluation method which combines information entropy theory and rough number. Rough number is first presented to aggregate individual judgments and priorities and to manipulate the vagueness under a group decision-making environment. A rough number based information entropy method is proposed to determine the relative weights of evaluation criteria. The composite performance values based on rough number are then calculated to rank the candidate design concepts. The results from a practical case study on the concept evaluation of an industrial robot design show that the integrated evaluation model can effectively strengthen the objectivity across the decision-making processes.
Resumo:
Recommender systems (RS) are used by many social networking applications and online e-commercial services. Collaborative filtering (CF) is one of the most popular approaches used for RS. However traditional CF approach suffers from sparsity and cold start problems. In this paper, we propose a hybrid recommendation model to address the cold start problem, which explores the item content features learned from a deep learning neural network and applies them to the timeSVD++ CF model. Extensive experiments are run on a large Netflix rating dataset for movies. Experiment results show that the proposed hybrid recommendation model provides a good prediction for cold start items, and performs better than four existing recommendation models for rating of non-cold start items.
Resumo:
Recent developments in service-oriented and distributed computing have created exciting opportunities for the integration of models in service chains to create the Model Web. This offers the potential for orchestrating web data and processing services, in complex chains; a flexible approach which exploits the increased access to products and tools, and the scalability offered by the Web. However, the uncertainty inherent in data and models must be quantified and communicated in an interoperable way, in order for its effects to be effectively assessed as errors propagate through complex automated model chains. We describe a proposed set of tools for handling, characterizing and communicating uncertainty in this context, and show how they can be used to 'uncertainty- enable' Web Services in a model chain. An example implementation is presented, which combines environmental and publicly-contributed data to produce estimates of sea-level air pressure, with estimates of uncertainty which incorporate the effects of model approximation as well as the uncertainty inherent in the observational and derived data.
Resumo:
Purpose: The purpose of this paper is to investigate the use of 802.11e MAC to resolve the transmission control protocol (TCP) unfairness. Design/methodology/approach: The paper shows how a TCP sender may adapt its transmission rate using the number of hops and the standard deviation of recently measured round-trip times to address the TCP unfairness. Findings: Simulation results show that the proposed techniques provide even throughput by providing TCP fairness as the number of hops increases over a wireless mesh network (WMN). Research limitations/implications: Future work will examine the performance of TCP over routing protocols, which use different routing metrics. Other future work is scalability over WMNs. Since scalability is a problem with communication in multi-hop, carrier sense multiple access (CSMA) will be compared with time division multiple access (TDMA) and a hybrid of TDMA and code division multiple access (CDMA) will be designed that works with TCP and other traffic. Finally, to further improve network performance and also increase network capacity of TCP for WMNs, the usage of multiple channels instead of only a single fixed channel will be exploited. Practical implications: By allowing the tuning of the 802.11e MAC parameters that have previously been constant in 802.11 MAC, the paper proposes the usage of 802.11e MAC on a per class basis by collecting the TCP ACK into a single class and a novel congestion control method for TCP over a WMN. The key feature of the proposed TCP algorithm is the detection of congestion by measuring the fluctuation of RTT of the TCP ACK samples via the standard deviation, plus the combined the 802.11e AIFS and CWmin allowing the TCP ACK to be prioritised which allows the TCP ACKs will match the volume of the TCP data packets. While 802.11e MAC provides flexibility and flow/congestion control mechanism, the challenge is to take advantage of these features in 802.11e MAC. Originality/value: With 802.11 MAC not having flexibility and flow/congestion control mechanisms implemented with TCP, these contribute to TCP unfairness with competing flows. © Emerald Group Publishing Limited.
Resumo:
This paper presents a statistical comparison of regional phonetic and lexical variation in American English. Both the phonetic and lexical datasets were first subjected to separate multivariate spatial analyses in order to identify the most common dimensions of spatial clustering in these two datasets. The dimensions of phonetic and lexical variation extracted by these two analyses were then correlated with each other, after being interpolated over a shared set of reference locations, in order to measure the similarity of regional phonetic and lexical variation in American English. This analysis shows that regional phonetic and lexical variation are remarkably similar in Modern American English.
Resumo:
In order to address problems of information overload in digital imagery task domains we have developed an interactive approach to the capture and reuse of image context information. Our framework models different aspects of the relationship between images and domain tasks they support by monitoring the interactive manipulation and annotation of task-relevant imagery. The approach allows us to gauge a measure of a user's intentions as they complete goal-directed image tasks. As users analyze retrieved imagery their interactions are captured and an expert task context is dynamically constructed. This human expertise, proficiency, and knowledge can then be leveraged to support other users in carrying out similar domain tasks. We have applied our techniques to two multimedia retrieval applications for two different image domains, namely the geo-spatial and medical imagery domains. © Springer-Verlag Berlin Heidelberg 2007.
Resumo:
Adaptability for distributed object-oriented enterprise frameworks in multimedia technology is a critical mission for system evolution. Today, building adaptive services is a complex task due to lack of adequate framework support in the distributed computing systems. In this paper, we propose a Metalevel Component-Based Framework which uses distributed computing design patterns as components to develop an adaptable pattern-oriented framework for distributed computing applications. We describe our approach of combining a meta-architecture with a pattern-oriented framework, resulting in an adaptable framework which provides a mechanism to facilitate system evolution. This approach resolves the problem of dynamic adaptation in the framework, which is encountered in most distributed multimedia applications. The proposed architecture of the pattern-oriented framework has the abilities to dynamically adapt new design patterns to address issues in the domain of distributed computing and they can be woven together to shape the framework in future. © 2011 Springer Science+Business Media B.V.
Resumo:
Aircraft manufacturing industries are looking for solutions in order to increase their productivity. One of the solutions is to apply the metrology systems during the production and assembly processes. Metrology Process Model (MPM) (Maropoulos et al, 2007) has been introduced which emphasises metrology applications with assembly planning, manufacturing processes and product designing. Measurability analysis is part of the MPM and the aim of this analysis is to check the feasibility for measuring the designed large scale components. Measurability Analysis has been integrated in order to provide an efficient matching system. Metrology database is structured by developing the Metrology Classification Model. Furthermore, the feature-based selection model is also explained. By combining two classification models, a novel approach and selection processes for integrated measurability analysis system (MAS) are introduced and such integrated MAS could provide much more meaningful matching results for the operators. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Metrology processes used in the manufacture of large products include tool setting, product verification and flexible metrology enabled automation. The range of applications and instruments available makes the selection of the appropriate instrument for a given task highly complex. Since metrology is a key manufacturing process it should be considered in the early stages of design. This paper provides an overview of the important selection criteria for typical measurement processes and presents some novel selection strategies. Metrics which can be used to assess measurability are also discussed. A prototype instrument selection and measurability analysis application is presented with discussion of how this can be used as the basis for development of a more sophisticated measurement planning tool. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Laser trackers have been widely used in many industries to meet increasingly high accuracy requirements. In laser tracker measurement, it is complex and difficult to perform an accurate error analysis and uncertainty evaluation. This paper firstly reviews the working principle of single beam laser trackers and state-of- The- Art of key technologies from both industrial and academic efforts, followed by a comprehensive analysis of uncertainty sources. A generic laser tracker modelling method is formulated and the framework of the virtual tracker is proposed. The VLS can be used for measurement planning, measurement accuracy optimization and uncertainty evaluation. The completed virtual laser tracking system should take all the uncertainty sources affecting coordinate measurement into consideration and establish an uncertainty model which will behave in an identical way to the real system. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
A novel direct integration technique of the Manakov-PMD equation for the simulation of polarisation mode dispersion (PMD) in optical communication systems is demonstrated and shown to be numerically as efficient as the commonly used coarse-step method. The main advantage of using a direct integration of the Manakov-PMD equation over the coarse-step method is a higher accuracy of the PMD model. The new algorithm uses precomputed M(w) matrices to increase the computational speed compared to a full integration without loss of accuracy. The simulation results for the probability distribution function (PDF) of the differential group delay (DGD) and the autocorrelation function (ACF) of the polarisation dispersion vector for varying numbers of precomputed M(w) matrices are compared to analytical models and results from the coarse-step method. It is shown that the coarse-step method achieves a significantly inferior reproduction of the statistical properties of PMD in optical fibres compared to a direct integration of the Manakov-PMD equation.