877 resultados para user data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cell transition data is obtained from a cellular phone that switches its current serving cell tower. The data consists of a sequence of transition events, which are pairs of cell identifiers and transition times. The focus of this thesis is applying data mining methods to such data, developing new algorithms, and extracting knowledge that will be a solid foundation on which to build location-aware applications. In addition to a thorough exploration of the features of the data, the tools and methods developed in this thesis provide solutions to three distinct research problems. First, we develop clustering algorithms that produce a reliable mapping between cell transitions and physical locations observed by users of mobile devices. The main clustering algorithm operates in online fashion, and we consider also a number of offline clustering methods for comparison. Second, we define the concept of significant locations, known as bases, and give an online algorithm for determining them. Finally, we consider the task of predicting the movement of the user, based on historical data. We develop a prediction algorithm that considers paths of movement in their entirety, instead of just the most recent movement history. All of the presented methods are evaluated with a significant body of real cell transition data, collected from about one hundred different individuals. The algorithms developed in this thesis are designed to be implemented on a mobile device, and require no extra hardware sensors or network infrastructure. By not relying on external services and keeping the user information as much as possible on the user s own personal device, we avoid privacy issues and let the users control the disclosure of their location information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network data packet capture and replay capabilities are basic requirements for forensic analysis of faults and security-related anomalies, as well as for testing and development. Cyber-physical networks, in which data packets are used to monitor and control physical devices, must operate within strict timing constraints, in order to match the hardware devices' characteristics. Standard network monitoring tools are unsuitable for such systems because they cannot guarantee to capture all data packets, may introduce their own traffic into the network, and cannot reliably reproduce the original timing of data packets. Here we present a high-speed network forensics tool specifically designed for capturing and replaying data traffic in Supervisory Control and Data Acquisition systems. Unlike general-purpose "packet capture" tools it does not affect the observed network's data traffic and guarantees that the original packet ordering is preserved. Most importantly, it allows replay of network traffic precisely matching its original timing. The tool was implemented by developing novel user interface and back-end software for a special-purpose network interface card. Experimental results show a clear improvement in data capture and replay capabilities over standard network monitoring methods and general-purpose forensics solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recommender systems assist users in finding what they want. The challenging issue is how to efficiently acquire user preferences or user information needs for building personalized recommender systems. This research explores the acquisition of user preferences using data taxonomy information to enhance personalized recommendations for alleviating cold-start problem. A concept hierarchy model is proposed, which provides a two-dimensional hierarchy for acquiring user preferences. The language model is also extended for the proposed hierarchy in order to generate an effective recommender algorithm. Both Amazon.com book and music datasets are used to evaluate the proposed approach, and the experimental results show that the proposed approach is promising.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

E-government provides a platform for governments to implement web enabled services that facilitate communication between citizens and the government. However, technology driven design approach and limited understanding of citizens' requirements, have led to a number of critical usability problems on the government websites. Hitherto, there has been no systematic attempt to analyse the way in which theory of User Centred Design (UCD) can contribute to address the usability issues of government websites. This research seeks to fill this gap by synthesising perspectives drawn from the study of User Centred Design and examining them based on the empirical data derived from case study of the Scottish Executive website. The research employs a qualitative approach in the collection and analysis of data. The triangulated analysis of the findings reveals that e-government web designers take commercial development approach and focus only on technical implementations which lead to websites that do not meet citizens' expectations. The research identifies that e-government practitioners can overcome web usability issues by transferring the theory of UCD to practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

E-government provides a platform for governments to implement web-enabled services that facilitate communication between citizens and the government. However, technology-driven design approach and limited understanding of citizens' requirements have led to a number of critical usability problems on the government websites. Hitherto, there has been no systematic attempt to analyse the way in which theory of User-Centred Design (UCD) can contribute to address the usability issues of government websites. This research seeks to fill this gap by synthesising perspectives drawn from the study of UCD and examining them based on the empirical data derived from case study of the Scottish Executive (SE) website. The research employs a qualitative approach in the collection and analysis of data. The triangulated analysis of the findings reveals that e-government web designers take commercial development approach and focus only on technical implementations, which lead to websites that do not meet citizens' expectations. The research identifies that e-government practitioners can overcome web usability issues by transferring the theory of UCD to practice. © Copyright 2010 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extensible Markup Language ( XML) has emerged as a medium for interoperability over the Internet. As the number of documents published in the form of XML is increasing, there is a need for selective dissemination of XML documents based on user interests. In the proposed technique, a combination of Adaptive Genetic Algorithms and multi class Support Vector Machine ( SVM) is used to learn a user model. Based on the feedback from the users, the system automatically adapts to the user's preference and interests. The user model and a similarity metric are used for selective dissemination of a continuous stream of XML documents. Experimental evaluations performed over a wide range of XML documents, indicate that the proposed approach significantly improves the performance of the selective dissemination task, with respect to accuracy and efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Australasian Society for Computers in Learning in Tertiary Education (ascilite) has recently completed research to inform development of the ALTC Exchange, a new online service for learning and teaching in Australia. The research investigated resource identification and contribution, engagement with the repository and user community, and associated peer review and commentary processes. This article focuses on the data obtained and recommendations developed for engagement of potential end users. It reports a literature review and findings, including an international perspective on the ALTC Exchange, with specific focus on prospective user needs, contexts of use and policies necessary to facilitate engagement of the higher education sector with the ALTC Exchange

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extensible Markup Language ( XML) has emerged as a medium for interoperability over the Internet. As the number of documents published in the form of XML is increasing, there is a need for selective dissemination of XML documents based on user interests. In the proposed technique, a combination of Adaptive Genetic Algorithms and multi class Support Vector Machine ( SVM) is used to learn a user model. Based on the feedback from the users, the system automatically adapts to the user's preference and interests. The user model and a similarity metric are used for selective dissemination of a continuous stream of XML documents. Experimental evaluations performed over a wide range of XML documents, indicate that the proposed approach significantly improves the performance of the selective dissemination task, with respect to accuracy and efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reports on a cross-sectional case study of a large construction project in which Electronic document management (EDM) was used. Attitudes towards EDM from the perspective of individual end users were investigated. Responses from a survey were combined with data from system usage log files to obtain an overview of attitudes prevalent in different user segments of the total population of 334 users. The survey was followed by semi-structured interviews with representative users. A strong majority of users from all segments of the project group considered EDM as a valuable aid in their work processes, despite certain functional limitations of the system used and the complexity of the information mass. Based on the study a model describing the key factors affecting end user EDM adoption is proposed. The model draws on insight from earlier studies of EDM enabled projects and theoretical frameworks on technology acceptance and success of information systems, as well as the insights gained from the case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a variation density function that profiles the relationship between multiple scalar fields over isosurfaces of a given scalar field. This profile serves as a valuable tool for multifield data exploration because it provides the user with cues to identify interesting isovalues of scalar fields. Existing isosurface-based techniques for scalar data exploration like Reeb graphs, contour spectra, isosurface statistics, etc., study a scalar field in isolation. We argue that the identification of interesting isovalues in a multifield data set should necessarily be based on the interaction between the different fields. We demonstrate the effectiveness of our approach by applying it to explore data from a wide variety of applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an algorithm for constructing the solid model (boundary representation) from pout data measured from the faces of the object. The poznt data is assumed to be clustered for each face. This algorithm does not require any compuiier model of the part to exist and does not require any topological infarmation about the part to be input by the user. The property that a convex solid can be constructed uniquely from geometric input alone is utilized in the current work. Any object can be represented a5 a combznatzon of convex solids. The proposed algorithm attempts to construct convex polyhedra from the given input. The polyhedra so obtained are then checked against the input data for containment and those polyhedra, that satisfy this check, are combined (using boolean union operation) to realise the solid model. Results of implementation are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop an optimal, distributed, and low feedback timer-based selection scheme to enable next generation rate-adaptive wireless systems to exploit multi-user diversity. In our scheme, each user sets a timer depending on its signal to noise ratio (SNR) and transmits a small packet to identify itself when its timer expires. When the SNR-to-timer mapping is monotone non-decreasing, timers of users with better SNRs expire earlier. Thus, the base station (BS) simply selects the first user whose timer expiry it can detect, and transmits data to it at as high a rate as reliably possible. However, timers that expire too close to one another cannot be detected by the BS due to collisions. We characterize in detail the structure of the SNR-to-timer mapping that optimally handles these collisions to maximize the average data rate. We prove that the optimal timer values take only a discrete set of values, and that the rate adaptation policy strongly influences the optimal scheme's structure. The optimal average rate is very close to that of ideal selection in which the BS always selects highest rate user, and is much higher than that of the popular, but ad hoc, timer schemes considered in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new method of data handling for web servers. We call this method Network Aware Buffering and Caching (NABC for short). NABC facilitates reduction of data copies in web server's data sending path, by doing three things: (1) Layout the data in main memory in a way that protocol processing can be done without data copies (2) Keep a unified cache of data in kernel and ensure safe access to it by various processes and kernel and (3) Pass only the necessary meta data between processes so that bulk data handling time spent during IPC can be reduced. We realize NABC by implementing a set of system calls and an user library. The end product of the implementation is a set of APIs specifically designed for use by the web servers. We port an in house web server called SWEET, to NABC APIs and evaluate performance using a range of workloads both simulated and real. The results show a very impressive gain of 12% to 21% in throughput for static file serving and 1.6 to 4 times gain in throughput for lightweight dynamic content serving for a server using NABC APIs over the one using UNIX APIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A mathematical model has been developed for the gas carburising (diffusion) process using finite volume method. The computer simulation has been carried out for an industrial gas carburising process. The model's predictions are in good agreement with industrial experimental data and with data collected from the literature. A study of various mass transfer and diffusion coefficients has been carried out in order to suggest which correlations should be used for the gas carburising process. The model has been interfaced in a Windows environment using a graphical user interface. In this way, the model is extremely user friendly. The sensitivity analysis of various parameters such as initial carbon concentration in the specimen, carbon potential of the atmosphere, temperature of the process, etc. has been carried out using the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PDB Goodies is a web-based graphical user interface (GUI) to manipulate the Protein Data Bank file containing the three-dimensional atomic coordinates of protein structures. The program also allows users to save the manipulated three-dimensional atomic coordinate file on their local client system. These fragments are used in various stages of structure elucidation and analysis. This software is incorporated with all the three-dimensional protein structures available in the Protein Data Bank, which presently holds approximately 18 000 structures. In addition, this program works on a three-dimensional atomic coordinate file (Protein Data Bank format) uploaded from the client machine. The program is written using CGI/PERL scripts and is platform independent. The program PDB Goodies can be accessed over the World Wide Web at http:// 144.16.71.11/pdbgoodies/.