803 resultados para User Interface (UI) Software-as-a-Service
Resumo:
In today’s world of information-driven society, many studies are exploring usefulness and ease of use of the technology. The research into personalizing next-generation user interface is also ever increasing. A better understanding of factors that influence users’ perception of web search engine performance would contribute in achieving this. This study measures and examines how users’ perceived level of prior knowledge and experience influence their perceived level of satisfaction of using the web search engines, and how their perceived level of satisfaction affects their perceived intention to reuse the system. 50 participants from an Australian university participated in the current study, where they performed three search tasks and completed survey questionnaires. A research model was constructed to test the proposed hypotheses. Correlation and regression analyses results indicated a significant correlation between (1) users’ prior level of experience and their perceived level of satisfaction in using the web search engines, and (2) their perceived level of satisfaction in using the systems and their perceived intention to reuse the systems. A theoretical model is proposed to illustrate the causal relationships. The implications and limitations of the study are also discussed.
Resumo:
INEX investigates focused retrieval from structured documents by providing large test collections of structured documents, uniform evaluation measures, and a forum for organizations to compare their results. This paper reports on the INEX 2014 evaluation campaign, which consisted of three tracks: The Interactive Social Book Search Track investigated user information seeking behavior when interacting with various sources of information, for realistic task scenarios, and how the user interface impacts search and the search experience. The Social Book Search Track investigated the relative value of authoritative metadata and user-generated content for search and recommendation using a test collection with data from Amazon and LibraryThing, including user profiles and personal catalogues. The Tweet Contextualization Track investigated tweet contextualization, helping a user to understand a tweet by providing him with a short background summary generated from relevant Wikipedia passages aggregated into a coherent summary. INEX 2014 was an exciting year for INEX in which we for the third time ran our workshop as part of the CLEF labs. This paper gives an overview of all the INEX 2014 tracks, their aims and task, the built test-collections, the participants, and gives an initial analysis of the results.
Resumo:
This paper uses transaction cost theory to study cloud computing adoption. A model is developed and tested with data from an Australian survey. According to the results, perceived vendor opportunism and perceived legislative uncertainty around cloud computing were significantly associated with perceived cloud computing security risk. There was also a significant negative relationship between perceived cloud computing security risk and the intention to adopt cloud services. This study also reports on adoption rates of cloud computing in terms of applications, as well as the types of services used.
Resumo:
Purpose: To estimate refractive indices used with the Lenstar biometer. Methods: Axial lengths of model eyes were determined using an IOLMaster biometer and a Lenstar; comparing these lengths gave an overall eye index for the Lenstar. Using the Lenstar Graphical User interface, we determined that boundaries between media could be manipulated so that there were opposite changes in optical pathlength on either side of the boundary and specified changes in distances determined the ratios of media indices. These ratios were combined with the overall eye index to estimate indices. Results: The IOLMaster and Lenstar produced axial length estimates to within ±0.01 mm. Estimations of group refractive indices were 1.340, 1.341, 1.415 and 1.354 for cornea, aqueous, lens and overall eye, respectively. The aqueous and lens indices, but not those for the cornea, are similar to schematic eye indices and reasonable lens indices. Conclusion: The Lenstar appears to use different refractive indices for different ocular media.
Resumo:
The aim of this ethnographic study was to understand welding practices in shipyard environments with the purpose of designing an interactive welding robot that can help workers with their daily job. The robot is meant to be deployed for automatic welding on jack-up rig structures. The design of the robot turns out to be a challenging task due to several problematic working conditions on the shipyard, such as dust, irregular floor, high temperature, wind variations, elevated working platforms, narrow spaces, and circular welding paths requiring a robotic arm with more than 6 degrees of freedom. Additionally, the environment is very noisy and the workers – mostly foreigners – have a very basic level of English. These two issues need to be taken into account when designing the interactive user interface for the robot. Ideally, the communication flow between the two parties involved should be as frictionless as possible. The paper presents the results of our field observations and welders’ interviews, as well as our robot design recommendation for the next project stage.
Resumo:
For people with intellectual disabilities, there are significant barriers to inclusion in socially cooperative endeavors. This paper investigates the effectiveness of Stomp, a tangible user interface (TUI) designed to provide new participatory experiences for people with intellectual disability. Results from an observational study reveal the extent to which the Stomp system supports social and physical interaction. The tangible, spatial, and embodied qualities of Stomp result in an experience that does not rely on the acquisition of specific competencies before interaction and engagement can occur.
Resumo:
This paper addresses two common problems that users of various products and interfaces encounter— over-featured interfaces and product documentation. Over-featured interfaces are seen as a problem as they can confuse and over-complicate everyday interactions. Researchers also often claim that users do not read product documentation, although they are often exhorted to ‘RTFM’(read the field manual).We conducted two sets of studies with users which looked at the issues of both manuals and excess features with common domestic and personal products. The quantitative set was a series of questionnaires administered to 170 people over 7 years. The qualitative set consisted of two 6-month longitudinal studies based on diaries and interviews with a total of 15 participants. We found that manuals are not read by the majority of people, and most do not use all the features of the products that they own and use regularly. Men are more likely to do both than women, and younger people are less likely to use manuals than middle-aged and older ones. More educated people are also less likely to read manuals. Over-featuring and being forced to consult manuals also appears to cause negative emotional experiences. Implications of these findings are discussed.
Resumo:
PURPOSE To estimate refractive indices used by the Lenstar biometer to translate measured optical path lengths into geometrical path lengths within the eye. METHODS Axial lengths of model eyes were determined using the IOLMaster and Lenstar biometers; comparing those lengths gave an overall eye refractive index estimate for the Lenstar. Using the Lenstar Graphical User Interface, we noticed that boundaries between media could be manipulated and opposite changes in optical path lengths on either side of the boundary could be introduced. Those ratios were combined with the overall eye refractive index to estimate separate refractive indices. Furthermore, Haag-Streit provided us with a template to obtain 'air thicknesses' to compare with geometrical distances. RESULTS The axial length estimates obtained using the IOLMaster and the Lenstar agreed to within 0.01 mm. Estimates of group refractive indices used in the Lenstar were 1.340, 1.341, 1.415, and 1.354 for cornea, aqueous, lens, and overall eye, respectively. Those refractive indices did not match those of schematic eyes, but were close in the cases of aqueous and lens. Linear equations relating air thicknesses to geometrical thicknesses were consistent with our findings. CONCLUSION The Lenstar uses different refractive indices for different ocular media. Some of the refractive indices, such as that for the cornea, are not physiological; therefore, it is likely that the calibrations in the instrument correspond to instrument-specific corrections and are not the real optical path lengths.
Resumo:
Adopting a multi-theoretical approach, I examine external auditors’ perceptions of the reasons why organizations do or do not adopt cloud computing. I interview forensic accountants and IT experts about the adoption, acceptance, institutional motives, and risks of cloud computing. Although the medium to large accounting firms where the external auditors worked almost exclusively used private clouds, both private and public cloud services were gaining a foothold among many of their clients. Despite the advantages of cloud computing, data confidentiality and the involvement of foreign jurisdictions remain a concern, particularly if the data are moved outside Australia. Additionally, some organizations seem to understand neither the technology itself nor their own requirements, which may lead to poorly negotiated contracts and service agreements. To minimize the risks associated with cloud computing, many organizations turn to hybrid solutions or private clouds that include national or dedicated data centers. To the best of my knowledge, this is the first empirical study that reports on cloud computing adoption from the perspectives of external auditors.
Resumo:
Gene expression is arguably the most important indicator of biological function. Thus identifying differentially expressed genes is one of the main aims of high throughout studies that use microarray and RNAseq platforms to study deregulated cellular pathways. There are many tools for analysing differentia gene expression from transciptomic datasets. The major challenge of this topic is to estimate gene expression variance due to the high amount of ‘background noise’ that is generated from biological equipment and the lack of biological replicates. Bayesian inference has been widely used in the bioinformatics field. In this work, we reveal that the prior knowledge employed in the Bayesian framework also helps to improve the accuracy of differential gene expression analysis when using a small number of replicates. We have developed a differential analysis tool that uses Bayesian estimation of the variance of gene expression for use with small numbers of biological replicates. Our method is more consistent when compared to the widely used cyber-t tool that successfully introduced the Bayesian framework to differential analysis. We also provide a user-friendly web based Graphic User Interface for biologists to use with microarray and RNAseq data. Bayesian inference can compensate for the instability of variance caused when using a small number of biological replicates by using pseudo replicates as prior knowledge. We also show that our new strategy to select pseudo replicates will improve the performance of the analysis. - See more at: http://www.eurekaselect.com/node/138761/article#sthash.VeK9xl5k.dpuf
Resumo:
Portable music players have made it possible to listen to a personal collection of music in almost every situation, and they are often used during some activity to provide a stimulating audio environment. Studies have demonstrated the effects of music on the human body and mind, indicating that selecting music according to situation can, besides making the situation more enjoyable, also make humans perform better. For example, music can boost performance during physical exercises, alleviate stress and positively affect learning. We believe that people intuitively select different types of music for different situations. Based on this hypothesis, we propose a portable music player, AndroMedia, designed to provide personalised music recommendations using the user’s current context and listening habits together with other user’s situational listening patterns. We have developed a prototype that consists of a central server and a PDA client. The client uses Bluetooth sensors to acquire context information and logs user interaction to infer implicit user feedback. The user interface also allows the user to give explicit feedback. Large user interface elements facilitate touch-based usage in busy environments. The prototype provides the necessary framework for using the collected information together with other user’s listening history in a context- enhanced collaborative filtering algorithm to generate context-sensitive recommendations. The current implementation is limited to using traditional collaborative filtering algorithms. We outline the techniques required to create context-aware recommendations and present a survey on mobile context-aware music recommenders found in literature. As opposed to the explored systems, AndroMedia utilises other users’ listening habits when suggesting tunes, and does not require any laborious set up processes.
Resumo:
Information retrieval of concise and consistent text passages is called passage retrieval. Passages can be used in an information retrieval system to improve its user interface and performance. In this thesis passage retrieval is compared to other forms of information retrieval. Implementation of passage retrieval as a feature of an information retrieval system is discussed. Various existing passage retrieval methods, their implementation and their efficiency are compared. I evaluated two different implementations of passage retrieval: direct passage retrieval and combined passage retrieval. In comparison combined passage retrieval turned out to be more efficient.
Resumo:
The conventional Cornell's source-based approach of probabilistic seismic-hazard assessment (PSHA) has been employed all around the world, whilst many studies often rely on the use of computer packages such as FRISK (McGuire FRISK-a computer program for seismic risk analysis. Open-File Report 78-1007, United States Geological Survey, Department of Interior, Washington 1978) and SEISRISK III (Bender and Perkins SEISRISK III-a computer program for seismic hazard estimation, Bulletin 1772. United States Geological Survey, Department of Interior, Washington 1987). A ``black-box'' syndrome may be resulted if the user of the software does not have another simple and robust PSHA method that can be used to make comparisons. An alternative method for PSHA, namely direct amplitude-based (DAB) approach, has been developed as a heuristic and efficient method enabling users to undertake their own sanity checks on outputs from computer packages. This paper experiments the application of the DAB approach for three cities in China, Iran, and India, respectively, and compares with documented results computed by the source-based approach. Several insights regarding the procedure of conducting PSHA have also been obtained, which could be useful for future seismic-hazard studies.
Resumo:
Formation of silicon carbide in the Acheson process was studied using a mass transfer model which has been developed in this study. The century old Acheson process is still used for the mass production of silicon carbide. A heat resistance furnace is used in the Acheson process which uses sand and petroleum coke as major raw materials.: It is a highly energy intensive process. No mass transfer model is available for this process. Therefore, a mass transfer model has been developed to study the mass transfer aspects of the process along with heat transfer. The reaction kinetics of silicon carbide formation has been taken from the literature. It has been shown that reaction kinetics has a reasonable influence on the process efficiency. The effect of various parameters on the process such as total gas pressure, presence of silicon carbide in the initial charge, etc. has been studied. A graphical user interface has also been developed for the Acheson process to make the computer code user friendly.
Resumo:
In this paper a new parallel algorithm for nonlinear transient dynamic analysis of large structures has been presented. An unconditionally stable Newmark-beta method (constant average acceleration technique) has been employed for time integration. The proposed parallel algorithm has been devised within the broad framework of domain decomposition techniques. However, unlike most of the existing parallel algorithms (devised for structural dynamic applications) which are basically derived using nonoverlapped domains, the proposed algorithm uses overlapped domains. The parallel overlapped domain decomposition algorithm proposed in this paper has been formulated by splitting the mass, damping and stiffness matrices arises out of finite element discretisation of a given structure. A predictor-corrector scheme has been formulated for iteratively improving the solution in each step. A computer program based on the proposed algorithm has been developed and implemented with message passing interface as software development environment. PARAM-10000 MIMD parallel computer has been used to evaluate the performances. Numerical experiments have been conducted to validate as well as to evaluate the performance of the proposed parallel algorithm. Comparisons have been made with the conventional nonoverlapped domain decomposition algorithms. Numerical studies indicate that the proposed algorithm is superior in performance to the conventional domain decomposition algorithms. (C) 2003 Elsevier Ltd. All rights reserved.