982 resultados para Computer techniques
Resumo:
Myoglobin has been studied in considerable detail using different experimental and computational techniques over the past decades. Recent developments in time-resolved spectroscopy have provided experimental data amenable to detailed atomistic simulations. The main theme of the present review are results on the structures, energetics and dynamics of ligands ( CO, NO) interacting with myoglobin from computer simulations. Modern computational methods including free energy simulations, mixed quantum mechanics/molecular mechanics simulations, and reactive molecular dynamics simulations provide insight into the dynamics of ligand dynamics in confined spaces complementary to experiment. Application of these methods to calculate and understand experimental observations for myoglobin interacting with CO and NO are presented and discussed.
Resumo:
This paper discusses how numerical gradient estimation methods may be used in order to reduce the computational demands on a class of multidimensional clustering algorithms. The study is motivated by the recognition that several current point-density based cluster identification algorithms could benefit from a reduction of computational demand if approximate a-priori estimates of the cluster centres present in a given data set could be supplied as starting conditions for these algorithms. In this particular presentation, the algorithm shown to benefit from the technique is the Mean-Tracking (M-T) cluster algorithm, but the results obtained from the gradient estimation approach may also be applied to other clustering algorithms and their related disciplines.
Resumo:
This paper deals with the integration of radial basis function (RBF) networks into the industrial software control package Connoisseur. The paper shows the improved modelling capabilities offered by RBF networks within the Connoisseur environment compared to linear modelling techniques such as recursive least squares. The paper also goes on to mention the way this improved modelling capability, obtained through the RBF networks will be utilised within Connoisseur.
Resumo:
Older adult computer users often lose track of the mouse cursor and so resort to methods such as mouse shaking or searching the screen to find the cursor again. Hence, this paper describes how a standard optical mouse was modified to include a touch sensor, activated by releasing and touching the mouse, which automatically centers the mouse cursor to the screen, potentially making it easier to find a 'lost' cursor. Six older adult computer users and six younger computer users were asked to compare the touch sensitive mouse with cursor centering with two alternative techniques for locating the mouse cursor: manually shaking the mouse and using the Windows sonar facility. The time taken to click on a target following a distractor task was recorded, and results show that centering the mouse was the fastest to use, with a 35% improvement over shaking the mouse. Five out of six older participants ranked the touch sensitive mouse with cursor centering as the easiest to use.
Resumo:
Point and click interactions using a mouse are an integral part of computer use for current desktop systems. Compared with younger users though, older adults experience greater difficulties performing cursor positioning tasks, and this can present limitations to using a computer easily and effectively. Target expansion is a technique for improving pointing performance, where the target dynamically grows as the cursor approaches. This has the advantage that targets conserve screen real estate in their unexpanded state, yet can still provide the benefits of a larger area to click on. This paper presents two studies of target expansion with older and younger participants, involving multidirectional point-select tasks with a computer mouse. Study 1 compares static versus expanding targets, and Study 2 compares static targets with three alternative techniques for expansion. Results show that expansion can improve times by up to 14%, and reduce error rates by up to 50%. Additionally, expanding targets are beneficial even when the expansion happens late in the movement, i.e. after the cursor has reached the expanded target area or even after it has reached the original target area. Participants’ subjective feedback on the target expansion are generally favorable, and this lends further support for the technique.
Resumo:
Older adult computer users often lose track of the mouse cursor and so resort to methods such as shaking the mouse or searching the entire screen to find the cursor again. Hence, this paper describes how a standard optical mouse was modified to include a touch sensor, activated by releasing and touching the mouse, which automatically centers the mouse cursor to the screen, potentially making it easier to find a ‘lost’ cursor. Six older adult computer users and six younger computer users were asked to compare the touch sensitive mouse with cursor centering with two alternative techniques for locating the mouse cursor: manually shaking the mouse and using the Windows sonar facility. The time taken to click on a target after a distractor task was recorded, and results show that centering the mouse was the fastest to use, with a 35% improvement over shaking the mouse. Five out of six older participants ranked the touch sensitive mouse with cursor centering as the easiest to use.
Resumo:
This article describes a new application of key psychological concepts in the area of Sociometry for the selection of workers within organizations in which projects are developed. The project manager can use a new procedure to determine which individuals should be chosen from a given pool of resources and how to combine them into one or several simultaneous groups/projects in order to assure the highest possible overall work efficiency from the standpoint of social interaction. The optimization process was carried out by means of matrix calculations performed using a computer or even manually, and based on a number of new ratios generated ad-hoc and composed on the basis of indices frequently used in Sociometry.
Resumo:
Brazil`s State of Sao Paulo Research Foundation
Resumo:
The widespread use of service-oriented architectures (SOAs) and Web services in commercial software requires the adoption of development techniques to ensure the quality of Web services. Testing techniques and tools concern quality and play a critical role in accomplishing quality of SOA based systems. Existing techniques and tools for traditional systems are not appropriate to these new systems, making the development of Web services testing techniques and tools required. This article presents new testing techniques to automatically generate a set of test cases and data for Web services. The techniques presented here explore data perturbation of Web services messages upon data types, integrity and consistency. To support these techniques, a tool (GenAutoWS) was developed and applied to real problems. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
In recent years, it has been observed that software clones and plagiarism are becoming an increased threat for one?s creativity. Clones are the results of copying and using other?s work. According to the Merriam – Webster dictionary, “A clone is one that appears to be a copy of an original form”. It is synonym to duplicate. Clones lead to redundancy of codes, but not all redundant code is a clone.On basis of this background knowledge ,in order to safeguard one?s idea and to avoid intentional code duplication for pretending other?s work as if their owns, software clone detection should be emphasized more. The objective of this paper is to review the methods for clone detection and to apply those methods for finding the extent of plagiarism occurrence among the Swedish Universities in Master level computer science department and to analyze the results.The rest part of the paper, discuss about software plagiarism detection which employs data analysis technique and then statistical analysis of the results.Plagiarism is an act of stealing and passing off the idea?s and words of another person?s as one?s own. Using data analysis technique, samples(Master level computer Science thesis report) were taken from various Swedish universities and processed in Ephorus anti plagiarism software detection. Ephorus gives the percentage of plagiarism for each thesis document, from this results statistical analysis were carried out using Minitab Software.The results gives a very low percentage of Plagiarism extent among the Swedish universities, which concludes that Plagiarism is not a threat to Sweden?s standard of education in computer science.This paper is based on data analysis, intelligence techniques, EPHORUS software plagiarism detection tool and MINITAB statistical software analysis.
Resumo:
Increasing costs and competitive business strategies are pushing sawmill enterprises to make an effort for optimization of their process management. Organizational decisions mainly concentrate on performance and reduction of operational costs in order to maintain profit margins. Although many efforts have been made, effective utilization of resources, optimal planning and maximum productivity in sawmill are still challenging to sawmill industries. Many researchers proposed the simulation models in combination with optimization techniques to address problems of integrated logistics optimization. The combination of simulation and optimization technique identifies the optimal strategy by simulating all complex behaviours of the system under consideration including objectives and constraints. During the past decade, an enormous number of studies were conducted to simulate operational inefficiencies in order to find optimal solutions. This paper gives a review on recent developments and challenges associated with simulation and optimization techniques. It was believed that the review would provide a perfect ground to the authors in pursuing further work in optimizing sawmill yard operations.