983 resultados para Computer tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many older adults wish to gain competence in using a computer, but many application interfaces are perceived as complex and difficult to use, deterring potential users from investing the time to learn them. Hence, this study looks at the potential of ‘familiar’ interface design which builds upon users’ knowledge of real world interactions, and applies existing skills to a new domain. Tools are provided in the form of familiar visual objects, and manipulated like real-world counterparts, rather than with buttons, icons and menus found in classic WIMP interfaces. This paper describes the formative evaluation of computer interactions that are based upon familiar real world tasks, which supports multitouch interaction, involves few buttons and icons, no menus, no right-clicks or double-clicks and no dialogs. Using an example of an email client to test the principles of using “familiarity”, the initial feedback was very encouraging, with 3 of the 4 participants being able to undertake some of the basic email tasks with no prior training and little or no help. The feedback has informed a number of refinements of the design principles, such as providing clearer affordance for visual objects. A full study is currently underway.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to determine the potential of mid-infrared spectroscopy coupled with multidimensional statistical analysis for the prediction of processed cheese instrumental texture and meltability attributes. Processed cheeses (n = 32) of varying composition were manufactured in a pilot plant. Following two and four weeks storage at 4 degrees C samples were analysed using texture profile analysis, two meltability tests (computer vision, Olson and Price) and mid-infrared spectroscopy (4000-640 cm(-1)). Partial least squares regression was used to develop predictive models for all measured attributes. Five attributes were successfully modelled with varying degrees of accuracy. The computer vision meltability model allowed for discrimination between high and low melt values (R-2 = 0.64). The hardness and springiness models gave approximate quantitative results (R-2 = 0.77) and the cohesiveness (R-2 = 0.81) and Olson and Price meltability (R-2 = 0.88) models gave good prediction results. (c) 2006 Elsevier Ltd. All rights reserved..

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advancement of e-learning technologies has made it viable for developments in education and technology to be combined in order to fulfil educational needs worldwide. E-learning consists of informal learning approaches and emerging technologies to support the delivery of learning skills, materials, collaboration and knowledge sharing. E-learning is a holistic approach that covers a wide range of courses, technologies and infrastructures to provide an effective learning environment. The Learning Management System (LMS) is the core of the entire e-learning process along with technology, content, and services. This paper investigates the role of model-driven personalisation support modalities in providing enhanced levels of learning and trusted assimilation in an e-learning delivery context. We present an analysis of the impact of an integrated learning path that an e-learning system may employ to track activities and evaluate the performance of learners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of online social networking tools (SNTs) has become commonplace within higher education. In this paper a definition and a typology of educational affordance of social networking service (SNS) are presented. The paper also explores the educational affordances whilst examining how university lecturers and students use SNTs to support their educational activities. The data presented here were obtained through a survey in which 38 participants from three universities took part; two universities in Uganda and one in the United Kingdom. The results show that Facebook is the most popular tool with 75 % of participants having profiles. Whilst most participants perceived the educational significance of these tools, social affordances remain more pronounced compared to pedagogical and technological affordances. The limitations of this study have also been discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Along the internal carotid artery (ICA), atherosclerotic plaques are often located in its cavernous sinus (parasellar) segments (pICA). Studies indicate that the incidence of pre-atherosclerotic lesions is linked with the complexity of the pICA; however, the pICA shape was never objectively characterized. Our study aims at providing objective mathematical characterizations of the pICA shape. Methods and results Three-dimensional (3D) computer models, reconstructed from contrast enhanced computed tomography (CT) data of 30 randomly selected patients (60 pICAs) were analyzed with modern visualization software and new mathematical algorithms. As objective measures for the pICA shape complexity, we provide calculations of curvature energy, torsion energy, and total complexity of 3D skeletons of the pICA lumen. We further measured the posterior knee of the so-called ""carotid siphon"" with a virtual goniometer and performed correlations between the objective mathematical calculations and the subjective angle measurements. Conclusions Firstly, our study provides mathematical characterizations of the pICA shape, which can serve as objective reference data for analyzing connections between pICA shape complexity and vascular diseases. Secondly, we provide an objective method for creating Such data. Thirdly, we evaluate the usefulness of subjective goniometric measurements of the angle of the posterior knee of the carotid siphon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Running hydrodynamic models interactively allows both visual exploration and change of model state during simulation. One of the main characteristics of an interactive model is that it should provide immediate feedback to the user, for example respond to changes in model state or view settings. For this reason, such features are usually only available for models with a relatively small number of computational cells, which are used mainly for demonstration and educational purposes. It would be useful if interactive modeling would also work for models typically used in consultancy projects involving large scale simulations. This results in a number of technical challenges related to the combination of the model itself and the visualisation tools (scalability, implementation of an appropriate API for control and access to the internal state). While model parallelisation is increasingly addressed by the environmental modeling community, little effort has been spent on developing a high-performance interactive environment. What can we learn from other high-end visualisation domains such as 3D animation, gaming, virtual globes (Autodesk 3ds Max, Second Life, Google Earth) that also focus on efficient interaction with 3D environments? In these domains high efficiency is usually achieved by the use of computer graphics algorithms such as surface simplification depending on current view, distance to objects, and efficient caching of the aggregated representation of object meshes. We investigate how these algorithms can be re-used in the context of interactive hydrodynamic modeling without significant changes to the model code and allowing model operation on both multi-core CPU personal computers and high-performance computer clusters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As digital systems move away from traditional desktop setups, new interaction paradigms are emerging that better integrate with users’ realworld surroundings, and better support users’ individual needs. While promising, these modern interaction paradigms also present new challenges, such as a lack of paradigm-specific tools to systematically evaluate and fully understand their use. This dissertation tackles this issue by framing empirical studies of three novel digital systems in embodied cognition – an exciting new perspective in cognitive science where the body and its interactions with the physical world take a central role in human cognition. This is achieved by first, focusing the design of all these systems on a contemporary interaction paradigm that emphasizes physical interaction on tangible interaction, a contemporary interaction paradigm; and second, by comprehensively studying user performance in these systems through a set of novel performance metrics grounded on epistemic actions, a relatively well established and studied construct in the literature on embodied cognition. The first system presented in this dissertation is an augmented Four-in-a-row board game. Three different versions of the game were developed, based on three different interaction paradigms (tangible, touch and mouse), and a repeated measures study involving 36 participants measured the occurrence of three simple epistemic actions across these three interfaces. The results highlight the relevance of epistemic actions in such a task and suggest that the different interaction paradigms afford instantiation of these actions in different ways. Additionally, the tangible version of the system supports the most rapid execution of these actions, providing novel quantitative insights into the real benefits of tangible systems. The second system presented in this dissertation is a tangible tabletop scheduling application. Two studies with single and paired users provide several insights into the impact of epistemic actions on the user experience when these are performed outside of a system’s sensing boundaries. These insights are clustered by the form, size and location of ideal interface areas for such offline epistemic actions to occur, as well as how can physical tokens be designed to better support them. Finally, and based on the results obtained to this point, the last study presented in this dissertation directly addresses the lack of empirical tools to formally evaluate tangible interaction. It presents a video-coding framework grounded on a systematic literature review of 78 papers, and evaluates its value as metric through a 60 participant study performed across three different research laboratories. The results highlight the usefulness and power of epistemic actions as a performance metric for tangible systems. In sum, through the use of such novel metrics in each of the three studies presented, this dissertation provides a better understanding of the real impact and benefits of designing and developing systems that feature tangible interaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We performed a light microscope and a computer three-dimensional reconstruction study of serial sections of the molar enamel organ of 3- and 5-day-old rats perfused with Indian ink through the arterial system. The tooth germs were fixed in Bouin's solution, embedded in paraffin, sectioned and stained with haematoxylin and eosin. For the three-dimensional reconstruction, light micrographs of the serial sections were digitized, and aligned using the serial EM Align software downloaded from http://synapses.bu.edu/tools/. After alignment, the boundaries of the India-ink-filled blood vessels were manually traced with a mouse using the software IGL trace (version 1.26b), also downloaded from the above website. After tracing, a three-dimensional representation of the blood vessel contours was generated in a VRML format and visualized with the help of the software Cortona Web3D viewer (version 4.0) downloaded from http://www.parallelgraphics.com/products/cortona. Our results showed that in regions where ameloblasts are polarized the capillaries are arranged in three distinct levels: (1) penetrating and leaving capillaries in relation to the outer enamel epithelium; (2) capillaries crossing and branching inside the stellate reticulum; and (3) capillaries branching and anastomosing profusely within the stratum intermedium, thereby forming an extensive capillary plexus intimately associated with the cells of the stratum intermedium. The existence of a conspicuous capillary plexus intermingled with cells of the stratum intermedium, as shown in our results, suggests that some molecules produced by cells of the stratum intermedium could be released into the capillary plexus and thereafter carried to the dental follicle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IEEE 1451 Standard is intended to address the smart transducer interfacing problematic in network environments. Usually, proprietary hardware and software is a very efficient solution to in planent the IEEE 1451 normative, although can be expensive and inflexible. In contrast, the use of open and standardized tools for implementing the IEEE 1451 normative is proposed in this paper. Tools such as Java and Phyton programming languages, Linux, programmable logic technology, Personal Computer resources and Ethernet architecture were integrated in order to constructa network node based on the IEEE 1451 standards. The node can be applied in systems based on the client-server communication model The evaluation of the employed tools and expermental results are presented. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the overall methodology that has been used to encode both the Brazilian Portuguese WordNet (WordNet.Br) standard language-independent conceptual-semantic relations (hyponymy, co-hyponymy, meronymy, cause, and entailment) and the so-called cross-lingual conceptual-semantic relations between different wordnets. Accordingly, after contextualizing the project and outlining the current lexical database structure and statistics, it describes the WordNet.Br editing GUI that was designed to aid the linguist in carrying out the tasks of building synsets, selecting sample sentences from corpora, writing synset concept glosses, and encoding both language-independent conceptual-semantic relations and cross-lingual conceptual-semantic relations between WordNet.Br and Princeton WordNet © Springer-Verlag Berlin Heidelberg 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During gray cast iron cutting, the great rate of mechanical energy from cutting forces is converted into heat. Considerable heat is generated, principally in three areas: the shear zone, rake face and at the clearance side of the cutting edge. Excessive heat will cause undesirable high temperature in the tool which leads to softening of the tool and its accelerated wear and breakage. Nowadays the advanced ceramics are widely used in cutting tools. In this paper a composition special of Si3N4 was sintering, characterized, cut and ground to make SNGN120408 and applyed in machining gray cast iron with hardness equal 205 HB in dry cutting conditions by using digital controlled computer lathe. The tool performance was analysed in function of cutting forces, flank wear, temperature and roughness. Therefore metal removing process is carried out for three different cutting speeds (300 m/min, 600 m/min, and 800 m/min), while a cutting depth of 1 mm and a feed rate of 0.33 mm/rev are kept constant. As a result of the experiments, the lowest main cutting force, which depends on cutting speed, is obtained as 264 N at 600 m/min while the highest main cutting force is recorded as 294 N at 300 m/min.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increased accessibility to high-performance computing resources has created a demand for user support through performance evaluation tools like the iSPD (iconic Simulator for Parallel and Distributed systems), a simulator based on iconic modelling for distributed environments such as computer grids. It was developed to make it easier for general users to create their grid models, including allocation and scheduling algorithms. This paper describes how schedulers are managed by iSPD and how users can easily adopt the scheduling policy that improves the system being simulated. A thorough description of iSPD is given, detailing its scheduler manager. Some comparisons between iSPD and Simgrid simulations, including runs of the simulated environment in a real cluster, are also presented. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we use Markov chain Monte Carlo (MCMC) methods in order to estimate and compare GARCH models from a Bayesian perspective. We allow for possibly heavy tailed and asymmetric distributions in the error term. We use a general method proposed in the literature to introduce skewness into a continuous unimodal and symmetric distribution. For each model we compute an approximation to the marginal likelihood, based on the MCMC output. From these approximations we compute Bayes factors and posterior model probabilities. (C) 2012 IMACS. Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate tools for the fusion of images generated by tomography and structural and functional magnetic resonance imaging. METHODS: Magnetic resonance and functional magnetic resonance imaging were performed while a volunteer who had previously undergone cranial tomography performed motor and somatosensory tasks in a 3-Tesla scanner. Image data were analyzed with different programs, and the results were compared. RESULTS: We constructed a flow chart of computational processes that allowed measurement of the spatial congruence between the methods. There was no single computational tool that contained the entire set of functions necessary to achieve the goal. CONCLUSION: The fusion of the images from the three methods proved to be feasible with the use of four free-access software programs (OsiriX, Register, MRIcro and FSL). Our results may serve as a basis for building software that will be useful as a virtual tool prior to neurosurgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.