Touch ‘n’ sketch: pen and fingers on a multi-touch sketch application for tablet PC’s


Autoria(s): Vieira, Hugo David Jesus
Contribuinte(s)

Leeuwen, Josef Petrus van

Nóbrega, Leonel Domingos Telo

Data(s)

22/01/2013

22/01/2013

22/01/2013

2011

Resumo

In many creative and technical areas, professionals make use of paper sketches for developing and expressing concepts and models. Paper offers an almost constraint free environment where they have as much freedom to express themselves as they need. However, paper does have some disadvantages, such as size and not being able to manipulate the content (other than remove it or scratch it), which can be overcome by creating systems that can offer the same freedom people have from paper but none of the disadvantages and limitations. Only in recent years has the technology become massively available that allows doing precisely that, with the development in touch‐sensitive screens that also have the ability to interact with a stylus. In this project a prototype was created with the objective of finding a set of the most useful and usable interactions, which are composed of combinations of multi‐touch and pen. The project selected Computer Aided Software Engineering (CASE) tools as its application domain, because it addresses a solid and well‐defined discipline with still sufficient room for new developments. This was the result from the area research conducted to find an application domain, which involved analyzing sketching tools from several possible areas and domains. User studies were conducted using Model Driven Inquiry (MDI) to have a better understanding of the human sketch creation activities and concepts devised. Then the prototype was implemented, through which it was possible to execute user evaluations of the interaction concepts created. Results validated most interactions, in the face of limited testing only being possible at the time. Users had more problems using the pen, however handwriting and ink recognition were very effective, and users quickly learned the manipulations and gestures from the Natural User Interface (NUI).

Universidade da Madeira

Identificador

http://hdl.handle.net/10400.13/322

Idioma(s)

eng

Direitos

openAccess

Palavras-Chave #Multi‐touch tablet‐PC #HCI #NUI #HAM #Sketch CASE tool #Digital ink‐recognition #. #Centro de Ciências Exatas e da Engenharia
Tipo

masterThesis