992 resultados para Interactive Techniques
Resumo:
The health benefits provided by probiotic bacteria have led to their increasing use in fermented and other dairy products. However, their viability in these products is low. Encapsulation has been investigated to protect the bacteria in the product's environment and improve their survival. There are two common encapsulation techniques, namely extrusion and emulsion, to encapsulate the probiotics for their use in the fermented and other dairy products. This review evaluates the merits and limitations of these two techniques, and also discusses the supporting materials and special treatments used in encapsulation processes. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
Graphical user interfaces (GUIs) are critical components of todays software. Given their increased relevance, correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing systems. We use static analysis techniques to generate models of the user interface behaviour from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particularly type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.
Resumo:
Graphical user interfaces (GUIs) make software easy to use by providing the user with visual controls. Therefore, correctness of GUI's code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper presents a generic model for language-independent reverse engineering of graphical user interface based applications, and we explore the integration of model-based testing techniques in our approach, thus allowing us to perform fault detection. A prototype tool has been constructed, which is already capable of deriving and testing a user interface behavioral model of applications written in Java/Swing.
Resumo:
Abstract. Graphical user interfaces (GUIs) make software easy to use by providing the user with visual controls. Therefore, correctness of GUI’s code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper describes our approach to reverse engineer an abstract model of a user interface directly from the GUI’s legacy code. We also present results from a case study. These results are encouraging and give evidence that the goal of reverse engineering user interfaces can be met with more work on this technique.
Resumo:
A common problem among information systems is the storage and maintenance of permanent information identified by a key. Such systems are typically known as data base engines or simply as data bases. Today the systems information market is full of solutions that provide mass storage capacities implemented in different operating system and with great amounts of extra functionalities. In this paper we will focus on the formal high level specification of data base systems in the Haskell language. We begin by introducing a high level view of a data base system with a specification of the most common operations in a functional point of view. We then augment this specification by lifting to the state monad which is then modified once again to permit input/output operations between the computations
Resumo:
Forest cover of the Maringá municipality, located in northern Parana State, was mapped in this study. Mapping was carried out by using high-resolution HRC sensor imagery and medium resolution CCD sensor imagery from the CBERS satellite. Images were georeferenced and forest vegetation patches (TOFs - trees outside forests) were classified using two methods of digital classification: reflectance-based or the digital number of each pixel, and object-oriented. The areas of each polygon were calculated, which allowed each polygon to be segregated into size classes. Thematic maps were built from the resulting polygon size classes and summary statistics generated from each size class for each area. It was found that most forest fragments in Maringá were smaller than 500 m². There was also a difference of 58.44% in the amount of vegetation between the high-resolution imagery and medium resolution imagery due to the distinct spatial resolution of the sensors. It was concluded that high-resolution geotechnology is essential to provide reliable information on urban greens and forest cover under highly human-perturbed landscapes.
Resumo:
Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.
Resumo:
Graphical user interfaces (GUIs) are critical components of today's open source software. Given their increased relevance, the correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing open source systems. We use static analysis techniques to generate models of the user interface behavior from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particular type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.
Resumo:
Background: Several studies link the seamless fit of implant-supported prosthesis with the accuracy of the dental impression technique obtained during acquisition. In addition, factors such as implant angulation and coping shape contribute to implant misfit. Purpose: The aim of this study was to identify the most accurate impression technique and factors affecting the impression accuracy. Material and Methods: A systematic review of peer-reviewed literature was conducted analyzing articles published between 2009 and 2013. The following search terms were used: implant impression, impression accuracy, and implant misfit.A total of 417 articles were identified; 32 were selected for review. Results: All 32 selected studies refer to in vitro studies. Fourteen articles compare open and closed impression technique, 8 advocate the open technique, and 6 report similar results. Other 14 articles evaluate splinted and non-splinted techniques; all advocating the splinted technique. Polyether material usage was reported in nine; six studies tested vinyl polysiloxane and one study used irreversible hydrocolloid. Eight studies evaluated different copings designs. Intraoral optical devices were compared in four studies. Conclusions: The most accurate results were achieved with two configurations: (1) the optical intraoral system with powder and (2) the open technique with splinted squared transfer copings, using polyether as impression material.
Resumo:
In this article we argue that digital simulations promote and explore complex relations between the player and the machines cybernetic system with which it relates through gameplay, that is, the real application of tactics and strategies used by participants as they play the game. We plan to show that the realism of simulation, together with the merger of artificial objects with the real world, can generate interactive empathy between players and their avatars. In this text, we intend to explore augmented reality as a means to visualise interactive communication projects. With ARToolkit, Virtools and 3ds Max applications, we aim to show how to create a portable interactive platform that resorts to the environment and markers for constructing the games scenario. Many of the conventional functions of the human eye are being replaced by techniques where images do not position themselves in the traditional manner that we observe them (Crary, 1998), or in the way we perceive the real world. The digitalization of the real world to a new informational layer over objects, people or environments, needs to be processed and mediated by tools that amplify the natural human senses.
Resumo:
Micronuclei (MN) in exfoliated epithelial cells are widely used as biomarkers of cancer risk in humans. MN are classified as biomarkers of the break age and loss of chromosomes. They are small, extra nuclear bodies that arise in dividing cells from centric chromosome/chromatid fragments or whole chromosomes/chromatids that lag behind in anaphase and are not included in the daughter nuclei in telophase. Buccal mucosa cells have been used in biomonitoring exposed populations because these cells are in the direct route of exposure to ingested pollutant, are capable of metabolizing proximate carcinogens to reactive chemicals, and are easily and rapidly collected by brushing the buccal mucosa. The objective of the present study was to further investigate if, and to what extent, different stains have an effect on the results of micronuclei studies in exfoliated cells. These techniques are: Papanicolaou (PAP), Modified Papanicolaou, May-Grünwald Giemsa (MGG), Giemsa, Harris’s Hematoxylin, Feulgen with Fast Green counterstain and Feulgen without counterstain.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
A organização automática de mensagens de correio electrónico é um desafio actual na área da aprendizagem automática. O número excessivo de mensagens afecta cada vez mais utilizadores, especialmente os que usam o correio electrónico como ferramenta de comunicação e trabalho. Esta tese aborda o problema da organização automática de mensagens de correio electrónico propondo uma solução que tem como objectivo a etiquetagem automática de mensagens. A etiquetagem automática é feita com recurso às pastas de correio electrónico anteriormente criadas pelos utilizadores, tratando-as como etiquetas, e à sugestão de múltiplas etiquetas para cada mensagem (top-N). São estudadas várias técnicas de aprendizagem e os vários campos que compõe uma mensagem de correio electrónico são analisados de forma a determinar a sua adequação como elementos de classificação. O foco deste trabalho recai sobre os campos textuais (o assunto e o corpo das mensagens), estudando-se diferentes formas de representação, selecção de características e algoritmos de classificação. É ainda efectuada a avaliação dos campos de participantes através de algoritmos de classificação que os representam usando o modelo vectorial ou como um grafo. Os vários campos são combinados para classificação utilizando a técnica de combinação de classificadores Votação por Maioria. Os testes são efectuados com um subconjunto de mensagens de correio electrónico da Enron e um conjunto de dados privados disponibilizados pelo Institute for Systems and Technologies of Information, Control and Communication (INSTICC). Estes conjuntos são analisados de forma a perceber as características dos dados. A avaliação do sistema é realizada através da percentagem de acerto dos classificadores. Os resultados obtidos apresentam melhorias significativas em comparação com os trabalhos relacionados.