919 resultados para Visualization Using Computer Algebra Tools
Resumo:
The constant increase in digital systems complexity definitely demands the automation of the corresponding synthesis process. This paper presents a computational environment designed to produce both software and hardware implementations of a system. The tool for code generation has been named ACG8051. As for the hardware synthesis there has been produced a larger environment consisting of four programs, namely: PIPE2TAB, AGPS, TABELA, and TAB2VHDL. ACG8051 and PIPE2TAB use place/transition net descriptions from PIPE as inputs. ACG8051 is aimed at generating assembly code for the 8051 micro-controller. PIPE2TAB produces a tabular version of a Mealy type finite state machine of the system, its output is fed into AGPS that is used for state allocation. The resulting digital system is then input to TABELA, which minimizes control functions and outputs of the digital system. Finally, the output generated by TABELA is fed to TAB2VHDL that produces a VHDL description of the system at the register transfer level. Thus, we present here a set of tools designed to take a high-level description of a digital system, represented by a place/transition net, and produces as output both an assembly code that can be immediately run on an 8051 micro-controller, and a VHDL description that can be used to directly implement the hardware parts either on an FPGA or as an ASIC.
Resumo:
The C 2 * radical is used as a system probe tool to the reactive flow diagnostic, and it was chosen due to its large occurrence in plasma and combustion in aeronautics and aerospace applications. The rotational temperatures of C 2 * species were determined by the comparison between experimental and theoretical data. The simulation code was developed by the authors, using C++ language and the object oriented paradigm, and it includes a set of new tools that increase the efficacy of the C 2 * probe to determine the rotational temperature of the system. A brute force approach for the determination of spectral parameters was adopted in this version of the computer code. The statistical parameter c 2 was used as an objective criterion to determine the better match of experimental and synthesized spectra. The results showed that the program works even with low-quality experimental data, typically collected from in situ airborne compact apparatus. The technique was applied to flames of a Bunsen burner, and the rotational temperature of ca. 2100 K was calculated.
Resumo:
This paper presents two tools developed to facilitate the use and automate the process of using Virtual Worlds for educational purposes. The first tool has been developed to automatically create the classroom space, usually called region in the virtual world, which means, a region in the virtual world used to develop educational activities between professors, students and interactive objects. The second tool helps the process of creating 3D interactive objects in a virtual world. With these tools educators will be able to produce 3D interactive learning objects and use them in virtual classrooms improving the quality and appeal, for students, of their classes. © 2011 IEEE.
Resumo:
This paper presents novel simulation tools to assist the lecturers about learning processes on renewable energy sources, considering photovoltaic (PV) systems. The PV behavior, functionality and its interaction with power electronic converters are investigated in the simulation tools. The main PV output characteristics, I (current) versus V (voltage) and P (power) versus V (voltage), were implemented in the tools, in order to aid the users for the design steps. In order to verify the effectiveness of the developed tools the simulation results were compared with Matlab. Finally, a prototype was implemented with the purpose to compare the experimental results with the results from the proposed tools, validating its operational feasibility. © 2011 IEEE.
Resumo:
Nowadays, organizations face the problem of keeping their information protected, available and trustworthy. In this context, machine learning techniques have also been extensively applied to this task. Since manual labeling is very expensive, several works attempt to handle intrusion detection with traditional clustering algorithms. In this paper, we introduce a new pattern recognition technique called Optimum-Path Forest (OPF) clustering to this task. Experiments on three public datasets have showed that OPF classifier may be a suitable tool to detect intrusions on computer networks, since it outperformed some state-of-the-art unsupervised techniques. © 2012 IEEE.
Resumo:
The control of molecular architectures has been exploited in layer-by-layer (LbL) films deposited on Au interdigitated electrodes, thus forming an electronic tongue (e-tongue) system that reached an unprecedented high sensitivity (down to 10-12 M) in detecting catechol. Such high sensitivity was made possible upon using units containing the enzyme tyrosinase, which interacted specifically with catechol, and by processing impedance spectroscopy data with information visualization methods. These latter methods, including the parallel coordinates technique, were also useful for identifying the major contributors to the high distinguishing ability toward catechol. Among several film architectures tested, the most efficient had a tyrosinase layer deposited atop LbL films of alternating layers of dioctadecyldimethylammonium bromide (DODAB) and 1,2-dipalmitoyl-sn-3-glycero-fosfo-rac-(1-glycerol) (DPPG), viz., (DODAB/DPPG)5/DODAB/Tyr. The latter represents a more suitable medium for immobilizing tyrosinase when compared to conventional polyelectrolytes. Furthermore, the distinction was more effective at low frequencies where double-layer effects on the film/liquid sample dominate the electrical response. Because the optimization of film architectures based on information visualization is completely generic, the approach presented here may be extended to designing architectures for other types of applications in addition to sensing and biosensing. © 2013 American Chemical Society.
Resumo:
Pós-graduação em Geologia Regional - IGCE
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
The concept of Functional Urban Regions (FURs), also called Metropolitan Regions (MRs), is not simple. It is clear, though, that they are not simply a combination of adjacent municipalities or areas. Different methods can be used for their definition. However, especially in developing countries, the application of some methods is not possible, due to the unavailability of detailed data. Alternative approaches have been developed based on spatial analysis methods and using variables extracted from available data. The objective of this study is to compare the results of two spatial analysis methods exploring two variables: population density and an indicator of transport infrastructure supply. The first method regards Exploratory Spatial Data Analyses tools, which define uniform regions based on specific variables. The second method used the same variables and the spatial analysis technique available in the computer program SKATER - Spatial 'K'luster Analysis by Tree Edge Removal. Assuming that those classifications of regions with similar characteristics can be used for identifying potential FURS, the results of all analyses were compared with one another and with the 'official' MR. A combined approach was also considered for comparison, but none of the results match the existing MR boundaries, what challenges the official definitions. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Studies show the positive effects that video games can have on student performance and attitude towards learning. In the past few years, strategies have been generated to optimize the use of technological resources with the aim of facilitating widespread adoption of technology in the classroom. Given its low acquisition and maintenance costs, the interpersonal computer allows individual interaction and simultaneous learning with large groups of students. The purpose of this work was to compare arithmetical knowledge acquired by third-grade students through the use of game-based activities and non-game-based activities using an interpersonal computer, with knowledge acquired through the use of traditional paper-and-pencil activities, and to analyze their impact in various socio-cultural contexts. To do this, a quasi-experimental study was conducted with 271 students in three different countries (Brazil, Chile, and Costa Rica), in both rural and urban schools. A set of educational games for practising arithmetic was developed and tested in six schools within these three countries. Results show that there were no significant differences (ANCOVA) in the learning acquired from game-based vs. non-game-based activities. However, both showed a significant difference when compared with the traditional method. Additionally, both groups using the interpersonal computer showed higher levels of student interest than the traditional method group, and these technological methods were seen to be especially effective in increasing learning among weaker students.
Resumo:
The representation of real objects in virtual environments has applications in many areas, such as cartography, mixed reality and reverse engineering. The generation of these objects can be performed through two ways: manually, with CAD (Computer Aided Design) tools, or automatically, by means of surface reconstruction techniques. The simpler the 3D model, the easier it is to process and store it. However, this methods can generate very detailed virtual elements, that can result in some problems when processing the resulting mesh, because it has a lot of edges and polygons that have to be checked at visualization. Considering this context, it can be applied simplification algorithms to eliminate polygons from resulting mesh, without change its topology, generating a lighter mesh with less irrelevant details. The project aimed the study, implementation and comparative tests of simplification algorithms applied to meshes generated through a reconstruction pipeline based on point clouds. This work proposes the realization of the simplification step, like a complement to the pipeline developed by (ONO et al., 2012), that developed reconstruction through cloud points obtained by Microsoft Kinect, and then using Poisson algorithm
Resumo:
Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software. We have been working to address this problem by finding ways to provide at least some of the benefits of formal software engineering techniques to end-user programmers. In this talk, focusing on the spreadsheet application paradigm, I present several of our approaches, focusing on methodologies that utilize source-code-analysis techniques to help end-users build more dependable spreadsheets. Behind the scenes, our methodologies use static analyses such as dataflow analysis and slicing, together with dynamic analyses such as execution monitoring, to support user tasks such as validation and fault localization. I show how, to accommodate the user base of spreadsheet languages, an interface to these methodologies can be provided in a manner that does not require an understanding of the theory behind the analyses, yet supports the interactive, incremental process by which spreadsheets are created. Finally, I present empirical results gathered in the use of our methodologies that highlight several costs and benefits trade-offs, and many opportunities for future work.
Resumo:
The elimination of all external incisions is an important step in reducing the invasiveness of surgical procedures. Natural Orifice Translumenal Endoscopic Surgery (NOTES) is an incision-less surgery and provides explicit benefits such as reducing patient trauma and shortening recovery time. However, technological difficulties impede the widespread utilization of the NOTES method. A novel robotic tool has been developed, which makes NOTES procedures feasible by using multiple interchangeable tool tips. The robotic tool has the capability of entering the body cavity through an orifice or a single incision using a flexible articulated positioning mechanism and once inserted is not constrained by incisions, allowing for visualization and manipulations throughout the cavity. Multiple interchangeable tool tips of the robotic device initially consist of three end effectors: a grasper, scissors, and an atraumatic Babcock clamp. The tool changer is capable of selecting and switching between the three tools depending on the surgical task using a miniature mechanism driven by micro-motors. The robotic tool is remotely controlled through a joystick and computer interface. In this thesis, the following aspects of this robotic tool will be detailed. The first-generation robot is designed as a conceptual model for implementing a novel mechanism of switching, advancing, and controlling the tool tips using two micro-motors. It is believed that this mechanism achieves a reduction in cumbersome instrument exchanges and can reduce overall procedure time and the risk of inadvertent tissue trauma during exchanges with a natural orifice approach. Also, placing actuators directly at the surgical site enables the robot to generate sufficient force to operate effectively. Mounting the multifunctional robot on the distal end of an articulating tube provides freedom from restriction on the robot kinematics and helps solve some of the difficulties otherwise faced during surgery using NOTES or related approaches. The second-generation multifunctional robot is then introduced in which the overall size is reduced and two arms provide 2 additional degrees of freedom, resulting in feasibility of insertion through the esophagus and increased dexterity. Improvements are necessary in future iterations of the multifunctional robot; however, the work presented is a proof of concept for NOTES robots capable of abdominal surgical interventions.