979 resultados para Software Tools


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esse trabalho tem por objetivo o desenvolvimento de um sistema inteligente para detecção da queima no processo de retificação tangencial plana através da utilização de uma rede neural perceptron multi camadas, treinada para generalizar o processo e, conseqüentemente, obter o limiar de queima. em geral, a ocorrência da queima no processo de retificação pode ser detectada pelos parâmetros DPO e FKS. Porém esses parâmetros não são eficientes nas condições de usinagem usadas nesse trabalho. Os sinais de emissão acústica e potência elétrica do motor de acionamento do rebolo são variáveis de entrada e a variável de saída é a ocorrência da queima. No trabalho experimental, foram empregados um tipo de aço (ABNT 1045 temperado) e um tipo de rebolo denominado TARGA, modelo ART 3TG80.3 NVHB.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A MATHEMATICA notebook to compute the elements of the matrices which arise in the solution of the Helmholtz equation by the finite element method (nodal approximation) for tetrahedral elements of any approximation order is presented. The results of the notebook enable a fast computational implementation of finite element codes for high order simplex 3D elements reducing the overheads due to implementation and test of the complex mathematical expressions obtained from the analytical integrations. These matrices can be used in a large number of applications related to physical phenomena described by the Poisson, Laplace and Schrodinger equations with anisotropic physical properties.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An extended version of HIER, a query-the-user facility for expert systems is presented. HIER was developed to run over Prolog programs, and has been incorporated to systems that support the design of large and complex applications. The framework of the extended version is described,; as well as the major features of the implementation. An example is included to illustrate the use of the tool, involving the design of a specific database application.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modern agriculture demands investments in technology that allows the farmers to improve productivity and quality of their products, aiming to establish themselves in a competitive market. However, the high costs of acquiring and maintaining such technology may be an inhibiting factor to its spread and acceptance, mainly to a large number of small grain Brazilian farmers, who need low cost innovative technological solutions, suitable for their financial reality. Starting from this premise, this paper presents the development of a low cost prototype for monitoring the temperature and humidity of grains stored in silos, and the economic implications of cost/benefit ratio of innovative applications of low cost technology in the process of thermometry of grains. The prototype was made of two electronic units, one for acquisition and another one for data reception, as well as software, which offered the farmers more precise information for the control of aeration. The data communication between the electronic units and the software was reliable and both were developed using low cost electronic components and free software tools. The developed system was considered as potentially viable to small grain Brazilian farmers; it can be used in any type of small silos. It provided reduction of costs of installation and maintenance and also offered an easy expansion system; besides the low cost of development when compared to similar products available in the Brazilian market.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Informação - FFC

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Energia na Agricultura) - FCA

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data coming out from various researches carried out over the last years in Italy on the problem of school dispersion in secondary school show that difficulty in studying mathematics is one of the most frequent reasons of discomfort reported by students. Nevertheless, it is definitely unrealistic to think we can do without such knowledge in today society: mathematics is largely taught in secondary school and it is not confined within technical-scientific courses only. It is reasonable to say that, although students may choose academic courses that are, apparently, far away from mathematics, all students will have to come to terms, sooner or later in their life, with this subject. Among the reasons of discomfort given by the study of mathematics, some mention the very nature of this subject and in particular the complex symbolic language through which it is expressed. In fact, mathematics is a multimodal system composed by oral and written verbal texts, symbol expressions, such as formulae and equations, figures and graphs. For this, the study of mathematics represents a real challenge to those who suffer from dyslexia: this is a constitutional condition limiting people performances in relation to the activities of reading and writing and, in particular, to the study of mathematical contents. Here the difficulties in working with verbal and symbolic codes entail, in turn, difficulties in the comprehension of texts from which to deduce operations that, once combined together, would lead to the problem final solution. Information technologies may support this learning disorder effectively. However, these tools have some implementation limits, restricting their use in the study of scientific subjects. Vocal synthesis word processors are currently used to compensate difficulties in reading within the area of classical studies, but they are not used within the area of mathematics. This is because the vocal synthesis (or we should say the screen reader supporting it) is not able to interpret all that is not textual, such as symbols, images and graphs. The DISMATH software, which is the subject of this project, would allow dyslexic users to read technical-scientific documents with the help of a vocal synthesis, to understand the spatial structure of formulae and matrixes, to write documents with a technical-scientific content in a format that is compatible with main scientific editors. The system uses LaTex, a text mathematic language, as mediation system. It is set up as LaTex editor, whose graphic interface, in line with main commercial products, offers some additional specific functions with the capability to support the needs of users who are not able to manage verbal and symbolic codes on their own. LaTex is translated in real time into a standard symbolic language and it is read by vocal synthesis in natural language, in order to increase, through the bimodal representation, the ability to process information. The understanding of the mathematic formula through its reading is made possible by the deconstruction of the formula itself and its “tree” representation, so allowing to identify the logical elements composing it. Users, even without knowing LaTex language, are able to write whatever scientific document they need: in fact the symbolic elements are recalled by proper menus and automatically translated by the software managing the correct syntax. The final aim of the project, therefore, is to implement an editor enabling dyslexic people (but not only them) to manage mathematic formulae effectively, through the integration of different software tools, so allowing a better teacher/learner interaction too.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

“Cartographic heritage” is different from “cartographic history”. The second term refers to the study of the development of surveying and drawing techniques related to maps, through time, i.e. through different types of cultural environment which were background for the creation of maps. The first term concerns the whole amount of ancient maps, together with these different types of cultural environment, which the history has brought us and which we perceive as cultural values to be preserved and made available to many users (public, institutions, experts). Unfortunately, ancient maps often suffer preservation problems of their analog support, mostly due to aging. Today, metric recovery in digital form and digital processing of historical cartography allow preserving map heritage. Moreover, modern geomatic techniques give us new chances of using historical information, which would be unachievable on analog supports. In this PhD thesis, the whole digital processing of recovery and elaboration of ancient cartography is reported, with a special emphasis on the use of digital tools in preservation and elaboration of cartographic heritage. It is possible to divide the workflow into three main steps, that reflect the chapter structure of the thesis itself: • map acquisition: conversion of the ancient map support from analog to digital, by means of high resolution scanning or 3D surveying (digital photogrammetry or laser scanning techniques); this process must be performed carefully, with special instruments, in order to reduce deformation as much as possible; • map georeferencing: reproducing in the digital image the native metric content of the map, or even improving it by selecting a large number of still existing ground control points; this way it is possible to understand the projection features of the historical map, as well as to evaluate and represent the degree of deformation induced by the old type of cartographic transformation (that can be unknown to us), by surveying errors or by support deformation, usually all errors of too high value with respect to our standards; • data elaboration and management in a digital environment, by means of modern software tools: vectorization, giving the map a new and more attractive graphic view (for instance, by creating a 3D model), superimposing it on current base maps, comparing it to other maps, and finally inserting it in GIS or WebGIS environment as a specific layer. The study is supported by some case histories, each of them interesting from the point of view of one digital cartographic elaboration step at least. The ancient maps taken into account are the following ones: • three maps of the Po river delta, made at the end of the XVI century by a famous land-surveyor, Ottavio Fabri (he is single author in the first map, co-author with Gerolamo Pontara in the second map, co-author with Bonajuto Lorini and others in the third map), who wrote a methodological textbook where he explains a new topographical instrument, the squadra mobile (mobile square) invented and used by himself; today all maps are preserved in the State Archive of Venice; • the Ichnoscenografia of Bologna by Filippo de’ Gnudi, made in the 1702 and today preserved in the Archiginnasio Library of Bologna; it is a scenographic view of the city, captured in a bird’s eye flight, but also with an icnographic value, as the author himself declares; • the map of Bologna by the periti Gregorio Monari and Antonio Laghi, the first map of the city derived from a systematic survey, even though it was made only ten years later (1711–1712) than the map by de’ Gnudi; in this map the scenographic view was abandoned, in favor of a more correct representation by means of orthogonal projection; today the map is preserved in the State Archive of Bologna; • the Gregorian Cadastre of Bologna, made in 1831 and updated until 1927, now preserved in the State Archive of Bologna; it is composed by 140 maps and 12 brogliardi (register volumes). In particular, the three maps of the Po river delta and the Cadastre were studied with respect to their acquisition procedure. Moreover, the first maps were analyzed from the georeferencing point of view, and the Cadastre was analyzed with respect to a possible GIS insertion. Finally, the Ichnoscenografia was used to illustrate a possible application of digital elaboration, such as 3D modeling. Last but not least, we must not forget that the study of an ancient map should start, whenever possible, from the consultation of the precious original analogical document; analysis by means of current digital techniques allow us new research opportunities in a rich and modern multidisciplinary context.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis is concerned with the role played by software tools in the analysis and dissemination of linguistic corpora and their contribution to a more widespread adoption of corpora in different fields. Chapter 1 contains an overview of some of the most relevant corpus analysis tools available today, presenting their most interesting features and some of their drawbacks. Chapter 2 begins with an explanation of the reasons why none of the available tools appear to satisfy the requirements of the user community and then continues with technical overview of the current status of the new system developed as part of this work. This presentation is followed by highlights of features that make the system appealing to users and corpus builders (i.e. scholars willing to make their corpora available to the public). The chapter concludes with an indication of future directions for the projects and information on the current availability of the software. Chapter 3 describes the design of an experiment devised to evaluate the usability of the new system in comparison to another corpus tool. Usage of the tool was tested in the context of a documentation task performed on a real assignment during a translation class in a master's degree course. In chapter 4 the findings of the experiment are presented on two levels of analysis: firstly a discussion on how participants interacted with and evaluated the two corpus tools in terms of interface and interaction design, usability and perceived ease of use. Then an analysis follows of how users interacted with corpora to complete the task and what kind of queries they submitted. Finally, some general conclusions are drawn and areas for future work are outlined.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This PhD thesis discusses the impact of Cloud Computing infrastructures on Digital Forensics in the twofold role of target of investigations and as a helping hand to investigators. The Cloud offers a cheap and almost limitless computing power and storage space for data which can be leveraged to commit either new or old crimes and host related traces. Conversely, the Cloud can help forensic examiners to find clues better and earlier than traditional analysis applications, thanks to its dramatically improved evidence processing capabilities. In both cases, a new arsenal of software tools needs to be made available. The development of this novel weaponry and its technical and legal implications from the point of view of repeatability of technical assessments is discussed throughout the following pages and constitutes the unprecedented contribution of this work

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die Transmissionselektronenmikroskopie gepaart mit bioinformatischen Methoden zur digitalen Bildverarbeitung ist ein schneller Weg zur Erstellung dreidimensionaler Rekonstruktionen großer Proteinkomplexe. Durch die Kombination der 3D-Elektronenmikroskopie mit der Röntgenstruktur von Untereinheiten erhält man ein pseudoatomares Modell der Quartärstruktur.rnIn dieser Arbeit wurden auf diese Weise die Quartärstrukturen von drei unterschiedlichen respiratorischen Proteinen analysiert (einem Hämoglobin und zwei Hämocyaninen). Zudem wurden spezielle Software-Tools entwickelt, um vorhandene Softwarepakete besser miteinander kombinieren zu können.rnDie ca. 15Å 3D-Rekonstruktion des Hämoglobins vom Wasserfloh Daphnia pulex klärt die umstrittene Frage, wie viele Untereinheiten die Quartärstruktur aufbauen: Es sind 16 (mit je zwei Häm-Domänen), angeordnet in zwei Schichten als D4-symmetrisches Sandwich. Die ca. 15 Å 3D-Rekonstruktion des 2x6meren Hämocyanins des Flusskrebses Astacus leptodactylus gibt neue Einblicke in die Kontaktstelle zwischen den beiden Hexameren; sie liegt im Bereich der Domäne 3. Bei dem aus 48 Untereinheiten bestehenden Hämocyanin des Pfeilschwanzes Limulus polyphemus wurde eine Auflösung von ca. 7 Å erreicht. Die Homologiemodelle der Untereinheiten wurden flexibel gefittet. An einer der Kontaktstellen zwischen den beiden Halbmolekülen wurden Molekulardynamik (MD)-Simulationen durchgeführt, um mehr über die Art der chemischen Bindung an dieser Kontaktstelle zu erfahren.rnSpeziell für die Kombination von 3D-Elektronenmikroskopie und MD-Simulation wurden verschiedene bioinformatische Werkzeuge und eine leicht zu erweiternde universelle grafische Benutzeroberfläche (GUI) entwickelt.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The chemical industry has to face safety problems linked to the hazards of chemicals and the risks posed by the plants where they are handled. However, their transport may cause significant risk values too: it’s not totally possible to avoid the occurrence of accidents. This work is focused on the emergency response to railway accidents involving hazardous materials, that is what has to be done once they happen to limit their consequences. A first effort has been devoted to understand the role given to this theme within legislations: it has been found out that often it’s not even taken into account. Exceptionally a few countries adopt guidelines suggesting how to plan the response, who is appointed to intervene and which actions should be taken first. An investigation has been made to define the tools available for the responders, with attention on the availability of chemical-specific safety distances. It has emerged that the ERG book adopted by some American countries has suggestions and the Belgian legislation too establishes criteria to evaluate these distances. An analysis has been conducted then on the most recent accidents occurred worldwide, to understand how the response was performed and which safety distances were adopted. These values were compared with the numbers reported by the ERG book and the results of two devoted software tools for consequence analysis of accidental spills scenarios. This comparison has shown that there are differences between them and that a more standardized approach is necessary. This is why further developments of the topic should focus on promoting uniform procedures for emergency response planning and on a worldwide adoption of a guidebook with suggestions about actions to reduce consequences and about safety distances, determined following finer researches. For this aim, the development of a detailed database of hazardous materials transportation accidents could be useful.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

For various reasons, it is important, if not essential, to integrate the computations and code used in data analyses, methodological descriptions, simulations, etc. with the documents that describe and rely on them. This integration allows readers to both verify and adapt the statements in the documents. Authors can easily reproduce them in the future, and they can present the document's contents in a different medium, e.g. with interactive controls. This paper describes a software framework for authoring and distributing these integrated, dynamic documents that contain text, code, data, and any auxiliary content needed to recreate the computations. The documents are dynamic in that the contents, including figures, tables, etc., can be recalculated each time a view of the document is generated. Our model treats a dynamic document as a master or ``source'' document from which one can generate different views in the form of traditional, derived documents for different audiences. We introduce the concept of a compendium as both a container for the different elements that make up the document and its computations (i.e. text, code, data, ...), and as a means for distributing, managing and updating the collection. The step from disseminating analyses via a compendium to reproducible research is a small one. By reproducible research, we mean research papers with accompanying software tools that allow the reader to directly reproduce the results and employ the methods that are presented in the research paper. Some of the issues involved in paradigms for the production, distribution and use of such reproducible research are discussed.