967 resultados para IT tools
Resumo:
Si3N4 tools were coated with a thin diamond film using a Hot-Filament Chemical Vapour Deposition (HFCVD) reactor, in order to machining a grey cast iron. Wear behaviour of these tools in high speed machining was the main subject of this work. Turning tests were performed with a combination of cutting speeds of 500, 700 and 900 m min−1, and feed rates of 0.1, 0.25 and 0.4 mm rot−1, remaining constant the depth of cut of 1 mm. In order to evaluate the tool behaviour during the turning tests, cutting forces were analyzed being verified a significant increase with feed rate. Diamond film removal occurred for the most severe set of cutting parameters. It was also observed the adhesion of iron and manganese from the workpiece to the tool. Tests were performed on a CNC lathe provided with a 3-axis dynamometer. Results were collected and registered by homemade software. Tool wear analysis was achieved by a Scanning Electron Microscope (SEM) provided with an X-ray Energy Dispersive Spectroscopy (EDS) system. Surface analysis was performed by a profilometer.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
Dynamic and distributed environments are hard to model since they suffer from unexpected changes, incomplete knowledge, and conflicting perspectives and, thus, call for appropriate knowledge representation and reasoning (KRR) systems. Such KRR systems must handle sets of dynamic beliefs, be sensitive to communicated and perceived changes in the environment and, consequently, may have to drop current beliefs in face of new findings or disregard any new data that conflicts with stronger convictions held by the system. Not only do they need to represent and reason with beliefs, but also they must perform belief revision to maintain the overall consistency of the knowledge base. One way of developing such systems is to use reason maintenance systems (RMS). In this paper we provide an overview of the most representative types of RMS, which are also known as truth maintenance systems (TMS), which are computational instances of the foundations-based theory of belief revision. An RMS module works together with a problem solver. The latter feeds the RMS with assumptions (core beliefs) and conclusions (derived beliefs), which are accompanied by their respective foundations. The role of the RMS module is to store the beliefs, associate with each belief (core or derived belief) the corresponding set of supporting foundations and maintain the consistency of the overall reasoning by keeping, for each represented belief, the current supporting justifications. Two major approaches are used to reason maintenance: single-and multiple-context reasoning systems. Although in the single-context systems, each belief is associated to the beliefs that directly generated it—the justification-based TMS (JTMS) or the logic-based TMS (LTMS), in the multiple context counterparts, each belief is associated with the minimal set of assumptions from which it can be inferred—the assumption-based TMS (ATMS) or the multiple belief reasoner (MBR).
Resumo:
Electricity markets are complex environments comprising several negotiation mechanisms. MASCEM (Multi- Agent System for Competitive Electricity Markets) is a simulator developed to allow deep studies of the interactions between the players that take part in the electricity market negotiations. ALBidS (Adaptive Learning Strategic Bidding System) is a multiagent system created to provide decision support to market negotiating players. Fully integrated with MASCEM it considers several different methodologies based on very distinct approaches. The Six Thinking Hats is a powerful technique used to look at decisions from different perspectives. This paper aims to complement ALBidS strategies usage by MASCEM players, providing, through the Six Thinking Hats group decision technique, a means to combine them and take advantages from their different perspectives. The combination of the different proposals resulting from ALBidS’ strategies is performed through the application of a Genetic Algorithm, resulting in an evolutionary learning approach.
Resumo:
Harnessing idle PCs CPU cycles, storage space and other resources of networked computers to collaborative are mainly fixated on for all major grid computing research projects. Most of the university computers labs are occupied with the high puissant desktop PC nowadays. It is plausible to notice that most of the time machines are lying idle or wasting their computing power without utilizing in felicitous ways. However, for intricate quandaries and for analyzing astronomically immense amounts of data, sizably voluminous computational resources are required. For such quandaries, one may run the analysis algorithms in very puissant and expensive computers, which reduces the number of users that can afford such data analysis tasks. Instead of utilizing single expensive machines, distributed computing systems, offers the possibility of utilizing a set of much less expensive machines to do the same task. BOINC and Condor projects have been prosperously utilized for solving authentic scientific research works around the world at a low cost. In this work the main goal is to explore both distributed computing to implement, Condor and BOINC, and utilize their potency to harness the ideal PCs resources for the academic researchers to utilize in their research work. In this thesis, Data mining tasks have been performed in implementation of several machine learning algorithms on the distributed computing environment.
Resumo:
Dissertation to obtain the degree of Doctor in Electrical and Computer Engineering, specialization of Collaborative Networks
Resumo:
Software tools in education became popular since the widespread of personal computers. Engineering courses lead the way in this development and these tools became almost a standard. Engineering graduates are familiar with numerical analysis tools but also with simulators (e.g. electronic circuits), computer assisted design tools and others, depending on the degree. One of the main problems with these tools is when and how to start use them so that they can be beneficial to students and not mere substitutes for potentially difficult calculations or design. In this paper a software tool to be used by first year students in electronics/electricity courses is presented. The growing acknowledgement and acceptance of open source software lead to the choice of an open source software tool – Scilab, which is a numerical analysis tool – to develop a toolbox. The toolbox was developed to be used as standalone or integrated in an e-learning platform. The e-learning platform used was Moodle. The first approach was to assess the mathematical skills necessary to solve all the problems related to electronics and electricity courses. Analysing the existing circuit simulators software tools, it is clear that even though they are very helpful by showing the end result they are not so effective in the process of the students studying and self learning since they show results but not intermediate steps which are crucial in problems that involve derivatives or integrals. Also, they are not very effective in obtaining graphical results that could be used to elaborate reports and for an overall better comprehension of the results. The developed tool was based on the numerical analysis software Scilab and is a toolbox that gives their users the opportunity to obtain the end results of a circuit analysis but also the expressions obtained when derivative and integrals calculations, plot signals, obtain vector diagrams, etc. The toolbox runs entirely in the Moodle web platform and provides the same results as the standalone application. The students can use the toolbox through the web platform (in computers where they don't have installation privileges) or in their personal computers by installing both the Scilab software and the toolbox. This approach was designed for first year students from all engineering degrees that have electronics/electricity courses in their curricula.
Resumo:
The aim of this article is to identify patterns in spatial distribution of cases of dengue fever that occurred in the municipality of Cruzeiro, State of São Paulo, in 2006. This is an ecological and exploratory study using the tools of spatial analysis in the preparation of thematic maps with data from Sinan-Net. An analysis was made by area, taking as unit the IBGE census, the analysis included four months in 2006 which show the occurrence of the disease in the city. The thematic maps were constructed by TerraView 3.3.1 software, the same software provided the values of the indicators of Global Moran (I M) every month and the Kernel estimation. In the year 2006, 691 cases of dengue were georeferenced (with a rate of 864.2 cases/100,000 inhabitants); the indicators of Moran and p-values obtained were I M = 0.080 (March) p = 0.11; I M = 0.285 (April) p = 0.01; I M = 0.201 (May) p = 0.01 and I M = 0.002 (June) p = 0.57. The first cases were identified in the Northeast and Central areas of Cruzeiro and the recent cases, in the North, Northeast and Central. It was possible to identify census tracts where the epidemic began and how it occurred temporally and spatially in the city.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
As an introduction to a series of articles focused on the exploration of particular tools and/or methods to bring together digital technology and historical research, the aim of this paper is mainly to highlight and discuss in what measure those methodological approaches can contribute to improve analytical and interpretative capabilities available to historians. In a moment when the digital world present us with an ever-increasing variety of tools to perform extraction, analysis and visualization of large amounts of text, we thought it would be relevant to bring the digital closer to the vast historical academic community. More than repeating an idea of digital revolution introduced in the historical research, something recurring in the literature since the 1980s, the aim was to show the validity and usefulness of using digital tools and methods, as another set of highly relevant tools that the historians should consider. For this several case studies were used, combining the exploration of specific themes of historical knowledge and the development or discussion of digital methodologies, in order to highlight some changes and challenges that, in our opinion, are already affecting the historians' work, such as a greater focus given to interdisciplinarity and collaborative work, and a need for the form of communication of historical knowledge to become more interactive.
Resumo:
Unilever Food Solutions new digital CRM1 Platform - What is the combination of tools, processes and content that will help Unilever Food Solutions grow his business? Unilever Food Solutions (UFS) intend to create a new online platform to enable it to communicate with segments of the markets, which have previously been too difficult to reach. Specifically targeted at Chefs and other food professionals, the aim is to create an interactive website, which delivers value to its intended users by providing a variety of relevant content and functions, while simultaneously opening up a potential transactional channel to those same users.
Resumo:
For hundreds of years biologists have studied the naturally occurring diversity in plant and animal species. The invention of the electron microscope in the rst half of the 1900's reveled that cells also can be incredible complex (and often stunningly beautiful). However, despite the fact that the eld of cell biology has existed for over 100 years we still lack a formal understanding of how cells evolve: It is unclear what the extents are in cell and organelle morphology, if and how diversity might be constrained, and how organelles change morphologically over time.(...)
Resumo:
Promoting the use of non-motorized modes of transport, such as cycling, is an important contribution to the improvement of mobility, accessibility and equity in cities. Cycling offers a fast and cheap transportation option for short distances, helping to lower pollutant emissions and contributing to a healthier way of life. In order to make the cycling mode more competitive in relation to motorized traffic, it is necessary to evaluate the potential of alternatives from the perspective of the physical effort. One way to do so consists of assessing the suitability of locations for implementing cycling infrastructures. In this work, four tools to determine the gradient along potential cycling paths are compared. Furthermore, an evaluation of the reliability of some low-cost tools to measure this parameter was conducted, by comparison with standard measurements using cartographic plans, on a field case study applied to the city of Braga, Portugal. These tools revealed a good level of accuracy for the planning stage, but proved to be less reliable for use in design.
Resumo:
Nowadays the main honey producing countries require accurate labeling of honey before commercialization, including floral classification. Traditionally, this classification is made by melissopalynology analysis, an accurate but time-consuming task requiring laborious sample pre-treatment and high-skilled technicians. In this work the potential use of a potentiometric electronic tongue for pollinic assessment is evaluated, using monofloral and polyfloral honeys. The results showed that after splitting honeys according to color (white, amber and dark), the novel methodology enabled quantifying the relative percentage of the main pollens (Castanea sp., Echium sp., Erica sp., Eucaliptus sp., Lavandula sp., Prunus sp., Rubus sp. and Trifolium sp.). Multiple linear regression models were established for each type of pollen, based on the best sensors sub-sets selected using the simulated annealing algorithm. To minimize the overfitting risk, a repeated K-fold cross-validation procedure was implemented, ensuring that at least 10-20% of the honeys were used for internal validation. With this approach, a minimum average determination coefficient of 0.91 ± 0.15 was obtained. Also, the proposed technique enabled the correct classification of 92% and 100% of monofloral and polyfloral honeys, respectively. The quite satisfactory performance of the novel procedure for quantifying the relative pollen frequency may envisage its applicability for honey labeling and geographical origin identification. Nevertheless, this approach is not a full alternative to the traditional melissopalynologic analysis; it may be seen as a practical complementary tool for preliminary honey floral classification, leaving only problematic cases for pollinic evaluation.
Resumo:
PhD thesis in Bioengineering