844 resultados para metodologia berriak
Resumo:
In this work, we propose a methodology for teaching robotics in elementary schools, based on the socio-historical Vygotsky theory. This methodology in conjunction with the Lego Mindstoms kit (R) and an educational software (an interface for control and programming of prototypes) are part of an educational robotics system named RoboEduc. For the practical development of this work, we have used the action-research strategy, being realized robotics activities with participation of children with age between 8 and 10 years, students of the elementary school level of Municipal School Ascendino de Almeida. This school is located at the city zone of Pitimbu, at the periphery of Natal, in Rio Grande do Norte state. The activities have focused on understanding the construction of robotic prototypes, their programming and control. At constructing prototypes, children develop zone of proximal development (ZPDs) that are learning spaces that, when well used, allow the construction not only of scientific concepts by the individuals but also of abilities and capabilities that are important for the social and cultural interactiond of each one and of the group. With the development of these practical workshops, it was possible to analyse the use of the Robot as the mediator element of the teaching-learning process and the contributions that the use of robotics may bring to teaching since elementary levels
Resumo:
Due of industrial informatics several attempts have been done to develop notations and semantics, which are used for classifying and describing different kind of system behavior, particularly in the modeling phase. Such attempts provide the infrastructure to resolve some real problems of engineering and construct practical systems that aim at, mainly, to increase the productivity, quality, and security of the process. Despite the many studies that have attempted to develop friendly methods for industrial controller programming, they are still programmed by conventional trial-and-error methods and, in practice, there is little written documentation on these systems. The ideal solution would be to use a computational environment that allows industrial engineers to implement the system using high-level language and that follows international standards. Accordingly, this work proposes a methodology for plant and control modelling of the discrete event systems that include sequential, parallel and timed operations, using a formalism based on Statecharts, denominated Basic Statechart (BSC). The methodology also permits automatic procedures to validate and implement these systems. To validate our methodology, we presented two case studies with typical examples of the manufacturing sector. The first example shows a sequential control for a tagged machine, which is used to illustrated dependences between the devices of the plant. In the second example, we discuss more than one strategy for controlling a manufacturing cell. The model with no control has 72 states (distinct configurations) and, the model with sequential control generated 20 different states, but they only act in 8 distinct configurations. The model with parallel control generated 210 different states, but these 210 configurations act only in 26 distinct configurations, therefore, one strategy control less restrictive than previous. Lastly, we presented one example for highlight the modular characteristic of our methodology, which it is very important to maintenance of applications. In this example, the sensors for identifying pieces in the plant were removed. So, changes in the control model are needed to transmit the information of the input buffer sensor to the others positions of the cell
Resumo:
Furthered mainly by new technologies, the expansion of distance education has created a demand for tools and methodologies to enhance teaching techniques based on proven pedagogical theories. Such methodologies must also be applied in the so-called Virtual Learning Environments. The aim of this work is to present a planning methodology based on known pedagogical theories which contributes to the incorporation of assessment in the process of teaching and learning. With this in mind, the pertinent literature was reviewed in order to identify the key pedagogical concepts needed to the definition of this methodology and a descriptive approach was used to establish current relations between this conceptual framework and distance education. As a result of this procedure, the Contents Map and the Dependence Map were specified and implemented, two teaching tools that promote the planning of a course by taking into account assessment still in this early stage. Inserted on Moodle, the developed tools were tested in a course of distance learning for practical observation of the involved concepts. It could be verified that the methodology proposed by the above-mentioned tools is in fact helpful in course planning and in strengthening educational assessment, placing the student as central element in the process of teaching and learning
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
Ensuring the dependability requirements is essential for the industrial applications since faults may cause failures whose consequences result in economic losses, environmental damage or hurting people. Therefore, faced from the relevance of topic, this thesis proposes a methodology for the dependability evaluation of industrial wireless networks (WirelessHART, ISA100.11a, WIA-PA) on early design phase. However, the proposal can be easily adapted to maintenance and expansion stages of network. The proposal uses graph theory and fault tree formalism to create automatically an analytical model from a given wireless industrial network topology, where the dependability can be evaluated. The evaluation metrics supported are the reliability, availability, MTTF (mean time to failure), importance measures of devices, redundancy aspects and common cause failures. It must be emphasized that the proposal is independent of any tool to evaluate quantitatively the target metrics. However, due to validation issues it was used a tool widely accepted on academy for this purpose (SHARPE). In addition, an algorithm to generate the minimal cut sets, originally applied on graph theory, was adapted to fault tree formalism to guarantee the scalability of methodology in wireless industrial network environments (< 100 devices). Finally, the proposed methodology was validate from typical scenarios found in industrial environments, as star, line, cluster and mesh topologies. It was also evaluated scenarios with common cause failures and best practices to guide the design of an industrial wireless network. For guarantee scalability requirements, it was analyzed the performance of methodology in different scenarios where the results shown the applicability of proposal for networks typically found in industrial environments
Resumo:
In academia, it is common to create didactic processors, facing practical disciplines in the area of Hardware Computer and can be used as subjects in software platforms, operating systems and compilers. Often, these processors are described without ISA standard, which requires the creation of compilers and other basic software to provide the hardware / software interface and hinder their integration with other processors and devices. Using reconfigurable devices described in a HDL language allows the creation or modification of any microarchitecture component, leading to alteration of the functional units of data path processor as well as the state machine that implements the control unit even as new needs arise. In particular, processors RISP enable modification of machine instructions, allowing entering or modifying instructions, and may even adapt to a new architecture. This work, as the object of study addressing educational soft-core processors described in VHDL, from a proposed methodology and its application on two processors with different complexity levels, shows that it s possible to tailor processors for a standard ISA without causing an increase in the level hardware complexity, ie without significant increase in chip area, while its level of performance in the application execution remains unchanged or is enhanced. The implementations also allow us to say that besides being possible to replace the architecture of a processor without changing its organization, RISP processor can switch between different instruction sets, which can be expanded to toggle between different ISAs, allowing a single processor become adaptive hybrid architecture, which can be used in embedded systems and heterogeneous multiprocessor environments
Resumo:
This work proposes a new methodology to verify those analog circuits, providing an automated tools to help the verifiers to have a more truthful result. This work presents the development of new methodology for analog circuits verification. The main goal is to provide a more automated verification process to certify analog circuits functional behavior. The proposed methodology is based on the golden model technique. A verification environment based on this methodology was built and results of a study case based on the validation of an operational amplifier design are offered as a confirmation of its effectiveness. The results had shown that the verification process was more truthful because of the automation provided by the tool developed
Resumo:
The objective of this research is to discuss about the need for implementation of new alternatives for the implementation on the metrological control: on the findings of initial and subsequent measurements, the control procedures of measurement uncertainty applied in assessing the loss or remains found in handling operations of bulk liquids, when used turbine meters used in measuring the tax on the business of Petrobras, due to the current environment of legal metrology and scientific, both domestic and international. We aim, with these alternatives: standardizing the minimization of random and systematic errors, the estimate of the remaining errors, as well as the management control of metrological calibration procedures, control of measurement uncertainty, and contribute to the change in the form of performance of legal metrology and scientific disseminating new information to change management of metrological control, objectively focused on aspects of supervision in implementing these activities in the control of the uncertainties of measurement used in our processes in the fiscal measurement system Petrobras. Results are presented, information and comments on the influence of measurement uncertainty in the current results of the fiscal and transfer of custody. This will emphasize the need, among other things, improvement and expansion of metrological control monitored by setting a better meet demand, calibration equipment and measuring instruments for Petrobras. Finally, we intend to establish the need for improving the method of evaluation of the data meter applied to the current management control of measurement uncertainty by proposing a methodology for addressing the problem, as well as highlighting the expected results.
Resumo:
Activities that use Global Navigation Satellite System (GNSS) are countless and the most used one is the Global Positioning System (GPS) developed by the United States. In precision agriculture there are demands for static and cinematic positioning with distinct levels of accuracy for different applications; nevertheless cinematic performance data are not available as manufacturers of GPS receivers present only static performance information. For this reason it was developed an instrumented vehicle to test a methodology of performance evaluation of GPS receivers in kinematic conditions, which is representative to agricultural operations. A set of instrumentation was composed and used for collecting data under variable speed and rotation direction. Tests were conducted showing that the methodology allows to measure accuracy and precision, but improvements have to be implemented on the instrumentation equipment for long term tests.
Resumo:
The present work has as objective to present a method of project and implementation of controllers PID, based on industrial instrumentation. An automatic system of auto-tunning of controllers PID will be presented, for systems of first and second order. The software presented in this work is applied in controlled plants by PID controllers implemented in a CLP. Software is applied to make the auto-tunning of the parameters of controller PID of plants that need this tunning. Software presents two stages, the first one is the stage of identification of the system using the least square recursive algorithm and the second is the stage of project of the parameters of controller PID using the root locus algorithm. An important fact of this work is the use of industrial instrumentation for the accomplishment of the experiments. The experiments had been carried through in controlled real plants for controllers PID implemented in the CLP. Thus has not only one resulted obtained with theoreticians experiments made with computational programs, and yes resulted obtained of real systems. The experiments had shown good results gotten with developed software
Resumo:
This paper presents methodology based on Lev Vigotsky`s social interactionist theory through investigative activities, which integrates the teaching of physics to robotics, directed to students of the Physics degree course, seeking to provide further training for future teachers. The method is organized through educational robotics workshops that addresses concepts of physics through the use of low-cost educational robots along with several activities. The methodology has been presented and discussed and put into practice afterwards in workshops so that these future teachers may be able to take robotics to their classroom. Students from the last and penultimate semester of the Physics degree course of the Federal Institute of Education, Science and Technology of Rio Grande do Norte, Caicó campus participated in this project
Resumo:
New materials made from industrial wastes have been studied as an alternative to traditional fabrication processes in building and civil engineering. These materials are produced considering some issues like: cost, efficiency and reduction of nvironmental damage. Specifically in cases of materials destined to dwellings in low latitude regions, like Brazilian Northeast, efficiency is related to mechanical and thermal resistance. Thus, when thermal insulation and energetic efficiency are aimed, it s important to increase thermal resistance without depletion of mechanical properties. This research was conducted on a construction element made of two plates of cement mortar, interspersed with a plate of recycled expanded polystyrene (EPS). This component, widely known as sandwich-panel, is commonly manufactured with commercial EPS whose substitution was proposed in this study. For this purpose it was applied a detailed methodology that defines parameters to a rational batching of the elements that constitute the nucleus. Samples of recycled EPS were made in two different values of apparent specific mass (ρ = 65 kg/m³; ρ = 130 kg/m³) and submitted to the Quick-Line 30TM that is a thermophysical properties analyzer. Based on the results of thermal conductivity, thermal capacity and thermal diffusivity obtained, it was possible to assure that recycled EPS has thermal insulation characteristics that qualify it to replace commercial EPS in building and civil engineering industry
Resumo:
The competitiveness of the trade generated by the higher availability of products with lower quality and cost promoted a new reality of industrial production with small clearances. Track deviations at the production are not discarded, uncertainties can statistically occur. The world consumer and the Brazilian one are supported by the consumer protection code, in lawsuits against the products poor quality. An automobile is composed of various systems and thousands of constituent parts, increasing the likelihood of failure. The dynamic and security systems are critical in relation to the consequences of possible failures. The investigation of the failure gives us the possibility of learning and contributing to various improvements. Our main purpose in this work is to develop a systematic, specific methodology by investigating the root cause of the flaw occurred on an axle end of the front suspension of an automobile, and to perform comparative data analyses between the fractured part and the project information. Our research was based on a flaw generated in an automotive suspension system involved in a mechanical judicial cause, resulting in property and personal damages. In the investigations concerning the analysis of mechanical flaws, knowledge on materials engineering plays a crucial role in the process, since it enables applying techniques for characterizing materials, relating the technical attributes required from a respective part with its structure of manufacturing material, thus providing a greater scientific contribution to the work. The specific methodology developed follows its own flowchart. In the early phase, the data in the records and information on the involved ones were collected. The following laboratory analyses were performed: macrography of the fracture, micrography with SEM (Scanning Electron Microscope) of the initial and final fracture, phase analysis with optical microscopy, Brinell hardness and Vickers microhardness analyses, quantitative and qualitative chemical analysis, by using X-ray fluorescence and optical spectroscopy for carbon analysis, qualitative study on the state of tension was done. Field data were also collected. In the analyses data of the values resulting from the fractured stock parts and the design values were compared. After the investigation, one concluded that: the developed methodology systematized the investigation and enabled crossing data, thus minimizing diagnostic error probability, the morphology of the fracture indicates failure by the fatigue mechanism in a geometrically propitious location, a tension hub, the part was subjected to low tensions by the sectional area of the final fracture, the manufacturing material of the fractured part has low ductility, the component fractured in an earlier moment than the one recommended by the manufacturer, the percentages of C, Si, Mn and Cr of the fractured part present values which differ from the design ones, the hardness value of the superior limit of the fractured part is higher than that of the design, and there is no manufacturing uniformity between stock and fractured part. The work will contribute to optimizing the guidance of the actions in a mechanical engineering judicial expertise
Resumo:
On this research we investigated how new technologies can help the process of design and manufacturing of furniture in such small manufacturers in Rio Grande do Norte state. Google SketchUp, a 3D software tool, was developed in such a way that its internal structures are opened and can be accessed using SketchUp s API for Ruby and programs written in Ruby language (plugins). Using the concepts of the so-called Group Technology and the flexibility that enables adding new functionalities to this software, it was created a Methodology for Modeling of Furniture, a Coding System and a plugin for Google s tool in order to implement the Methodology developed. As resulted, the following facilities are available: the user may create and reuse the library s models over-and-over; reports of the materials manufacturing process costs are provided and, finally, detailed drawings, getting a better integration between the furniture design and manufacturing process
Resumo:
Improving the adherence between oilwell metallic casing and cement sheath potentially decrease the number of corrective actions present/y necessary for Northeastern wells submitted to steam injection. In addition to the direct costs involved in the corrective operations, the economic impact of the failure of the primary cementing aIso includes the loss in the production of the well. The adherence between casing and cement is current/y evaluated by a simple shear tests non standardized by the American Petroleum Institute (API). Therefore, the objective of the present is to propose and evaluate a standardized method to assess the adherence of oilwell metallic casing to cement sheath. To that end, a section of a cemented oilwell was simulated and used to test the effect of different parameters on the shear stress of the system. Surface roughness and different cement compositions submitted or not to thermal cycling were evaluated. The results revealed that the test geometry and parameters proposed yielded different values for the shear stress of the system, corresponding to different adherent conditions between metallic casing and cement sheath