22 resultados para Ferramentas Lean Manufacturing
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The evolution of the practices and strategies of manufacturing management, over the years, has made many companies realign their production systems in order to raise their competitiveness and operational performance. However, in most cases these changes are made in a heterogeneous manner, which ends up leaving the production system without a defined goal, which may end up damaging the managerial strategies of the organization as a whole. Thus, some organizations seek to use techniques and/or successful production practices used by other companies, believing can be able to reproduce the same results. An efficient production system must be fully planned and appropriate to the strategic objectives of the organization. Thus, this paper aims to identify the manufacturing management strategies adopted in paraibanas industries, as well as identify the lean practices used by them. Thus, a qualitative study was conducted, using as methodological basis the multicase study. Were made: direct observations, semi-structured interviews and questionnaires applied to those responsible by the production sector of the participating companies. As a result, it was possible to identify the type of manufacturing management system adopted by companies. Where it was detected that Company A uses a system of Modern Mass Production with focus on productivity and low cost and the Company B is using Lean Manufacturing system focused on quality and diversity. In the two organizations was possible to realize the application of lean practices where the Company what does not use the LM, possessed lean practices in standard extremely mature of utilization.
Resumo:
The manufacturing of above and below-knee prosthesis starts by taking surfac measurements of the patient s residual limb. This demands the making of a cartridg with appropriate fitting and customized to the profile of each patient. The traditiona process in public hospitals in Brazil begins with the completion of a record file (according to law nº388, of July 28, 1999 by the ministry of the health) for obtaining o the prosthesis, where it is identified the amputation level, equipment type, fitting type material, measures etc. Nowadays, that work is covered by the Brazilian Nationa Health Service (SUS) and is accomplished in a manual way being used commo measuring tapes characterizing a quite rudimentary, handmade work and without an accuracy.In this dissertation it is presented the development of a computer integrate tool that it include CAD theory, for visualization of both above and below-knee prosthesis in 3D (i.e. OrtoCAD), as well as, the design and the construction a low cos electro-mechanic 3D scanner (EMS). This apparatus is capable to automatically obtain geometric information of the stump or of the healthy leg while ensuring smalle uncertainty degree for all measurements. The methodology is based on reverse engineering concepts so that the EMS output is fed into the above mentioned academi CAD software in charge of the 3D computer graphics reconstruction of the residualimb s negative plaster cast or even the healthy leg s mirror image. The obtained results demonstrate that the proposed model is valid, because it allows the structura analysis to be performed based on the requested loads, boundary conditions, material chosen and wall thickness. Furthermore it allows the manufacturing of a prosthesis cartridge meeting high accuracy engineering patterns with consequent improvement in the quality of the overall production process
Resumo:
Currently there is still a high demand for quality control in manufacturing processes of mechanical parts. This keeps alive the need for the inspection activity of final products ranging from dimensional analysis to chemical composition of products. Usually this task may be done through various nondestructive and destructive methods that ensure the integrity of the parts. The result generated by these modern inspection tools ends up not being able to geometrically define the real damage and, therefore, cannot be properly displayed on a computing environment screen. Virtual 3D visualization may help identify damage that would hardly be detected by any other methods. One may find some commercial softwares that seek to address the stages of a design and simulation of mechanical parts in order to predict possible damages trying to diminish potential undesirable events. However, the challenge of developing softwares capable of integrating the various design activities, product inspection, results of non-destructive testing as well as the simulation of damage still needs the attention of researchers. This was the motivation to conduct a methodological study for implementation of a versatile CAD/CAE computer kernel capable of helping programmers in developing softwares applied to the activities of design and simulation of mechanics parts under stress. In this research it is presented interesting results obtained from the use of the developed kernel showing that it was successfully applied to case studies of design including parts presenting specific geometries, namely: mechanical prostheses, heat exchangers and piping of oil and gas. Finally, the conclusions regarding the experience of merging CAD and CAE theories to develop the kernel, so as to result in a tool adaptable to various applications of the metalworking industry are presented
Resumo:
Nowadays lives up in an era of tight credit caused by the global financial crisis, as occurred in the past, it is the responsibility of various sectors and segments of society find ways to reinvent itself. In this context, Lean Construction presents itself as a strong alternative production management for companies in the construction segment. Arising out of lean thinking that originated in Japan in the postwar period and has spread around the world in times of extreme scarcity with the oil crisis. In practice the Lean Construction is a philosophy that seeks to improve the process of production management, maximizing the value of the flow from the customer's perspective through the elimination of losses. And thrives in environments and cultures that consider the scarcity of resources like something natural, applying both the macroeconomic crisis as in times of prosperity. The Planning and Production Control - PCP presents itself as a fundamental building block for companies to protect themselves in the face of economic fluctuations, seeking for their survival and success in the competitive market. Motivated by the lack of discussion of the topic in the local academy, and for the identification of 93.33% of construction companies that do not make use of methodological tools for PCP in the state, this dissertation aims to study and propose the implementation of lean construction in methodology of planning projects implemented on construction sites. This characterized the management system, of the production of a construction company, pointing out the main causes of ineffectiveness related to consequent low performance of one of his ventures. In sequence, the PCP was implemented with the use of tools to serve the principles of lean construction. This being monitored through indicators that provided managers managerial view of process of actions control and production of protective mechanisms. All implementation guidelines and application of this management model, were exposed in a simplified way, practical and efficient, in order to break the resistance of new practices and old paradigms in the industry.
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools
Resumo:
This work demonstrates the importance of using tools used in geographic information systems (GIS) and spatial data analysis (SDA) for the study of infectious diseases. Analysis methods were used to describe more fully the spatial distribution of a particular disease by incorporating the geographical element in the analysis. In Chapter 1, we report the historical evolution of these techniques in the field of human health and use Hansen s disease (leprosy) in Rio Grande do Norte as an example. In Chapter 2, we introduced a few basic theoretical concepts on the methodology and classified the types of spatial data commonly treated. Chapters 3 and 4 defined and demonstrated the use of the two most important techniques for analysis of health data, which are data point processes and data area. We modelled the case distribution of Hansen s disease in the city of Mossoró - RN. In the analysis, we used R scripts and made available routines and analitical procedures developed by the author. This approach can be easily used by researchers in several areas. As practical results, major risk areas in Mossoró leprosy were detected, and its association with the socioeconomic profile of the population at risk was found. Moreover, it is clearly shown that his approach could be of great help to be used continuously in data analysis and processing, allowing the development of new strategies to work might increase the use of such techniques in data analysis in health care
Resumo:
In this paper artificial neural network (ANN) based on supervised and unsupervised algorithms were investigated for use in the study of rheological parameters of solid pharmaceutical excipients, in order to develop computational tools for manufacturing solid dosage forms. Among four supervised neural networks investigated, the best learning performance was achieved by a feedfoward multilayer perceptron whose architectures was composed by eight neurons in the input layer, sixteen neurons in the hidden layer and one neuron in the output layer. Learning and predictive performance relative to repose angle was poor while to Carr index and Hausner ratio (CI and HR, respectively) showed very good fitting capacity and learning, therefore HR and CI were considered suitable descriptors for the next stage of development of supervised ANNs. Clustering capacity was evaluated for five unsupervised strategies. Network based on purely unsupervised competitive strategies, classic "Winner-Take-All", "Frequency-Sensitive Competitive Learning" and "Rival-Penalize Competitive Learning" (WTA, FSCL and RPCL, respectively) were able to perform clustering from database, however this classification was very poor, showing severe classification errors by grouping data with conflicting properties into the same cluster or even the same neuron. On the other hand it could not be established what was the criteria adopted by the neural network for those clustering. Self-Organizing Maps (SOM) and Neural Gas (NG) networks showed better clustering capacity. Both have recognized the two major groupings of data corresponding to lactose (LAC) and cellulose (CEL). However, SOM showed some errors in classify data from minority excipients, magnesium stearate (EMG) , talc (TLC) and attapulgite (ATP). NG network in turn performed a very consistent classification of data and solve the misclassification of SOM, being the most appropriate network for classifying data of the study. The use of NG network in pharmaceutical technology was still unpublished. NG therefore has great potential for use in the development of software for use in automated classification systems of pharmaceutical powders and as a new tool for mining and clustering data in drug development
Resumo:
Furthered mainly by new technologies, the expansion of distance education has created a demand for tools and methodologies to enhance teaching techniques based on proven pedagogical theories. Such methodologies must also be applied in the so-called Virtual Learning Environments. The aim of this work is to present a planning methodology based on known pedagogical theories which contributes to the incorporation of assessment in the process of teaching and learning. With this in mind, the pertinent literature was reviewed in order to identify the key pedagogical concepts needed to the definition of this methodology and a descriptive approach was used to establish current relations between this conceptual framework and distance education. As a result of this procedure, the Contents Map and the Dependence Map were specified and implemented, two teaching tools that promote the planning of a course by taking into account assessment still in this early stage. Inserted on Moodle, the developed tools were tested in a course of distance learning for practical observation of the involved concepts. It could be verified that the methodology proposed by the above-mentioned tools is in fact helpful in course planning and in strengthening educational assessment, placing the student as central element in the process of teaching and learning
Resumo:
The use of Geographic Information Systems (GIS) has becoming very important in fields where detailed and precise study of earth surface features is required. Applications in environmental protection are such an example that requires the use of GIS tools for analysis and decision by managers and enrolled community of protected areas. In this specific field, a challenge that remains is to build a GIS that can be dynamically fed with data, allowing researchers and other agents to recover actual and up to date information. In some cases, data is acquired in several ways and come from different sources. To solve this problem, some tools were implemented that includes a model for spatial data treatment on the Web. The research issues involved start with the feeding and processing of environmental control data collected in-loco as biotic and geological variables and finishes with the presentation of all information on theWeb. For this dynamic processing, it was developed some tools that make MapServer more flexible and dynamic, allowing data uploading by the proper users. Furthermore, it was also developed a module that uses interpolation to aiming spatial data analysis. A complex application that has validated this research is to feed the system with data coming from coral reef regions located in northeast of Brazil. The system was implemented using the best interactivity concept provided by the AJAX model and resulted in a substantial contribution for efficiently accessing information, being an essential mechanism for controlling events in the environmental monitoring
Resumo:
The last years have presented an increase in the acceptance and adoption of the parallel processing, as much for scientific computation of high performance as for applications of general intention. This acceptance has been favored mainly for the development of environments with massive parallel processing (MPP - Massively Parallel Processing) and of the distributed computation. A common point between distributed systems and MPPs architectures is the notion of message exchange, that allows the communication between processes. An environment of message exchange consists basically of a communication library that, acting as an extension of the programming languages that allow to the elaboration of applications parallel, such as C, C++ and Fortran. In the development of applications parallel, a basic aspect is on to the analysis of performance of the same ones. Several can be the metric ones used in this analysis: time of execution, efficiency in the use of the processing elements, scalability of the application with respect to the increase in the number of processors or to the increase of the instance of the treat problem. The establishment of models or mechanisms that allow this analysis can be a task sufficiently complicated considering parameters and involved degrees of freedom in the implementation of the parallel application. An joined alternative has been the use of collection tools and visualization of performance data, that allow the user to identify to points of strangulation and sources of inefficiency in an application. For an efficient visualization one becomes necessary to identify and to collect given relative to the execution of the application, stage this called instrumentation. In this work it is presented, initially, a study of the main techniques used in the collection of the performance data, and after that a detailed analysis of the main available tools is made that can be used in architectures parallel of the type to cluster Beowulf with Linux on X86 platform being used libraries of communication based in applications MPI - Message Passing Interface, such as LAM and MPICH. This analysis is validated on applications parallel bars that deal with the problems of the training of neural nets of the type perceptrons using retro-propagation. The gotten conclusions show to the potentiality and easinesses of the analyzed tools.
Resumo:
The control, automation and optimization areas help to improve the processes used by industry. They contribute to a fast production line, improving the products quality and reducing the manufacturing costs. Didatic plants are good tools for research in these areas, providing a direct contact with some industrial equipaments. Given these capabilities, the main goal of this work is to model and control a didactic plant, which is a level and flow process control system with an industrial instrumentation. With a model it is possible to build a simulator for the plant that allows studies about its behaviour, without any of the real processes operational costs, like experiments with controllers. They can be tested several times before its application in a real process. Among the several types of controllers, it was used adaptive controllers, mainly the Direct Self-Tuning Regulators (DSTR) with Integral Action and the Gain Scheduling (GS). The DSTR was based on Pole-Placement design and use the Recursive Least Square to calculate the controller parameters. The characteristics of an adaptive system was very worth to guarantee a good performance when the controller was applied to the plant
Resumo:
This work presents the results, analyses and conclusions about a study carried out with objective of minimizing the thermal cracks formation on cemented carbide inserts during face milling. The main focus of investigation was based on the observation that milling process is an interrupted machining process, which imposes cyclic thermal loads to the cutting tool, causing frequent stresses changes in its superficial and sub-superficial layers. These characteristics cause the formation of perpendicular cracks from cutting edge which aid the cutting tool wear, reducing its life. Several works on this subject emphasizing the thermal cyclic behavior imposed by the milling process as the main responsible for thermal cracks formation have been published. In these cases, the phenomenon appears as a consequence of the difference in temperature experienced by the cutting tool with each rotation of the cutter, usually defined as the difference between the temperatures in the cutting tool wedge at the end of the cutting and idle periods (T factor). Thus, a technique to minimize this cyclic behavior with objective of transforming the milling in an almost-continuous process in terms of temperature was proposed. In this case, a hot air stream was applied into the idle period, during the machining process. This procedure aimed to minimize the T factor. This technique was applied using three values of temperature from the hot air stream (100, 350 e 580 oC) with no cutting fluid (dry condition) and with cutting fluid mist (wet condition) using the hot air stream at 580oC. Besides, trials at room temperature were carried out. Afterwards the inserts were analyzed using a scanning electron microscope, where the quantity of thermal cracks generated in each condition, the wear and others damages was analyzed. In a general way, it was found that the heating of the idle period was positive for reducing the number of thermal cracks during face milling with cemented carbide inserts. Further, the cutting fluid mist application was effective in reducing the wear of the cutting tools.
Resumo:
This work presents the research carried through in the industrial segment of confection of clothes of the Great Natal whose objective is to show the profile, enterprise and technological management as also the use of simultaneous engineering in the development of products. The research approaches two studies. The first one presents the current picture of the companies, synthesized through twelve variable. As, through fifteen variable it shows to the level of use of Simultaneous Engineering in the Development of Products and its amplitude in relation to the Integrated Management using tools CAD, PDM and ERP (Computer Aided Design, Product Management Date, Enterprise Resource Planning). The integration of these systems acts aiming the reduction of the cost and the development time of products. The reached results indicate that simultaneous engineering is a competitive advantage and becomes possible: to reduce the life cycle of the product, to rationalize the resources, to incorporate one high standard of the quality to the process and product as well as to personalize the product to take care of the global market. It is important to note that this work also is considered to contribute for the better understanding of the real companies situation of confection located at the Great Natal and its role in the economy of the State of the Rio Grande do Norte
Resumo:
In machining of internal threads, dedicated tools, known as taps, are needed for each profile type, diameter, and low cutting speed values are used when compared to main machining processes. This restriction in the cutting speed is associated with the difficulty of synchronizing the tool s rotation speed and feed velocity in the process. This fact restricts the flexibility and makes machining lead times longer when manufacturing of components with threads is required. An alternative to the constraints imposed by the tap is the thread milling with helical interpolation technique. The technique is the fusion of two movements: rotation and helical interpolation. The tools may have different configurations: a single edge or multiple edges (axial, radial or both). However, thread milling with helical interpolation technique is relatively new and there are limited studies on the subject, a fact which promotes challenges to its wide application in the manufacturing shop floor. The objective of this research is determine the performance of different types of tools in the thread milling with helical interpolation technique using hardened steel workpieces. In this sense, four tool configurations were used for threading milling in AISI 4340 quenched and tempered steel (40 HRC). The results showed that climb cut promoted a greater number of machined threads, regardless of tool configuration. The upcut milling causes chippings in cutting edge, while the climb cutting promotes abrasive wear. Another important point is that increase in hole diameter by tool diameter ratio increases tool lifetime
Resumo:
Provide data and information on watershed becomes important since the knowledge of their physical characteristics, land use, etcetera, allows for better planning and sustainable use of economically, socially and environmentally in this area. The investigation of the physical environment has been commonly given with the use of geoprocessing, which has proved a very efficient tool. Within this context, this research aims at analyzing the river basin Punaú (located in the cities of Touros, Rio do Fogo and Pureza, state of Rio Grande do Norte) in several aspects, using geoprocessing as a tool of work, to provide information about the entire watershed. Specifically, this study aimed to update pre-existing maps, such as geological, geomorphological and land use, generating map of environmental vulnerability, under the aspect of erosion susceptibility of the area, generating map of legal incompatibility, identifying areas that are already being employed in breach of environmental legislation; propose solutions to the occupation of the river basin Punaú, focused on environmental planning. The methodology was based on the use of geoprocessing tools for data analysis and to make maps of legal incompatibility and environmental vulnerability. For the first map was taken into account the environmental legislation regarding the protection of watersheds. For the vulnerability analysis, the generated map was the result of crossing the maps of geology, geomorphology, soils and land use, having been assigned weights to different attributes of thematic maps, generating a map of environmental vulnerability in relation to susceptibility to erosion. The analysis results indicate that agriculture is the most significant activity in the basin, in total occupied area, which confers a high degree of environmental vulnerability in most of the basin, and some agricultural areas eventually develop in a manner inconsistent with Brazilian environmental legislation. It is proposed to consider deploying a measure of revitalization of the watershed in more critical areas and conservation through mitigation measures on the causes of environmental degradation, such as protection of water sources, protection and restoration of riparian vegetation, protection of permanent preservation areas, containment of erosion processes in general, and others listed or not in specific laws, and even the establishment of a committee of basins in the area