965 resultados para CNPQ::ENGENHARIAS::ENGENHARIA MECANICA::PROCESSOS DE FABRICACAO
Resumo:
Desde a criação da tecnologia e do seu uso pelas empresas, a relação custo e benefício nem sempre foi bem elucidada tanto para os responsáveis pela área de tecnologia quanto para a alta direção. Mas, apesar disto, cada vez mais as organizações investem maciçamente em tecnologia, esperando que esta seja a solução para diversos problemas. Por isto, esta questão tem se tornado crucial para o processo de tomada de decisões, visto que investimentos nesta área costumam ser dispendiosos e, na atual conjuntura, estas análises precisam ser extremamente criteriosas para que se miniminizem as possibilidades de insucesso dos projetos, principalmente numa economia estabilizada e de concorrência acirrada. Uma das alternativas que as empresas têm buscado para atingir o sucesso e correr menos riscos é a terceirização da área de TI. Partindo desta visão, a presente dissertação tem por objetivo realizar uma investigação sobre a terceirização dos serviços de TI em todos os seus aspectos, isto é, desde a sua motivação, serviços efetivamente terceirizados, vantagens, desvantagens e possíveis obstáculos, a visão do alinhamento estratégico da TI, os processos de gestão de contratos e formas de controle e, por fim, tendências futuras. Trata-se de uma pesquisa de múltiplos casos, envolvendo franquias do Sistema Coca-Cola no Brasil. O estudo apresenta uma pesquisa bibliográfica sobre o processo de tomada de decisão empresarial, a análise de investimentos, a gestão e a terceirização da TI, o que permitem definir as dimensões de análise da pesquisa. Na pesquisa de campo foram entrevistados os gerentes da área de TI, nas cidades de Brasília-DF, Goiânia-GO e Ribeirão Preto-SP. A pesquisa de campo permitiu identificar como as mesmas avaliam seus investimentos em TI, como esta área é gerenciada, o que as levou a optar pela terceirização e como os processos terceirizados afetam a organização. Por se tratar de uma pesquisa qualitativa, optou-se por analisar comparativamente as três organizações. Com a realização deste estudo, obtiveram-se, como principais resultados, que as organizações estão utilizando a terceirização em TI para focar no negócio principal e, mesmo encontrando diversas desvantagens, inclusive com relação a custos, acreditam que os benefícios justificam. Ainda identificaram-se alguns obstáculos internos para a terceirização, principalmente quanto ao receio de se perder a inteligência do negócio. O acompanhamento dessas atividades terceirizadas é realizado pela equipe interna e por critérios estruturados, onde se verificam os níveis de serviço
Resumo:
In the current conjuncture, the environmental factor has been changing the position of companies that are practicing or minimally adopting environmental management. Such tool has been used by companies to face the problems caused by solid waste, in particular green coconut waste, which is constantly among the material discarded by society (companies/ consumer). It is a typical tropical fruit whose fresh water is very benefic for human health, and its popularization has caused a progressive increase of its consumption. Following this stream of thought, this present work came up with an analysis of strengths, weaknesses, threats, and opportunities SWOT analysis on green coconut solid waste management at two agribusiness companies in the state of Rio Grande do Norte (RN), Brazil, aiming to know the challenges and the potentials of this kind of waste. According to the approach of the problem, this work fits a descriptive, exploratory, and qualitative research. The data collection was obtained by a questionnaire and a structured interview, in order to evaluate the strategic posture of agribusiness companies through SWOT analysis, which is an English acronym for Strengths, Weaknesses, Opportunities and Threats. The SWOT analysis is an effective tool to analyze the internal and external environment of an organization. This tool contributes to locate the company at the environment in question and when well applied it enables the detection of mistakes, the strengthening of correct procedures, the avoidance of threats, and the bet on opportunities. The studied agribusiness industries have very similar profiles, such as a long business life span, and a strategy that extends the useful life of the fruit, by using its waste for the manufacturing of new subproducts. In both, the daily quantity of waste resulted of this process reaches approximately 20 thousand units of the fruit in high season, being necessary a focus directed at use and/or treatment of these waste. Further to SWOT analysis, it was ascertained that the agribusiness company A works through a defensive marketing strategy and acts vulnerably, in other words, unable of acting before this market segment, for it has decided to stop using the waste due to a lack of equipment and technology. On the other hand, the agribusiness company B has incorporated an offensive marketing strategy because even not possessing equipments, technology, and appropriated internal installations, it still insists on use and benefits of green coconut waste in its agribusiness. Thus, it is considered that the potential of green coconut waste management for the production of several subproducts reduces the impacts produced by inappropriate placement and generates profits in a short, medium and long term. Such profits being tangible and intangible, as the interest for sustainability actions is not only a matter of obtaining return on capital, but it is an important question in order to move on into business, since it is not enough to have quality on products and process nowadays. It is necessary to establish socio-environmental practices aiming the image of the company as the prevailing role on consumers buying decision
Resumo:
The knowledge management has received major attention from product designers because many of the activities within this process have to be creative and, therefore, they depend basically on the knowledge of the people who are involved in the process. Moreover, Product Development Process (PDP) is one of the activities in which knowledge management manifests in the most critical form once it had the intense application of the knowledge. As a consequence, this thesis analyzes the knowledge management aiming to improve the PDP and it also proposes a theoretical model of knowledge management. This model uses five steps (creation, maintenance, dissemination, utilization and discard) through the verification of the occurrence of four types of knowledge conversion (socialization, externalization, combination and internalization) that it will improve the knowledge management in this process. The intellectual capital in Small and Medium Enterprises (SMEs) managed efficiently and with the participation of all employees has become the mechanism of the creation and transference processes of knowledge, supporting and, consequently, improving the PDP. The expected results are an effective and efficient application of the proposed model for the creation of the knowledge base within an organization (organizational memory) aiming a better performance of the PDP. In this way, it was carried out an extensive analysis of the knowledge management (instrument of qualitative and subjective evaluation) within the Design department of a Brazilian company (SEBRAE/RN). This analysis aimed to know the state-of-the-art of the Design department regarding the use of knowledge management. This step was important in order to evaluate in the level of the evolution of the department related to the practical use of knowledge management before implementing the proposed theoretical model and its methodology. At the end of this work, based on the results of the diagnosis, a knowledge management system is suggested to facilitate the knowledge sharing within the organization, in order words, the Design department
Resumo:
The knowledge management has received major attention from product designers because many of the activities within this process have to be creative and, therefore, they depend basically on the knowledge of the people who are involved in the process. Moreover, Product Development Process (PDP) is one of the activities in which knowledge management manifests in the most critical form once it had the intense application of the knowledge. As a consequence, this thesis analyzes the knowledge management aiming to improve the PDP and it also proposes a theoretical model of knowledge management. This model uses five steps (creation, maintenance, dissemination, utilization and discard) through the verification of the occurrence of four types of knowledge conversion (socialization, externalization, combination and internalization) that it will improve the knowledge management in this process. The intellectual capital in Small and Medium Enterprises (SMEs) managed efficiently and with the participation of all employees has become the mechanism of the creation and transference processes of knowledge, supporting and, consequently, improving the PDP. The expected results are an effective and efficient application of the proposed model for the creation of the knowledge base within an organization (organizational memory) aiming a better performance of the PDP. In this way, it was carried out an extensive analysis of the knowledge management (instrument of qualitative and subjective evaluation) within the Design department of a Brazilian company (SEBRAE/RN). This analysis aimed to know the state-of-the-art of the Design department regarding the use of knowledge management. This step was important in order to evaluate in the level of the evolution of the department related to the practical use of knowledge management before implementing the proposed theoretical model and its methodology. At the end of this work, based on the results of the diagnosis, a knowledge management system is suggested to facilitate the knowledge sharing within the organization, in order words, the Design department
Resumo:
Taking competitive advantage or satisfy the client are the reasons why companies have been implementing a Quality Management System (QMS). It brings benefits such as the improvement in the processes, products and services; an enhancement in the image of the company (marketing) and satisfaction of the clients. As a whole, this paper aims to evaluate the results obtained from the implementation of the QMS in the certified companies in the ISO 9001 standard, contained in the database of INMETRO, of the Rio Grande do Norte State (RN). In order to achieve the goals, a bibliographical research about the theme quality management system was made and, subsequently a survey was made with the managers of the certified companies in RN, using the online questionnaire. Out of 27 companies that have the certificate in Rio Grande do Norte, 21 responded the data collection instrument. The data analysis was made through techniques of descriptive and multivariate statistics: cluster analysis. The research instrument used contained 20 questions that address the main theme of this dissertation. Using the cluster analysis, four groupings that possessed similarities concerning the survey answers were found. This analysis allowed us to conclude that the QMS boosts significant improvements in the organizations, such as: quality in the reputation of the company and sales increase. On the other hand, it allowed us to identify as main difficulties: the dissemination of the quality culture, lack of commitment of the whole organization and the resistance of the workers
Resumo:
This research aims at to contribute to show the consolidation of the area of Information Systems (IS) as area of knowledge in Production Engineering. For this, it according to presents a scenery of the publication in IS in the field of the Production Engineering in Brazil amount of articles, the authorship profile, the methodologies, the citations, the research thematic and the continuity of the research thematic. The base for this study was the works published in the National Meeting of Production Engineering - ENEGEP of years 2000, 2001, 2002, 2003 and 2004, inside of the area of Information Systems. Classified as bibliographical research, of applied nature, quantitative boarding, of the point of view of the objectives description-exploration was called and for the collection of data its comment was systematic with bibliographical survey. As field research, the method of collection of data if constituted of the elaboration of an analysis protocol and, to arrive itself at the final diagnosis, it operation the data through the statistical method, with the accomplishment of descriptive analyses. It approached concepts of IS and the seek areas and, it studied research correlate in Production Engineering, in Information Systems, in Information Science and other areas of the knowledge. How much to the results one concluded that the national and international contents are compatible and that the area of IS is in constant evolution. For the continuity of research lines it was observed that the majority of the authors was faithful to the area of Systems of Information. Amongst other found results, some institutions must try to increase its volume of publications and research, while others must look for to keep its reached mark already in the last years
Resumo:
This Thesis deals with the performance improvement on hotels that have adopted the ISO 9000 Quality Management Systems. It is researched the Brazilian hotels that have an ISO 9001 registration with an assessment form based on the Balanced Scorecard approach. The main findings are that ISO 9000 provided improvement on the performance of the hotels in general and also in all the BSC perspectives, and that are different perception on managers and directors, what suggests a need for a tool like BSC to register the performance improvements on the same basis. The Thesis contributes to provide information on the performance improvement in hotels, one of the claimed regarding the low ISO 9000 adoption rate in Brazilian hotels
Resumo:
In order to guarantee database consistency, a database system should synchronize operations of concurrent transactions. The database component responsible for such synchronization is the scheduler. A scheduler synchronizes operations belonging to different transactions by means of concurrency control protocols. Concurrency control protocols may present different behaviors: in general, a scheduler behavior can be classified as aggressive or conservative. This paper presents the Intelligent Transaction Scheduler (ITS), which has the ability to synchronize the execution of concurrent transactions in an adaptive manner. This scheduler adapts its behavior (aggressive or conservative), according to the characteristics of the computing environment in which it is inserted, using an expert system based on fuzzy logic. The ITS can implement different correctness criteria, such as conventional (syntactic) serializability and semantic serializability. In order to evaluate the performance of the ITS in relation to others schedulers with exclusively aggressive or conservative behavior, it was applied in a dynamic environment, such as a Mobile Database Community (MDBC). An MDBC simulator was developed and many sets of tests were run. The experimentation results, presented herein, prove the efficiency of the ITS in synchronizing transactions in a dynamic environment
Resumo:
The concepts of the industrial automation are being incorporated in the medical area, in other words, they also pass to be applied in the hospital automation. In this sense, researches have been developed and have usually been approached several of the problems that are pertinent to the processes that can be automated in the hospital environment. Considering that in the automation processes, an imperative factor is the communication, because the systems are usually distributed, the network for data transference becomes itself an important point in these processes. Because this network should be capable to provide the exchange of data and to guarantee the demands that are imposed by the automation process. In this context, this doctorate thesis proposed, specified, analyzed and validated the Multicycles Protocol for Hospital Automation (MP-HA), which is customized to assist the demands in these automation processes, seeking to guarantee the determinism in the communications and to optimize the factor of use of the mean of transmission
Resumo:
This work addresses issues related to analysis and development of multivariable predictive controllers based on bilinear multi-models. Linear Generalized Predictive Control (GPC) monovariable and multivariable is shown, and highlighted its properties, key features and applications in industry. Bilinear GPC, the basis for the development of this thesis, is presented by the time-step quasilinearization approach. Some results are presented using this controller in order to show its best performance when compared to linear GPC, since the bilinear models represent better the dynamics of certain processes. Time-step quasilinearization, due to the fact that it is an approximation, causes a prediction error, which limits the performance of this controller when prediction horizon increases. Due to its prediction error, Bilinear GPC with iterative compensation is shown in order to minimize this error, seeking a better performance than the classic Bilinear GPC. Results of iterative compensation algorithm are shown. The use of multi-model is discussed in this thesis, in order to correct the deficiency of controllers based on single model, when they are applied in cases with large operation ranges. Methods of measuring the distance between models, also called metrics, are the main contribution of this thesis. Several application results in simulated distillation columns, which are close enough to actual behaviour of them, are made, and the results have shown satisfactory
Sistema de detecção e isolamento de falhas em sistemas dinâmicos baseado em identificação paramétrica
Resumo:
The present research aims at contributing to the area of detection and diagnosis of failure through the proposal of a new system architecture of detection and isolation of failures (FDI, Fault Detection and Isolation). The proposed architecture presents innovations related to the way the physical values monitored are linked to the FDI system and, as a consequence, the way the failures are detected, isolated and classified. A search for mathematical tools able to satisfy the objectives of the proposed architecture has pointed at the use of the Kalman Filter and its derivatives EKF (Extended Kalman Filter) and UKF (Unscented Kalman Filter). The use of the first one is efficient when the monitored process presents a linear relation among its physical values to be monitored and its out-put. The other two are proficient in case this dynamics is no-linear. After that, a short comparative of features and abilities in the context of failure detection concludes that the UFK system is a better alternative than the EKF one to compose the architecture of the FDI system proposed in case of processes of no-linear dynamics. The results shown in the end of the research refer to the linear and no-linear industrial processes. The efficiency of the proposed architecture may be observed since it has been applied to simulated and real processes. To conclude, the contributions of this thesis are found in the end of the text
Resumo:
The usual programs for load flow calculation were in general developped aiming the simulation of electric energy transmission, subtransmission and distribution systems. However, the mathematical methods and algorithms used by the formulations were based, in majority, just on the characteristics of the transmittion systems, which were the main concern focus of engineers and researchers. Though, the physical characteristics of these systems are quite different from the distribution ones. In the transmission systems, the voltage levels are high and the lines are generally very long. These aspects contribute the capacitive and inductive effects that appear in the system to have a considerable influence in the values of the interest quantities, reason why they should be taken into consideration. Still in the transmission systems, the loads have a macro nature, as for example, cities, neiborhoods, or big industries. These loads are, generally, practically balanced, what reduces the necessity of utilization of three-phase methodology for the load flow calculation. Distribution systems, on the other hand, present different characteristics: the voltage levels are small in comparison to the transmission ones. This almost annul the capacitive effects of the lines. The loads are, in this case, transformers, in whose secondaries are connected small consumers, in a sort of times, mono-phase ones, so that the probability of finding an unbalanced circuit is high. This way, the utilization of three-phase methodologies assumes an important dimension. Besides, equipments like voltage regulators, that use simultaneously the concepts of phase and line voltage in their functioning, need a three-phase methodology, in order to allow the simulation of their real behavior. For the exposed reasons, initially was developped, in the scope of this work, a method for three-phase load flow calculation in order to simulate the steady-state behaviour of distribution systems. Aiming to achieve this goal, the Power Summation Algorithm was used, as a base for developing the three phase method. This algorithm was already widely tested and approved by researchers and engineers in the simulation of radial electric energy distribution systems, mainly for single-phase representation. By our formulation, lines are modeled in three-phase circuits, considering the magnetic coupling between the phases; but the earth effect is considered through the Carson reduction. It s important to point out that, in spite of the loads being normally connected to the transformer s secondaries, was considered the hypothesis of existence of star or delta loads connected to the primary circuit. To perform the simulation of voltage regulators, a new model was utilized, allowing the simulation of various types of configurations, according to their real functioning. Finally, was considered the possibility of representation of switches with current measuring in various points of the feeder. The loads are adjusted during the iteractive process, in order to match the current in each switch, converging to the measured value specified by the input data. In a second stage of the work, sensibility parameters were derived taking as base the described load flow, with the objective of suporting further optimization processes. This parameters are found by calculating of the partial derivatives of a variable in respect to another, in general, voltages, losses and reactive powers. After describing the calculation of the sensibility parameters, the Gradient Method was presented, using these parameters to optimize an objective function, that will be defined for each type of study. The first one refers to the reduction of technical losses in a medium voltage feeder, through the installation of capacitor banks; the second one refers to the problem of correction of voltage profile, through the instalation of capacitor banks or voltage regulators. In case of the losses reduction will be considered, as objective function, the sum of the losses in all the parts of the system. To the correction of the voltage profile, the objective function will be the sum of the square voltage deviations in each node, in respect to the rated voltage. In the end of the work, results of application of the described methods in some feeders are presented, aiming to give insight about their performance and acuity
Resumo:
Due of industrial informatics several attempts have been done to develop notations and semantics, which are used for classifying and describing different kind of system behavior, particularly in the modeling phase. Such attempts provide the infrastructure to resolve some real problems of engineering and construct practical systems that aim at, mainly, to increase the productivity, quality, and security of the process. Despite the many studies that have attempted to develop friendly methods for industrial controller programming, they are still programmed by conventional trial-and-error methods and, in practice, there is little written documentation on these systems. The ideal solution would be to use a computational environment that allows industrial engineers to implement the system using high-level language and that follows international standards. Accordingly, this work proposes a methodology for plant and control modelling of the discrete event systems that include sequential, parallel and timed operations, using a formalism based on Statecharts, denominated Basic Statechart (BSC). The methodology also permits automatic procedures to validate and implement these systems. To validate our methodology, we presented two case studies with typical examples of the manufacturing sector. The first example shows a sequential control for a tagged machine, which is used to illustrated dependences between the devices of the plant. In the second example, we discuss more than one strategy for controlling a manufacturing cell. The model with no control has 72 states (distinct configurations) and, the model with sequential control generated 20 different states, but they only act in 8 distinct configurations. The model with parallel control generated 210 different states, but these 210 configurations act only in 26 distinct configurations, therefore, one strategy control less restrictive than previous. Lastly, we presented one example for highlight the modular characteristic of our methodology, which it is very important to maintenance of applications. In this example, the sensors for identifying pieces in the plant were removed. So, changes in the control model are needed to transmit the information of the input buffer sensor to the others positions of the cell
Resumo:
The traditional processes for treatment of hazardous waste are questionable for it generates other wastes that adversely affect people s health. As an attempt to minimize these problems, it was developed a system for treatment of hazardous waste by thermal plasma, a more appropriate technology since it produces high temperatures, preventing the formation of toxic pollutants to human beings. The present work brings out a solution of automation for this plant. The system has local and remote monitoring resources to ensure the operators security as well as the process itself. A special attention was given to the control of the main reactor temperature of the plant as it is the place where the main processing occurs and because it presents a complex mathematical model. To this, it was employed cascaded controls based on Fuzzy logic. A process computer, with a particular man-machine interface (MMI), provides information and controls of the plant to the operator, including by Internet. A compact PLC module is in charge of the central element of management automation and plant control which receives information from sensors, and sends it to the MMI
Resumo:
In the recovering process of oil, rock heterogeneity has a huge impact on how fluids move in the field, defining how much oil can be recovered. In order to study this variability, percolation theory, which describes phenomena involving geometry and connectivity are the bases, is a very useful model. Result of percolation is tridimensional data and have no physical meaning until visualized in form of images or animations. Although a lot of powerful and sophisticated visualization tools have been developed, they focus on generation of planar 2D images. In order to interpret data as they would be in the real world, virtual reality techniques using stereo images could be used. In this work we propose an interactive and helpful tool, named ZSweepVR, based on virtual reality techniques that allows a better comprehension of volumetric data generated by simulation of dynamic percolation. The developed system has the ability to render images using two different techniques: surface rendering and volume rendering. Surface rendering is accomplished by OpenGL directives and volume rendering is accomplished by the Zsweep direct volume rendering engine. In the case of volumetric rendering, we implemented an algorithm to generate stereo images. We also propose enhancements in the original percolation algorithm in order to get a better performance. We applied our developed tools to a mature field database, obtaining satisfactory results. The use of stereoscopic and volumetric images brought valuable contributions for the interpretation and clustering formation analysis in percolation, what certainly could lead to better decisions about the exploration and recovery process in oil fields