980 resultados para dados técnicos
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
The study aims identify the existence of quality culture in Brazilian automotive dealerships with ISO 9001, motivated by this research problem: exist a quality culture in this dealerships, which facilitates the adoption of quality practices supported by ISO 9001? For referencing, the theoretical review was written in five themes: organizational culture, quality culture, total quality management, ISO 9001 quality management system and the Brazilian automobile industry. As regards the methodological aspects, the research has an applied nature, with a quantitative approach, being exploratory in their objectives, and bibliographic, documental and survey as technical procedures. The organizations participating in the study were all Brazilian automotive dealerships certified with ISO 9001. The research intended cover all the 80 active dealers with ISO 9001 certification identified by the Brazilian Committee for Quality (ABNT CB-25). The survey recorded participation of 32 companies (response rate 40%). The questionnaire was sent to seller managers, formatted into five sections: 1) introductory message 2) manager profile, 3) reasons for implementation and benefits generated by ISO 4) adoption levels of quality practices and 5) diagnosis of organizational culture. The questions contained in sections 2 and 3 were structured in multiple choice, and in the remaining sections were structured in Likert 5-point scale. The statistical method used (data analysis), was the descriptive statistics, for data representation in frequency percentage (FP) and standard level (SL). The results showed that the interviewed dealerships have an organizational culture with very high levels of prevalence in "outcome orientation" and "attention to detail" cultural dimensions. In addition, about the other two dimensions considered conducive to quality (innovation and teamwork/respect for people), both observed high prevalence. Based on the present results, concluded that the organizational culture of Brazilian dealerships with ISO 9001 are quality oriented, being conducive to adoption of quality practices supported by TQM Systems. However, it is important to mention that the quality culture identified is not sufficiently developed to adopt quality practices at optimal levels, which sets up an unfavorable scenario to deals with highly rigorous customer
Resumo:
The objective of this thesis is proposes a method for a mobile robot to build a hybrid map of an indoor, semi-structured environment. The topological part of this map deals with spatial relationships among rooms and corridors. It is a topology-based map, where the edges of the graph are rooms or corridors, and each link between two distinct edges represents a door. The metric part of the map consists in a set of parameters. These parameters describe a geometric figure which adapts to the free space of the local environment. This figure is calculated by a set of points which sample the boundaries of the local free space. These points are obtained with range sensors and with knowledge about the robot s pose. A method based on generalized Hough transform is applied to this set of points in order to obtain the geomtric figure. The building of the hybrid map is an incremental procedure. It is accomplished while the robot explores the environment. Each room is associated with a metric local map and, consequently, with an edge of the topo-logical map. During the mapping procedure, the robot may use recent metric information of the environment to improve its global or relative pose
Resumo:
This graduate thesis proposes a model to asynchronously replicate heterogeneous databases. This model singularly combines -in a systematic way and in a single project -different concepts, techniques and paradigms related to the areas of database replication and management of heterogeneous databases. One of the main advantages of the replication is to allow applications to continue to process information, during time intervals when they are off the network and to trigger the database synchronization, as soon as the network connection is reestablished. Therefore, the model introduces a communication and update protocol that takes in consideration the environment of asynchronous characteristics used. As part of the work, a tool was developed in Java language, based on the model s premises in order to process, test, simulate and validate the proposed model
Resumo:
In recent decades, changes have been occurring in the telecommunications industry, allied to competition driven by the policies of privatization and concessions, have fomented the world market irrefutably causing the emergence of a new reality. The reflections in Brazil have become evident due to the appearance of significant growth rates, getting in 2012 to provide a net operating income of 128 billion dollars, placing the country among the five major powers in the world in mobile communications. In this context, an issue of increasing importance to the financial health of companies is their ability to retain their customers, as well as turn them into loyal customers. The appearance of infidelity from customer operators has been generating monthly rates shutdowns about two to four percent per month accounting for business management one of its biggest challenges, since capturing a new customer has meant an expenditure greater than five times to retention. For this purpose, models have been developed by means of structural equation modeling to identify the relationships between the various determinants of customer loyalty in the context of services. The original contribution of this thesis is to develop a model for loyalty from the identification of relationships between determinants of satisfaction (latent variables) and the inclusion of attributes that determine the perceptions of service quality for the mobile communications industry, such as quality, satisfaction, value, trust, expectation and loyalty. It is a qualitative research which will be conducted with customers of operators through simple random sampling technique, using structured questionnaires. As a result, the proposed model and statistical evaluations should enable operators to conclude that customer loyalty is directly influenced by technical and operational quality of the services offered, as well as provide a satisfaction index for the mobile communication segment
Resumo:
This work presents simulation results of an identification platform compatible with the INPE Brazilian Data Collection System, modeled with SystemC-AMS. SystemC-AMS that is a library of C++ classes dedicated to the simulation of heterogeneous systems, offering a powerful resource to describe models in digital, analog and RF domains, as well as mechanical and optic. The designed model was divided in four parts. The first block takes into account the satellite s orbit, necessary to correctly model the propagation channel, including Doppler effect, attenuation and thermal noise. The identification block detects the satellite presence. It is composed by low noise amplifier, band pass filter, power detector and logic comparator. The controller block is responsible for enabling the RF transmitter when the presence of the satellite is detected. The controller was modeled as a Petri net, due to the asynchronous nature of the system. The fourth block is the RF transmitter unit, which performs the modulation of the information in BPSK ±60o. This block is composed by oscillator, mixer, adder and amplifier. The whole system was simulated simultaneously. The results are being used to specify system components and to elaborate testbenchs for design verification
Resumo:
We propose a new approach to reduction and abstraction of visual information for robotics vision applications. Basically, we propose to use a multi-resolution representation in combination with a moving fovea for reducing the amount of information from an image. We introduce the mathematical formalization of the moving fovea approach and mapping functions that help to use this model. Two indexes (resolution and cost) are proposed that can be useful to choose the proposed model variables. With this new theoretical approach, it is possible to apply several filters, to calculate disparity and to obtain motion analysis in real time (less than 33ms to process an image pair at a notebook AMD Turion Dual Core 2GHz). As the main result, most of time, the moving fovea allows the robot not to perform physical motion of its robotics devices to keep a possible region of interest visible in both images. We validate the proposed model with experimental results
Resumo:
Digital signal processing (DSP) aims to extract specific information from digital signals. Digital signals are, by definition, physical quantities represented by a sequence of discrete values and from these sequences it is possible to extract and analyze the desired information. The unevenly sampled data can not be properly analyzed using standard techniques of digital signal processing. This work aimed to adapt a technique of DSP, the multiresolution analysis, to analyze unevenly smapled data, to aid the studies in the CoRoT laboratory at UFRN. The process is based on re-indexing the wavelet transform to handle unevenly sampled data properly. The was efective presenting satisfactory results
Resumo:
The last years have presented an increase in the acceptance and adoption of the parallel processing, as much for scientific computation of high performance as for applications of general intention. This acceptance has been favored mainly for the development of environments with massive parallel processing (MPP - Massively Parallel Processing) and of the distributed computation. A common point between distributed systems and MPPs architectures is the notion of message exchange, that allows the communication between processes. An environment of message exchange consists basically of a communication library that, acting as an extension of the programming languages that allow to the elaboration of applications parallel, such as C, C++ and Fortran. In the development of applications parallel, a basic aspect is on to the analysis of performance of the same ones. Several can be the metric ones used in this analysis: time of execution, efficiency in the use of the processing elements, scalability of the application with respect to the increase in the number of processors or to the increase of the instance of the treat problem. The establishment of models or mechanisms that allow this analysis can be a task sufficiently complicated considering parameters and involved degrees of freedom in the implementation of the parallel application. An joined alternative has been the use of collection tools and visualization of performance data, that allow the user to identify to points of strangulation and sources of inefficiency in an application. For an efficient visualization one becomes necessary to identify and to collect given relative to the execution of the application, stage this called instrumentation. In this work it is presented, initially, a study of the main techniques used in the collection of the performance data, and after that a detailed analysis of the main available tools is made that can be used in architectures parallel of the type to cluster Beowulf with Linux on X86 platform being used libraries of communication based in applications MPI - Message Passing Interface, such as LAM and MPICH. This analysis is validated on applications parallel bars that deal with the problems of the training of neural nets of the type perceptrons using retro-propagation. The gotten conclusions show to the potentiality and easinesses of the analyzed tools.
Resumo:
Self-organizing maps (SOM) are artificial neural networks widely used in the data mining field, mainly because they constitute a dimensionality reduction technique given the fixed grid of neurons associated with the network. In order to properly the partition and visualize the SOM network, the various methods available in the literature must be applied in a post-processing stage, that consists of inferring, through its neurons, relevant characteristics of the data set. In general, such processing applied to the network neurons, instead of the entire database, reduces the computational costs due to vector quantization. This work proposes a post-processing of the SOM neurons in the input and output spaces, combining visualization techniques with algorithms based on gravitational forces and the search for the shortest path with the greatest reward. Such methods take into account the connection strength between neighbouring neurons and characteristics of pattern density and distances among neurons, both associated with the position that the neurons occupy in the data space after training the network. Thus, the goal consists of defining more clearly the arrangement of the clusters present in the data. Experiments were carried out so as to evaluate the proposed methods using various artificially generated data sets, as well as real world data sets. The results obtained were compared with those from a number of well-known methods existent in the literature
Resumo:
The control of industrial processes has become increasingly complex due to variety of factory devices, quality requirement and market competition. Such complexity requires a large amount of data to be treated by the three levels of process control: field devices, control systems and management softwares. To use data effectively in each one of these levels is extremely important to industry. Many of today s industrial computer systems consist of distributed software systems written in a wide variety of programming languages and developed for specific platforms, so, even more companies apply a significant investment to maintain or even re-write their systems for different platforms. Furthermore, it is rare that a software system works in complete isolation. In industrial automation is common that, software had to interact with other systems on different machines and even written in different languages. Thus, interoperability is not just a long-term challenge, but also a current context requirement of industrial software production. This work aims to propose a middleware solution for communication over web service and presents an user case applying the solution developed to an integrated system for industrial data capture , allowing such data to be available simplified and platformindependent across the network
Resumo:
The competitiveness of the trade generated by the higher availability of products with lower quality and cost promoted a new reality of industrial production with small clearances. Track deviations at the production are not discarded, uncertainties can statistically occur. The world consumer and the Brazilian one are supported by the consumer protection code, in lawsuits against the products poor quality. An automobile is composed of various systems and thousands of constituent parts, increasing the likelihood of failure. The dynamic and security systems are critical in relation to the consequences of possible failures. The investigation of the failure gives us the possibility of learning and contributing to various improvements. Our main purpose in this work is to develop a systematic, specific methodology by investigating the root cause of the flaw occurred on an axle end of the front suspension of an automobile, and to perform comparative data analyses between the fractured part and the project information. Our research was based on a flaw generated in an automotive suspension system involved in a mechanical judicial cause, resulting in property and personal damages. In the investigations concerning the analysis of mechanical flaws, knowledge on materials engineering plays a crucial role in the process, since it enables applying techniques for characterizing materials, relating the technical attributes required from a respective part with its structure of manufacturing material, thus providing a greater scientific contribution to the work. The specific methodology developed follows its own flowchart. In the early phase, the data in the records and information on the involved ones were collected. The following laboratory analyses were performed: macrography of the fracture, micrography with SEM (Scanning Electron Microscope) of the initial and final fracture, phase analysis with optical microscopy, Brinell hardness and Vickers microhardness analyses, quantitative and qualitative chemical analysis, by using X-ray fluorescence and optical spectroscopy for carbon analysis, qualitative study on the state of tension was done. Field data were also collected. In the analyses data of the values resulting from the fractured stock parts and the design values were compared. After the investigation, one concluded that: the developed methodology systematized the investigation and enabled crossing data, thus minimizing diagnostic error probability, the morphology of the fracture indicates failure by the fatigue mechanism in a geometrically propitious location, a tension hub, the part was subjected to low tensions by the sectional area of the final fracture, the manufacturing material of the fractured part has low ductility, the component fractured in an earlier moment than the one recommended by the manufacturer, the percentages of C, Si, Mn and Cr of the fractured part present values which differ from the design ones, the hardness value of the superior limit of the fractured part is higher than that of the design, and there is no manufacturing uniformity between stock and fractured part. The work will contribute to optimizing the guidance of the actions in a mechanical engineering judicial expertise
Resumo:
Materials denominated technical textiles can be defined as structures designed and developed with function to fulfill specific functional requirements of various industrial sectors as are the cases of the automotive and aerospace industries. In this aspect the technical textiles are distinguished from conventional textile materials, in which the aesthetic and of comfort needs are of primordial importance. Based on these considerations, the subject of this dissertation was established having as its main focus the study of development of textile structures from aramid and glass fibers and acting in order to develop the manufacture of composite materials that combine properties of two different structures, manufactured in an identical operation, where each structure contributes to improving the properties of the resulting composite material. Therefore were created in laboratory scale, textile structures with low weight and different composition: aramid (100%), glass (100%) and aramid /glass (65/35%), in order to use them as a reinforcing element in composite materials with polyester matrix. These composites were tested in tension and its fracture surface, evaluated by MEV. Based on the analysis of mechanical properties of the developed composites, the efficiency of the structures prepared as reinforcing element were testified by reason of that the resistance values of the composites are far superior to the polyester matrix. It was also observed that hybridization in tissue structure was efficient, since the best results obtained were for hybrid composites, where strength to the rupture was similar to the steel 1020, reaching values on the order of 340 MPa
Resumo:
Materials known as technical textiles can be defined as structures designed and developed to meet specific functional requirements of various industry sectors, which is the case in automotive and aerospace industries, and other specific applications. Therefore, the purpose of this work presents the development and manufacture of polymer composite with isophthalic polyester resin. The reinforcement of the composite structure is a technical textile fabric made from high performance fibers, aramid (Kevlar 49) and glass fiber E. The fabrics are manufactured by the same method, with the aim of improving the tensile strength of the resulting polymer composite material. The fabrics, we developed some low grammage technical textile structures in laboratory scale and differentiated-composition type aramid (100%), hybrid 1 aramid fiber / glass (65/35%) and hybrid 2 aramid fiber / glass (85/15% ) for use as a reinforcing element in composite materials with unsaturated isophthalic polyester matrix. The polymer composites produced were tested in uniaxial tensile fracture surface and it´s evaluated by SEM. The purpose of this work characterize the performance of polymer composites prepared, identifying changes and based on resistance to strain corresponding to the mechanical behavior. The objectives are to verify the capability of using this reinforcement structure, along with the use of high performance fibers and resin in terms of workability and mechanical strength; verify the adherence of the fiber to the matrix and the fracture surface by electron microscopy scanning and determination of tensile strength by tensile test. The results indicate that, in a comparative study to the response of uniaxial tensile test for tensile strength of the composites and the efficiency of the low percentage of reinforcement element, being a technical textile fabric structure that features characteristic of lightness and low weight added in polymer composites
Resumo:
Crude oil is a complex liquid mixture of organic and inorganic compounds that are dominated by hydrocarbons. It is a mixture of alkanes from the simplest to more complex aromatic compounds that are present derivatives such as gasoline, diesel, alcohol, kerosene, naphtha, etc.. These derivatives are extracted from any oil, however, only with a very high quality, in other words, when the content of hydrocarbons of low molecular weight is high means that production of these compounds is feasible. The American Petroleum Institute (API) developed a classification system for the various types of oil. In Brazil, the quality of most of the oil taken from wells is very low, so it is necessary to generate new technology to develop best practices for refining in order to produce petroleum products of higher commercial value. Therefore, it is necessary to study the thermodynamic equilibrium properties of its derivative compounds of interest. This dissertation aims to determine vapor-liquid equilibrium (VLE) data for the systems Phenilcyclohexane - CO2, and Cyclohexane - Phenilcyclohexane - CO2 at high pressure and temperatures between 30 to 70oC. Furthermore, comparisons between measured VLE experimental data from this work and from the literature in relation to the Peng- Robinson molecular thermodynamic model, using a simulation program SPECS IVCSEP v5.60 and two adjustable interaction parameters, have been performed for modeling and simulation purposes. Finally, the developed apparatus for determination of phase equilibrium data at high pressures is presented