72 resultados para bico rotativo padrão de deposição
Resumo:
That work looked for to point out the different conceptions on the family agriculture and the established self-consumption practices inside the unit of production. Hypothesis: due to the conditions, more and more restricted of producing for the self-consumption, the rural families are more favorable present her situation of alimentary insecurity as severe as for the urban families, unlike what he/she defends. The research was accomplished in three states of the Northeast: Paraíba; Rio Grande do Norte and Sergipe. The results revealed that among the main factors that expose those families to situations of constant alimentary vulnerability are had: the low quality of the alimentary consumption in what concerns the readiness, to the diversification and mainly, to the accessibility. The analyses can be to subsidize a reflection concerning the alimentary pattern of families rural front to the precepts of Food and Nutrition Security (FNS) politics
Resumo:
This Master s Thesis aims to use the theoretical models of growth with restricted balance of payments, specifically Kaldor (1970) and Thirlwall (1979) models, to analyze the behavior and the pattern of specialization of Brazilian exports and imports in the last years. It is observed that, in some periods, the pattern of specialization has contributed in restricting long-term growth of the Brazilian economy. It has been hypothesized that overall this is due to lack of structural transformation policies. To achieve this goal, it analyzed the performance of Brazilian exports and imports disaggregating them according to their technological content. The basis for comparison was a group of countries to which Brazil is inserted in, the BRIC. In this regard, the work is a comparative analysis by using descriptive statistics. It is concluded that the low rate of GDP growth experienced by Brazil since the 1980s can be explained in part by the decoupling of the Brazilian National Innovation System (NIS) and the Brazilian productive structure. This would be reducing the income elasticity of exports and raising imports, causing a pattern of specialization intensive primary commodities and labor and low-skill labor
Resumo:
This work presents a study in quality of health care, with focus on consulting appointment. The main purpose is to define a statistical model and propose a quality grade of the consulting appointment time. The time considered is that from the day the patient get the appointment done to the day the consulting is realized. It is used reliability techniques and functions that has as main characteristic the analysis of data regarding the time of occurrence certain event. It is gathered a random sample of 1743 patients in the appointment system of a University Hospital - the Hospital Universitário Onofre Lopes - of the Federal University of Rio Grande do Norte, Brazil. The sample is randomly stratified in terms on clinical specialty. The data were analyzed against the parametric methods of the reliability statistics and the adjustment of the regression model resulted in the Weibull distribution being best fit to data. The quality grade proposed is based in the PAHO criteria for a consulting appointment and result that no clinic got the PAHO quality grade. The quality grade proposed could be used to define priority for improvement and as criteria to quality control
Resumo:
The knowledge management has received major attention from product designers because many of the activities within this process have to be creative and, therefore, they depend basically on the knowledge of the people who are involved in the process. Moreover, Product Development Process (PDP) is one of the activities in which knowledge management manifests in the most critical form once it had the intense application of the knowledge. As a consequence, this thesis analyzes the knowledge management aiming to improve the PDP and it also proposes a theoretical model of knowledge management. This model uses five steps (creation, maintenance, dissemination, utilization and discard) through the verification of the occurrence of four types of knowledge conversion (socialization, externalization, combination and internalization) that it will improve the knowledge management in this process. The intellectual capital in Small and Medium Enterprises (SMEs) managed efficiently and with the participation of all employees has become the mechanism of the creation and transference processes of knowledge, supporting and, consequently, improving the PDP. The expected results are an effective and efficient application of the proposed model for the creation of the knowledge base within an organization (organizational memory) aiming a better performance of the PDP. In this way, it was carried out an extensive analysis of the knowledge management (instrument of qualitative and subjective evaluation) within the Design department of a Brazilian company (SEBRAE/RN). This analysis aimed to know the state-of-the-art of the Design department regarding the use of knowledge management. This step was important in order to evaluate in the level of the evolution of the department related to the practical use of knowledge management before implementing the proposed theoretical model and its methodology. At the end of this work, based on the results of the diagnosis, a knowledge management system is suggested to facilitate the knowledge sharing within the organization, in order words, the Design department
Resumo:
This thesis proposes the specification and performance analysis of a real-time communication mechanism for IEEE 802.11/11e standard. This approach is called Group Sequential Communication (GSC). The GSC has a better performance for dealing with small data packets when compared to the HCCA mechanism by adopting a decentralized medium access control using a publish/subscribe communication scheme. The main objective of the thesis is the HCCA overhead reduction of the Polling, ACK and QoS Null frames exchanged between the Hybrid Coordinator and the polled stations. The GSC eliminates the polling scheme used by HCCA scheduling algorithm by using a Virtual Token Passing procedure among members of the real-time group to whom a high-priority and sequential access to communication medium is granted. In order to improve the reliability of the mechanism proposed into a noisy channel, it is presented an error recovery scheme called second chance algorithm. This scheme is based on block acknowledgment strategy where there is a possibility of retransmitting when missing real-time messages. Thus, the GSC mechanism maintains the real-time traffic across many IEEE 802.11/11e devices, optimized bandwidth usage and minimal delay variation for data packets in the wireless network. For validation purpose of the communication scheme, the GSC and HCCA mechanisms have been implemented in network simulation software developed in C/C++ and their performance results were compared. The experiments show the efficiency of the GSC mechanism, especially in industrial communication scenarios.
Resumo:
In academia, it is common to create didactic processors, facing practical disciplines in the area of Hardware Computer and can be used as subjects in software platforms, operating systems and compilers. Often, these processors are described without ISA standard, which requires the creation of compilers and other basic software to provide the hardware / software interface and hinder their integration with other processors and devices. Using reconfigurable devices described in a HDL language allows the creation or modification of any microarchitecture component, leading to alteration of the functional units of data path processor as well as the state machine that implements the control unit even as new needs arise. In particular, processors RISP enable modification of machine instructions, allowing entering or modifying instructions, and may even adapt to a new architecture. This work, as the object of study addressing educational soft-core processors described in VHDL, from a proposed methodology and its application on two processors with different complexity levels, shows that it s possible to tailor processors for a standard ISA without causing an increase in the level hardware complexity, ie without significant increase in chip area, while its level of performance in the application execution remains unchanged or is enhanced. The implementations also allow us to say that besides being possible to replace the architecture of a processor without changing its organization, RISP processor can switch between different instruction sets, which can be expanded to toggle between different ISAs, allowing a single processor become adaptive hybrid architecture, which can be used in embedded systems and heterogeneous multiprocessor environments
Resumo:
The monitoring of patients performed in hospitals is usually done either in a manual or semiautomated way, where the members of the healthcare team must constantly visit the patients to ascertain the health condition in which they are. The adoption of this procedure, however, compromises the quality of the monitoring conducted since the shortage of physical and human resources in hospitals tends to overwhelm members of the healthcare team, preventing them from moving to patients with adequate frequency. Given this, many existing works in the literature specify alternatives aimed at improving this monitoring through the use of wireless networks. In these works, the network is only intended for data traffic generated by medical sensors and there is no possibility of it being allocated for the transmission of data from applications present in existing user stations in the hospital. However, in the case of hospital automation environments, this aspect is a negative point, considering that the data generated in such applications can be directly related to the patient monitoring conducted. Thus, this thesis defines Wi-Bio as a communication protocol aimed at the establishment of IEEE 802.11 networks for patient monitoring, capable of enabling the harmonious coexistence among the traffic generated by medical sensors and user stations. The formal specification and verification of Wi-Bio were made through the design and analysis of Petri net models. Its validation was performed through simulations with the Network Simulator 2 (NS2) tool. The simulations of NS2 were designed to portray a real patient monitoring environment corresponding to a floor of the nursing wards sector of the University Hospital Onofre Lopes (HUOL), located at Natal, Rio Grande do Norte. Moreover, in order to verify the feasibility of Wi-Bio in terms of wireless networks standards prevailing in the market, the testing scenario was also simulated under a perspective in which the network elements used the HCCA access mechanism described in the IEEE 802.11e amendment. The results confirmed the validity of the designed Petri nets and showed that Wi-Bio, in addition to presenting a superior performance compared to HCCA on most items analyzed, was also able to promote efficient integration between the data generated by medical sensors and user applications on the same wireless network
Resumo:
This paper presents the performanee analysis of traffie retransmission algorithms pro¬posed to the HCCA medium aeeess meehanism of IEEE 802.11 e standard applied to industrial environmen1. Due to the nature of this kind of environment, whieh has eleetro¬magnetic interferenee, and the wireless medium of IEEE 802.11 standard, suseeptible to such interferenee, plus the lack of retransmission meehanisms, refers to an impraetieable situation to ensure quality of service for real-time traffic, to whieh the IEEE 802.11 e stan¬dard is proposed and this environment requires. Thus, to solve this problem, this paper proposes a new approach that involves the ereation and evaluation of retransmission al-gorithms in order to ensure a levei of robustness, reliability and quality of serviee to the wireless communication in such environments. Thus, according to this approaeh, if there is a transmission error, the traffie scheduler is able to manage retransmissions to reeo¬ver data 10s1. The evaluation of the proposed approaeh is performed through simulations, where the retransmission algorithms are applied to different seenarios, whieh are abstrae¬tions of an industrial environment, and the results are obtained by using an own-developed network simulator and compared with eaeh other to assess whieh of the algorithms has better performanee in a pre-defined applieation
Resumo:
It s notorious the advance of computer networks in recent decades, whether in relation to transmission rates, the number of interconnected devices or the existing applications. In parallel, it s also visible this progress in various sectors of the automation, such as: industrial, commercial and residential. In one of its branches, we find the hospital networks, which can make the use of a range of services, ranging from the simple registration of patients to a surgery by a robot under the supervision of a physician. In the context of both worlds, appear the applications in Telemedicine and Telehealth, which work with the transfer in real time of high resolution images, sound, video and patient data. Then comes a problem, since the computer networks, originally developed for the transfer of less complex data, is now being used by a service that involves high transfer rates and needs requirements for quality of service (QoS) offered by the network . Thus, this work aims to do the analysis and comparison of performance of a network when subjected to this type of application, for two different situations: the first without the use of QoS policies, and the second with the application of such policies, using as scenario for testing, the Metropolitan Health Network of the Federal University of Rio Grande do Norte (UFRN)
Resumo:
This work presents a scalable and efficient parallel implementation of the Standard Simplex algorithm in the multicore architecture to solve large scale linear programming problems. We present a general scheme explaining how each step of the standard Simplex algorithm was parallelized, indicating some important points of the parallel implementation. Performance analysis were conducted by comparing the sequential time using the Simplex tableau and the Simplex of the CPLEXR IBM. The experiments were executed on a shared memory machine with 24 cores. The scalability analysis was performed with problems of different dimensions, finding evidence that our parallel standard Simplex algorithm has a better parallel efficiency for problems with more variables than constraints. In comparison with CPLEXR , the proposed parallel algorithm achieved a efficiency of up to 16 times better
Resumo:
Multiphase flows in ducts can adopt several morphologies depending on the mass fluxes and the fluids properties. Annular flow is one of the most frequently encountered flow patterns in industrial applications. For gas liquid systems, it consists of a liquid film flowing adjacent to the wall and a gas core flowing in the center of the duct. This work presents a numerical study of this flow pattern in gas liquid systems in vertical ducts. For this, a solution algorithm was developed and implemented in FORTRAN 90 to numerically solve the governing transport equations. The mass and momentum conservation equations are solved simultaneously from the wall to the center of the duct, using the Finite Volumes Technique. Momentum conservation in the gas liquid interface is enforced using an equivalent effective viscosity, which also allows for the solution of both velocity fields in a single system of equations. In this way, the velocity distributions across the gas core and the liquid film are obtained iteratively, together with the global pressure gradient and the liquid film thickness. Convergence criteria are based upon satisfaction of mass balance within the liquid film and the gas core. For system closure, two different approaches are presented for the calculation of the radial turbulent viscosity distribution within the liquid film and the gas core. The first one combines a k- Ɛ one-equation model and a low Reynolds k-Ɛ model. The second one uses a low Reynolds k- Ɛ model to compute the eddy viscosity profile from the center of the duct right to the wall. Appropriate interfacial values for k e Ɛ are proposed, based on concepts and ideas previously used, with success, in stratified gas liquid flow. The proposed approaches are compared with an algebraic model found in the literature, specifically devised for annular gas liquid flow, using available experimental results. This also serves as a validation of the solution algorithm
Resumo:
To obtain a process stability and a quality weld bead it is necessary an adequate parameters set: base current and time, pulse current and pulse time, because these influence the mode of metal transfer and the weld quality in the MIG-P, sometimes requiring special sources with synergistic modes with external control for this stability. This work aims to analyze and compare the effects of pulse parameters and droplet size in arc stability in MIG-P, four packets of pulse parameters were analysed: Ip = 160 A, tp = 5.7 ms; Ip = 300 A and tp = 2 ms, Ip = 350 A, tp = 1.2 ms and Ip = 350 A, tp = 0.8 ms. Each was analyzed with three different drop diameters: drop with the same diameter of the wire electrode; droplet diameter larger drop smaller than the diameter of the wire electrode. For purposes of comparison the same was determined relation between the average current and welding speed was determined generating a constant (Im / Vs = K) for all parameters. Welding in flat plate by simple deposition for the MIG-P with a distance beak contact number (DBCP) constant was perfomed subsequently making up welding in flat plate by simple deposition with an inclination of 10 degrees to vary the DBCP, where by assessment on how the MIG-P behaved in such a situation was possible, in addition to evaluating the MIG-P with adaptive control, in order to maintain a constant arc stability. Also high speed recording synchronized with acquiring current x voltage (oscillogram) was executed for better interpretation of the transfer mechanism and better evaluation in regard to the study of the stability of the process. It is concluded that parameters 3 and 4 exhibited greater versatility; diameters drop equal to or slightly less than the diameter of the wire exhibited better stability due to their higher frequency of detachment, and the detachment of the drop base does not harm the maintenance the height of the arc
Resumo:
Plasma DC hollow cathode has been used for film deposition by sputtering with release of neutral atoms from the cathode. The DC Plasma Ar-H2 hollow cathode currently used in the industry has proven to be effective in cleaning surfaces and thin film deposition when compared to argon plasma. When we wish to avoid the effects of ion bombardment on the substrate discharge, it uses the post-discharge region. Were generated by discharge plasma of argon and hydrogen hollow cathode deposition of thin films of titanium on glass substrate. The optical emission spectroscopy was used for the post-discharge diagnosis. The films formed were analyzed by mechanical profilometry technique. It was observed that in the spectrum of the excitation lines of argon occurred species. There are variations in the rate of deposition of titanium on the glass substrate for different process parameters such as deposition time, distance and discharge working gases. It was noted an increase in intensity of the lines of argon compared with the lines of titanium. Deposition with argon and hydrogen in glass sample observed a higher rate deposition of titanium as more closer the sample was in the discharge
Resumo:
The continuous development of instruments and equipment used as tools or torque measurement in the industry is demanding more accurate techniques in the use of this kind instrumentation, including development of metrological characteristics in torque measurement. The same happens with the needs in calibration services. There is a diversity of methods of hand torque tools in the market with different measuring range but without complaining with technical standards in terms of requirements of quality and reliability. However, actually there is no choice of a torque measuring standard that fulfils, with low cost, the needs for the calibration of hand torque tools in a large number of ranges. The objective of this thesis is to show the development and evaluation of a torque measuring standard device with a conception to allow the calibration of hand torque tools with three levels of torque with an single instrument, promoting reduction of costs and time in the calibration, also offering reliability for the evaluation of torque measuring instrument. To attend the demand in the calibration of hand torque tools it is necessary that the calibration laboratories have a big collection of torque measuring standards, to fulfills the needs of the costumer, what is very costly. The development of this type of torque measuring standard revealed a viable technique and economically making possible the calibration of hand torque tools in different nominal ranges through a single measurement system versatile, efficient and of easy operation
Resumo:
The flow assurance has become one of the topics of greatest interest in the oil industry, mainly due to production and transportation of oil in regions with extreme temperature and pressure. In these operations the wax deposition is a commonly problem in flow of paraffinic oils, causing the rising costs of the process, due to increased energy cost of pumping, decreased production, increased pressure on the line and risk of blockage of the pipeline. In order to describe the behavior of the wax deposition phenomena in turbulent flow of paraffinic oils, under different operations conditions, in this work we developed a simulator with easy interface. For that we divided de work in four steps: (i) properties estimation (physical, thermals, of transport and thermodynamics) of n-alkanes and paraffinic mixtures by using correlations; (ii) obtainment of the solubility curve and determination the wax appearance temperature, by calculating the solid-liquid equilibrium of parafinnic systems; (iii) modelling wax deposition process, comprising momentum, mass and heat transfer; (iv) development of graphic interface in MATLAB® environment for to allow the understanding of simulation in different flow conditions as well as understand the matter of the variables (inlet temperature, external temperature, wax appearance temperature, oil composition, and time) on the behavior of the deposition process. The results showed that the simulator developed, called DepoSim, is able to calculate the profile of temperature, thickness of the deposit, and the amount of wax deposited in a simple and fast way, and also with consistent results and applicable to the operation