937 resultados para Process control -- Statistical methods
Resumo:
In this thesis, the basic research of Chase and Simon (1973) is questioned, and we seek new results by analyzing the errors of experts and beginners chess players in experiments to reproduce chess positions. Chess players with different levels of expertise participated in the study. The results were analyzed by a Brazilian grandmaster, and quantitative analysis was performed with the use of statistical methods data mining. The results challenge significantly, the current theories of expertise, memory and decision making in this area, because the present theory predicts piece on square encoding, in which players can recognize the strategic situation reproducing it faithfully, but commit several errors that the theory can¿t explain. The current theory can¿t fully explain the encoding used by players to register a board. The errors of intermediary players preserved fragments of the strategic situation, although they have committed a series of errors in the reconstruction of the positions. The encoding of chunks therefore includes more information than that predicted by current theories. Currently, research on perception, trial and decision is heavily concentrated on the idea of pattern recognition". Based on the results of this research, we explore a change of perspective. The idea of "pattern recognition" presupposes that the processing of relevant information is on "patterns" (or data) that exist independently of any interpretation. We propose that the theory suggests the vision of decision-making via the recognition of experience.
Resumo:
In this thesis, the basic research of Chase and Simon (1973) is questioned, and we seek new results by analyzing the errors of experts and beginners chess players in experiments to reproduce chess positions. Chess players with different levels of expertise participated in the study. The results were analyzed by a Brazilian grandmaster, and quantitative analysis was performed with the use of statistical methods data mining. The results challenge significantly, the current theories of expertise, memory and decision making in this area, because the present theory predicts piece on square encoding, in which players can recognize the strategic situation reproducing it faithfully, but commit several errors that the theory can¿t explain. The current theory can¿t fully explain the encoding used by players to register a board. The errors of intermediary players preserved fragments of the strategic situation, although they have committed a series of errors in the reconstruction of the positions. The encoding of chunks therefore includes more information than that predicted by current theories. Currently, research on perception, trial and decision is heavily concentrated on the idea of 'pattern recognition'. Based on the results of this research, we explore a change of perspective. The idea of 'pattern recognition' presupposes that the processing of relevant information is on 'patterns' (or data) that exist independently of any interpretation. We propose that the theory suggests the vision of decision-making via the recognition of experience.
Resumo:
O objetivo deste trabalho é analisar se as principais mudanças ocorridas na forma de condução de um processo de gestão operacional de uma empresa pública de TI, quando da implantação de um novo modelo de suporte técnico aos usuários de seus produtos e serviços, proporcionaram uma alteração no paradigma de gestão existente. Em função do proposto, a diretriz adotada para a condução deste trabalho foi a de pesquisar as alterações relativas aos procedimentos de organização e gestão das atividades e não as inerentes à infra-estrutura técnica. Com este fim, foi utilizado como referencial teórico a literatura que aborda a evolução do processo de gestão da produção, notadamente aquele desenvolvido do final do século XIX até os dias atuais, tais como: o taylorismo, o fordismo e o pós-fordismo. Para estruturar a coleta de informações foram definidas as seguintes dimensões de análise: Conhecimento do Processo; Planejamento da Produção; Execução da Produção; Controle do Processo; Divisão do Trabalho; Trabalho em Equipe e Recursos Técnicos, representativas e aplicáveis a qualquer modelo, permitindo a comparação entre estes. Para levantamento dos dados foram utilizadas as documentações existentes na empresa, registros de bancos de dados, observações diretas, observações participantes e questionários, respondidos pelos gerentes responsáveis pela condução do processo. Uma vez concluída esta etapa e com o uso das dimensões de análise, foram identificadas as principais características dos dois paradigmas adotados: fordista e pós-fordista. Fez-se também a descrição destas dimensões de análise tendo por base os dados coletados sobre os modelos em estudo. Finalmente, houve a confrontação das informações para se identificar a qual paradigma cada modelo mais se assemelhava e se houve a mudança prevista. Os resultados obtidos comprovaram que o modelo anterior era mais próximo ao paradigma fordista, enquanto o novo modelo apresentava suas características predominantes identificadas com o paradigma pós-fordista. Com isto foi possível concluir que a pesquisa realizada comprovou o proposto neste trabalho, que o modelo adotado promove a mudança do paradigma de gestão existente no modelo anterior.
Resumo:
Este estudo trata das dificuldades que os professores de matemática encontram na sua prática diária de sala de aula, dificuldades estas que sempre existiram e parecem persistir apesar das tentativas de solucioná-las. O trabalho desenvolveu-se através de entrevistas, ob servações de aulas e reuniões de área. Os maiores problemas apontados foram: formação do professor, conteúdo programático, aprendizagem, avaliação e dificuldades dos alunos. Cada um destes itens foi aprofundado sempre que necessario. Procurou-se esclarecer todos e com isto encontrar caminhos. Após caracterizá-los, passou-se às dificuldades dos alunos; são enfocadas apenas as mais significativas, segundo os professores. Foi aplicado um teste onde muitas delas se confirmaram. Concluiu-se que a prática da matemática em nossas escolas continua ineficiente. O seu ensino não acompanha as necessidades da sociedade, os professores tendem a abandonar a profissão por causa dos baixos salários, os alunos são reprovados em massa e abandonam seus estudos, os livros apenas acrescentam ou retiram conteúdos, as escolas continuam formando alunos passivos e pouco criticos em relação à matemáti ca. Muitas tentativas ainda serão feitas mas nao se pode contar com a certeza do retorno porque o professor não é valorizado e nem ouvido quando se trata de apresentar propostas. Os poucos resultados positivos observados partiram deles que sempre procuram soluções práticas e não dispendiosas para resolver seus problemas. Finalizando o trabalho foram apresentadas sugestões dos professores e se acredita que muitas produzem resultado positivo em pouco tempo.
Resumo:
O propósito deste estudo é compreender como os diferentes membros gerenciam o risco na cadeia de suprimentos global. Por meio de um estudo multicaso na cadeia de exportação da manga brasileira para os Estados Unidos foram investigados os elos fornecedor, exportador, operador logístico e importador. Foi desenvolvido um protocolo de pesquisa conforme sugestão de Yin (2010) e adaptação de instrumento de coleta de dados semi-estruturado de Christopher, et al. (2011). A partir de entrevistas com organizações de apoio ao objeto da pesquisa, foi selecionado a amostra de empresas nas quais o fenômeno de gestão do risco poderia ser melhor observado. Baseado nas classificações de tipo de risco elaboradas por Christopher et al. (2011), é sugerida uma nova organização das categorias de risco (demanda, suprimentos, processo e controle, ambiental e sustentabilidade). Este trabalho apresenta uma descrição da cadeia de exportação de manga, bem como os principais riscos e as estratégias mitigadoras de risco empregadas pelos quatro membros da cadeia estudados. Conforme regra de Sousa (2001), foram realizadas comparações entre os tipos de risco e estratégias mitigadoras observadas na cadeia de suprimentos. Por fim, o estudo mostrou que a gestão do risco é heterogenia entre os membros da cadeia, o exportador é o mais penalizado pela consequência de risco à cadeia e a colaboração é a principal forma observada de mitigação.
Resumo:
PEREIRA, J. P. ; CASTRO, B. P. S. ; VALENTIM, R. A. M. . Kit Educacional para Controle e Supervisão Aplicado a Nível. Holos, Natal, v. 2, p. 68-72, 2009
Resumo:
The financial crisis that occurred between the years 2007 and 2008, known as the subprime crisis, has highlighted the governance of companies in Brazil and worldwide. To monitor the financial risk, quantitative tools of risk management were created in the 1990s, after several financial disasters. The market turmoil has also led companies to invest in the development and use of information, which are applied as tools to support process control and decision making. Numerous empirical studies on informational efficiency of the market have been made inside and outside Brazil, revealing whether the prices reflect the information available instantly. The creation of different levels of corporate governance on BOVESPA, in 2000, made the firms had greater impairment in relation to its shareholders with greater transparency in their information. The purpose of this study is to analyze how the subprime financial crisis has affected, between January 2007 and December 2009, the volatility of stock returns in the BM&BOVESPA of companies with greater liquidity at different levels of corporate governance. From studies of time series and through the studies of events, econometric tests were performed by the EVIEWS, and through the results obtained it became evident that the adoption of good practices of corporate governance affect the volatility of returns of companies
Resumo:
Titanium nitride films were grown on glass using the Cathodic Cage Plasma Deposition technique in order to verify the influence of process parameters in optical and structural properties of the films. The plasma atmosphere used was a mixture of Ar, N2 and H2, setting the Ar and N2 gas flows at 4 and 3 sccm, respectively and H2 gas flow varied from 0, 1 to 2 sccm. The deposition process was monitored by Optical Emission Spectroscopy (OES) to investigate the influence of the active species in plasma. It was observed that increasing the H2 gas flow into the plasma the luminescent intensities associated to the species changed. In this case, the luminescence of N2 (391,4nm) species was not proportional to the increasing of the H2 gas into the reactor. Other parameters investigated were diameter and number of holes in the cage. The analysis by Grazing Incidence X-Ray Diffraction (GIXRD) confirmed that the obtained films are composed by TiN and they may have variations in the nitrogen amount into the crystal and in the crystallite size. The optical microscopy images provided information about the homogeneity of the films. The atomic force microscopy (AFM) results revealed some microstructural characteristics and surface roughness. The thickness was measured by ellipsometry. The optical properties such as transmittance and reflectance (they were measured by spectrophotometry) are very sensitive to changes in the crystal lattice of the material, chemical composition and film thicknesses. Therefore, such properties are appropriate tools for verification of this process control. In general, films obtained at 0 sccm of H2 gas flow present a higher transmittance. It can be attributed to the smaller crystalline size due to a higher amount of nitrogen in the TiN lattice. The films obtained at 1 and 2 sccm of H2 gas flow have a golden appearance and XRD pattern showed peaks characteristics of TiN with higher intensity and smaller FWHM (Full Width at Half Maximum) parameter. It suggests that the hydrogen presence in the plasma makes the films more stoichiometric and becomes it more crystalline. It was observed that with higher number of holes in the lid of the cage, close to the region between the lid and the sample and the smaller diameter of the hole, the deposited film is thicker, which is justified by the most probability of plasma species reach effectively the sample and it promotes the growth of the film
Resumo:
The production of water has become one of the most important wastes in the petroleum industry, specifically in the up stream segment. The treatment of this kind of effluents is complex and normally requires high costs. In this context, the electrochemical treatment emerges as an alternative methodology for treating the wastewaters. It employs electrochemical reactions to increase the capability and efficiency of the traditional chemical treatments for associated produced water. The use of electrochemical reactors can be effective with small changes in traditional treatments, generally not representing a significant additional surface area for new equipments (due to the high cost of square meter on offshore platforms) and also it can use almost the same equipments, in continuous or batch flow, without others high costs investments. Electrochemical treatment causes low environmental impact, because the process uses electrons as reagent and generates small amount of wastes. In this work, it was studied two types of electrochemical reactors: eletroflocculation and eletroflotation, with the aim of removing of Cu2+, Zn2+, phenol and BTEX mixture of produced water. In eletroflocculation, an electrical potential was applied to an aqueous solution containing NaCl. For this, it was used iron electrodes, which promote the dissolution of metal ions, generating Fe2+ and gases which, in appropriate pH, promote also clotting-flocculation reactions, removing Cu2+ and Zn2+. In eletroflotation, a carbon steel cathode and a DSA type anode (Ti/TiO2-RuO2-SnO2) were used in a NaCl solution. It was applied an electrical current, producing strong oxidant agents as Cl2 and HOCl, increasing the degradation rate of BTEX and phenol. Under different flow rates, the Zn2+ was removed by electrodeposition or by ZnOH formation, due the increasing of pH during the reaction. To better understand the electrochemical process, a statistical protocol factor (22) with central point was conducted to analyze the sensitivity of operating parameters on removing Zn2+ by eletroflotation, confirming that the current density affected the process negatively and the flow rate positively. For economical viability of these two electrochemical treatments, the energy consumption was calculated, taking in account the kWh given by ANEEL. The treatment cost obtained were quite attractive in comparison with the current treatments used in Rio Grande do Norte state. In addition, it could still be reduced for the case of using other alternative energy source such as solar, wind or gas generated directly from the Petrochemical Plant or offshore platforms
Resumo:
The petrochemical industry has as objective obtain, from crude oil, some products with a higher commercial value and a bigger industrial utility for energy purposes. These industrial processes are complex, commonly operating with large production volume and in restricted operation conditions. The operation control in optimized and stable conditions is important to keep obtained products quality and the industrial plant safety. Currently, industrial network has been attained evidence when there is a need to make the process control in a distributed way. The Foundation Fieldbus protocol for industrial network, for its interoperability feature and its user interface organized in simple configuration blocks, has great notoriety among industrial automation network group. This present work puts together some benefits brought by industrial network technology to petrochemical industrial processes inherent complexity. For this, a dynamic reconfiguration system for intelligent strategies (artificial neural networks, for example) based on the protocol user application layer is proposed which might allow different applications use in a particular process, without operators intervention and with necessary guarantees for the proper plant functioning
Resumo:
This paper characterizes humic substances (HS) extracted from soil samples collected in the Rio Negro basin in the state of Amazonas, Brazil, particularly investigating their reduction capabilities towards Hg(II) in order to elucidate potential mercury cycling/volatilization in this environment. For this reason, a multimethod approach was used, consisting of both instrumental methods (elemental analysis, EPR, solid-state NMR, FIA combined with cold-vapor AAS of Hg(0)) and statistical methods such as principal component analysis (PCA) and a central composite factorial planning method. The HS under study were divided into groups, complexing and reducing ones, owing to different distribution of their functionalities. The main functionalities (cor)related with reduction of Hg(II) were phenolic, carboxylic and amide groups, while the groups related with complexation of Hg(II) were ethers, hydroxyls, aldehydes and ketones. The HS extracted from floodable regions of the Rio Negro basin presented a greater capacity to retain (to complex, to adsorb physically and/or chemically) Hg(II), while nonfloodable regions showed a greater capacity to reduce Hg(II), indicating that HS extracted from different types of regions contribute in different ways to the biogeochemical mercury cycle in the basin of the mid-Rio Negro, AM, Brazil. (c) 2007 Published by Elsevier B.V.
Resumo:
The influence of 2 different levels of the inspired oxygen fraction (FiO(2)) on blood gas variables was evaluated in dogs with high intracranial pressure (ICP) during propofol anesthesia (induction followed by a continuous rate infusion [CRI] of 0.6 mg/kg/min) and intermittent positive pressure ventilation (IPPV). Eight adult mongrel dogs were anesthetized on 2 occasions, 21 d apart, and received oxygen at an FiO(2) of 1.0 (G100) or 0.6 (G60) in a randomized crossover fashion. A fiberoptic catheter was implanted on the surface of the right cerebral cortex for assessment of the ICP. An increase in the ICP was induced by temporary ligation of the jugular vein 50 min after induction of anesthesia and immediately after baseline measurement of the ICP. Blood gas measurements were taken 20 min later and then at 15-min intervals for 1 h. Numerical data were submitted to Morrison's multivariate statistical methods. The ICP, the cerebral perfusion pressure and the mean arterial pressure did not differ significantly between FiO(2) levels or measurement times after jugular ligation. The only blood gas values that differed significantly (P < 0.05) were the arterial oxygen partial pressure, which was greater with G100 than with G60 throughout the procedure, and the venous haemoglobin saturation, that was greater with G100 than with G60 at M0. There were no significant differences between FiO(2) levels or measurement times in the following blood gas variables: arterial carbon dioxide partial pressure, arterial hemoglobin saturation, base deficit, bicarbonate concentration, pH, venous oxygen partial pressure, venous carbon dioxide partial pressure and the arterial-to-end-tidal carbon dioxide difference.
Resumo:
Nowadays, telecommunications is one of the most dynamic and strategic areas in the world. Organizations are always seeking to find new management practices within an ever increasing competitive environment where resources are getting scarce. In this scenario, data obtained from business and corporate processes have even greater importance, although this data is not yet adequately explored. Knowledge Discovery in Databases (KDD) appears then, as an option to allow the study of complex problems in different areas of management. This work proposes both a systematization of KDD activities using concepts from different methodologies, such as CRISP-DM, SEMMA and FAYYAD approaches and a study concerning the viability of multivariate regression analysis models to explain corporative telecommunications sales using performance indicators. Thus, statistical methods were outlined to analyze the effects of such indicators on the behavior of business productivity. According to business and standard statistical analysis, equations were defined and fit to their respective determination coefficients. Tests of hypotheses were also conducted on parameters with the purpose of validating the regression models. The results show that there is a relationship between these development indicators and the amount of sales
Resumo:
This study presents an investigation of the influence of Corporate Social Responsibility (CSR) in customer s satisfaction and loyalty through a study with car s buyers, besides that, it aims to contribute to conceptual models of satisfaction and loyalty analysis by applying the model of Johnson et al. (2001), adapted for the introduction of variables of CSR and conscious consumption, in a car dealership in Natal / RN. The methodology has a descriptive quantitative approach and for the analysis results were applied statistical methods of simple and multiple linear regression analysis, descriptive analysis and exploratory analysis. The field research provided 90 valid forms. The results show that CSR affects the image of the company studied and is also one of the elements of the compound of satisfaction and loyalty. This study concludes that CSR should be considered in the strategic and marketing actions of firms
Resumo:
The industrial automation is directly linked to the development of information tecnology. Better hardware solutions, as well as improvements in software development methodologies make possible the rapid growth of the productive process control. In this thesis, we propose an architecture that will allow the joining of two technologies in hardware (industrial network) and software field (multiagent systems). The objective of this proposal is to join those technologies in a multiagent architecture to allow control strategies implementations in to field devices. With this, we intend develop an agents architecture to detect and solve problems which may occur in the industrial network environment. Our work ally machine learning with industrial context, become proposed multiagent architecture adaptable to unfamiliar or unexpected production environment. We used neural networks and presented an allocation strategies of these networks in industrial network field devices. With this we intend to improve decision support at plant level and allow operations human intervention independent