909 resultados para Statistical Control Process
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
During the sugarcane mechanized harvester process, the wear of base cutting knives is directly correlated to the quality of the cut made by the machines, and the use of quality control tools important in monitoring this process. Thus, the present study in Ribeirão Preto region aimed to assess the knives cut baseline and damage caused to wear brass knuckles in mechanized harvesting of cane raw sugar, from the viewpoint of statistical quality control (SQC). The wear of the knives was quantified by mass loss and its dimensions, while cutting quality was assessed by cutting height and damage to stumps visually classified according to the level of damage caused. The results showed that the wear of the knives was more pronounced in certain periods of use, but still within control standards. The cutting height was not affected by the wear of the knives, keeping within the limits of desirable quality for operation. Eventually damage the stumps ranged among themselves depending on each face of the cutting knives evaluated, and the predominance of certain classes of damage in each cutting face, but always remained in statistical control.
Resumo:
Pós-graduação em Engenharia de Produção - FEB
Resumo:
Pós-graduação em Agronomia (Produção Vegetal) - FCAV
Resumo:
Pós-graduação em Agronomia (Ciência do Solo) - FCAV
Resumo:
The main objective of ventilation systems in case of fire is the reduction of the possible consequences by achieving the best possible conditions for the evacuation of the users and the intervention of the emergency services. In the last years, the required quick response of the ventilation system, from normal to emergency mode, has been improved by the use of automatic and semi-automatic control systems, what reduces the response times through the support to the operators decision taking, and the use of pre-defined strategies. A further step consists on the use of closedloop algorithms, which takes into account not only the initial conditions but their development (air velocity, traffic situation, etc), optimizing the quality of the smoke control process
Resumo:
Several types of parallelism can be exploited in logic programs while preserving correctness and efficiency, i.e. ensuring that the parallel execution obtains the same results as the sequential one and the amount of work performed is not greater. However, such results do not take into account a number of overheads which appear in practice, such as process creation and scheduling, which can induce a slow-down, or, at least, limit speedup, if they are not controlled in some way. This paper describes a methodology whereby the granularity of parallel tasks, i.e. the work available under them, is efficiently estimated and used to limit parallelism so that the effect of such overheads is controlled. The run-time overhead associated with the approach is usually quite small, since as much work is done at compile time as possible. Also,a number of run-time optimizations are proposed. Moreover, a static analysis of the overhead associated with the granularity control process is performed in order to decide its convenience. The performance improvements resulting from the incorporation of grain size control are shown to be quite good, specially for systems with medium to large parallel execution overheads.
Resumo:
Several types of parallelism can be exploited in logic programs while preserving correctness and efficiency, i.e. ensuring that the parallel execution obtains the same results as the sequential one and the amount of work performed is not greater. However, such results do not take into account a number of overheads which appear in practice, such as process creation and scheduling, which can induce a slow-down, or, at least, limit speedup, if they are not controlled in some way. This paper describes a methodology whereby the granularity of parallel tasks, i.e. the work available under them, is efficiently estimated and used to limit parallelism so that the effect of such overheads is controlled. The run-time overhead associated with the approach is usually quite small, since as much work is done at compile time as possible. Also, a number of run-time optimizations are proposed. Moreover, a static analysis of the overhead associated with the granularity control process is performed in order to decide its convenience. The performance improvements resulting from the incorporation of grain size control are shown to be quite good, specially for systems with médium to large parallel execution overheads.
Resumo:
This paper empirically evaluates container terminal service attributes. The methodology proposed focuses on statistical control. Based on the concept of service segmentation, the authors employed control charts to classify container terminal services. The purpose of control charts is to allow simple detection of events that are indicative of actual process change. This simple decision can be difficult where the process characteristic is continuously varying, the control chart provides statistically objective criteria of change. When change is detected and considered good its cause should be identified and possibly become the new way of working, where the change is bad then its cause should be identified and eliminated. Both theoretical and practical implications of the research findings are discussed in this paper.
Resumo:
This paper empirically evaluates container terminal service attributes. The methodology proposed focuses on statistical control. Based on the concept of service segmentation, we employed control charts to classify container terminal services. The purpose of control charts is to allow simple detection of events that are indicative of actual process change. This simple decision can be difficult where the process characteristic is continuously varying; the control chart provides statistically objective criteria of change. When change is detected and considered good its cause should be identified and possibly become the new way of working, where the change is bad then its cause should be identified and eliminated. This paper is organized as follows: Section 1 is the introduction, Section 2 provides a brief note on other studies that inspired this research, section 3 focuses on the methodology used, and develops the results obtained and finally conclusions are shown in Section 4. Theoretical and practical implications of the research findings are discussed.
Resumo:
La presente Tesis está orientada al análisis de la supervisión multidistribuida de tres procesos agroalimentarios: el secado solar, el transporte refrigerado y la fermentación de café, a través de la información obtenida de diferentes dispositivos de adquisición de datos, que incorporan sensores, así como el desarrollo de metodologías de análisis de series temporales, modelos y herramientas de control de procesos para la ayuda a la toma de decisiones en las operaciones de estos entornos. En esta tesis se han utilizado: tarjetas RFID (TemTrip®) con sistema de comunicación por radiofrecuencia y sensor de temperatura; el registrador (i-Button®), con sensor integrado de temperatura y humedad relativa y un tercer prototipo empresarial, módulo de comunicación inalámbrico Nlaza, que integra un sensor de temperatura y humedad relativa Sensirion®. Estos dispositivos se han empleado en la conformación de redes multidistribuidas de sensores para la supervisión de: A) Transportes de producto hortofrutícola realizados en condiciones comerciales reales, que son: dos transportes terrestre de producto de IV gama desde Murcia a Madrid; transporte multimodal (barco-barco) de limones desde Montevideo (Uruguay) a Cartagena (España) y transporte multimodal (barco-camión) desde Montevideo (Uruguay) a Verona (Italia). B) dos fermentaciones de café realizadas en Popayán (Colombia) en un beneficiadero. Estas redes han permitido registrar la dinámica espacio-temporal de temperaturas y humedad relativa de los procesos estudiados. En estos procesos de transporte refrigerado y fermentación la aplicación de herramientas de visualización de datos y análisis de conglomerados, han permitido identificar grupos de sensores que presentan patrones análogos de sus series temporales, caracterizando así zonas con dinámicas similares y significativamente diferentes del resto y permitiendo definir redes de sensores de menor densidad cubriendo las diferentes zonas identificadas. Las metodologías de análisis complejo de las series espacio-temporales (modelos psicrométricos, espacio de fases bidimensional e interpolaciones espaciales) permitieron la cuantificación de la variabilidad del proceso supervisado tanto desde el punto de vista dinámico como espacial así como la identificación de eventos. Constituyendo así herramientas adicionales de ayuda a la toma de decisiones en el control de los procesos. Siendo especialmente novedosa la aplicación de la representación bidimensional de los espacios de fases en el estudio de las series espacio-temporales de variables ambientales en aplicaciones agroalimentarias, aproximación que no se había realizado hasta el momento. En esta tesis también se ha querido mostrar el potencial de un sistema de control basado en el conocimiento experto como es el sistema de lógica difusa. Se han desarrollado en primer lugar, los modelos de estimación del contenido en humedad y las reglas semánticas que dirigen el proceso de control, el mejor modelo se ha seleccionado mediante un ensayo de secado realizado sobre bolas de hidrogel como modelo alimentario y finalmente el modelo se ha validado mediante un ensayo en el que se deshidrataban láminas de zanahoria. Los resultados sugirieron que el sistema de control desarrollado, es capaz de hacer frente a dificultades como las variaciones de temperatura día y noche, consiguiendo un producto con buenas características de calidad comparables a las conseguidas sin aplicar ningún control sobre la operación y disminuyendo así el consumo energético en un 98% con respecto al mismo proceso sin control. La instrumentación y las metodologías de análisis de datos implementadas en esta Tesis se han mostrado suficientemente versátiles y transversales para ser aplicadas a diversos procesos agroalimentarios en los que la temperatura y la humedad relativa sean criterios de control en dichos procesos, teniendo una aplicabilidad directa en el sector industrial ABSTRACT This thesis is focused on the analysis of multi-distributed supervision of three agri-food processes: solar drying, refrigerated transport and coffee fermentation, through the information obtained from different data acquisition devices with incorporated sensors, as well as the development of methodologies for analyzing temporary series, models and tools to control processes in order to help in the decision making in the operations within these environments. For this thesis the following has been used: RFID tags (TemTrip®) with a Radiofrequency ID communication system and a temperature sensor; the recorder (i-Button®), with an integrated temperature and relative humidity and a third corporate prototype, a wireless communication module Nlaza, which has an integrated temperature and relative humidity sensor, Sensirion®. These devices have been used in creating three multi-distributed networks of sensors for monitoring: A) Transport of fruits and vegetables made in real commercial conditions, which are: two land trips of IV range products from Murcia to Madrid; multimodal transport (ship - ship) of lemons from Montevideo (Uruguay) to Cartagena (Spain) and multimodal transport (ship - truck) from Montevideo (Uruguay) to Verona (Italy). B) Two coffee fermentations made in Popayan (Colombia) in a coffee processing plant. These networks have allowed recording the time space dynamics of temperatures and relative humidity of the processed under study. Within these refrigerated transport and fermentation processes, the application of data display and cluster analysis tools have allowed identifying sensor groups showing analogical patterns of their temporary series; thus, featuring areas with similar and significantly different dynamics from the others and enabling the definition of lower density sensor networks covering the different identified areas. The complex analysis methodologies of the time space series (psychrometric models, bi-dimensional phase space and spatial interpolation) allowed quantifying the process variability of the supervised process both from the dynamic and spatial points of view; as well as the identification of events. Thus, building additional tools to aid decision-making on process control brought the innovative application of the bi-dimensional representation of phase spaces in the study of time-space series of environmental variables in agri-food applications, an approach that had not been taken before. This thesis also wanted to show the potential of a control system based on specialized knowledge such as the fuzzy logic system. Firstly, moisture content estimation models and semantic rules directing the control process have been developed, the best model has been selected by an drying assay performed on hydrogel beads as food model; and finally the model has been validated through an assay in which carrot sheets were dehydrated. The results suggested that the control system developed is able to cope with difficulties such as changes in temperature daytime and nighttime, getting a product with good quality features comparable to those features achieved without applying any control over the operation and thus decreasing consumption energy by 98% compared to the same uncontrolled process. Instrumentation and data analysis methodologies implemented in this thesis have proved sufficiently versatile and cross-cutting to apply to several agri-food processes in which the temperature and relative humidity are the control criteria in those processes, having a direct effect on the industry sector.
Resumo:
O sucesso de estratégias de controle preditivo baseado em modelo (MPC, na sigla em inglês) tanto em ambiente industrial quanto acadêmico tem sido marcante. No entanto, ainda há diversas questões em aberto na área, especialmente quando a hipótese simplificadora de modelo perfeito é abandonada. A consideração explícita de incertezas levou a importantes progressos na área de controle robusto, mas esta ainda apresenta alguns problemas: a alta demanda computacional e o excesso de conservadorismo são questões que podem ter prejudicado a aplicação de estratégias de controle robusto na prática. A abordagem de controle preditivo estocástico (SMPC, na sigla em inglês) busca a redução do conservadorismo através da incorporação de informação estatística dos ruídos. Como processos na indústria química sempre estão sujeito a distúrbios, seja devido a diferenças entre planta e modelo ou a distúrbios não medidos, está técnica surge como uma interessante alternativa para o futuro. O principal objetivo desta tese é o desenvolvimento de algoritmos de SMPC que levem em conta algumas das especificidades de tais processos, as quais não foram adequadamente tratadas na literatura até o presente. A contribuição mais importante é a inclusão de ação integral no controlador através de uma descrição do modelo em termos de velocidade. Além disso, restrições obrigatórias (hard) nas entradas associadas a limites físicos ou de segurança e restrições probabilísticas nos estados normalmente advindas de especificações de produtos também são consideradas na formulação. Duas abordagens foram seguidas neste trabalho, a primeira é mais direta enquanto a segunda fornece garantias de estabilidade em malha fechada, contudo aumenta o conservadorismo. Outro ponto interessante desenvolvido nesta tese é o controle por zonas de sistemas sujeitos a distúrbios. Essa forma de controle é comum na indústria devido à falta de graus de liberdade, sendo a abordagem proposta a primeira contribuição da literatura a unir controle por zonas e SMPC. Diversas simulações de todos os controladores propostos e comparações com modelos da literatura são exibidas para demonstrar o potencial de aplicação das técnicas desenvolvidas.
Resumo:
This paper studies an overlooked, but highly important relationship, the relationship that exists between regulatory agencies (e.g., the EPA, OSHA, and the FDA) and the for-profit businesses they attempt to govern. Drawing on business-to-business control and satisfaction research, a framework is developed to understand how regulatory control influences the satisfaction levels of customer firms. Regulatory control is disaggregated into four distinct facets: the controlling agency, the rules and regulations of control, the processes used by the agency to apply the regulations, and sanctions. Each facet is hypothesized to have an effect on satisfaction. A regulator's administration of state food safety regulations provides the empirical context for testing the hypotheses. Results from a survey of 173 restaurants provide empirical support for the conceptual model. Most importantly, the study finds that the informal control process increases customer satisfaction, while the formal control process decreases customer satisfaction. We discuss how these and other findings may contribute to more effective agency-to-business relationships and ongoing research.