908 resultados para process control
Resumo:
Designed for undergraduate and postgraduate students, academic researchers and industrial practitioners, this book provides comprehensive case studies on numerical computing of industrial processes and step-by-step procedures for conducting industrial computing. It assumes minimal knowledge in numerical computing and computer programming, making it easy to read, understand and follow. Topics discussed include fundamentals of industrial computing, finite difference methods, the Wavelet-Collocation Method, the Wavelet-Galerkin Method, High Resolution Methods, and comparative studies of various methods. These are discussed using examples of carefully selected models from real processes of industrial significance. The step-by-step procedures in all these case studies can be easily applied to other industrial processes without a need for major changes and thus provide readers with useful frameworks for the applications of engineering computing in fundamental research problems and practical development scenarios.
Resumo:
This paper describes recent updates to a milling train extraction model used to assess and predict the performance of a milling train. An extension was made to the milling unit model for the bagasse mills to replace the imbibition coefficient with crushing factor and mixing efficiency. New empirical relationships for reabsorption factor, imbibition coefficient, crushing factor, mixing efficiency and purity ratio were developed. The new empirical relationships were tested against factory measurements and previous model predictions. The updated model has been implemented in the SysCAD process modelling software. New additions to the model implementation include: a shredder model to assess or predict cane preparation, mill and shredder drives for power consumption and an updated imbibition control system to add allow water to be added to intermediate mills.
Resumo:
Biomethanation of herbaceous biomass feedstock has the potential to provide clean energy source for cooking and other activities in areas where such biomass availability predominates. A biomethanation concept that involves fermentation of biomass residues in three steps, occurring in three zones of the fermentor is described. This approach while attempting take advantage of multistage reactors simplifies the reactor operation and obviates the need for a high degree of process control or complex reactor design. Typical herbaceous biomass decompose with a rapid VFA flux initially (with a tendency to float) followed by a slower decomposition showing balanced process of VFA generation and its utilization by methanogens that colonize biomass slowly. The tendency to float at the initial stages is suppressed by allowing previous days feed to hold it below digester liquid which permits VFA to disperse into the digester liquid without causing process inhibition. This approach has been used to build and operate simple biomass digesters to provide cooking gas in rural areas with weed and agro-residues. With appropriate modifications, the same concept has been used for digesting municipal solid wastes in small towns where large fermentors are not viable. With further modifications this concept has been used for solid-liquid feed fermentors. Methanogen colonized leaf biomass has been used as biofilm support to treat coffee processing wastewater as well as crop litter alternately in a year. During summer it functions as a biomass based biogas plants operating in the three-zone mode while in winter, feeding biomass is suspended and high strength coffee processing wastewater is let into the fermentor achieving over 90% BOD reduction. The early field experience of these fermentors is presented.
Resumo:
Colour graphics subsystems can be used in a variety of applications such as high-end business graphics, low-end scientific computations, and for realtime display of process control diagrams. The design of such a subsystem is shown. This subsystem can be added to any Multibus-compatible microcomputer system. The use of an NEC 7220 graphics display controller chip has simplified the design to a considerable extent. CGRAM (CORE graphics on Multibus), a comprehensive subset of the CORE graphics standard package, is supported on the subsystem.
Resumo:
Onboard spacecraft computing system is a case of a functionally distributed system that requires continuous interaction among the nodes to control the operations at different nodes. A simple and reliable protocol is desired for such an application. This paper discusses a formal approach to specify the computing system with respect to some important issues encountered in the design and development of a protocol for the onboard distributed system. The issues considered in this paper are concurrency, exclusiveness and sequencing relationships among the various processes at different nodes. A 6-tuple model is developed for the precise specification of the system. The model also enables us to check the consistency of specification and deadlock caused due to improper specification. An example is given to illustrate the use of the proposed methodology for a typical spacecraft configuration. Although the theory is motivated by a specific application the same may be applied to other distributed computing system such as those encountered in process control industries, power plant control and other similar environments.
Resumo:
The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.
Resumo:
A central tenet in the theory of reliability modelling is the quantification of the probability of asset failure. In general, reliability depends on asset age and the maintenance policy applied. Usually, failure and maintenance times are the primary inputs to reliability models. However, for many organisations, different aspects of these data are often recorded in different databases (e.g. work order notifications, event logs, condition monitoring data, and process control data). These recorded data cannot be interpreted individually, since they typically do not have all the information necessary to ascertain failure and preventive maintenance times. This paper presents a methodology for the extraction of failure and preventive maintenance times using commonly-available, real-world data sources. A text-mining approach is employed to extract keywords indicative of the source of the maintenance event. Using these keywords, a Naïve Bayes classifier is then applied to attribute each machine stoppage to one of two classes: failure or preventive. The accuracy of the algorithm is assessed and the classified failure time data are then presented. The applicability of the methodology is demonstrated on a maintenance data set from an Australian electricity company.
Resumo:
Cord blood is a well-established alternative to bone marrow and peripheral blood stem cell transplantation. To this day, over 400 000 unrelated donor cord blood units have been stored in cord blood banks worldwide. To enable successful cord blood transplantation, recent efforts have been focused on finding ways to increase the hematopoietic progenitor cell content of cord blood units. In this study, factors that may improve the selection and quality of cord blood collections for banking were identified. In 167 consecutive cord blood units collected from healthy full-term neonates and processed at a national cord blood bank, mean platelet volume (MPV) correlated with the numbers of cord blood unit hematopoietic progenitors (CD34+ cells and colony-forming units); this is a novel finding. Mean platelet volume can be thought to represent general hematopoietic activity, as newly formed platelets have been reported to be large. Stress during delivery is hypothesized to lead to the mobilization of hematopoietic progenitor cells through cytokine stimulation. Accordingly, low-normal umbilical arterial pH, thought to be associated with perinatal stress, correlated with high cord blood unit CD34+ cell and colony-forming unit numbers. The associations were closer in vaginal deliveries than in Cesarean sections. Vaginal delivery entails specific physiological changes, which may also affect the hematopoietic system. Thus, different factors may predict cord blood hematopoietic progenitor cell numbers in the two modes of delivery. Theoretical models were created to enable the use of platelet characteristics (mean platelet volume) and perinatal factors (umbilical arterial pH and placental weight) in the selection of cord blood collections with high hematopoietic progenitor cell counts. These observations could thus be implemented as a part of the evaluation of cord blood collections for banking. The quality of cord blood units has been the focus of several recent studies. However, hemostasis activation during cord blood collection is scarcely evaluated in cord blood banks. In this study, hemostasis activation was assessed with prothrombin activation fragment 1+2 (F1+2), a direct indicator of thrombin generation, and platelet factor 4 (PF4), indicating platelet activation. Altogether three sample series were collected during the set-up of the cord blood bank as well as after changes in personnel and collection equipment. The activation decreased from the first to the subsequent series, which were collected with the bank fully in operation and following international standards, and was at a level similar to that previously reported for healthy neonates. As hemostasis activation may have unwanted effects on cord blood cell contents, it should be minimized. The assessment of hemostasis activation could be implemented as a part of process control in cord blood banks. Culture assays provide information about the hematopoietic potential of the cord blood unit. In processed cord blood units prior to freezing, megakaryocytic colony growth was evaluated in semisolid cultures with a novel scoring system. Three investigators analyzed the colony assays, and the scores were highly concordant. With such scoring systems, the growth potential of various cord blood cell lineages can be assessed. In addition, erythroid cells were observed in liquid cultures of cryostored and thawed, unseparated cord blood units without exogenous erythropoietin. This was hypothesized to be due to the erythropoietic effect of thrombopoietin, endogenous erythropoietin production, and diverse cell-cell interactions in the culture. This observation underscores the complex interactions of cytokines and supporting cells in the heterogeneous cell population of the thawed cord blood unit.
Resumo:
Among all methods of metal alloy slurry preparation, the cooling slope method is the simplest in terms of design and process control. The method involves pouring of the melt from top, down an oblique and channel shaped plate cooled from bottom by counter flowing water. The melt, while flowing down, partially solidifies and forms columnar dendrites on plate wall. These dendrites are broken into equiaxed grains and are washed away with melt. The melt, together with the equiaxed grains, forms semisolid slurry collected at the slope exit and cast into billets having non-dendritic microstructure. The final microstructure depends on several process parameters such as slope angle, slope length, pouring superheat, and cooling rate. The present work involves scaling analysis of conservation equations of momentum, energy and species for the melt flow down a cooling slope. The main purpose of the scaling analysis is to obtain a physical insight into the role and relative importance of each parameter in influencing the final microstructure. For assessing the scaling analysis, the trends predicted by scaling are compared against corresponding numerical results using an enthalpy based solidification model with incorporation of solid phase movement.
Resumo:
A phase field modelling approach is implemented in the present study towards simulation of microstructure evolution during cooling slope semi solid slurry generation process of A380 Aluminium alloy. First, experiments are performed to evaluate the number of seeds required within the simulation domain to simulate near spherical microstructure formation, occurs during cooling slope processing of the melt. Subsequently, microstructure evolution is studied employing a phase field method. Simulations are performed to understand the effect of cooling rate on the slurry microstructure. Encouraging results are obtained from the simulation studies which are validated by experimental observations. The results obtained from mesoscopic phase field simulations are grain size, grain density, degree of sphericity of the evolving primary Al phase and the amount of solid fraction present within the slurry at different time frames. Effect of grain refinement also has been studied with an aim of improving the slurry microstructure further. Insight into the process has been obtained from the numerical findings, which are found to be useful for process control.
Resumo:
Earth abundant alternative chalcopyrite Cu2CoSnS4 (CCTS) thin films were deposited by a facile sol-gel process onto larger substrates. Temperature dependence of the process control of deposition and desired phase formations was studied in detail. Films were analyzed for complete transformation from amorphous to polycrystalline, with textured structures for stannite phase, as reflected from the X-ray diffraction and with nearly stoichiometric compositions of Cu:Co:Sn:S = 2:0:1:0:1:0:4:0 from EDAX analysis. Morphological investigations revealed that the CCTS films with larger grains, on the order of its thickness, were synthesized at higher temperature of 500 degrees C. The optimal band gap for application in photovoltaics was estimated to be 1.4 eV. Devices with SLG/CCTS/Al geometry were fabricated for real time demonstration of photoconductivity under A.M 1.5 G solar and 1064 rim infrared laser illuminations. A photodetector showed one order current amplification from similar to 1.9 X 10(-6) A in the dark to 2.2 x 10(-5) A and 9.8 X 10(-6) A under A.M 1.5 G illumination and 50 mW cm(-2) IR laser, respectively. Detector sensitivity, responsivity, external quantum efficiency, and gain were estimated as 4.2, 0.12 A/W, 14.74% and 14.77%, respectively, at 50 mW cm(-2) laser illuminations. An ON and OFF ratio of 2.5 proved that CCTS can be considered as a potential absorber in low cost photovoltaics applications.
Resumo:
Esta tese tem o objetivo geral de investigar a associação entre estresse e acidentes no trabalho em funcionários técnico-administrativos efetivos de uma universidade pública no Rio de Janeiro por meio de modelos multiníveis. Para alcançar tal objetivo, a tese foi distribuída em dois artigos. O primeiro artigo investiga a associação entre estresse e acidentes no trabalho considerando componentes hierárquicos da estrutura dos dados por meio de modelos multiníveis com funcionários no primeiro nível agrupados em setores de trabalho no segundo nível. O segundo artigo investiga o comportamento dos coeficientes fixos e aleatórios dos modelos multiníveis com classificação cruzada entre setores de trabalho e grupos ocupacionais em relação aos modelos multiníveis que consideram apenas componentes hierárquicos dos setores de trabalho, ignorando o ajuste dos grupos ocupacionais. O estresse psicossocial no trabalho foi abordado a partir das relações entre alta demanda psicológica e baixo controle do processo laboral, Estas dimensões foram captadas por meio da versão resumida da escala Karasek, que também contém informações sobre o apoio social no trabalho. Dimensões isoladas do estresse no trabalho (demanda e controle), razão entre demanda psicológica e controle do trabalho (Razão D/C) e o apoio social no trabalho foram mensurados no nível individual e nos setores de trabalho. De modo geral, os resultados destacam a demanda psicológica mensurada no nível individual como um importante fator associado à ocorrência de acidentes de trabalho. O apoio social no trabalho, mensurado no nível individual e no setor de trabalho, apresentou associação inversa à prevalência de acidentes de trabalho, sendo, no setor, acentuada entre as mulheres. Os resultados também mostram que os parâmetros fixos dos modelos com e sem classificação cruzada foram semelhantes e que, de modo geral, os erros padrões (EP) foram um pouco maiores nos modelos com classificação cruzada, apesar deste comportamento do EP não ter sido observado quando relacionado aos coeficientes fixos das variáveis agregadas no setor de trabalho. A maior distinção entre as duas abordagens foi observada em relação aos coeficientes aleatórios relacionados aos setores de trabalho, que alteraram substancialmente após ajustar o efeito da ocupação por meio dos modelos com classificação cruzada. Este estudo reforça a importância de características psicossociais na ocorrência de acidentes de trabalho e contribui para o conhecimento dessas relações a partir de abordagens analíticas que refinam a captação da estrutura de dependência dos indivíduos em seu ambiente de trabalho. Sugere-se a realização de outros estudos com metodologia similar, que permitam aprofundar o conhecimento sobre estresse e acidentes no trabalho.
Resumo:
Processos de produção precisam ser avaliados continuamente para que funcionem de modo mais eficaz e eficiente possível. Um conjunto de ferramentas utilizado para tal finalidade é denominado controle estatístico de processos (CEP). Através de ferramentas do CEP, o monitoramento pode ser realizado periodicamente. A ferramenta mais importante do CEP é o gráfico de controle. Nesta tese, foca-se no monitoramento de uma variável resposta, por meio dos parâmetros ou coeficientes de um modelo de regressão linear simples. Propõe-se gráficos de controle χ2 adaptativos para o monitoramento dos coeficientes do modelo de regressão linear simples. Mais especificamente, são desenvolvidos sete gráficos de controle χ2 adaptativos para o monitoramento de perfis lineares, a saber: gráfico com tamanho de amostra variável; intervalo de amostragem variável; limites de controle e de advertência variáveis; tamanho de amostra e intervalo de amostragem variáveis; tamanho de amostra e limites variáveis; intervalo de amostragem e limites variáveis e por fim, com todos os parâmetros de projeto variáveis. Medidas de desempenho dos gráficos propostos foram obtidas através de propriedades de cadeia de Markov, tanto para a situação zero-state como para a steady-state, verificando-se uma diminuição do tempo médio até um sinal no caso de desvios pequenos a moderados nos coeficientes do modelo de regressão do processo de produção. Os gráficos propostos foram aplicados a um exemplo de um processo de fabricação de semicondutores. Além disso, uma análise de sensibilidade dos mesmos é feita em função de desvios de diferentes magnitudes nos parâmetros do processo, a saber, no intercepto e na inclinação, comparando-se o desempenho entre os gráficos desenvolvidos e também com o gráfico χ2 com parâmetros fixos. Os gráficos propostos nesta tese são adequados para vários tipos de aplicações. Neste trabalho também foi considerado características de qualidade as quais são representadas por um modelo de regressão não-linear. Para o modelo de regressão não-linear considerado, a proposta é utilizar um método que divide o perfil não-linear em partes lineares, mais especificamente, um algoritmo para este fim, proposto na literatura, foi utilizado. Desta forma, foi possível validar a técnica proposta, mostrando que a mesma é robusta no sentido que permite tipos diferentes de perfis não-lineares. Aproxima-se, portanto um perfil não-linear por perfis lineares por partes, o que proporciona o monitoramento de cada perfil linear por gráficos de controle, como os gráficos de controle desenvolvidos nesta tese. Ademais apresenta-se a metodologia de decompor um perfil não-linear em partes lineares de forma detalhada e completa, abrindo espaço para ampla utilização.