958 resultados para Iron foundries Production control Data processing
Resumo:
Drinking water utilities in urban areas are focused on finding smart solutions facing new challenges in their real-time operation because of limited water resources, intensive energy requirements, a growing population, a costly and ageing infrastructure, increasingly stringent regulations, and increased attention towards the environmental impact of water use. Such challenges force water managers to monitor and control not only water supply and distribution, but also consumer demand. This paper presents and discusses novel methodologies and procedures towards an integrated water resource management system based on advanced ICT technologies of automation and telecommunications for largely improving the efficiency of drinking water networks (DWN) in terms of water use, energy consumption, water loss minimization, and water quality guarantees. In particular, the paper addresses the first results of the European project EFFINET (FP7-ICT2011-8-318556) devoted to the monitoring and control of the DWN in Barcelona (Spain). Results are split in two levels according to different management objectives: (i) the monitoring level is concerned with all the aspects involved in the observation of the current state of a system and the detection/diagnosis of abnormal situations. It is achieved through sensors and communications technology, together with mathematical models; (ii) the control level is concerned with computing the best suitable and admissible control strategies for network actuators as to optimize a given set of operational goals related to the performance of the overall system. This level covers the network control (optimal management of water and energy) and the demand management (smart metering, efficient supply). The consideration of the Barcelona DWN as the case study will allow to prove the general applicability of the proposed integrated ICT solutions and their effectiveness in the management of DWN, with considerable savings of electricity costs and reduced water loss while ensuring the high European standards of water quality to citizens.
Resumo:
The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.
Resumo:
No presente trabalho, estudou-se o processamento das vísceras de frango (Gallus domesticus) e das ratitas avestruz (Struthio camellus) e ema (Rhea americana), processadas da mesma forma, com o objetivo de elaborar farinhas. A qualidade e controle de produção dessas farinhas foram avaliados por meio de parâmetros tecnológicos exigidos pela legislação vigente. Basicamente, a matéria-prima foi cozida e esterilizada, filtrada em peneira (para a separação do óleo), moída, seca em estufa e analisada. Foram obtidos dados de rendimentos de produção, de composição de nutrientes, de digestibilidade em pepsina, do valor calórico, determinados os teores de cálcio e fósforo, e realizados estudos da estabilidade da farinha durante o armazenamento (Salmonella, pH, índice de acidez e TBA). Os resultados mostraram que a produção de farinha pelo processamento das vísceras de avestruz e ema é viável, seus parâmetros de qualidade atendem, em grande parte, às exigências, mas, para um emprego imediato (uso regular) na elaboração de rações, necessita de alguns ajustes (correções) de parâmetros físico-químicos (nutrientes).
Resumo:
This work presents one software developed to process solar radiation data. This software can be used in meteorological and climatic stations, and also as a support for solar radiation measurements in researches of solar energy availability allowing data quality control, statistical calculations and validation of models, as well as ease interchanging of data. (C) 1999 Elsevier B.V. Ltd. All rights reserved.
Resumo:
Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Currently, timed ovulation induction and timed artificial insemination (TAI) can be performed in buffalo using GnRH or estradiol plus progesterone/progestin (P4)-releasing devices and prostaglandin F-2 alpha (PGF(2 alpha)). The control of the emergence of follicular waves and of ovulation at predetermined times, without the need for estrus detection, has facilitated the management and improved the efficiency of AI programs in buffalo during the breeding and nonbreeding season. Multiple ovulations, embryo transfer, ovum collection and in vitro embryo production have been shown to be feasible in buffalo, although low efficiency and limited commercial application of these techniques have been documented as well. These results could be associated with low ovarian follicular pools, high levels of follicular atresia and failures of the oocyte to enter the oviduct after superstimulation of follicular growth. This review discusses a number of key points related to the manipulation of ovarian follicular growth to improve pregnancy rates following TAI and embryo transfer of in vivo- and in vitro-derived embryos in buffalo.
Resumo:
Abstract Background Caspase-1 is a cysteine protease responsible for the processing and secretion of IL-1β and IL-18, which are closely related to the induction of inflammation. However, limited evidence addresses the participation of caspase-1 in inflammatory pain. Here, we investigated the role of caspase-1 in inflammatory hypernociception (a decrease in the nociceptive threshold) using caspase-1 deficient mice (casp1-/-). Results Mechanical inflammatory hypernociception was evaluated using an electronic version of the von Frey test. The production of cytokines, PGE2 and neutrophil migration were evaluated by ELISA, radioimmunoassay and myeloperoxidase activity, respectively. The interleukin (IL)-1β and cyclooxygenase (COX)-2 protein expression were evaluated by western blotting. The mechanical hypernociception induced by intraplantar injection of carrageenin, tumour necrosis factor (TNF)α and CXCL1/KC was reduced in casp1-/- mice compared with WT mice. However, the hypernociception induced by IL-1β and PGE2 did not differ in WT and casp1-/- mice. Carrageenin-induced TNF-α and CXCL1/KC production and neutrophil recruitment in the paws of WT mice were not different from casp1-/- mice, while the maturation of IL-1β was reduced in casp1-/- mice. Furthermore, carrageenin induced an increase in the expression of COX-2 and PGE2 production in the paw of WT mice, but was reduced in casp1-/- mice. Conclusion These results suggest that caspase-1 plays a critical role in the cascade of events involved in the genesis of inflammatory hypernociception by promoting IL-1β maturation. Because caspase-1 is involved in the induction of COX-2 expression and PGE2 production, our data support the assertion that caspase-1 is a key target to control inflammatory pain.
Resumo:
Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.
Resumo:
This is the second part of a study investigating a model-based transient calibration process for diesel engines. The first part addressed the data requirements and data processing required for empirical transient emission and torque models. The current work focuses on modelling and optimization. The unexpected result of this investigation is that when trained on transient data, simple regression models perform better than more powerful methods such as neural networks or localized regression. This result has been attributed to extrapolation over data that have estimated rather than measured transient air-handling parameters. The challenges of detecting and preventing extrapolation using statistical methods that work well with steady-state data have been explained. The concept of constraining the distribution of statistical leverage relative to the distribution of the starting solution to prevent extrapolation during the optimization process has been proposed and demonstrated. Separate from the issue of extrapolation is preventing the search from being quasi-static. Second-order linear dynamic constraint models have been proposed to prevent the search from returning solutions that are feasible if each point were run at steady state, but which are unrealistic in a transient sense. Dynamic constraint models translate commanded parameters to actually achieved parameters that then feed into the transient emission and torque models. Combined model inaccuracies have been used to adjust the optimized solutions. To frame the optimization problem within reasonable dimensionality, the coefficients of commanded surfaces that approximate engine tables are adjusted during search iterations, each of which involves simulating the entire transient cycle. The resulting strategy, different from the corresponding manual calibration strategy and resulting in lower emissions and efficiency, is intended to improve rather than replace the manual calibration process.
Resumo:
BACKGROUND: Short-acting agents for neuromuscular block (NMB) require frequent dosing adjustments for individual patient's needs. In this study, we verified a new closed-loop controller for mivacurium dosing in clinical trials. METHODS: Fifteen patients were studied. T1% measured with electromyography was used as input signal for the model-based controller. After induction of propofol/opiate anaesthesia, stabilization of baseline electromyography signal was awaited and a bolus of 0.3 mg kg-1 mivacurium was then administered to facilitate endotracheal intubation. Closed-loop infusion was started thereafter, targeting a neuromuscular block of 90%. Setpoint deviation, the number of manual interventions and surgeon's complaints were recorded. Drug use and its variability between and within patients were evaluated. RESULTS: Median time of closed-loop control for the 11 patients included in the data processing was 135 [89-336] min (median [range]). Four patients had to be excluded because of sensor problems. Mean absolute deviation from setpoint was 1.8 +/- 0.9 T1%. Neither manual interventions nor complaints from the surgeons were recorded. Mean necessary mivacurium infusion rate was 7.0 +/- 2.2 microg kg-1 min-1. Intrapatient variability of mean infusion rates over 30-min interval showed high differences up to a factor of 1.8 between highest and lowest requirement in the same patient. CONCLUSIONS: Neuromuscular block can precisely be controlled with mivacurium using our model-based controller. The amount of mivacurium needed to maintain T1% at defined constant levels differed largely between and within patients. Closed-loop control seems therefore advantageous to automatically maintain neuromuscular block at constant levels.
Resumo:
Polycarbonate (PC) is an important engineering thermoplastic that is currently produced in large industrial scale using bisphenol A and monomers such as phosgene. Since phosgene is highly toxic, a non-phosgene approach using diphenyl carbonate (DPC) as an alternative monomer, as developed by Asahi Corporation of Japan, is a significantly more environmentally friendly alternative. Other advantages include the use of CO2 instead of CO as raw material and the elimination of major waste water production. However, for the production of DPC to be economically viable, reactive-distillation units are needed to obtain the necessary yields by shifting the reaction-equilibrium to the desired products and separating the products at the point where the equilibrium reaction occurs. In the field of chemical reaction engineering, there are many reactions that are suffering from the low equilibrium constant. The main goal of this research is to determine the optimal process needed to shift the reactions by using appropriate control strategies of the reactive distillation system. An extensive dynamic mathematical model has been developed to help us investigate different control and processing strategies of the reactive distillation units to increase the production of DPC. The high-fidelity dynamic models include extensive thermodynamic and reaction-kinetics models while incorporating the necessary mass and energy balance of the various stages of the reactive distillation units. The study presented in this document shows the possibility of producing DPC via one reactive distillation instead of the conventional two-column, with a production rate of 16.75 tons/h corresponding to start reactants materials of 74.69 tons/h of Phenol and 35.75 tons/h of Dimethyl Carbonate. This represents a threefold increase over the projected production rate given in the literature based on a two-column configuration. In addition, the purity of the DPC produced could reach levels as high as 99.5% with the effective use of controls. These studies are based on simulation done using high-fidelity dynamic models.
Resumo:
EURATOM/CIEMAT and Technical University of Madrid (UPM) have been involved in the development of a FPSC [1] (Fast Plant System Control) prototype for ITER, based on PXIe (PCI eXtensions for Instrumentation). One of the main focuses of this project has been data acquisition and all the related issues, including scientific data archiving. Additionally, a new data archiving solution has been developed to demonstrate the obtainable performances and possible bottlenecks of scientific data archiving in Fast Plant System Control. The presented system implements a fault tolerant architecture over a GEthernet network where FPSC data are reliably archived on remote, while remaining accessible to be redistributed, within the duration of a pulse. The storing service is supported by a clustering solution to guaranty scalability, so that FPSC management and configuration may be simplified, and a unique view of all archived data provided. All the involved components have been integrated under EPICS [2] (Experimental Physics and Industrial Control System), implementing in each case the necessary extensions, state machines and configuration process variables. The prototyped solution is based on the NetCDF-4 [3] and [4] (Network Common Data Format) file format in order to incorporate important features, such as scientific data models support, huge size files management, platform independent codification, or single-writer/multiple-readers concurrency. In this contribution, a complete description of the above mentioned solution is presented, together with the most relevant results of the tests performed, while focusing in the benefits and limitations of the applied technologies.
Resumo:
A piece of research is presented that was conducted on the Guayanes Farmhouse Telita Cheese Producers Network located in the Piar and Padre Chien rural municipalities of Bolivar state in Venezuela. Guayanes telita cheese is a regional dairy product. The producers are to be found in a rural area with a high potential for marketing the label in the Southern Common Market (MERCOSUR). This market is the focal point of the strategic importance of this study for the Region and the Country. The research is of a descriptive scope conducted in the field. A questionnaire based on good food production practice was used as a data gathering technique. The final sample comprised 30 production units. Statistical processing was performed with version 15.2 of the STATGRAPHICS Centurion computational tool. The results would appear to confirm previous studies that point to the existence of factors that prevent these Micro-SMEs from guaranteeing the food safety of the product. The results indicate that new lines of research need to be opened up. These are oriented towards formulating strategies for the continuous improvement of these micro-SMEs, including quality control indicators.
Resumo:
Este proyecto tiene por objeto desarrollar una sistemática de control metrológico para vigilar la exactitud de los medidores volumétricos de desplazamiento positivo que operan en las compañías logísticas de hidrocarburos para la transferencia de custodia que están en el cargadero y se ensayan in situ para obtener su meter factor. El punto de partida son las hojas primarias de las calibraciones generadas frente a sus patrones y el prover que nos facilitan las compañías. Dado el elevado número de ensayos y debido a que estos medidores no tienen gráficos de control estables, el enfoque del tratamiento ha sido un etiquetado para realizar un control de inestabilidad y calidad de los ensayos y así, determinar equipos anómalos. Para la búsqueda de equipos atípicos se ha desarrollado el filtro de Tukey para el estudio de la estadística descriptiva de los valores del meter factor. Entre los dos métodos se han obtenido una clasificación de equipos vigilables, recalibrables y sustituibles para facilitar a las compañías logísticas. ABSTRACT The aim of this project is to develop a systematic metrological control to monitor the accuracy of the positive displacement flow meters operating in oil logistics companies for custody transfer which are in the loading track facilities and it are tested in-situ to obtain the meter factor. Due to the high number of assays that meters don´t have stable graphics of control, the approach of data processing has been a labeled to perform an instability and quality control of assays for establish anomalous meters. To find outliers meters is developed the filter of Turkey to study the descriptive statistics of meter factor values. Between both analytical methods is obtained a classification of controllable, recalibrables and replaceable meters to provide to the logistic company.