967 resultados para Control techniques
Resumo:
Virtual tools are commonly used nowadays to optimize product design and manufacturing process of fibre reinforced composite materials. The present work focuses on two areas of interest to forecast the part performance and the production process particularities. The first part proposes a multi-physical optimization tool to support the concept stage of a composite part. The strategy is based on the strategic handling of information and, through a single control parameter, is able to evaluate the effects of design variations throughout all these steps in parallel. The second part targets the resin infusion process and the impact of thermal effects. The numerical and experimental approach allowed the identificationof improvement opportunities regarding the implementation of algorithms in commercially available simulation software.
Resumo:
La presencia de microorganismos patógenos en alimentos es uno de los problemas esenciales en salud pública, y las enfermedades producidas por los mismos es una de las causas más importantes de enfermedad. Por tanto, la aplicación de controles microbiológicos dentro de los programas de aseguramiento de la calidad es una premisa para minimizar el riesgo de infección de los consumidores. Los métodos microbiológicos clásicos requieren, en general, el uso de pre-enriquecimientos no-selectivos, enriquecimientos selectivos, aislamiento en medios selectivos y la confirmación posterior usando pruebas basadas en la morfología, bioquímica y serología propias de cada uno de los microorganismos objeto de estudio. Por lo tanto, estos métodos son laboriosos, requieren un largo proceso para obtener resultados definitivos y, además, no siempre pueden realizarse. Para solucionar estos inconvenientes se han desarrollado diversas metodologías alternativas para la detección identificación y cuantificación de microorganismos patógenos de origen alimentario, entre las que destacan los métodos inmunológicos y moleculares. En esta última categoría, la técnica basada en la reacción en cadena de la polimerasa (PCR) se ha convertido en la técnica diagnóstica más popular en microbiología, y recientemente, la introducción de una mejora de ésta, la PCR a tiempo real, ha producido una segunda revolución en la metodología diagnóstica molecular, como pude observarse por el número creciente de publicaciones científicas y la aparición continua de nuevos kits comerciales. La PCR a tiempo real es una técnica altamente sensible -detección de hasta una molécula- que permite la cuantificación exacta de secuencias de ADN específicas de microorganismos patógenos de origen alimentario. Además, otras ventajas que favorecen su implantación potencial en laboratorios de análisis de alimentos son su rapidez, sencillez y el formato en tubo cerrado que puede evitar contaminaciones post-PCR y favorece la automatización y un alto rendimiento. En este trabajo se han desarrollado técnicas moleculares (PCR y NASBA) sensibles y fiables para la detección, identificación y cuantificación de bacterias patogénicas de origen alimentario (Listeria spp., Mycobacterium avium subsp. paratuberculosis y Salmonella spp.). En concreto, se han diseñado y optimizado métodos basados en la técnica de PCR a tiempo real para cada uno de estos agentes: L. monocytogenes, L. innocua, Listeria spp. M. avium subsp. paratuberculosis, y también se ha optimizado y evaluado en diferentes centros un método previamente desarrollado para Salmonella spp. Además, se ha diseñado y optimizado un método basado en la técnica NASBA para la detección específica de M. avium subsp. paratuberculosis. También se evaluó la aplicación potencial de la técnica NASBA para la detección específica de formas viables de este microorganismo. Todos los métodos presentaron una especificidad del 100 % con una sensibilidad adecuada para su aplicación potencial a muestras reales de alimentos. Además, se han desarrollado y evaluado procedimientos de preparación de las muestras en productos cárnicos, productos pesqueros, leche y agua. De esta manera se han desarrollado métodos basados en la PCR a tiempo real totalmente específicos y altamente sensibles para la determinación cuantitativa de L. monocytogenes en productos cárnicos y en salmón y productos derivados como el salmón ahumado y de M. avium subsp. paratuberculosis en muestras de agua y leche. Además este último método ha sido también aplicado para evaluar la presencia de este microorganismo en el intestino de pacientes con la enfermedad de Crohn's, a partir de biopsias obtenidas de colonoscopia de voluntarios afectados. En conclusión, este estudio presenta ensayos moleculares selectivos y sensibles para la detección de patógenos en alimentos (Listeria spp., Mycobacterium avium subsp. paratuberculosis) y para una rápida e inambigua identificación de Salmonella spp. La exactitud relativa de los ensayos ha sido excelente, si se comparan con los métodos microbiológicos de referencia y pueden serusados para la cuantificación de tanto ADN genómico como de suspensiones celulares. Por otro lado, la combinación con tratamientos de preamplificación ha resultado ser de gran eficiencia para el análisis de las bacterias objeto de estudio. Por tanto, pueden constituir una estrategia útil para la detección rápida y sensible de patógenos en alimentos y deberían ser una herramienta adicional al rango de herramientas diagnósticas disponibles para el estudio de patógenos de origen alimentario.
Resumo:
Pseudomonas oryzihabitans, a bacterium associated with the entomopathogenic nematode Steinernema abbasi, was evaluated for its potential to colonise roots and thereby control a field population of root-knot nematodes. Immunological techniques were developed to detect root colonisation of P. oryzihabitans on tomato roots using a specific polyclonal antibody raised against vegetative bacterial cells. In vitro, bacterial cell filtrates were also shown significantly to inhibit juveniles hatching. In a glasshouse pot experiment, there were 22 and 82% fewer females in roots of plants treated with suspensions containing 10(3) and 10(6) cells ml(-1) of P oryzihabitans, respectively. In addition, there were significantly fewer egg masses produced; however, the numbers of eggs per egg mass did not differ significantly. The relationship between root colonisation and nematode control is discussed.
Resumo:
Two-component systems capable of self-assembling into soft gel-phase materials are of considerable interest due to their tunability and versatility. This paper investigates two-component gels based on a combination of a L-lysine-based dendron and a rigid diamine spacer (1,4-diaminobenzene or 1,4-diaminocyclohexane). The networked gelator was investigated using thermal measurements, circular dichroism, NMR spectroscopy and small angle neutron scattering (SANS) giving insight into the macroscopic properties, nanostructure and molecular-scale organisation. Surprisingly, all of these techniques confirmed that irrespective of the molar ratio of the components employed, the "solid-like" gel network always consisted of a 1:1 mixture of dendron/diamine. Additionally, the gel network was able to tolerate a significant excess of diamine in the "liquid-like" phase before being disrupted. In the light of this observation, we investigated the ability of the gel network structure to evolve from mixtures of different aromatic diamines present in excess. We found that these two-component gels assembled in a component-selective manner, with the dendron preferentially recognising 1,4-diaminobenzene (>70%). when similar competitor diamines (1,2- and 1,3-diaminobenzene) are present. Furthermore, NMR relaxation measurements demonstrated that the gel based oil 1,4-diaminobenzene was better able to form a selective ternary complex with pyrene than the gel based oil 1,4-diaminocyclohexane, indicative of controlled and selective pi-pi interactions within a three-component assembly. As such, the results ill this paper demonstrate how component selection processes in two-component gel systems call control hierarchical self-assembly.
Resumo:
Purpose – To evaluate the control strategy for a hybrid natural ventilation wind catchers and air-conditioning system and to assess the contribution of wind catchers to indoor air environments and energy savings if any. Design/methodology/approach – Most of the modeling techniques for assessing wind catchers performance are theoretical. Post-occupancy evaluation studies of buildings will provide an insight into the operation of these building components and help to inform facilities managers. A case study for POE was presented in this paper. Findings – The monitoring of the summer and winter month operations showed that the indoor air quality parameters were kept within the design target range. The design control strategy failed to record data regarding the operation, opening time and position of wind catchers system. Though the implemented control strategy was working effectively in monitoring the operation of mechanical ventilation systems, i.e. AHU, did not integrate the wind catchers with the mechanical ventilation system. Research limitations/implications – Owing to short-falls in the control strategy implemented in this project, it was found difficult to quantify and verify the contribution of the wind catchers to the internal conditions and, hence, energy savings. Practical implications – Controlling the operation of the wind catchers via the AHU will lead to isolation of the wind catchers in the event of malfunctioning of the AHU. Wind catchers will contribute to the ventilation of space, particularly in the summer months. Originality/value – This paper demonstrates the value of POE as indispensable tool for FM professionals. It further provides insight into the application of natural ventilation systems in building for healthier indoor environments at lower energy cost. The design of the control strategy for natural ventilation and air-conditioning should be considered at the design stage involving the FM personnel.
Resumo:
The deployment of Quality of Service (QoS) techniques involves careful analysis of area including: those business requirements; corporate strategy; and technical implementation process, which can lead to conflict or contradiction between those goals of various user groups involved in that policy definition. In addition long-term change management provides a challenge as these implementations typically require a high-skill set and experience level, which expose organisations to effects such as “hyperthymestria” [1] and “The Seven Sins of Memory”, defined by Schacter and discussed further within this paper. It is proposed that, given the information embedded within the packets of IP traffic, an opportunity exists to augment the traffic management with a machine-learning agent-based mechanism. This paper describes the process by which current policies are defined and that research required to support the development of an application which enables adaptive intelligent Quality of Service controls to augment or replace those policy-based mechanisms currently in use.
Resumo:
In this work a method for building multiple-model structures is presented. A clustering algorithm that uses data from the system is employed to define the architecture of the multiple-model, including the size of the region covered by each model, and the number of models. A heating ventilation and air conditioning system is used as a testbed of the proposed method.
Resumo:
In this work a method for building multiple-model structures is presented. A clustering algorithm that uses data from the system is employed to define the architecture of the multiple-model, including the size of the region covered by each model, and the number of models. A heating ventilation and air conditioning system is used as a testbed of the proposed method.
Resumo:
In this work, a fault-tolerant control scheme is applied to a air handling unit of a heating, ventilation and air-conditioning system. Using the multiple-model approach it is possible to identify faults and to control the system under faulty and normal conditions in an effective way. Using well known techniques to model and control the process, this work focuses on the importance of the cost function in the fault detection and its influence on the reconfigurable controller. Experimental results show how the control of the terminal unit is affected in the presence a fault, and how the recuperation and reconfiguration of the control action is able to deal with the effects of faults.
Resumo:
Electrospinning is a route to polymer fibres with diameters considerably smaller than available from most fibre-producing techniques. We explore the use of a low molecular weight compound as an effective control additive during the electrospinning of poly(epsilon-caprolactone). This approach extends the control variables for the electrospinning of nanoscale fibres from the more usual ones such as the polymer molecular weight, solvent and concentration. We show that through the use of dual solvent systems, we can alter the impact of the additive on the electrospinning process so that finer as well as thicker fibres can be prepared under otherwise identical conditions. As well as the size of the fibres and the number of beads, the use of the additive allows us to alter the level of crystallinity as well as the level of preferred orientation of the poly(epsilon-caprolactone) crystals. This approach, involving the use of a dual solvent and a low molar mass compound, offers considerable potential for application to other polymer systems. (C) 2010 Society of Chemical Industry
Resumo:
In the last few years a state-space formulation has been introduced into self-tuning control. This has not only allowed for a wider choice of possible control actions, but has also provided an insight into the theory underlying—and hidden by—that used in the polynomial description. This paper considers many of the self-tuning algorithms, both state-space and polynomial, presently in use, and by starting from first principles develops the observers which are, effectively, used in each case. At any specific time instant the state estimator can be regarded as taking one of two forms. In the first case the most recently available output measurement is excluded, and here an optimal and conditionally stable observer is obtained. In the second case the present output signal is included, and here it is shown that although the observer is once again conditionally stable, it is no longer optimal. This result is of significance, as many of the popular self-tuning controllers lie in the second, rather than first, category.
Resumo:
A simple parameter adaptive controller design methodology is introduced in which steady-state servo tracking properties provide the major control objective. This is achieved without cancellation of process zeros and hence the underlying design can be applied to non-minimum phase systems. As with other self-tuning algorithms, the design (user specified) polynomials of the proposed algorithm define the performance capabilities of the resulting controller. However, with the appropriate definition of these polynomials, the synthesis technique can be shown to admit different adaptive control strategies, e.g. self-tuning PID and self-tuning pole-placement controllers. The algorithm can therefore be thought of as an embodiment of other self-tuning design techniques. The performances of some of the resulting controllers are illustrated using simulation examples and the on-line application to an experimental apparatus.
Resumo:
The last decade has seen the re-emergence of artificial neural networks as an alternative to traditional modelling techniques for the control of nonlinear systems. Numerous control schemes have been proposed and have been shown to work in simulations. However, very few analyses have been made of the working of these networks. The authors show that a receding horizon control strategy based on a class of recurrent networks can stabilise nonlinear systems.
Resumo:
In recent years researchers in the Department of Cybernetics have been developing simple mobile robots capable of exploring their environment on the basis of the information obtained from a few simple sensors. These robots are used as the test bed for exploring various behaviours of single and multiple organisms: the work is inspired by considerations of natural systems. In this paper we concentrate on that part of the work which involves neural networks and related techniques. These neural networks are used both to process the sensor information and to develop the strategy used to control the robot. Here the robots, their sensors, and the neural networks used and all described. 1.
Resumo:
The use of data reconciliation techniques can considerably reduce the inaccuracy of process data due to measurement errors. This in turn results in improved control system performance and process knowledge. Dynamic data reconciliation techniques are applied to a model-based predictive control scheme. It is shown through simulations on a chemical reactor system that the overall performance of the model-based predictive controller is enhanced considerably when data reconciliation is applied. The dynamic data reconciliation techniques used include a combined strategy for the simultaneous identification of outliers and systematic bias.