961 resultados para Statistical Process Control (SPC)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contexto Una central nuclear, al igual que cualquier otro tipo de central generadora de energía eléctrica, mediante turbinas de vapor, está basada en un proceso termodinámico. El rendimiento de las mismas es función del salto entálpico del vapor, para mejorarlo las centrales están constituidas por un ciclo compound formado por turbina de alta presión y turbinas de baja presión, y un ciclo regenerativo consistente en calentar el agua de alimentación antes de su introducción a los generadores de vapor. Un ciclo regenerativo está basado en etapas de calentadores o cambiadores de calor para aprovechar al máximo la energía térmica del vapor, este proyecto está basado en la mejora y optimización del proceso de control de estos para contribuir a mejorar el rendimiento de la central. Objetivo Implementar un sistema de control que nos permita modernizar los clásicos sistemas basados en controles locales y comunicaciones analógicas. Mejorar el rendimiento del ciclo regenerativo de la central, aprovechando las mejoras tecnológicas que ofrece el mercado, tanto en el hardware como en el software de los sistemas de instrumentación y control. Optimizar el rendimiento de los lazos de control de cada uno de los elementos del ciclo regenerativo mediante estrategias de control. Procedimiento Desarrollo de un sistema de control actualizado considerando, como premisa principal, la fiabilidad del sistema, el análisis de fallos y la jerarquización del riesgo. Análisis y cálculo de los lazos de control considerando las premisas establecidas. Configuración de los lazos mediante estrategias de control que nos permitan optimizar y minimizar los efectos del fallo. Para ello se han utilizado parámetros y datos extraídos de la Central Nuclear de Ascó. Conclusiones Se ha modernizado y optimizado el sistema de control mejorando el rendimiento del ciclo regenerativo. Se ha conseguido un sistema más fiable, reduciendo el riesgo del fallo y disminuyendo los efectos de los mismos. El coste de un proyecto de estas características es inferior al de un sistema convencional y ofrece más posibilidades. Es un sistema abierto que permite utilizar e interconectar equipos de diferentes fabricantes, lo que favorece tanto el mantenimiento como las posibles ampliaciones futuras del sistema.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El projecte està basat en la creació d'una aplicació per dispositius mòbils android i que fent servir l'ús del micròfon capturi el so que genera l'usuari i pugui determinar si s'està respirant i en quin punt de la respiració es troba l'usuari. S'ha dut a terme una filosofia de disseny orientada a l'usuari (DCU) de manera que el primer pas ha sigut realitzar un prototip i un 'sketch'. A continuació, s'han realitzat 10 aplicacions test i en cadascuna d'elles s'ha ampliat la funcionalitat fins a arribar a obtenir una aplicació base que s'aproxima al disseny inicial generat per mitjà del prototip. El més important dels dissenys algorísmics que s'han realitzat per la aplicació es la capacitat de processar el senyal en temps real, ja que fins i tot s'ha pogut aplicar la transformada ràpida de Fourier (FFT) en temps real sense que el rendiment de l'aplicació es veies afectat. Això ha sigut possible gràcies al disseny del processament amb doble buffer i amb un fil d'execució dedicat independent del fil principal d'execució del programa 'UI Thread'

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the rat utricle, synaptic contacts between hair cells and the nerve fibers arising from the vestibular primary neurons form during the first week after birth. During that period, the sodium-based excitability that characterizes neonate utricle sensory cells is switched off. To investigate whether the establishment of synaptic contacts was responsible for the modulation of the hair cell excitability, we used an organotypic culture of rat utricle in which the setting of synapses was prevented. Under this condition, the voltage-gated sodium current and the underlying action potentials persisted in a large proportion of nonafferented hair cells. We then studied whether impairment of nerve terminals in the utricle of adult rats may also affect hair cell excitability. We induced selective and transient damages of afferent terminals using glutamate excitotoxicity in vivo. The efficiency of the excitotoxic injury was attested by selective swellings of the terminals and underlying altered vestibular behavior. Under this condition, the sodium-based excitability transiently recovered in hair cells. These results indicate that the modulation of hair cell excitability depends on the state of the afferent terminals. In adult utricle hair cells, this property may be essential to set the conditions required for restoration of the sensory network after damage. This is achieved via re-expression of a biological process that occurs during synaptogenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis was made in Naantali plant of Finnfeeds Finland Oy. In this thesis the main study was in reducing, controlling, measuring and processing odour effluents in various methods. Also are considered legislation, marketing issues and environmental requirements of reducing of odour effluents. The literature review introduces odours complications, legislations and various methods of odour removal. There is also a review of volatile organic compounds detection and measuring methods. The experimental section consists TD-GC-MS-measurements and expansive measurements with electronic nose. Electronic nose is a new solution for recognition and measuring industrial odours. In this thesis the electronic nose was adapted into reliable recognition and measuring method. Measurements with electronic nose was made in betaine factory and main targets were odour removal process and other odours from factory. As a result of experimental work with TD-GC-MS-measurements becomes odour compound of 2-and 3- methylbutanal and dimethyldisulfide, which odour is sweet and fug. Extensive study with electronic nose found many developmental subjects. Odour balance measurements of factory and after calculation made adjustment of odour removal process, over all odour effluent to environment will reduce 25 %.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. METHODS: The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. RESULTS: The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. CONCLUSION: The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study analyzed high-density event-related potentials (ERPs) within an electrical neuroimaging framework to provide insights regarding the interaction between multisensory processes and stimulus probabilities. Specifically, we identified the spatiotemporal brain mechanisms by which the proportion of temporally congruent and task-irrelevant auditory information influences stimulus processing during a visual duration discrimination task. The spatial position (top/bottom) of the visual stimulus was indicative of how frequently the visual and auditory stimuli would be congruent in their duration (i.e., context of congruence). Stronger influences of irrelevant sound were observed when contexts associated with a high proportion of auditory-visual congruence repeated and also when contexts associated with a low proportion of congruence switched. Context of congruence and context transition resulted in weaker brain responses at 228 to 257 ms poststimulus to conditions giving rise to larger behavioral cross-modal interactions. Importantly, a control oddball task revealed that both congruent and incongruent audiovisual stimuli triggered equivalent non-linear multisensory interactions when congruence was not a relevant dimension. Collectively, these results are well explained by statistical learning, which links a particular context (here: a spatial location) with a certain level of top-down attentional control that further modulates cross-modal interactions based on whether a particular context repeated or changed. The current findings shed new light on the importance of context-based control over multisensory processing, whose influences multiplex across finer and broader time scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Durante toda la evolución de la tecnología, se han empleado aparatos interconexionados por cables. Los cables limitan la libertad de movimiento del usuario y pueden captar interferencias entre ellos si la red de cableado es elevada. Mientras avanzaba la tecnología inalámbrica, se ha ido adaptando al equipamiento electrónico a la vez que se iban haciendo cada vez más pequeños. Por esto, se impone la necesidad de utilizarlos como controles a distancia sin el empleo de cables debido a los inconvenientes que estos conllevan. El presente trabajo, pretende unificar tres tecnologías que pueden tener en el futuro una gran afinidad. · Dispositivos basados en el sistema Android. Desde sus inicios, han tenido una evolución meteórica. Se han ido haciendo cada vez más rápidos y mejores. · Sistemas inalámbricos. Los sistemas wifi o bluetooth, se han ido incorporando a nuestras vidas cada vez más y están prácticamente en cualquier aparato. · Robótica. Cualquier proceso de producción incorpora un robot. Son necesarios para hacer muchos trabajos que, aunque el hombre lo puede realizar, un robot reduce los tiempos y la peligrosidad de los procesos. Aunque las dos primeras tecnologías van unidas, ¿quién no tiene un teléfono con conexión wifi y bluetooth?, pocos diseños aúnan estos campos con la Robótica. El objetivo final de este trabajo es realizar una aplicación en Android para el control remoto de un robot, empleando el sistema de comunicación inalámbrico. La aplicación desarrollada, permite controlar el robot a conveniencia del usuario en un entorno táctil/teledirigido. Gracias a la utilización de simulador en ambos lenguajes (RAPID y Android), ha sido posible realizar la programación sin tener que estar presente ante el robot objeto de este trabajo. A través de su progreso, se ha ido evolucionando en la cantidad de datos enviados al robot y complejidad en su procesamiento, a la vez que se ha mejorado en la estética de la aplicación. Finalmente se usó la aplicación desarrollada con el robot, consiguiendo con éxito que realizara los movimientos que eran enviados con la tablet programada.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Behavior-based navigation of autonomous vehicles requires the recognition of the navigable areas and the potential obstacles. In this paper we describe a model-based objects recognition system which is part of an image interpretation system intended to assist the navigation of autonomous vehicles that operate in industrial environments. The recognition system integrates color, shape and texture information together with the location of the vanishing point. The recognition process starts from some prior scene knowledge, that is, a generic model of the expected scene and the potential objects. The recognition system constitutes an approach where different low-level vision techniques extract a multitude of image descriptors which are then analyzed using a rule-based reasoning system to interpret the image content. This system has been implemented using a rule-based cooperative expert system

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis the X-ray tomography is discussed from the Bayesian statistical viewpoint. The unknown parameters are assumed random variables and as opposite to traditional methods the solution is obtained as a large sample of the distribution of all possible solutions. As an introduction to tomography an inversion formula for Radon transform is presented on a plane. The vastly used filtered backprojection algorithm is derived. The traditional regularization methods are presented sufficiently to ground the Bayesian approach. The measurements are foton counts at the detector pixels. Thus the assumption of a Poisson distributed measurement error is justified. Often the error is assumed Gaussian, altough the electronic noise caused by the measurement device can change the error structure. The assumption of Gaussian measurement error is discussed. In the thesis the use of different prior distributions in X-ray tomography is discussed. Especially in severely ill-posed problems the use of a suitable prior is the main part of the whole solution process. In the empirical part the presented prior distributions are tested using simulated measurements. The effect of different prior distributions produce are shown in the empirical part of the thesis. The use of prior is shown obligatory in case of severely ill-posed problem.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identifiability of the parameters of a heat exchanger model without phase change was studied in this Master’s thesis using synthetically made data. A fast, two-step Markov chain Monte Carlo method (MCMC) was tested with a couple of case studies and a heat exchanger model. The two-step MCMC-method worked well and decreased the computation time compared to the traditional MCMC-method. The effect of measurement accuracy of certain control variables to the identifiability of parameters was also studied. The accuracy used did not seem to have a remarkable effect to the identifiability of parameters. The use of the posterior distribution of parameters in different heat exchanger geometries was studied. It would be computationally most efficient to use the same posterior distribution among different geometries in the optimisation of heat exchanger networks. According to the results, this was possible in the case when the frontal surface areas were the same among different geometries. In the other cases the same posterior distribution can be used for optimisation too, but that will give a wider predictive distribution as a result. For condensing surface heat exchangers the numerical stability of the simulation model was studied. As a result, a stable algorithm was developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a previous work, a hybrid system consisting of an advanced oxidation process (AOP) named Photo-Fenton (Ph-F) and a fixed bed biological treatment operating as a sequencing batch biofilm reactor (SBBR) was started-up and optimized to treat 200 mg·L-1 of 4-chlorophenol (4-CP) as a model compound. In this work, studies of reactor stability and control as well as microbial population determination by molecular biology techniques were carried out to further characterize and control the biological reactor. Results revealed that the integrated system was flexible and even able to overcome toxic shock loads. Oxygen uptake rate (OUR) in situ was shown to be a valid tool to control the SBBR operation, to detect toxic conditions to the biomass, and to assess the recovery of performance. A microbial characterization by 16S rDNA sequence analysis reveals that the biological population was varied, although about 30% of the bacteria belonged to the Wautersia genus.