936 resultados para Process control -- Data processing
Resumo:
In this paper, the authors introduce a novel mechanism for data management in a middleware for smart home control, where a relational database and semantic ontology storage are used at the same time in a Data Warehouse. An annotation system has been designed for instructing the storage format and location, registering new ontology concepts and most importantly, guaranteeing the Data Consistency between the two storage methods. For easing the data persistence process, the Data Access Object (DAO) pattern is applied and optimized to enhance the Data Consistency assurance. Finally, this novel mechanism provides an easy manner for the development of applications and their integration with BATMP. Finally, an application named "Parameter Monitoring Service" is given as an example for assessing the feasibility of the system.
Resumo:
El presente Trabajo de Fin de Grado se enmarca dentro de un sistema de control y desarrollo de sistemas inteligentes de transporte (ITS). Este Trabajo consta de varias líneas de desarrollo, que se engloban dentro de dicho marco y surgen de la necesidad de aumentar la seguridad, flujo, estructura y mantenimiento de las carreteras incorporando las tecnologías más recientes. En primer lugar, el presente Trabajo se centra en el desarrollo de un nuevo sistema de procesamiento de datos de tráfico en tiempo real que aprovecha las tecnologías de Big Data, Cloud Computing y Map-Reduce que han surgido estos últimos años. Para ello se realiza un estudio previo de los datos de tráfico vial que originan los vehículos que viajan por carreteras. Centrándose en el sistema empleado por la Dirección General de Tráfico de España y comparándolos con el de las Empresas basadas en servicios de localización (LBS). Se expone el modelo Hadoop utilizado así como el proceso Map-Reduce implementado en este sistema analizador. Por último los datos de salida son preparados y enviados a un módulo web básico que actúa como Sistema de Información Geográfica (GIS).---ABSTRACT---This Final Degree Project is part of a control system and development of intelligent transport systems (ITS). This work is part of a several lines of development, which are included within this framework and arise from the need to increase security, flow, structure and maintenance of roads incorporating the latest technologies. First, this paper focuses on the development of a new data processing system of real-time traffic that takes advantage of Big Data, Cloud Computing and Map-Reduce technologies emerged in our recent years. It is made a preliminary study of road traffic data originated by vehicles traveling by road. Focusing on the system used by the Dirección General de Tráfico of Spain and compared with that of the companies offering location based services (LBS). It is exposed the used Hadoop model and the Map-Reduce process implemented on this analyzer system. Finally, the output data is prepared and sent to a basic web module that acts as Geographic Information System (GIS).
Self-organized phase transitions in neural networks as a neural mechanism of information processing.
Resumo:
Transitions between dynamically stable activity patterns imposed on an associative neural network are shown to be induced by self-organized infinitesimal changes in synaptic connection strength and to be a kind of phase transition. A key event for the neural process of information processing in a population coding scheme is transition between the activity patterns encoding usual entities. We propose that the infinitesimal and short-term synaptic changes based on the Hebbian learning rule are the driving force for the transition. The phase transition between the following two dynamical stable states is studied in detail, the state where the firing pattern is changed temporally so as to itinerate among several patterns and the state where the firing pattern is fixed to one of several patterns. The phase transition from the pattern itinerant state to a pattern fixed state may be induced by the Hebbian learning process under a weak input relevant to the fixed pattern. The reverse transition may be induced by the Hebbian unlearning process without input. The former transition is considered as recognition of the input stimulus, while the latter is considered as clearing of the used input data to get ready for new input. To ensure that information processing based on the phase transition can be made by the infinitesimal and short-term synaptic changes, it is absolutely necessary that the network always stays near the critical state corresponding to the phase transition point.
Resumo:
A ciência na qual se estuda a deformação de um fluido no qual é aplicada uma tensão de cisalhamento é conhecida como reologia e o equipamento utilizado para a realização dos ensaios é chamado de reômetro. Devido a impraticabilidade de uso de reômetros comerciais, diversos pesquisadores desenvolveram reômetros capazes de analisar suspensões de macropartículas, baseados nos mesmos princípios de funcionamento dos equipamentos já existentes. Em alguns casos, a medição do torque do motor é realizada pela aquisição da tensão, uma vez que esta é proporcional ao torque. Entretanto, para melhor compreensão do resultado e para evitar a possibilidade de conclusões precipitadas, vê-se necessária correta interpretação do sinal elétrico, precisando avaliar qual frequência do sinal é relevante para o ensaio e, também, qual a melhor taxa de amostragem. Além da aquisição, para que o ensaio reológico seja realizado com precisão, é indispensável ótimo controle da taxa ou tensão do motor e uma alternativa é a utilização de um servomotor e um servoconversor. No caso desse ser comercial é essencial saber configurá-lo. Para facilitar o usuário leigo, alguns pesquisadores desenvolveram softwares para controle do equipamento e análise dos dados. Assim, o presente trabalho tem como objetivo propor uma metodologia para compreender o sinal aquisitado de um reômetro servo controlado e desenvolvimento do software de análise para o tratamento dos dados obtidos a partir de ensaios reológicos. Verificou-se a melhor configuração do servocontrolador, a melhor taxa de amostragem, de no mínimo 20 amostras/segundo, e, também, desenvolveu-se um filtro digital passa-baixa do tipo FIR para remover a frequência indesejada. Além disso, foi desenvolvido um software utilizando uma rotina em Matlab e uma interface gráfica do usuário (Graphical User Interface - GUI), para o pós-processamento dos dados para auxiliar o usuário leigo no tratamento e interpretação do resultado, que se mostrou eficaz.
Resumo:
A new methodology is proposed to produce subsidence activity maps based on the geostatistical analysis of persistent scatterer interferometry (PSI) data. PSI displacement measurements are interpolated based on conditional Sequential Gaussian Simulation (SGS) to calculate multiple equiprobable realizations of subsidence. The result from this process is a series of interpolated subsidence values, with an estimation of the spatial variability and a confidence level on the interpolation. These maps complement the PSI displacement map, improving the identification of wide subsiding areas at a regional scale. At a local scale, they can be used to identify buildings susceptible to suffer subsidence related damages. In order to do so, it is necessary to calculate the maximum differential settlement and the maximum angular distortion for each building of the study area. Based on PSI-derived parameters those buildings in which the serviceability limit state has been exceeded, and where in situ forensic analysis should be made, can be automatically identified. This methodology has been tested in the city of Orihuela (SE Spain) for the study of historical buildings damaged during the last two decades by subsidence due to aquifer overexploitation. The qualitative evaluation of the results from the methodology carried out in buildings where damages have been reported shows a success rate of 100%.
Resumo:
The Santas Justa and Rufina Gothic church (fourteenth century) has suffered several physical, mechanical, chemical, and biochemical types of pathologies along its history: rock alveolization, efflorescence, biological activity, and capillary ascent of groundwater. However, during the last two decades, a new phenomenon has seriously affected the church: ground subsidence caused by aquifer overexploitation. Subsidence is a process that affects the whole Vega Baja of the Segura River basin and consists of gradual sinking in the ground surface caused by soil consolidation due to a pore pressure decrease. This phenomenon has been studied by differential synthetic aperture radar interferometry techniques, which illustrate settlements up to 100 mm for the 1993–2009 period for the whole Orihuela city. Although no differential synthetic aperture radar interferometry information is available for the church due to the loss of interferometric coherence, the spatial analysis of nearby deformation combined with fieldwork has advanced the current understanding on the mechanisms that affect the Santas Justa and Rufina church. These results show the potential interest and the limitations of using this remote sensing technique as a complementary tool for the forensic analysis of building structures.
Resumo:
The quality of water level time series data strongly varies with periods of high and low quality sensor data. In this paper we are presenting the processing steps which were used to generate high quality water level data from water pressure measured at the Time Series Station (TSS) Spiekeroog. The TSS is positioned in a tidal inlet between the islands of Spiekeroog and Langeoog in the East Frisian Wadden Sea (southern North Sea). The processing steps will cover sensor drift, outlier identification, interpolation of data gaps and quality control. A central step is the removal of outliers. For this process an absolute threshold of 0.25m/10min was selected which still keeps the water level increase and decrease during extreme events as shown during the quality control process. A second important feature of data processing is the interpolation of gappy data which is accomplished with a high certainty of generating trustworthy data. Applying these methods a 10 years dataset (December 2002-December 2012) of water level information at the TSS was processed resulting in a seven year time series (2005-2011).
Resumo:
Mode of access: Internet.
Resumo:
"Research was supported by the United States Air Force through the Air Force Office of Scientific Research, Air Research and Development Command."
Resumo:
Includes bibliographical references (p. 48-49).
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Water recovery is one of the key parameters in flotation modelling for the purposes of plant design and process control, as it determines the circulating flow and residence time in the individual process units in the plant and has a significant effect on entrainment and froth recovery. This paper reviews some of the water recovery models available in the literature, including both empirical and fundamental models. The selected models are tested using the data obtained from the experimental work conducted in an Outokumpu 3 m(3) tank cell at the Xstrata Mt Isa copper concentrator. It is found that all the models fit the experimental data reasonably well for a given flotation system. However, the empirical models are either unable to distinguish the effect of different cell operating conditions or required to determine the empirical model parameters to be derived in an existing flotation system. The model developed by [Neethling, SJ., Lee, H.T., Cilliers, J.J., 2003, Simple relationships for predicting the recovery of liquid from flowing foams and froths. Minerals Engineering 16, 1123-1130] is based on fundamental understanding of the froth structure and transfer of the water in the froth. It describes the water recovery as a function of the cell operating conditions and the froth properties which can all be determined on-line. Hence, the fundamental model can be used for process control purposes in practice. By incorporating additional models to relate the air recovery and surface bubble size directly to the cell operating conditions, the fundamental model can also be used for prediction purposes. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.
Resumo:
Workflow technology has delivered effectively for a large class of business processes, providing the requisite control and monitoring functions. At the same time, this technology has been the target of much criticism due to its limited ability to cope with dynamically changing business conditions which require business processes to be adapted frequently, and/or its limited ability to model business processes which cannot be entirely predefined. Requirements indicate the need for generic solutions where a balance between process control and flexibility may be achieved. In this paper we present a framework that allows the workflow to execute on the basis of a partially specified model where the full specification of the model is made at runtime, and may be unique to each instance. This framework is based on the notion of process constraints. Where as process constraints may be specified for any aspect of the workflow, such as structural, temporal, etc. our focus in this paper is on a constraint which allows dynamic selection of activities for inclusion in a given instance. We call these cardinality constraints, and this paper will discuss their specification and validation requirements.