936 resultados para Process control -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Field communication systems (fieldbuses) are widely used as the communication support for distributed computer-controlled systems (DCCS) within all sort of process control and manufacturing applications. There are several advantages in the use of fieldbuses as a replacement for the traditional point-to-point links between sensors/actuators and computer-based control systems, within which the most relevant is the decentralisation and distribution of the processing power over the field. A widely used fieldbus is the WorldFIP, which is normalised as European standard EN 50170. Using WorldFIP to support DCCS, an important issue is “how to guarantee the timing requirements of the real-time traffic?” WorldFIP has very interesting mechanisms to schedule data transfers, since it explicitly distinguishes periodic and aperiodic traffic. In this paper, we describe how WorldFIP handles these two types of traffic, and more importantly, we provide a comprehensive analysis on how to guarantee the timing requirements of the real-time traffic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses sensor network applications which need to obtain an accurate image of physical phenomena and do so with a high sampling rate in both time and space. We present a fast and scalable approach for obtaining an approximate representation of all sensor readings at high sampling rate for quickly reacting to critical events in a physical environment. This approach is an improvement on previous work in that after the new approach has undergone a startup phase then the new approach can use a very small sampling period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data analytic applications are characterized by large data sets that are subject to a series of processing phases. Some of these phases are executed sequentially but others can be executed concurrently or in parallel on clusters, grids or clouds. The MapReduce programming model has been applied to process large data sets in cluster and cloud environments. For developing an application using MapReduce there is a need to install/configure/access specific frameworks such as Apache Hadoop or Elastic MapReduce in Amazon Cloud. It would be desirable to provide more flexibility in adjusting such configurations according to the application characteristics. Furthermore the composition of the multiple phases of a data analytic application requires the specification of all the phases and their orchestration. The original MapReduce model and environment lacks flexible support for such configuration and composition. Recognizing that scientific workflows have been successfully applied to modeling complex applications, this paper describes our experiments on implementing MapReduce as subworkflows in the AWARD framework (Autonomic Workflow Activities Reconfigurable and Dynamic). A text mining data analytic application is modeled as a complex workflow with multiple phases, where individual workflow nodes support MapReduce computations. As in typical MapReduce environments, the end user only needs to define the application algorithms for input data processing and for the map and reduce functions. In the paper we present experimental results when using the AWARD framework to execute MapReduce workflows deployed over multiple Amazon EC2 (Elastic Compute Cloud) instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Engenharia Mecânica – Especialização Gestão Industrial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, data centers are large energy consumers and the trend for next years is expected to increase further, considering the growth in the order of cloud services. A large portion of this power consumption is due to the control of physical parameters of the data center (such as temperature and humidity). However, these physical parameters are tightly coupled with computations, and even more so in upcoming data centers, where the location of workloads can vary substantially due, for example, to workloads being moved in the cloud infrastructure hosted in the data center. Therefore, managing the physical and compute infrastructure of a large data center is an embodiment of a Cyber-Physical System (CPS). In this paper, we describe a data collection and distribution architecture that enables gathering physical parameters of a large data center at a very high temporal and spatial resolution of the sensor measurements. We think this is an important characteristic to enable more accurate heat-flow models of the data center and with them, find opportunities to optimize energy consumptions. Having a high-resolution picture of the data center conditions, also enables minimizing local hot-spots, perform more accurate predictive maintenance (failures in all infrastructure equipments can be more promptly detected) and more accurate billing. We detail this architecture and define the structure of the underlying messaging system that is used to collect and distribute the data. Finally, we show the results of a preliminary study of a typical data center radio environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we study several natural and man-made complex phenomena in the perspective of dynamical systems. For each class of phenomena, the system outputs are time-series records obtained in identical conditions. The time-series are viewed as manifestations of the system behavior and are processed for analyzing the system dynamics. First, we use the Fourier transform to process the data and we approximate the amplitude spectra by means of power law functions. We interpret the power law parameters as a phenomenological signature of the system dynamics. Second, we adopt the techniques of non-hierarchical clustering and multidimensional scaling to visualize hidden relationships between the complex phenomena. Third, we propose a vector field based analogy to interpret the patterns unveiled by the PL parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertation presented to obtain the PhD degree in Electrical and Computer Engineering - Electronics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current data mining engines are difficult to use, requiring optimizations by data mining experts in order to provide optimal results. To solve this problem a new concept was devised, by maintaining the functionality of current data mining tools and adding pervasive characteristics such as invisibility and ubiquity which focus on their users, providing better ease of use and usefulness, by providing autonomous and intelligent data mining processes. This article introduces an architecture to implement a data mining engine, composed by four major components: database; Middleware (control); Middleware (processing); and interface. These components are interlinked but provide independent scaling, allowing for a system that adapts to the user’s needs. A prototype has been developed in order to test the architecture. The results are very promising and showed their functionality and the need for further improvements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este proyecto se ha desarrollado estrategias de control avanzadas para plantas de depuración de aguas residuales urbanas que eliminan conjuntamente materia orgánica, nitrógeno y fósforo. Las estrategias se han basado en el estudio multivariable del comportamiento del sistema, que ha producido subsidios para la utilización de lazos de control feedforward, de control predictivo y de un control de costes que automáticamente enviaba las consignas más adecuadas para los controladores de proceso. Para el desarrollo de las estrategias, se ha creado un sistema virtual de simulación (simulador) de plantas de depuradoras, basado en datos de literatura. Para el caso de una planta real, se ha desarrollado un simulador de la planta de Manresa (Catalunya). Sin embargo, el sistema de Manresa se ha utilizado exclusivamente para auxiliar los ingenieros de la planta en la tomada de decisiones de cambio de configuración para que la eliminación de fósforo se dé por la ruta biológica y no por la ruta química. La implementación de los simuladores ha permitido hacer muchas pruebas que en una planta real demandarían mucho tiempo y consumirían muchos recursos energéticos y financieros. Las estrategias de control más elaboradas han podido ahorrar hasta 150.000,00 Euros por año en relación a la operación de la planta sin el control automático. Cuanto a los estudios del modelo de la planta real, se concluyó que la eliminación biológica de fósforo puede sustituir el actual proceso químico de eliminación de fósforo, bajando los costes operacionales (costes del agente precipitante).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estudi i disseny de la implantació d'un ERP (Enterprise Resource Planning) en una fàbrica de fruits secs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Disseny d'un programari de gestió de magatzems on quedin reflectides les seves entrades, sortides i altres operacions pròpies dels magatzems. El programari ha de ser escalable i perdurar en el temps a més a més de permetre operacions d¿actualització, esborrat, addicció de dades i les operacionsfonamentals de consulta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construcció d'una aplicació web a partir de les especificacions d'un client imaginari. Estudi i utilització del mètode Rational Unified Process, el més habitual actualment en la construcció de software. Disseny d'una base de dades i implementació del model lògic mitjançant un SGBD punter al mercat com Oracle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L'objectiu final d'aquest treball és l'obtenció d'un sistema que permeti regular l'accés de manera fàcil i flexible a determinats serveis de xarxa per a un conjunt d'usuaris que tenen necessitats i atributs diferents d'ús de la xarxa segons el seu perfil i segons el moment.