902 resultados para Distributed computer-controlled systems


Relevância:

50.00% 50.00%

Publicador:

Resumo:

Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The future power grid will effectively utilize renewable energy resources and distributed generation to respond to energy demand while incorporating information technology and communication infrastructure for their optimum operation. This dissertation contributes to the development of real-time techniques, for wide-area monitoring and secure real-time control and operation of hybrid power systems. ^ To handle the increased level of real-time data exchange, this dissertation develops a supervisory control and data acquisition (SCADA) system that is equipped with a state estimation scheme from the real-time data. This system is verified on a specially developed laboratory-based test bed facility, as a hardware and software platform, to emulate the actual scenarios of a real hybrid power system with the highest level of similarities and capabilities to practical utility systems. It includes phasor measurements at hundreds of measurement points on the system. These measurements were obtained from especially developed laboratory based Phasor Measurement Unit (PMU) that is utilized in addition to existing commercially based PMU’s. The developed PMU was used in conjunction with the interconnected system along with the commercial PMU’s. The tested studies included a new technique for detecting the partially islanded micro grids in addition to several real-time techniques for synchronization and parameter identifications of hybrid systems. ^ Moreover, due to numerous integration of renewable energy resources through DC microgrids, this dissertation performs several practical cases for improvement of interoperability of such systems. Moreover, increased number of small and dispersed generating stations and their need to connect fast and properly into the AC grids, urged this work to explore the challenges that arise in synchronization of generators to the grid and through introduction of a Dynamic Brake system to improve the process of connecting distributed generators to the power grid.^ Real time operation and control requires data communication security. A research effort in this dissertation was developed based on Trusted Sensing Base (TSB) process for data communication security. The innovative TSB approach improves the security aspect of the power grid as a cyber-physical system. It is based on available GPS synchronization technology and provides protection against confidentiality attacks in critical power system infrastructures. ^

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Distributed Generation (DG) from alternate sources and smart grid technologies represent good solutions for the increase in energy demands. Employment of these DG assets requires solutions for the new technical challenges that are accompanied by the integration and interconnection into operational power systems. A DG infrastructure comprised of alternate energy sources in addition to conventional sources, is developed as a test bed. The test bed is operated by synchronizing, wind, photovoltaic, fuel cell, micro generator and energy storage assets, in addition to standard AC generators. Connectivity of these DG assets is tested for viability and for their operational characteristics. The control and communication layers for dynamic operations are developed to improve the connectivity of alternates to the power system. A real time application for the operation of alternate sources in microgrids is developed. Multi agent approach is utilized to improve stability and sequences of actions for black start are implemented. Experiments for control and stability issues related to dynamic operation under load conditions have been conducted and verified.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Distributed Computing frameworks belong to a class of programming models that allow developers to

launch workloads on large clusters of machines. Due to the dramatic increase in the volume of

data gathered by ubiquitous computing devices, data analytic workloads have become a common

case among distributed computing applications, making Data Science an entire field of

Computer Science. We argue that Data Scientist's concern lays in three main components: a dataset,

a sequence of operations they wish to apply on this dataset, and some constraint they may have

related to their work (performances, QoS, budget, etc). However, it is actually extremely

difficult, without domain expertise, to perform data science. One need to select the right amount

and type of resources, pick up a framework, and configure it. Also, users are often running their

application in shared environments, ruled by schedulers expecting them to specify precisely their resource

needs. Inherent to the distributed and concurrent nature of the cited frameworks, monitoring and

profiling are hard, high dimensional problems that block users from making the right

configuration choices and determining the right amount of resources they need. Paradoxically, the

system is gathering a large amount of monitoring data at runtime, which remains unused.

In the ideal abstraction we envision for data scientists, the system is adaptive, able to exploit

monitoring data to learn about workloads, and process user requests into a tailored execution

context. In this work, we study different techniques that have been used to make steps toward

such system awareness, and explore a new way to do so by implementing machine learning

techniques to recommend a specific subset of system configurations for Apache Spark applications.

Furthermore, we present an in depth study of Apache Spark executors configuration, which highlight

the complexity in choosing the best one for a given workload.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Over the last decade, there has been a trend where water utility companies aim to make water distribution networks more intelligent in order to improve their quality of service, reduce water waste, minimize maintenance costs etc., by incorporating IoT technologies. Current state of the art solutions use expensive power hungry deployments to monitor and transmit water network states periodically in order to detect anomalous behaviors such as water leakage and bursts. However, more than 97% of water network assets are remote away from power and are often in geographically remote underpopulated areas, facts that make current approaches unsuitable for next generation more dynamic adaptive water networks. Battery-driven wireless sensor/actuator based solutions are theoretically the perfect choice to support next generation water distribution. In this paper, we present an end-to-end water leak localization system, which exploits edge processing and enables the use of battery-driven sensor nodes. Our system combines a lightweight edge anomaly detection algorithm based on compression rates and an efficient localization algorithm based on graph theory. The edge anomaly detection and localization elements of the systems produce a timely and accurate localization result and reduce the communication by 99% compared to the traditional periodic communication. We evaluated our schemes by deploying non-intrusive sensors measuring vibrational data on a real-world water test rig that have had controlled leakage and burst scenarios implemented.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Surgical interventions are usually performed in an operation room; however, access to the information by the medical team members during the intervention is limited. While in conversations with the medical staff, we observed that they attach significant importance to the improvement of the information and communication direct access by queries during the process in real time. It is due to the fact that the procedure is rather slow and there is lack of interaction with the systems in the operation room. These systems can be integrated on the Cloud adding new functionalities to the existing systems the medical expedients are processed. Therefore, such a communication system needs to be built upon the information and interaction access specifically designed and developed to aid the medical specialists. Copyright 2014 ACM.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Sustainability and responsible environmental behaviour constitute a vital premise in the development of the humankind. In fact, during last decades, the global energetic scenario is evolving towards a scheme with increasing relevance of Renewable Energy Sources (RES) like photovoltaic, wind, biomass and hydrogen. Furthermore, hydrogen is an energy carrier which constitutes a mean for long-term energy storage. The integration of hydrogen with local RES contributes to distributed power generation and early introduction of hydrogen economy. Intermittent nature of many of RES, for instance solar and wind sources, impose the development of a management and control strategy to overcome this drawback. This strategy is responsible of providing a reliable, stable and efficient operation of the system. To implement such strategy, a monitoring system is required.The present paper aims to contribute to experimentally validate LabVIEW as valuable tool to develop monitoring platforms in the field of RES-based facilities. To this aim, a set of real systems successfully monitored is exposed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Wireless Sensor Networks (WSNs) have a vast field of applications, including deployment in hostile environments. Thus, the adoption of security mechanisms is fundamental. However, the extremely constrained nature of sensors and the potentially dynamic behavior of WSNs hinder the use of key management mechanisms commonly applied in modern networks. For this reason, many lightweight key management solutions have been proposed to overcome these constraints. In this paper, we review the state of the art of these solutions and evaluate them based on metrics adequate for WSNs. We focus on pre-distribution schemes well-adapted for homogeneous networks (since this is a more general network organization), thus identifying generic features that can improve some of these metrics. We also discuss some challenges in the area and future research directions. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The TCP/IP architecture was consolidated as a standard to the distributed systems. However, there are several researches and discussions about alternatives to the evolution of this architecture and, in this study area, this work presents the Title Model to contribute with the application needs support by the cross layer ontology use and the horizontal addressing, in a next generation Internet. For a practical viewpoint, is showed the network cost reduction for the distributed programming example, in networks with layer 2 connectivity. To prove the title model enhancement, it is presented the network analysis performed for the message passing interface, sending a vector of integers and returning its sum. By this analysis, it is confirmed that the current proposal allows, in this environment, a reduction of 15,23% over the total network traffic, in bytes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The finite element method is used to simulate coupled problems, which describe the related physical and chemical processes of ore body formation and mineralization, in geological and geochemical systems. The main purpose of this paper is to illustrate some simulation results for different types of modelling problems in pore-fluid saturated rock masses. The aims of the simulation results presented in this paper are: (1) getting a better understanding of the processes and mechanisms of ore body formation and mineralization in the upper crust of the Earth; (2) demonstrating the usefulness and applicability of the finite element method in dealing with a wide range of coupled problems in geological and geochemical systems; (3) qualitatively establishing a set of showcase problems, against which any numerical method and computer package can be reasonably validated. (C) 2002 Published by Elsevier Science B.V.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this work was to exemplify the specific contribution of both two- and three-dimensional (31)) X-ray computed tomography to characterise earthworm burrow systems. To achieve this purpose we used 3D mathematical morphology operators to characterise burrow systems resulting from the activity of an anecic (Aporrectodea noctunia), and an endogeic species (Allolobophora chlorotica), when both species were introduced either separately or together into artificial soil cores. Images of these soil cores were obtained using a medical X-ray tomography scanner. Three-dimensional reconstructions of burrow systems were obtained using a specifically developed segmentation algorithm. To study the differences between burrow systems, a set of classical tools of mathematical morphology (granulometries) were used. So-called granulometries based on different structuring elements clearly separated the different burrow systems. They enabled us to show that burrows made by the anecic species were fatter, longer, more vertical, more continuous but less sinuous than burrows of the endogeic species. The granulometry transform of the soil matrix showed that burrows made by A. nocturna were more evenly distributed than those of A. chlorotica. Although a good discrimination was possible when only one species was introduced into the soil cores, it was not possible to separate burrows of the two species from each other in cases where species were introduced into the same soil core. This limitation, partly due to the insufficient spatial resolution of the medical scanner, precluded the use of the morphological operators to study putative interactions between the two species.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background Heart failure and diabetes often occur simultaneously in patients, but the prognostic value of glycemia in chronic heart failure is debatable. We evaluated the role of glycemia on prognosis of heart failure. Methods Outpatients with chronic heart failure from the Long-term Prospective Randomized Controlled Study Using Repetitive Education at Six-Month Intervals and Monitoring for Adherence in Heart Failure Outpatients (REMADHE) trial were grouped according to the presence of diabetes and level of glycemia. All-cause mortality/heart transplantation and unplanned hospital admission were evaluated. Results Four hundred fifty-six patients were included (135 [29.5%] female, 124 [27.2%] with diabetes mellitus, age of w50.2 +/- 11.4 years, and left-ventricle ejection fraction of 34.7% +/- 10.5%). During follow-up (3.6 +/- 2.2 years), 27 (5.9%) patients were submitted to heart transplantation and 202 (44.2%) died; survival was similar in patients with and without diabetes mellitus. When patients with and without diabetes were categorized according to glucose range (glycemia <= 100 mg/dL [5.5 mmol/L]), as well as when distributed in quintiles of glucose, the survival was significantly worse among patients with lower levels of glycemia. This finding persisted in Cox proportional hazards regression model that included gender, etiology, left ventricle ejection fraction, left ventricle diastolic diameter, creatinine level and beta-blocker therapy, and functional status (hazard ratio 1.45, 95% CI 1.09-1.69, P = .039). No difference regarding unplanned hospital admission was found. Conclusion We report on an inverse association between glycemia and mortality in outpatients with chronic heart failure. These results point to a new pathophysiologic understanding of the interactions between diabetes mellitus, hyperglycemia, and heart disease. (Am Heart J 2010; 159: 90-7.)