944 resultados para Monitoring Systems
Resumo:
Heterogeneity has to be taken into account when integrating a set of existing information sources into a distributed information system that are nowadays often based on Service- Oriented Architectures (SOA). This is also particularly applicable to distributed services such as event monitoring, which are useful in the context of Event Driven Architectures (EDA) and Complex Event Processing (CEP). Web services deal with this heterogeneity at a technical level, also providing little support for event processing. Our central thesis is that such a fully generic solution cannot provide complete support for event monitoring; instead, source specific semantics such as certain event types or support for certain event monitoring techniques have to be taken into account. Our core result is the design of a configurable event monitoring (Web) service that allows us to trade genericity for the exploitation of source specific characteristics. It thus delivers results for the areas of SOA, Web services, CEP and EDA.
Resumo:
Sustainability and responsible environmental behaviour constitute a vital premise in the development of the humankind. In fact, during last decades, the global energetic scenario is evolving towards a scheme with increasing relevance of Renewable Energy Sources (RES) like photovoltaic, wind, biomass and hydrogen. Furthermore, hydrogen is an energy carrier which constitutes a mean for long-term energy storage. The integration of hydrogen with local RES contributes to distributed power generation and early introduction of hydrogen economy. Intermittent nature of many of RES, for instance solar and wind sources, impose the development of a management and control strategy to overcome this drawback. This strategy is responsible of providing a reliable, stable and efficient operation of the system. To implement such strategy, a monitoring system is required.The present paper aims to contribute to experimentally validate LabVIEW as valuable tool to develop monitoring platforms in the field of RES-based facilities. To this aim, a set of real systems successfully monitored is exposed.
Resumo:
The convergence between the recent developments in sensing technologies, data science, signal processing and advanced modelling has fostered a new paradigm to the Structural Health Monitoring (SHM) of engineered structures, which is the one based on intelligent sensors, i.e., embedded devices capable of stream processing data and/or performing structural inference in a self-contained and near-sensor manner. To efficiently exploit these intelligent sensor units for full-scale structural assessment, a joint effort is required to deal with instrumental aspects related to signal acquisition, conditioning and digitalization, and those pertaining to data management, data analytics and information sharing. In this framework, the main goal of this Thesis is to tackle the multi-faceted nature of the monitoring process, via a full-scale optimization of the hardware and software resources involved by the {SHM} system. The pursuit of this objective has required the investigation of both: i) transversal aspects common to multiple application domains at different abstraction levels (such as knowledge distillation, networking solutions, microsystem {HW} architectures), and ii) the specificities of the monitoring methodologies (vibrations, guided waves, acoustic emission monitoring). The key tools adopted in the proposed monitoring frameworks belong to the embedded signal processing field: namely, graph signal processing, compressed sensing, ARMA System Identification, digital data communication and TinyML.
Resumo:
Modern scientific discoveries are driven by an unsatisfiable demand for computational resources. High-Performance Computing (HPC) systems are an aggregation of computing power to deliver considerably higher performance than one typical desktop computer can provide, to solve large problems in science, engineering, or business. An HPC room in the datacenter is a complex controlled environment that hosts thousands of computing nodes that consume electrical power in the range of megawatts, which gets completely transformed into heat. Although a datacenter contains sophisticated cooling systems, our studies indicate quantitative evidence of thermal bottlenecks in real-life production workload, showing the presence of significant spatial and temporal thermal and power heterogeneity. Therefore minor thermal issues/anomalies can potentially start a chain of events that leads to an unbalance between the amount of heat generated by the computing nodes and the heat removed by the cooling system originating thermal hazards. Although thermal anomalies are rare events, anomaly detection/prediction in time is vital to avoid IT and facility equipment damage and outage of the datacenter, with severe societal and business losses. For this reason, automated approaches to detect thermal anomalies in datacenters have considerable potential. This thesis analyzed and characterized the power and thermal characteristics of a Tier0 datacenter (CINECA) during production and under abnormal thermal conditions. Then, a Deep Learning (DL)-powered thermal hazard prediction framework is proposed. The proposed models are validated against real thermal hazard events reported for the studied HPC cluster while in production. This thesis is the first empirical study of thermal anomaly detection and prediction techniques of a real large-scale HPC system to the best of my knowledge. For this thesis, I used a large-scale dataset, monitoring data of tens of thousands of sensors for around 24 months with a data collection rate of around 20 seconds.
Resumo:
Protected crop production is a modern and innovative approach to cultivating plants in a controlled environment to optimize growth, yield, and quality. This method involves using structures such as greenhouses or tunnels to create a sheltered environment. These productive solutions are characterized by a careful regulation of variables like temperature, humidity, light, and ventilation, which collectively contribute to creating an optimal microclimate for plant growth. Heating, cooling, and ventilation systems are used to maintain optimal conditions for plant growth, regardless of external weather fluctuations. Protected crop production plays a crucial role in addressing challenges posed by climate variability, population growth, and food security. Similarly, animal husbandry involves providing adequate nutrition, housing, medical care and environmental conditions to ensure animal welfare. Then, sustainability is a critical consideration in all forms of agriculture, including protected crop and animal production. Sustainability in animal production refers to the practice of producing animal products in a way that minimizes negative impacts on the environment, promotes animal welfare, and ensures the long-term viability of the industry. Then, the research activities performed during the PhD can be inserted exactly in the field of Precision Agriculture and Livestock farming. Here the focus is on the computational fluid dynamic (CFD) approach and environmental assessment applied to improve yield, resource efficiency, environmental sustainability, and cost savings. It represents a significant shift from traditional farming methods to a more technology-driven, data-driven, and environmentally conscious approach to crop and animal production. On one side, CFD is powerful and precise techniques of computer modeling and simulation of airflows and thermo-hygrometric parameters, that has been applied to optimize the growth environment of crops and the efficiency of ventilation in pig barns. On the other side, the sustainability aspect has been investigated and researched in terms of Life Cycle Assessment analyses.
Resumo:
The Healthy Cities and Agenda 21 programs improve living and health conditions and affect social and economic determinants of health. The Millennium Development Goals (MDG) indicators can be used to assess the impact of social agendas. A data search was carried out for the period 1997 to 2006 to obtain 48 indicators proposed by the United Nations and a further 74 proposed by the technical group for the MDGin Brazil. There is a scarcity of studies concerned with assessing the MDG at the municipal level. Data from Brazilian health information systems are not always consistent or accurate for municipalities. The lack of availability and reliable data led to the substitution of some indicators. The information systems did not always provide annual data; national household surveys could not be disaggregated at the municipal level and there were also modifications on conceptual definitions over time. As a result, the project created an alternative list with 29 indicators. MDG monitoring at the local community can be important to measure the performance of actions toward improvements in quality of life and social iniquities.
Resumo:
Background: The Borg Scale may be a useful tool for heart failure patients to self-monitor and self-regulate exercise on land or in water (hydrotherapy) by maintaining the heart rate (HR) between the anaerobic threshold and respiratory compensation point. Methods and Results: Patients performed a cardiopulmonary exercise test to determine their anaerobic threshold/respiratory compensation points. The percentage of the mean HR during the exercise session in relation to the anaerobic threshold HR (%EHR-AT), in relation to the respiratory compensation point (%EHR-RCP), in relation to the peak HR by the exercise test (%EHR-Peak) and in relation to the maximum predicted HR (%EHR-Predicted) was calculated. Next, patients were randomized into the land or water exercise group. One blinded investigator instructed the patients in each group to exercise at a level between ""relatively easy and slightly tiring"". The mean HR throughout the 30-min exercise session was recorded. The %EHR-AT and %EHR-Predicted did not differ between the land and water exercisegroups, but they differed in the %EHR-RCP (95 +/- 7 to 86 +/- 7. P<0.001) and in the %EHR-Peak (85 +/- 8 to 78 +/- 9, P=0.007). Conclusions: Exercise guided by the Borg scale maintains the patient's HR between the anaerobic threshold and respiratory compensation point (ie, in the exercise training zone). (Circ J 2009; 73: 1871-1876)
Resumo:
The power transformer is a piece of electrical equipment that needs continuous monitoring and fast protection since it is very expensive and an essential element for a power system to perform effectively. The most common protection technique used is the percentage differential logic, which provides discrimination between an internal fault and different operating conditions. Unfortunately, there are some operating conditions of power transformers that can affect the protection behavior and the power system stability. This paper proposes the development of a new algorithm to improve the differential protection performance by using fuzzy logic and Clarke`s transform. An electrical power system was modeled using Alternative Transients Program (ATP) software to obtain the operational conditions and fault situations needed to test the algorithm developed. The results were compared to a commercial relay for validation, showing the advantages of the new method.
Resumo:
The main purpose of this paper is to present architecture of automated system that allows monitoring and tracking in real time (online) the possible occurrence of faults and electromagnetic transients observed in primary power distribution networks. Through the interconnection of this automated system to the utility operation center, it will be possible to provide an efficient tool that will assist in decisionmaking by the Operation Center. In short, the desired purpose aims to have all tools necessary to identify, almost instantaneously, the occurrence of faults and transient disturbances in the primary power distribution system, as well as to determine its respective origin and probable location. The compilations of results from the application of this automated system show that the developed techniques provide accurate results, identifying and locating several occurrences of faults observed in the distribution system.
Resumo:
Nowadays, there is a trend for industry reorganization in geographically dispersed systems, carried out of their activities with autonomy. These systems must maintain coordinated relationship among themselves in order to assure an expected performance of the overall system. Thus, a manufacturing system is proposed, based on ""web services"" to assure an effective orchestration of services in order to produce final products. In addition, it considers special functions, such as teleoperation and remote monitoring, users` online request, among others. Considering the proposed system as discrete event system (DES), techniques derived from Petri nets (PN), including the Production Flow Schema (PFS), can be used in a PFS/PN approach for modeling. The system is approached in different levels of abstraction: a conceptual model which is obtained by applying the PFS technique and a functional model which is obtained by applying PN. Finally, a particular example of the proposed system is presented.
Resumo:
Background-The effectiveness of heart failure disease management proarams in patients under cardiologists` care over long-term follow-up is not established. Methods and Results-We investigated the effects of a disease management program with repetitive education and telephone monitoring on primary (combined death or unplanned first hospitalization and quality-of-life changes) and secondary end points (hospitalization, death, and adherence). The REMADHE [Repetitive Education and Monitoring for ADherence for Heart Failure] trial is a long-term randomized, prospective, parallel trial designed to compare intervention with control. One hundred seventeen patients were randomized to usual care, and 233 to additional intervention. The mean follow-up was 2.47 +/- 1.75 years, with 54% adherence to the program. In the intervention group, the primary end point composite of death or unplanned hospitalization was reduced (hazard ratio, 0.64; confidence interval, 0.43 to 0.88; P=0.008), driven by reduction in hospitalization. The quality-of-life questionnaire score improved only in the intervention group (P<0.003). Mortality was similar in both groups. Number of hospitalizations (1.3 +/- 1.7 versus 0.8 +/- 1.3, P<0.0001), total hospital days during the follow-up (19.9 +/- 51 versus 11.1 +/- 24 days, P<0.0001), and the need for emergency visits (4.5 +/- 10.6 versus 1.6 +/- 2.4, P<0.0001) were lower in the intervention group. Beneficial effects were homogeneous for sex, race, diabetes and no diabetes, age, functional class, and etiology. Conclusions-For a longer follow-up period than in previous studies, this heart failure disease management program model of patients under the supervision of a cardiologist is associated with a reduction in unplanned hospitalization, a reduction of total hospital days, and a reduced need for emergency care, as well as improved quality of life, despite modest program adherence over time. (Circ Heart Fail. 2008;1:115-124.)
Resumo:
Dherte PM, Negrao MPG, Mori Neto S, Holzhacker R, Shimada V, Taberner P, Carmona MJC - Smart Alerts: Development of a Software to Optimize Data Monitoring. Background and objectives: Monitoring is useful for vital follow-ups and prevention, diagnosis, and treatment of several events in anesthesia. Although alarms can be useful in monitoring they can cause dangerous user`s desensitization. The objective of this study was to describe the development of specific software to integrate intraoperative monitoring parameters generating ""smart alerts"" that can help decision making, besides indicating possible diagnosis and treatment. Methods: A system that allowed flexibility in the definition of alerts, combining individual alarms of the parameters monitored to generate a more elaborated alert system was designed. After investigating a set of smart alerts, considered relevant in the surgical environment, a prototype was designed and evaluated, and additional suggestions were implemented in the final product. To verify the occurrence of smart alerts, the system underwent testing with data previously obtained during intraoperative monitoring of 64 patients. The system allows continuous analysis of monitored parameters, verifying the occurrence of smart alerts defined in the user interface. Results: With this system a potential 92% reduction in alarms was observed. We observed that in most situations that did not generate alerts individual alarms did not represent risk to the patient. Conclusions: Implementation of software can allow integration of the data monitored and generate information, such as possible diagnosis or interventions. An expressive potential reduction in the amount of alarms during surgery was observed. Information displayed by the system can be oftentimes more useful than analysis of isolated parameters.
Resumo:
Aims Trials of disease management programmes (DMP) in heart failure (HF) have shown controversial results regarding quality of life. We hypothesized that a DMP applied over the long-term could produce different effects on each of the quality-of-life components. Methods and results We extended the prospective, randomized REMADHE Trial, which studied a DMP in HF patients. We analysed changes in Minnesota Living with Heart Failure Questionnaire components in 412 patients, 60.5% male, age 50.2 +/- 11.4 years, left ventricular ejection fraction 34.7 +/- 10.5%. During a mean follow-up of 3.6 +/- 2.2 years, 6.3% of patients underwent heart transplantation and 31.8% died. Global quality-of-life scores improved in the DMP intervention group, compared with controls, respectively: 57.5 +/- 3.1 vs. 52.6 +/- 4.3 at baseline, 32.7 +/- 3.9 vs. 40.2 +/- 6.3 at 6 months, 31.9 +/- 4.3 vs. 41.5 +/- 7.4 at 12 months, 26.8 +/- 3.1 vs. 47.0 +/- 5.3 at the final assessment; P<0.01. Similarly, the physical component (23.7 +/- 1.4 vs. 21.1 +/- 2.2 at baseline, 16.2 +/- 2.9 vs. 18.0 +/- 3.3 at 6 months, 17.3 +/- 2.9 vs. 23.1 +/- 5.7 at 12 months, 11.4 +/- 1.6 vs. 19.9 +/- 2.4 final; P<0.01), the emotional component (13.2 +/- 1.0 vs. 12.1 +/- 1.4 at baseline, 11.7 +/- 2.7 vs. 12.3 +/- 3.1 at 6 months, 12.4 +/- 2.9 vs. 16.8 +/- 5.9 at 12 months, 6.7 +/- 1.0 vs. 10.6 +/- 1.4 final; P<0.01) and the additional questions (20.8 +/- 1.2 vs. 19.3 +/- 1.8 at baseline, 14.3 +/- 2.7 vs. 17.3 +/- 3.1 at 6 months, 12.4 +/- 2.9 vs. 21.0 +/- 5.5 at 12 months, 6.7 +/- 1.4 vs. 17.3 +/- 2.2 final; P<0.01) were better (lower) in the intervention group. The emotional component improved earlier than the others. Post-randomization quality of life was not associated with events. Conclusion Components of the quality-of-life assessment responded differently to DMP. These results indicate the need for individualized DMP strategies in patients with HF. Trial registration information www.clincaltrials.gov NCT00505050-REMADHE.
Resumo:
Background Heart failure and diabetes often occur simultaneously in patients, but the prognostic value of glycemia in chronic heart failure is debatable. We evaluated the role of glycemia on prognosis of heart failure. Methods Outpatients with chronic heart failure from the Long-term Prospective Randomized Controlled Study Using Repetitive Education at Six-Month Intervals and Monitoring for Adherence in Heart Failure Outpatients (REMADHE) trial were grouped according to the presence of diabetes and level of glycemia. All-cause mortality/heart transplantation and unplanned hospital admission were evaluated. Results Four hundred fifty-six patients were included (135 [29.5%] female, 124 [27.2%] with diabetes mellitus, age of w50.2 +/- 11.4 years, and left-ventricle ejection fraction of 34.7% +/- 10.5%). During follow-up (3.6 +/- 2.2 years), 27 (5.9%) patients were submitted to heart transplantation and 202 (44.2%) died; survival was similar in patients with and without diabetes mellitus. When patients with and without diabetes were categorized according to glucose range (glycemia <= 100 mg/dL [5.5 mmol/L]), as well as when distributed in quintiles of glucose, the survival was significantly worse among patients with lower levels of glycemia. This finding persisted in Cox proportional hazards regression model that included gender, etiology, left ventricle ejection fraction, left ventricle diastolic diameter, creatinine level and beta-blocker therapy, and functional status (hazard ratio 1.45, 95% CI 1.09-1.69, P = .039). No difference regarding unplanned hospital admission was found. Conclusion We report on an inverse association between glycemia and mortality in outpatients with chronic heart failure. These results point to a new pathophysiologic understanding of the interactions between diabetes mellitus, hyperglycemia, and heart disease. (Am Heart J 2010; 159: 90-7.)
Resumo:
Objectives We studied the relationship between changes in body composition and changes in blood pressure levels. Background The mechanisms underlying the frequently observed progression from pre-hypertension to hypertension are poorly understood. Methods We examined 1,145 subjects from a population-based survey at baseline in 1994/1995 and at follow-up in 2004/2005. First, we studied individuals pre-hypertensive at baseline who, during 10 years of follow-up, either had normalized blood pressure (PreNorm, n = 48), persistently had pre-hypertension (PrePre, n = 134), or showed progression to hypertension (PreHyp, n = 183). In parallel, we studied predictors for changes in blood pressure category in individuals hypertensive at baseline (n = 429). Results After 10 years, the PreHyp group was characterized by a marked increase in body weight (+5.71% [95% confidence interval (CI): 4.60% to 6.83%]) that was largely the result of an increase in fat mass (+17.8% [95% CI: 14.5% to 21.0%]). In the PrePre group, both the increases in body weight (+1.95% [95% CI: 0.68% to 3.22%]) and fat mass (+8.09% [95% CI: 4.42% to 11.7%]) were significantly less pronounced than in the PreHyp group (p < 0.001 for both). The PreNorm group showed no significant change in body weight (-1.55% [95% CI: -3.70% to 0.61%]) and fat mass (+0.20% [95% CI: -6.13% to 6.52%], p < 0.05 for both, vs. the PrePre group). Conclusions After 10 years of follow-up, hypertension developed in 50.1% of individuals with pre-hypertension and only 6.76% went from hypertensive to pre-hypertensive blood pressure levels. An increase in body weight and fat mass was a risk factor for the development of sustained hypertension, whereas a decrease was predictive of a decrease in blood pressure. (J Am Coll Cardiol 2010; 56: 65-76) (C) 2010 by the American College of Cardiology Foundation