866 resultados para Monitoring Systems
Resumo:
Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.
Resumo:
Doppler Optical Coherence Tomography (DOCT) is a biomedical imaging technique that allows simultaneous structural imaging and flow monitoring inside biological tissues and materials with spatial resolution in the micrometer scale. It has recently been applied to the characterization of microfluidic systems. Structural and flow imaging of novel microfluidics platforms for cytotoxicologic applications were obtained with a real-time, Near Infrared Spectral Domain DOCT system. Characteristics such as flow homogeneity in the chamber, which is one of the most important parameters for cell culture, are investigated. OCT and DOCT images were used to monitor flow inside a specific platform that is based on microchannel division for a better flow homogeneity. In particular, the evolution of flow profile at the transition between the microchannel structure and the chamber is studied.
Resumo:
In this paper, we propose an intelligent method, named the Novelty Detection Power Meter (NodePM), to detect novelties in electronic equipment monitored by a smart grid. Considering the entropy of each device monitored, which is calculated based on a Markov chain model, the proposed method identifies novelties through a machine learning algorithm. To this end, the NodePM is integrated into a platform for the remote monitoring of energy consumption, which consists of a wireless sensors network (WSN). It thus should be stressed that the experiments were conducted in real environments different from many related works, which are evaluated in simulated environments. In this sense, the results show that the NodePM reduces by 13.7% the power consumption of the equipment we monitored. In addition, the NodePM provides better efficiency to detect novelties when compared to an approach from the literature, surpassing it in different scenarios in all evaluations that were carried out.
Resumo:
Recent advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing environmental conditions and number of users, application performance might suffer, leading to Service Level Agreement (SLA) violations and inefficient use of hardware resources. We introduce a system for controlling the complexity of scaling applications composed of multiple services using mechanisms based on fulfillment of SLAs. We present how service monitoring information can be used in conjunction with service level objectives, predictions, and correlations between performance indicators for optimizing the allocation of services belonging to distributed applications. We validate our models using experiments and simulations involving a distributed enterprise information system. We show how discovering correlations between application performance indicators can be used as a basis for creating refined service level objectives, which can then be used for scaling the application and improving the overall application's performance under similar conditions.
Resumo:
The identification of plausible causes for water body status deterioration will be much easier if it can build on available, reliable, extensive and comprehensive biogeochemical monitoring data (preferably aggregated in a database). A plausible identification of such causes is a prerequisite for well-informed decisions on which mitigation or remediation measures to take. In this chapter, first a rationale for an extended monitoring programme is provided; it is then compared to the one required by the Water Framework Directive (WFD). This proposal includes a list of relevant parameters that are needed for an integrated, a priori status assessment. Secondly, a few sophisticated statistical tools are described that subsequently allow for the estiation of the magnitude of impairment as well as the likely relative importance of different stressors in a multiple stressed environment. The advantages and restrictions of these rather complicated analytical methods are discussed. Finally, the use of Decision Support Systems (DSS) is advocated with regard to the specific WFD implementation requirements.
Resumo:
Detector uniformity is a fundamental performance characteristic of all modern gamma camera systems, and ensuring a stable, uniform detector response is critical for maintaining clinical images that are free of artifact. For these reasons, the assessment of detector uniformity is one of the most common activities associated with a successful clinical quality assurance program in gamma camera imaging. The evaluation of this parameter, however, is often unclear because it is highly dependent upon acquisition conditions, reviewer expertise, and the application of somewhat arbitrary limits that do not characterize the spatial location of the non-uniformities. Furthermore, as the goal of any robust quality control program is the determination of significant deviations from standard or baseline conditions, clinicians and vendors often neglect the temporal nature of detector degradation (1). This thesis describes the development and testing of new methods for monitoring detector uniformity. These techniques provide more quantitative, sensitive, and specific feedback to the reviewer so that he or she may be better equipped to identify performance degradation prior to its manifestation in clinical images. The methods exploit the temporal nature of detector degradation and spatially segment distinct regions-of-non-uniformity using multi-resolution decomposition. These techniques were tested on synthetic phantom data using different degradation functions, as well as on experimentally acquired time series floods with induced, progressively worsening defects present within the field-of-view. The sensitivity of conventional, global figures-of-merit for detecting changes in uniformity was evaluated and compared to these new image-space techniques. The image-space algorithms provide a reproducible means of detecting regions-of-non-uniformity prior to any single flood image’s having a NEMA uniformity value in excess of 5%. The sensitivity of these image-space algorithms was found to depend on the size and magnitude of the non-uniformities, as well as on the nature of the cause of the non-uniform region. A trend analysis of the conventional figures-of-merit demonstrated their sensitivity to shifts in detector uniformity. The image-space algorithms are computationally efficient. Therefore, the image-space algorithms should be used concomitantly with the trending of the global figures-of-merit in order to provide the reviewer with a richer assessment of gamma camera detector uniformity characteristics.
Resumo:
Increasing commercial pressures on land are provoking fundamental and far-reaching changes in the relationships between people and land. Much knowledge on land-oriented investments projects currently comes from the media. Although this provides a good starting point, lack of transparency and rapidly changing contexts mean that this is often unreliable. The International Land Coalition, in partnership with Oxfam Novib, Centre de coopération internationale en recherche agronomique pour le développement (CIRAD), University of Pretoria, Centre for Development and Environment of the University of Bern (CDE), and GIZ, started to compile an inventory of land-related investments. This project aims to better understand the extent, trends and impacts of land-related investments by supporting an ongoing and systematic stocktaking exercise of the various investment projects currently taking place worldwide. It involves a large number of organizations and individuals working in areas where land transactions are being made, and able to provide details of such investments. The project monitors land transactions in rural areas that imply a transformation of land use rights from communities and smallholders to commercial use, and are made both by domestic and foreign investors (private actors, governments, government-back private investors). The focus is on investments for food or agrofuel production, timber extraction, carbon trading, mineral extraction, conservation and tourism. A novel way of using ITC to document land acquisitions in a spatially explicit way and by using an approach called “crowdsourcing” is being developed. This approach will allow actors to share information and knowledge directly and at any time on a public platform, where it will be scrutinized in terms of reliability and cross checked with other sources. Up to now, over 1200 deals have been recorded across 96 countries. Details of such transactions have been classified in a matrix and distributed to over 350 contacts worldwide for verification. The verified information has been geo-referenced and represented in two global maps. This is an open database enabling a continued monitoring exercise and the improvement of data accuracy. More information will be released over time. The opportunities arise from overcoming constraints by incomplete information by proposing a new way of collecting, enhancing and sharing information and knowledge in a more democratic and transparent manner. The intention is to develop interactive knowledge platform where any interested person can share and access information on land deals, their link to involved stakeholders, and their embedding into a geographical context. By making use of new ICT technologies that are more and more in the reach of local stakeholders, as well as open access and web-based spatial information systems, it will become possible to create a dynamic database containing spatial explicit data. Feeding in data by a large number of stakeholders, increasingly also by means of new mobile ITC technologies, will open up new opportunities to analyse, monitor and assess highly dynamic trends of land acquisition and rural transformation.
Resumo:
In recent years, Geographic Information Systems (GIS) have increasingly been used in a wide array of application contexts for development cooperation in lowlands and mountain areas. When used for planning, implementation, and monitoring, GIS is a versatile and highly efficient tool, particularly in mountain areas characterized by great spatial diversity and inaccessibility. However, the establishment and application of GIS in mountain regions generally presents considerable technical challenges. Moreover, it is necessary to address specific institutional and organizational issues regarding implementation.
Resumo:
It is a challenge to measure the impact of releasing data to the public since the effects may not be directly linked to particular open data activities or substantial impact may only occur several years after publishing the data. This paper proposes a framework to assess the impact of releasing open data by applying the Social Return on Investment (SROI) approach. SROI was developed for organizations intended to generate social and environmental benefits thus fitting the purpose of most open data initiatives. We link the four steps of SROI (input, output, outcome, impact) with the 14 high-value data categories of the G8 Open Data Charter to create a matrix of open data examples, activities, and impacts in each of the data categories. This Impact Monitoring Framework helps data providers to navigate the impact space of open data laying out the conceptual basis for further research.
Resumo:
Due to its extraordinary biodiversity and rapid deforestation, north-eastern Madagascar is a conservation hotspot of global importance. Reducing shifting cultivation is a high priority for policy-makers and conservationists; however, spatially explicit evidence of shifting cultivation is lacking due to the difficulty of mapping it with common remote sensing methods. To overcome this challenge, we adopted a landscape mosaic approach to assess the changes between natural forests, shifting cultivation and permanent cultivation systems at the regional level from 1995 to 2011. Our study confirmed that shifting cultivation is still being used to produce subsistence rice throughout the region, but there is a trend of intensification away from shifting cultivation towards permanent rice production, especially near protected areas. While large continuous forest exists today only in the core zones of protected areas, the agricultural matrix is still dominated by a dense cover of tree crops and smaller forest fragments. We believe that this evidence makes a crucial contribution to the development of interventions to prevent further conversion of forest to agricultural land while improving local land users' well-being.
Resumo:
Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.
Resumo:
The selection of metrics for ecosystem restoration programs is critical for improving the quality of monitoring programs and characterizing project success. Moreover it is oftentimes very difficult to balance the importance of multiple ecological, social, and economical metrics. Metric selection process is a complex and must simultaneously take into account monitoring data, environmental models, socio-economic considerations, and stakeholder interests. We propose multicriteria decision analysis (MCDA) methods, broadly defined, for the selection of optimal sets of metrics to enhance evaluation of ecosystem restoration alternatives. Two MCDA methods, a multiattribute utility analysis (MAUT), and a probabilistic multicriteria acceptability analysis (ProMAA), are applied and compared for a hypothetical case study of a river restoration involving multiple stakeholders. Overall, the MCDA results in a systematic, unbiased, and transparent solution, informing restoration alternatives evaluation. The two methods provide comparable results in terms of selected metrics. However, because ProMAA can consider probability distributions for weights and utility values of metrics for each criteria, it is suggested as the best option if data uncertainty is high. Despite the increase in complexity in the metric selection process, MCDA improves upon the current ad-hoc decision practice based on the consultations with stakeholders and experts, and encourages transparent and quantitative aggregation of data and judgement, increasing the transparency of decision making in restoration projects. We believe that MCDA can enhance the overall sustainability of ecosystem by enhancing both ecological and societal needs.
Resumo:
Segmented ionization chambers represent a good solution to monitor the position, the intensity and the shape of ion beams in hadrontherapy. Pixel and strip chambers have been developed for both passive scattering and active scanning dose delivery systems. In particular, strip chambers are optimal for pencil beam scanning, allowing for spatial and time resolutions below 0.1 mm and 1 ms, respectively. The MATRIX pixel and the Strip Accurate Monitor for Beam Applications (SAMBA) detectors are described in this paper together with the results of several beam tests and industrial developments based on these prototypes.