475 resultados para Condition monitoring
Resumo:
Persistent monitoring of the ocean is not optimally accomplished by repeatedly executing a fixed path in a fixed location. The ocean is dynamic, and so should the executed paths to monitor and observe it. An open question merging autonomy and optimal sampling is how and when to alter a path/decision, yet achieve desired science objectives. Additionally, many marine robotic deployments can last multiple weeks to months; making it very difficult for individuals to continuously monitor and retask them as needed. This problem becomes increasingly more complex when multiple platforms are operating simultaneously. There is a need for monitoring and adaptation of the robotic fleet via teams of scientists working in shifts; crowds are ideal for this task. In this paper, we present a novel application of crowd-sourcing to extend the autonomy of persistent-monitoring vehicles to enable nonrepetitious sampling over long periods of time. We present a framework that enables the control of a marine robot by anybody with an internet-enabled device. Voters are provided current vehicle location, gathered science data and predicted ocean features through the associated decision support system. Results are included from a simulated implementation of our system on a Wave Glider operating in Monterey Bay with the science objective to maximize the sum of observed nitrate values collected.
Resumo:
This workshop was supported by the Australian Centre for Ecological Analysis and Synthesis (ACEAS, http://www.aceas.org.au/), a facility of the Australian Government-funded Terrestrial Ecosystem Research Network (http://www.tern.org.au/), a research infrastructure facility established under the National Collaborative Research Infrastructure Strategy and Education Infrastructure Fund - Super Science Initiative, through the Department of Industry, Innovation, Science, Research and Tertiary Education. Hosted by: Queensland University of Technology, Brisbane, Queensland. (QUT, http://www.qut.edu.au/) Dates: 8-11 May 2012 Report Editors: Prof Stuart Parsons (Uni. Auckland, NZ) and Dr Michael Towsey (QUT). This report is a compilation of notes and discussion summaries contributed by those attending the Workshop. They have been assembled into a logical order by the editors. Another report (with photographs) can be obtained at: http://www.aceas.org.au/index.php?option=com_content&view=article&id=94&Itemid=96
Resumo:
In this paper we present a novel place recognition algorithm inspired by recent discoveries in human visual neuroscience. The algorithm combines intolerant but fast low resolution whole image matching with highly tolerant, sub-image patch matching processes. The approach does not require prior training and works on single images (although we use a cohort normalization score to exploit temporal frame information), alleviating the need for either a velocity signal or image sequence, differentiating it from current state of the art methods. We demonstrate the algorithm on the challenging Alderley sunny day – rainy night dataset, which has only been previously solved by integrating over 320 frame long image sequences. The system is able to achieve 21.24% recall at 100% precision, matching drastically different day and night-time images of places while successfully rejecting match hypotheses between highly aliased images of different places. The results provide a new benchmark for single image, condition-invariant place recognition.
Resumo:
Emerging infectious diseases present a complex challenge to public health officials and governments; these challenges have been compounded by rapidly shifting patterns of human behaviour and globalisation. The increase in emerging infectious diseases has led to calls for new technologies and approaches for detection, tracking, reporting, and response. Internet-based surveillance systems offer a novel and developing means of monitoring conditions of public health concern, including emerging infectious diseases. We review studies that have exploited internet use and search trends to monitor two such diseases: influenza and dengue. Internet-based surveillance systems have good congruence with traditional surveillance approaches. Additionally, internet-based approaches are logistically and economically appealing. However, they do not have the capacity to replace traditional surveillance systems; they should not be viewed as an alternative, but rather an extension. Future research should focus on using data generated through internet-based surveillance and response systems to bolster the capacity of traditional surveillance systems for emerging infectious diseases.
Resumo:
The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.
Resumo:
The Lady Elliot Island eco-resort, on the Great Barrier Reef, operates with a strong sustainability ethic, and has broken away from its reliance on diesel generators, an initiative which has ongoing and substantial economic benefit. The first step was an energy audit that led to a 35% reduction in energy usage, to an average of 575 kWh per day. The eco-resort then commissioned a hybrid solar power station, in 2008, with energy storage in battery banks. Solar power is currently (2013) providing about 160 kWh of energy per day, and the eco-resort’s diesel fuel usage has decreased from 550 to 100 litres per day, enabling the power station to pay for itself in 3 years. The eco-resort plans to complete its transition to renewable energy by 2015, by installing additional solar panels, and a 10-15 kW wind turbine. This paper starts by discussing why the eco-resort chose a hybrid solar power station to transition to renewable energy, and the barriers to change. It then describes the power station, upgrades through to 2013, the power control system, the problems that were solved to realise the potential of a facility operating in a harsh and remote environment, and its performance. The paper concludes by outlining other eco-resort sustainability practices, including education and knowledge-sharing initiatives, and monitoring the island’s environmental and ecological condition.
Resumo:
Considering the wide spectrum of situations that it may encounter, a robot navigating autonomously in outdoor environments needs to be endowed with several operating modes, for robustness and efficiency reasons. Indeed, the terrain it has to traverse may be composed of flat or rough areas, low cohesive soils such as sand dunes, concrete road etc... Traversing these various kinds of environment calls for different navigation and/or locomotion functionalities, especially if the robot is endowed with different locomotion abilities, such as the robots WorkPartner, Hylos [4], Nomad or the Marsokhod rovers.
Resumo:
Considering the wide spectrum of situations that it may encounter, a robot navigating autonomously in outdoor environments needs to be endowed with several operating modes, for robustness and efficiency reasons. Indeed, the terrain it has to traverse may be composed of flat or rough areas, low cohesive soils such as sand dunes, concrete road etc. . .Traversing these various kinds of environment calls for different navigation and/or locomotion functionalities, especially if the robot is endowed with different locomotion abilities, such as the robots WorkPartner, Hylos [4], Nomad or the Marsokhod rovers. Numerous rover navigation techniques have been proposed, each of them being suited to a particular environment context (e.g. path following, obstacle avoidance in more or less cluttered environments, rough terrain traverses...). However, seldom contributions in the literature tackle the problem of selecting autonomously the most suited mode [3]. Most of the existing work is indeed devoted to the passive analysis of a single navigation mode, as in [2]. Fault detection is of course essential: one can imagine that a proper monitoring of the Mars Exploration Rover Opportunity could have avoided the rover to be stuck during several weeks in a dune, by detecting non-nominal behavior of some parameters. But the ability to recover the anticipated problem by switching to a better suited navigation mode would bring higher autonomy abilities, and therefore a better overall efficiency. We propose here a probabilistic framework to achieve this, which fuses environment related and robot related information in order to actively control the rover operations.
Resumo:
The design and synthesis of molecularly or supramolecularly defined interfacial architectures have seen in recent years a remarkable growth of interest and scientific research activities for various reasons. On the one hand, it is generally believed that the construction of an interactive interface between the living world of cells, tissue, or whole organisms and the (inorganic or organic) materials world of technical devices such as implants or medical parts requires proper construction and structural (and functional) control of this organism–machine interface. It is still the very beginning of generating a better understanding of what is needed to make an organism tolerate implants, to guarantee bidirectional communication between microelectronic devices and living tissue, or to simply construct interactive biocompatibility of surfaces in general. This exhaustive book lucidly describes the design, synthesis, assembly and characterization, and bio-(medical) applications of interfacial layers on solid substrates with molecularly or supramolecularly controlled architectures. Experts in the field share their contributions that have been developed in recent years.
Resumo:
Bridges are important infrastructures of all nations and are required for transportation of goods as well as human. A catastrophic failure can result in loss of lives and enormous financial hardship to the nation. Although various kinds of sensors are now available to monitor the health of the structures due to corrosion, they do not provide permanent and long term measurements. This paper investigates the fabrication of Carbon Nanotube (CNT) based composite sensors for corrosion detection of structures. Multi-wall CNT (MWCNT)/Nafion composite sensors were fabricated to evaluate their electrical properties for corrosion detection. The test specimens were subjected to real life corrosion experimental tests and the results confirm that the electrical resistance of the sensor electrode was dramatically changed due to corrosion.
Resumo:
This paper describes the experimental evaluation of a novel Autonomous Surface Vehicle capable of navigating complex inland water reservoirs and measuring a range of water quality properties and greenhouse gas emissions. The 16 ft long solar powered catamaran is capable of collecting water column profiles whilst in motion. It is also directly integrated with a reservoir scale floating sensor network to allow remote mission uploads, data download and adaptive sampling strategies. This paper describes the onboard vehicle navigation and control algorithms as well as obstacle avoidance strategies. Experimental results are shown demonstrating its ability to maintain track and avoid obstacles on a variety of large-scale missions and under differing weather conditions, as well as its ability to continuously collect various water quality parameters complimenting traditional manual monitoring campaigns.
Resumo:
This paper describes a novel Autonomous Surface Vehicle capable of navigating throughout complex inland water storages and measuring a range of water quality properties and greenhouse gas emissions. The 16 ft long solar powered catamaran can collect this information throughout the water column whilst the vehicle is moving. A unique feature of this ASV is its integration into a storage scale floating sensor network to allow remote mission uploads, data download and adaptive sampling strategies. This paper provides an overview of the vehicle design and operation including control, laser-based obstacle avoidance, and vision-based inspection capabilities. Experimental results are shown illustrating its ability to continuously collect key water quality parameters and compliment intensive manual monitoring campaigns.
Resumo:
Background Less invasive methods of determining cardiac output are now readily available. Using indicator dilution technique, for example has made it easier to continuously measure cardiac output because it uses the existing intra-arterial line. Therefore gone is the need for a pulmonary artery floatation catheter and with it the ability to measure left atrial and left ventricular work indices as well the ability to monitor and measure a mixed venous saturation (SvO2). Purpose The aim of this paper is to put forward the notion that SvO2 provides valuable information about oxygen consumption and venous reserve; important measures in the critically ill to ensure oxygen supply meets cellular demand. In an attempt to portray this, a simplified example of the septic patient is offered to highlight the changing pathophysiological sequelae of the inflammatory process and its importance for monitoring SvO2. Relevance to clinical practice SvO2 monitoring, it could be argued, provides the gold standard for assessing arterial and venous oxygen indices in the critically ill. For the bedside ICU nurse the plethora of information inherent in SvO2 monitoring could provide them with important data that will assist in averting potential problems with oxygen delivery and consumption. However, it has been suggested that central venous saturation (ScvO2) might be an attractive alternative to SvO2 because of its less invasiveness and ease of obtaining a sample for analysis. There are problems with this approach and these are to do with where the catheter tip is sited and the nature of the venous admixture at this site. Studies have shown that ScvO2 is less accurate than SvO2 and should not be used as a sole guiding variable for decision-making. These studies have demonstrated that there is an unacceptably wide range in variance between ScvO2 and SvO2 and this is dependent on the presenting disease, in some cases SvO2 will be significantly lower than ScvO2. Conclusion Whilst newer technologies have been developed to continuously measure cardiac output, SvO2 monitoring is still an important adjunct to clinical decision-making in the ICU. Given the information that it provides, seeking alternatives such as ScvO2 or blood samples obtained from femorally placed central venous lines, can unnecessarily lead to inappropriate treatment being given or withheld. Instead when using ScvO2, trending of this variable should provide clinical determinates that are useable for the bedside ICU nurse, remembering that in most conditions SvO2 will be approximately 16% lower.
Resumo:
Background There are few data regarding the effectiveness of remote monitoring for older people with heart failure. We conducted a post-hoc sub-analysis of a previously published large Cochrane systematic review and meta-analysis of relevant randomized controlled trials to determine whether structured telephone support and telemonitoring were effective in this population. Methods A post hoc sub-analysis of a systematic review and meta-analysis that applied the Cochrane methodology was conducted. Meta-analyses of all-cause mortality, all-cause hospitalizations and heart failure-related hospitalizations were performed for studies where the mean or median age of participants was 70 or more years. Results The mean or median age of participants was 70 or more years in eight of the 16 (n=2,659/5,613; 47%) structured telephone support studies and four of the 11 (n=894/2,710; 33%) telemonitoring studies. Structured telephone support (RR 0.80; 95% CI=0.63-1.00) and telemonitoring (RR 0.56; 95% CI=0.41-0.76) interventions reduced mortality. Structured telephone support interventions reduced heart failure-related hospitalizations (RR 0.81; 95% CI=0.67-0.99). Conclusion Despite a systematic bias towards recruitment of individuals younger than the epidemiological average into the randomized controlled trials, older people with heart failure did benefit from structured telephone support and telemonitoring. These post-hoc sub-analysis results were similar to overall effects observed in the main meta-analysis. While further research is required to confirm these observational findings, the evidence at hand indicates that discrimination by age alone may be not be appropriate when inviting participation in a remote monitoring service for heart failure.
Resumo:
Environmental monitoring is becoming critical as human activity and climate change place greater pressures on biodiversity, leading to an increasing need for data to make informed decisions. Acoustic sensors can help collect data across large areas for extended periods making them attractive in environmental monitoring. However, managing and analysing large volumes of environmental acoustic data is a great challenge and is consequently hindering the effective utilization of the big dataset collected. This paper presents an overview of our current techniques for collecting, storing and analysing large volumes of acoustic data efficiently, accurately, and cost-effectively.