658 resultados para reliable narrator
Resumo:
Background High-risk foot complications such as neuropathy, ischaemia, deformity, infections, ulcers and amputations consume considerable health care resources and typically result from chronic diseases. This study aimed to develop and test the validity and reliability of a Queensland High Risk Foot Form (QHRFF) tool. Methods Phase one involved developing a QHRFF using an existing diabetes high-risk foot tool, literature search, expert panel and several state-wide stakeholder groups. Phase two tested the criterion-related validity along with inter- and intra-rater reliability of the final QHRFF. Three cohorts of patients (n = 94) and four clinicians, representing different levels of expertise, were recruited. Validity was determined by calculating sensitivity, specificity and positive predictive values (PPV). Kappa and intra-class correlation (ICC) statistics were used to establish reliability. Results A QHRFF tool containing 46-items across seven domains was developed and endorsed. The majority of QHRFF items achieved moderate-to-perfect validity (PPV = 0.71 – 1) and reliability (Kappa/ICC = 0.41 – 1). Items with weak validity and/or reliability included those identifying health professionals previously attending the patient, other (non-listed) co-morbidity, previous foot ulcer, foot deformity, optimum offloading and optimum footwear. Conclusions The QHRFF had moderate-to-perfect validity and reliability across the majority of items, particularly identifying individual co-morbidities and foot complications. Items with weak validity or reliability need to be re-defined or removed. Overall, the QHRFF appears to be a valid and reliable tool to assess, collect and measure clinical data pertaining to high-risk foot complications for clinical or research purposes.
Resumo:
This paper addresses challenges part of the shift of paradigm taking place in the way we produce, transmit and use power related to what is known as smart grids. The aim of this paper is to explore present initiatives to establish smart grids as a sustainable and reliable power supply system. We argue that smart grids are not isolated to abstract conceptual models alone. We suggest that establishing sustainable and reliable smart grids depend on series of contributions including modeling and simulation projects, technological infrastructure pilots, systemic methods and training, and not least how these and other elements must interact to add reality to the conceptual models. We present and discuss three initiatives that illuminate smart grids from three very different positions. First, the new power grid simulator project in the electrical engineering PhD program at Queensland University of Technology (QUT). Second, the new smart grids infrastructure pilot run by the Norwegian Centers of Expertise Smart Energy Markets (NCE SMART). And third, the new systemic Master program on next generation energy technology at østfold University College (Hiø). These initiatives represent future threads in a mesh embedding smart grids in models, technology, infrastructure, education, skills and people.
Resumo:
For industrial wireless sensor networks, maintaining the routing path for a high packet delivery ratio is one of the key objectives in network operations. It is important to both provide the high data delivery rate at the sink node and guarantee a timely delivery of the data packet at the sink node. Most proactive routing protocols for sensor networks are based on simple periodic updates to distribute the routing information. A faulty link causes packet loss and retransmission at the source until periodic route update packets are issued and the link has been identified as broken. We propose a new proactive route maintenance process where periodic update is backed-up with a secondary layer of local updates repeating with shorter periods for timely discovery of broken links. Proposed route maintenance scheme improves reliability of the network by decreasing the packet loss due to delayed identification of broken links. We show by simulation that proposed mechanism behaves better than the existing popular routing protocols (AODV, AOMDV and DSDV) in terms of end-to-end delay, routing overhead, packet reception ratio.
Resumo:
Camera-laser calibration is necessary for many robotics and computer vision applications. However, existing calibration toolboxes still require laborious effort from the operator in order to achieve reliable and accurate results. This paper proposes algorithms that augment two existing trustful calibration methods with an automatic extraction of the calibration object from the sensor data. The result is a complete procedure that allows for automatic camera-laser calibration. The first stage of the procedure is automatic camera calibration which is useful in its own right for many applications. The chessboard extraction algorithm it provides is shown to outperform openly available techniques. The second stage completes the procedure by providing automatic camera-laser calibration. The procedure has been verified by extensive experimental tests with the proposed algorithms providing a major reduction in time required from an operator in comparison to manual methods.
Resumo:
This work aims to promote integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicles equipped with a camera and a 2D laser range finder. A method to check for inconsistencies between the data provided by these two heterogeneous sensors is proposed and discussed. First, uncertainties in the estimated transformation between the laser and camera frames are evaluated and propagated up to the projection of the laser points onto the image. Then, for each pair of laser scan-camera image acquired, the information at corners of the laser scan is compared with the content of the image, resulting in a likelihood of correspondence. The result of this process is then used to validate segments of the laser scan that are found to be consistent with the image, while inconsistent segments are rejected. Experimental results illustrate how this technique can improve the reliability of perception in challenging environmental conditions, such as in the presence of airborne dust.
Resumo:
This work aims to promote reliability and integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicle (UGV) autonomy. For this purpose, a comprehensive UGV system, comprising many different exteroceptive and proprioceptive sensors has been built. The first contribution of this work is a large, accurately calibrated and synchronised, multi-modal data-set, gathered in controlled environmental conditions, including the presence of dust, smoke and rain. The data have then been used to analyse the effects of such challenging conditions on perception and to identify common perceptual failures. The second contribution is a presentation of methods for mitigating these failures to promote perceptual integrity in adverse environmental conditions.
Resumo:
1. The ability of many introduced fish species to thrive in degraded aquatic habitats and their potential to impact on aquatic ecosystem structure and function suggest that introduced fish may represent both a symptom and a cause of decline in river health and the integrity of native aquatic communities. 2. The varying sensitivities of many commonly introduced fish species to degraded stream conditions, the mechanism and reason for their introduction and the differential susceptibility of local stream habitats to invasion because of the environmental and biological characteristics of the receiving water body, are all confounding factors that may obscure the interpretation of patterns of introduced fish species distribution and abundance and therefore their reliability as indicators of river health. 3. In the present study, we address the question of whether alien fish (i.e. those species introduced from other countries) are a reliable indicator of the health of streams and rivers in south-eastern Queensland, Australia. We examine the relationships of alien fish species distributions and indices of abundance and biomass with the natural environmental features, the biotic characteristics of the local native fish assemblages and indicators of anthropogenic disturbance at a large number of sites subject to varying sources and intensities of human impact. 4. Alien fish species were found to be widespread and often abundant in south-eastern Queensland rivers and streams, and the five species collected were considered to be relatively tolerant to river degradation, making them good candidate indicators of river health. Variation in alien species indices was unrelated to the size of the study sites, the sampling effort expended or natural environmental gradients. The biological resistance of the native fish fauna was not concluded to be an important factor mediating invasion success by alien species. Variation in alien fish indices was, however, strongly related to indicators of disturbance intensity describing local in-stream habitat and riparian degradation, water quality and surrounding land use, particularly the amount of urban development in the catchment. 5. Potential confounding factors that may influence the likelihood of introduction and successful establishment of an alien species and the implications of these factors for river bioassessment are discussed. We conclude that the potentially strong impact that many alien fish species can have on the biological integrity of natural aquatic ecosystems, together with their potential to be used as an initial basis to find out other forms of human disturbance impacts, suggest that some alien species (particularly species from the family Poeciliidae) can represent a reliable 'first cut' indicator of river health.
Resumo:
This paper proposes a highly reliable fault diagnosis approach for low-speed bearings. The proposed approach first extracts wavelet-based fault features that represent diverse symptoms of multiple low-speed bearing defects. The most useful fault features for diagnosis are then selected by utilizing a genetic algorithm (GA)-based kernel discriminative feature analysis cooperating with one-against-all multicategory support vector machines (OAA MCSVMs). Finally, each support vector machine is individually trained with its own feature vector that includes the most discriminative fault features, offering the highest classification performance. In this study, the effectiveness of the proposed GA-based kernel discriminative feature analysis and the classification ability of individually trained OAA MCSVMs are addressed in terms of average classification accuracy. In addition, the proposedGA- based kernel discriminative feature analysis is compared with four other state-of-the-art feature analysis approaches. Experimental results indicate that the proposed approach is superior to other feature analysis methodologies, yielding an average classification accuracy of 98.06% and 94.49% under rotational speeds of 50 revolutions-per-minute (RPM) and 80 RPM, respectively. Furthermore, the individually trained MCSVMs with their own optimal fault features based on the proposed GA-based kernel discriminative feature analysis outperform the standard OAA MCSVMs, showing an average accuracy of 98.66% and 95.01% for bearings under rotational speeds of 50 RPM and 80 RPM, respectively.
Resumo:
In contrast to single robotic agent, multi-robot systems are highly dependent on reliable communication. Robots have to synchronize tasks or to share poses and sensor readings with other agents, especially for co-operative mapping task where local sensor readings are incorporated into a global map. The drawback of existing communication frameworks is that most are based on a central component which has to be constantly within reach. Additionally, they do not prevent data loss between robots if a failure occurs in the communication link. During a distributed mapping task, loss of data is critical because it will corrupt the global map. In this work, we propose a cloud-based publish/subscribe mechanism which enables reliable communication between agents during a cooperative mission using the Data Distribution Service (DDS) as a transport layer. The usability of our approach is verified by several experiments taking into account complete temporary communication loss.
Resumo:
In this paper, we propose a highly reliable fault diagnosis scheme for incipient low-speed rolling element bearing failures. The scheme consists of fault feature calculation, discriminative fault feature analysis, and fault classification. The proposed approach first computes wavelet-based fault features, including the respective relative wavelet packet node energy and entropy, by applying a wavelet packet transform to an incoming acoustic emission signal. The most discriminative fault features are then filtered from the originally produced feature vector by using discriminative fault feature analysis based on a binary bat algorithm (BBA). Finally, the proposed approach employs one-against-all multiclass support vector machines to identify multiple low-speed rolling element bearing defects. This study compares the proposed BBA-based dimensionality reduction scheme with four other dimensionality reduction methodologies in terms of classification performance. Experimental results show that the proposed methodology is superior to other dimensionality reduction approaches, yielding an average classification accuracy of 94.9%, 95.8%, and 98.4% under bearing rotational speeds at 20 revolutions-per-minute (RPM), 80 RPM, and 140 RPM, respectively.
Resumo:
Background Foot disease complications, such as foot ulcers and infection, contribute to considerable morbidity and mortality. These complications are typically precipitated by “high-risk factors”, such as peripheral neuropathy and peripheral arterial disease. High-risk factors are more prevalent in specific “at risk” populations such as diabetes, kidney disease and cardiovascular disease. To the best of the authors’ knowledge a tool capturing multiple high-risk factors and foot disease complications in multiple at risk populations has yet to be tested. This study aimed to develop and test the validity and reliability of a Queensland High Risk Foot Form (QHRFF) tool. Methods The study was conducted in two phases. Phase one developed a QHRFF using an existing diabetes foot disease tool, literature searches, stakeholder groups and expert panel. Phase two tested the QHRFF for validity and reliability. Four clinicians, representing different levels of expertise, were recruited to test validity and reliability. Three cohorts of patients were recruited; one tested criterion measure reliability (n = 32), another tested criterion validity and inter-rater reliability (n = 43), and another tested intra-rater reliability (n = 19). Validity was determined using sensitivity, specificity and positive predictive values (PPV). Reliability was determined using Kappa, weighted Kappa and intra-class correlation (ICC) statistics. Results A QHRFF tool containing 46 items across seven domains was developed. Criterion measure reliability of at least moderate categories of agreement (Kappa > 0.4; ICC > 0.75) was seen in 91% (29 of 32) tested items. Criterion validity of at least moderate categories (PPV > 0.7) was seen in 83% (60 of 72) tested items. Inter- and intra-rater reliability of at least moderate categories (Kappa > 0.4; ICC > 0.75) was seen in 88% (84 of 96) and 87% (20 of 23) tested items respectively. Conclusions The QHRFF had acceptable validity and reliability across the majority of items; particularly items identifying relevant co-morbidities, high-risk factors and foot disease complications. Recommendations have been made to improve or remove identified weaker items for future QHRFF versions. Overall, the QHRFF possesses suitable practicality, validity and reliability to assess and capture relevant foot disease items across multiple at risk populations.
Resumo:
Background: Significant recent attention has focussed on the role of antibiotic prescribing and usage with the aim of combating antibiotic resistance, a growing worldwide health concern. A significant gap in this literature concerns the consumption patterns and beliefs of consumers about antibiotics and their effects. We seek to remedy this gap by exploring a range of questionable antibiotic practices and obtaining reliable estimates of their prevalence as well as their normative status. Methods: We conducted an online survey of over 100 consumers. We used a new incentive compatible technique, the Bayesian Truth Serum (BTS), to elicit more truthful responding than standard self-report measures. We asked participants to indicate whether they engaged in a number of practices including whether they had: taken antibiotics when they are out of date and stored antibiotics at home for later use. We then sought estimates of the percentage of other patients (like them) who had engaged in each behaviour, as well as asking them among those patients who had, the percentage that would admit to having done so. We also asked about social acceptability and responsibility of the practices. Results: These results will show for each type of questionable practice how prevalent it is and whether consumers view it as both socially acceptable and socially responsible. We will gain the relative prevalence of each of these practices. Conclusion: These findings are of paramount importance in gaining a better understanding of consumers’ antibiotic consumption patterns. These will be vital for better targeting educational campaigns to lower inappropriate antibiotic consumption.
Resumo:
Flood extent mapping is a basic tool for flood damage assessment, which can be done by digital classification techniques using satellite imageries, including the data recorded by radar and optical sensors. However, converting the data into the information we need is not a straightforward task. One of the great challenges involved in the data interpretation is to separate the permanent water bodies and flooding regions, including both the fully inundated areas and the wet areas where trees and houses are partly covered with water. This paper adopts the decision fusion technique to combine the mapping results from radar data and the NDVI data derived from optical data. An improved capacity in terms of identifying the permanent or semi-permanent water bodies from flood inundated areas has been achieved. Computer software tools Multispec and Matlab were used.
Resumo:
The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of global warming on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since all building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. Based on a review of the existing weather data generation models, this paper presents an effective method to generate approximate future hourly weather data suitable for the study of the impact of global warming. Depending on the level of information available for the prediction of future weather condition, it is shown that either the method of retaining to current level, constant offset method or diurnal modelling method may be used to generate the future hourly variation of an individual weather parameter. An example of the application of this method to the different global warming scenarios in Australia is presented. Since there is no reliable projection of possible change in air humidity, solar radiation or wind characters, as a first approximation, these parameters have been assumed to remain at the current level. A sensitivity test of their impact on the building energy performance shows that there is generally a good linear relationship between building cooling load and the changes of weather variables of solar radiation, relative humidity or wind speed.