973 resultados para Reliable Computations


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses challenges part of the shift of paradigm taking place in the way we produce, transmit and use power related to what is known as smart grids. The aim of this paper is to explore present initiatives to establish smart grids as a sustainable and reliable power supply system. We argue that smart grids are not isolated to abstract conceptual models alone. We suggest that establishing sustainable and reliable smart grids depend on series of contributions including modeling and simulation projects, technological infrastructure pilots, systemic methods and training, and not least how these and other elements must interact to add reality to the conceptual models. We present and discuss three initiatives that illuminate smart grids from three very different positions. First, the new power grid simulator project in the electrical engineering PhD program at Queensland University of Technology (QUT). Second, the new smart grids infrastructure pilot run by the Norwegian Centers of Expertise Smart Energy Markets (NCE SMART). And third, the new systemic Master program on next generation energy technology at østfold University College (Hiø). These initiatives represent future threads in a mesh embedding smart grids in models, technology, infrastructure, education, skills and people.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For industrial wireless sensor networks, maintaining the routing path for a high packet delivery ratio is one of the key objectives in network operations. It is important to both provide the high data delivery rate at the sink node and guarantee a timely delivery of the data packet at the sink node. Most proactive routing protocols for sensor networks are based on simple periodic updates to distribute the routing information. A faulty link causes packet loss and retransmission at the source until periodic route update packets are issued and the link has been identified as broken. We propose a new proactive route maintenance process where periodic update is backed-up with a secondary layer of local updates repeating with shorter periods for timely discovery of broken links. Proposed route maintenance scheme improves reliability of the network by decreasing the packet loss due to delayed identification of broken links. We show by simulation that proposed mechanism behaves better than the existing popular routing protocols (AODV, AOMDV and DSDV) in terms of end-to-end delay, routing overhead, packet reception ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Camera-laser calibration is necessary for many robotics and computer vision applications. However, existing calibration toolboxes still require laborious effort from the operator in order to achieve reliable and accurate results. This paper proposes algorithms that augment two existing trustful calibration methods with an automatic extraction of the calibration object from the sensor data. The result is a complete procedure that allows for automatic camera-laser calibration. The first stage of the procedure is automatic camera calibration which is useful in its own right for many applications. The chessboard extraction algorithm it provides is shown to outperform openly available techniques. The second stage completes the procedure by providing automatic camera-laser calibration. The procedure has been verified by extensive experimental tests with the proposed algorithms providing a major reduction in time required from an operator in comparison to manual methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims to promote integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicles equipped with a camera and a 2D laser range finder. A method to check for inconsistencies between the data provided by these two heterogeneous sensors is proposed and discussed. First, uncertainties in the estimated transformation between the laser and camera frames are evaluated and propagated up to the projection of the laser points onto the image. Then, for each pair of laser scan-camera image acquired, the information at corners of the laser scan is compared with the content of the image, resulting in a likelihood of correspondence. The result of this process is then used to validate segments of the laser scan that are found to be consistent with the image, while inconsistent segments are rejected. Experimental results illustrate how this technique can improve the reliability of perception in challenging environmental conditions, such as in the presence of airborne dust.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims to promote reliability and integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicle (UGV) autonomy. For this purpose, a comprehensive UGV system, comprising many different exteroceptive and proprioceptive sensors has been built. The first contribution of this work is a large, accurately calibrated and synchronised, multi-modal data-set, gathered in controlled environmental conditions, including the presence of dust, smoke and rain. The data have then been used to analyse the effects of such challenging conditions on perception and to identify common perceptual failures. The second contribution is a presentation of methods for mitigating these failures to promote perceptual integrity in adverse environmental conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. The ability of many introduced fish species to thrive in degraded aquatic habitats and their potential to impact on aquatic ecosystem structure and function suggest that introduced fish may represent both a symptom and a cause of decline in river health and the integrity of native aquatic communities. 2. The varying sensitivities of many commonly introduced fish species to degraded stream conditions, the mechanism and reason for their introduction and the differential susceptibility of local stream habitats to invasion because of the environmental and biological characteristics of the receiving water body, are all confounding factors that may obscure the interpretation of patterns of introduced fish species distribution and abundance and therefore their reliability as indicators of river health. 3. In the present study, we address the question of whether alien fish (i.e. those species introduced from other countries) are a reliable indicator of the health of streams and rivers in south-eastern Queensland, Australia. We examine the relationships of alien fish species distributions and indices of abundance and biomass with the natural environmental features, the biotic characteristics of the local native fish assemblages and indicators of anthropogenic disturbance at a large number of sites subject to varying sources and intensities of human impact. 4. Alien fish species were found to be widespread and often abundant in south-eastern Queensland rivers and streams, and the five species collected were considered to be relatively tolerant to river degradation, making them good candidate indicators of river health. Variation in alien species indices was unrelated to the size of the study sites, the sampling effort expended or natural environmental gradients. The biological resistance of the native fish fauna was not concluded to be an important factor mediating invasion success by alien species. Variation in alien fish indices was, however, strongly related to indicators of disturbance intensity describing local in-stream habitat and riparian degradation, water quality and surrounding land use, particularly the amount of urban development in the catchment. 5. Potential confounding factors that may influence the likelihood of introduction and successful establishment of an alien species and the implications of these factors for river bioassessment are discussed. We conclude that the potentially strong impact that many alien fish species can have on the biological integrity of natural aquatic ecosystems, together with their potential to be used as an initial basis to find out other forms of human disturbance impacts, suggest that some alien species (particularly species from the family Poeciliidae) can represent a reliable 'first cut' indicator of river health.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a highly reliable fault diagnosis approach for low-speed bearings. The proposed approach first extracts wavelet-based fault features that represent diverse symptoms of multiple low-speed bearing defects. The most useful fault features for diagnosis are then selected by utilizing a genetic algorithm (GA)-based kernel discriminative feature analysis cooperating with one-against-all multicategory support vector machines (OAA MCSVMs). Finally, each support vector machine is individually trained with its own feature vector that includes the most discriminative fault features, offering the highest classification performance. In this study, the effectiveness of the proposed GA-based kernel discriminative feature analysis and the classification ability of individually trained OAA MCSVMs are addressed in terms of average classification accuracy. In addition, the proposedGA- based kernel discriminative feature analysis is compared with four other state-of-the-art feature analysis approaches. Experimental results indicate that the proposed approach is superior to other feature analysis methodologies, yielding an average classification accuracy of 98.06% and 94.49% under rotational speeds of 50 revolutions-per-minute (RPM) and 80 RPM, respectively. Furthermore, the individually trained MCSVMs with their own optimal fault features based on the proposed GA-based kernel discriminative feature analysis outperform the standard OAA MCSVMs, showing an average accuracy of 98.66% and 95.01% for bearings under rotational speeds of 50 RPM and 80 RPM, respectively.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the past few years, research works on the topic of secure outsourcing of cryptographic computations has drawn significant attention from academics in security and cryptology disciplines as well as information security practitioners. One main reason for this interest is their application for resource constrained devices such as RFID tags. While there has been significant progress in this domain since Hohenberger and Lysyanskaya have provided formal security notions for secure computation delegation, there are some interesting challenges that need to be solved that can be useful towards a wider deployment of cryptographic protocols that enable secure outsourcing of cryptographic computations. This position paper brings out these challenging problems with RFID technology as the use case together with our ideas, where applicable, that can provide a direction towards solving the problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In contrast to single robotic agent, multi-robot systems are highly dependent on reliable communication. Robots have to synchronize tasks or to share poses and sensor readings with other agents, especially for co-operative mapping task where local sensor readings are incorporated into a global map. The drawback of existing communication frameworks is that most are based on a central component which has to be constantly within reach. Additionally, they do not prevent data loss between robots if a failure occurs in the communication link. During a distributed mapping task, loss of data is critical because it will corrupt the global map. In this work, we propose a cloud-based publish/subscribe mechanism which enables reliable communication between agents during a cooperative mission using the Data Distribution Service (DDS) as a transport layer. The usability of our approach is verified by several experiments taking into account complete temporary communication loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a highly reliable fault diagnosis scheme for incipient low-speed rolling element bearing failures. The scheme consists of fault feature calculation, discriminative fault feature analysis, and fault classification. The proposed approach first computes wavelet-based fault features, including the respective relative wavelet packet node energy and entropy, by applying a wavelet packet transform to an incoming acoustic emission signal. The most discriminative fault features are then filtered from the originally produced feature vector by using discriminative fault feature analysis based on a binary bat algorithm (BBA). Finally, the proposed approach employs one-against-all multiclass support vector machines to identify multiple low-speed rolling element bearing defects. This study compares the proposed BBA-based dimensionality reduction scheme with four other dimensionality reduction methodologies in terms of classification performance. Experimental results show that the proposed methodology is superior to other dimensionality reduction approaches, yielding an average classification accuracy of 94.9%, 95.8%, and 98.4% under bearing rotational speeds at 20 revolutions-per-minute (RPM), 80 RPM, and 140 RPM, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Foot disease complications, such as foot ulcers and infection, contribute to considerable morbidity and mortality. These complications are typically precipitated by “high-risk factors”, such as peripheral neuropathy and peripheral arterial disease. High-risk factors are more prevalent in specific “at risk” populations such as diabetes, kidney disease and cardiovascular disease. To the best of the authors’ knowledge a tool capturing multiple high-risk factors and foot disease complications in multiple at risk populations has yet to be tested. This study aimed to develop and test the validity and reliability of a Queensland High Risk Foot Form (QHRFF) tool. Methods The study was conducted in two phases. Phase one developed a QHRFF using an existing diabetes foot disease tool, literature searches, stakeholder groups and expert panel. Phase two tested the QHRFF for validity and reliability. Four clinicians, representing different levels of expertise, were recruited to test validity and reliability. Three cohorts of patients were recruited; one tested criterion measure reliability (n = 32), another tested criterion validity and inter-rater reliability (n = 43), and another tested intra-rater reliability (n = 19). Validity was determined using sensitivity, specificity and positive predictive values (PPV). Reliability was determined using Kappa, weighted Kappa and intra-class correlation (ICC) statistics. Results A QHRFF tool containing 46 items across seven domains was developed. Criterion measure reliability of at least moderate categories of agreement (Kappa > 0.4; ICC > 0.75) was seen in 91% (29 of 32) tested items. Criterion validity of at least moderate categories (PPV > 0.7) was seen in 83% (60 of 72) tested items. Inter- and intra-rater reliability of at least moderate categories (Kappa > 0.4; ICC > 0.75) was seen in 88% (84 of 96) and 87% (20 of 23) tested items respectively. Conclusions The QHRFF had acceptable validity and reliability across the majority of items; particularly items identifying relevant co-morbidities, high-risk factors and foot disease complications. Recommendations have been made to improve or remove identified weaker items for future QHRFF versions. Overall, the QHRFF possesses suitable practicality, validity and reliability to assess and capture relevant foot disease items across multiple at risk populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A scheme for integration of stand-alone INS and GPS sensors is presented, with data interchange over an external bus. This ensures modularity and sensor interchangeability. Use of a medium-coupled scheme reduces data flow and computation, facilitating use in surface vehicles. Results show that the hybrid navigation system is capable of delivering high positioning accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let a and s denote the inter arrival times and service times in a GI/GI/1 queue. Let a (n), s (n) be the r.v.s, with distributions as the estimated distributions of a and s from iid samples of a and s of sizes n. Let w be a r.v. with the stationary distribution lr of the waiting times of the queue with input (a, s). We consider the problem of estimating E [w~], tx > 0 and 7r via simulations when (a (n), s (n)) are used as input. Conditions for the accuracy of the asymptotic estimate, continuity of the asymptotic variance and uniformity in the rate of convergence to the estimate are obtained. We also obtain rates of convergence for sample moments, the empirical process and the quantile process for the regenerative processes. Robust estimates are also obtained when an outlier contaminated sample of a and s is provided. In the process we obtain consistency, continuity and asymptotic normality of M-estimators for stationary sequences. Some robustness results for Markov processes are included.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computational study for the convergence acceleration of Euler and Navier-Stokes computations with upwind schemes has been conducted in a unified framework. It involves the flux-vector splitting algorithms due to Steger-Warming and Van Leer, the flux-difference splitting algorithms due to Roe and Osher and the hybrid algorithms, AUSM (Advection Upstream Splitting Method) and HUS (Hybrid Upwind Splitting). Implicit time integration with line Gauss-Seidel relaxation and multigrid are among the procedures which have been systematically investigated on an individual as well as cumulative basis. The upwind schemes have been tested in various implicit-explicit operator combinations such that the optimal among them can be determined based on extensive computations for two-dimensional flows in subsonic, transonic, supersonic and hypersonic flow regimes. In this study, the performance of these implicit time-integration procedures has been systematically compared with those corresponding to a multigrid accelerated explicit Runge-Kutta method. It has been demonstrated that a multigrid method employed in conjunction with an implicit time-integration scheme yields distinctly superior convergence as compared to those associated with either of the acceleration procedures provided that effective smoothers, which have been identified in this investigation, are prescribed in the implicit operator.