964 resultados para Robust Performance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liuwei Dihuang Wan (LWD), a classic Chinese medicinal formulae, has been used to improve or restore declined functions related to aging and geriatric diseases, such as impaired mobility, vision, hearing, cognition and memory. It has attracted increasingly much attention as one of the most popular and valuable herbal medicines. However, the systematic analysis of the chemical constituents of LDW is difficult and thus has not been well established. In this paper, a rapid, sensitive and reliable ultra-performance liquid chromatography with electrospray ionization quadrupole time-of-flight high-definition mass spectrometry (UPLC-ESI-Q-TOF-MS) method with automated MetaboLynx analysis in positive and negative ion mode was established to characterize the chemical constituents of LDW. The analysis was performed on a Waters UPLCTM HSS T3 using a gradient elution system. MS/MS fragmentation behavior was proposed for aiding the structural identification of the components. Under the optimized conditions, a total of 50 peaks were tentatively characterized by comparing the retention time and MS data. It is concluded that a rapid and robust platform based on UPLC-ESI-Q-TOF-MS has been successfully developed for globally identifying multiple-constituents of traditional Chinese medicine prescriptions. This is the first report on systematic analysis of the chemical constituents of LDW. This article is protected by copyright. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novel techniques have been developed for the automatic recognition of human behaviour in challenging environments using information from visual and infra-red camera feeds. The techniques have been applied to two interesting scenarios: Recognise drivers' speech using lip movements and recognising audience behaviour, while watching a movie, using facial features and body movements. Outcome of the research in these two areas will be useful in the improving the performance of voice recognition in automobiles for voice based control and for obtaining accurate movie interest ratings based on live audience response analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a data quality metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the number of potential applications of Unmanned Aircraft Systems (UAS) grows in civilian operations and national security, National Airworthiness Authorities are under increasing pressure to provide a path for certification and allow UAS integration into the national airspace. The success of this integration depends on developments in improved UAS reliability and safety, regulations for certification, and technologies for operational performance and safety assessment. This paper focusses on the latter and describes the use of a framework for evaluating robust autonomy of UAS, namely, the autonomous system’s ability to either continue operation in the presence of faults or safely shut down. The paper draws parallels between the proposed evaluation framework and the evaluation of pilots during the licensing process. It also discusses how the data from the proposed evaluation can be uses as an aid for decision making in certification and UAS designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the level of autonomy in Unmanned Aircraft Systems (UAS) increases, there is an imperative need for developing methods to assess robust autonomy. This paper focuses on the computations that lead to a set of measures of robust autonomy. These measures are the probabilities that selected performance indices related to the mission requirements and airframe capabilities remain within regions of acceptable performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the number of Uninhabited Airborne Systems (UAS) proliferates in civil applications, industry is increasingly putting pressure on regulation authorities to provide a path for certification and allow UAS integration into regulated airspace. The success of this integration depends on developments in improved UAS reliability and safety, regulations for certification, and technologies for operational performance and safety assessment. This paper focusses on the last topic and describes a framework for quantifying robust autonomy of UAS, which quantifies the system's ability to either continue operating in the presence of faults or safely shut down. Two figures of merit are used to evaluate vehicle performance relative to mission requirements and the consequences of autonomous decision making in motion control and guidance systems. These figures of merit are interpreted within a probabilistic framework, which extends previous work in the literature. The valuation of the figures of merit can be done using stochastic simulation scenarios during both vehicle development and certification stages with different degrees of integration of hardware-in-the-loop simulation technology. The objective of the proposed framework is to aid in decision making about the suitability of a vehicle with respect to safety and reliability relative to mission requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses a method to quantify robust autonomy of Uninhabited Vehicles and Systems (UVS) in aerospace, marine, or land applications. Based on mission-vehicle specific performance criteria, we define an system utility function that can be evaluated using simulation scenarios for an envelope of environmental conditions. The results of these evaluations are used to compute a figure of merit or measure for operational efectiveness (MOE). The procedure is then augmented to consider faults and the performance of mechanisms to handle these faulty operational modes. This leads to a measure of robust autonomy (MRA). The objective of the proposed figures of merit is to assist in decision making about vehicle performance and reliability at both vehicle development stage (using simulation models) and at certification stage (using hardware-in-the-loop testing). Performance indices based on dynamic and geometric tasks associated with vehicle manoeuvring problems are proposed, and an example of a two- dimensional y scenario is provided to illustrate the use of the proposed figures of merit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quick detection of an abrupt unknown change in the conditional distribution of a dependent stochastic process has numerous applications. In this paper, we pose a minimax robust quickest change detection problem for cases where there is uncertainty about the post-change conditional distribution. Our minimax robust formulation is based on the popular Lorden criteria of optimal quickest change detection. Under a condition on the set of possible post-change distributions, we show that the widely known cumulative sum (CUSUM) rule is asymptotically minimax robust under our Lorden minimax robust formulation as a false alarm constraint becomes more strict. We also establish general asymptotic bounds on the detection delay of misspecified CUSUM rules (i.e. CUSUM rules that are designed with post- change distributions that differ from those of the observed sequence). We exploit these bounds to compare the delay performance of asymptotically minimax robust, asymptotically optimal, and other misspecified CUSUM rules. In simulation examples, we illustrate that asymptotically minimax robust CUSUM rules can provide better detection delay performance at greatly reduced computation effort compared to competing generalised likelihood ratio procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capacitors are widely used for power-factor correction (PFC) in power systems. When a PFC capacitor is installed with a certain load in a microgrid, it may be in parallel with the filter capacitor of the inverter interfacing the utility grid and the local distributed-generation unit and, thus, change the effective filter capacitance. Another complication is the possibility of occurrence of resonance in the microgrid. This paper conducts an in-depth investigation of the effective shunt-filter-capacitance variation and resonance phenomena in a microgrid due to a connection of a PFC capacitor. To compensate the capacitance-parameter variation, an Hinfin controller is designed for the voltage-source- inverter voltage control. By properly choosing the weighting functions, the synthesized Hinfin controller would exhibit high gains at the vicinity of the line frequency, similar to traditional high- performance P+ resonant controller and, thus, would possess nearly zero steady-state error. However, with the robust Hinfin controller, it will be possible to explicitly specify the degree of robustness in face of parameter variations. Furthermore, a thorough investigation is carried out to study the performance of inner current-loop feedback variables under resonance conditions. It reveals that filter-inductor current feedback is more effective in damping the resonance. This resonance can be further attenuated by employing the dual-inverter microgrid conditioner and controlling the series inverter as a virtual resistor affecting only harmonic components without interference with the fundamental power flow. And finally, the study in this paper has been tested experimentally using an experimental microgrid prototype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, the mining industry has come to recognise the importance of water both to itself and to others. Water accounting is a formalisation of this importance that quantifies and communicates how water is used by individual sites and the industry as a whole. While there are a number of different accounting frameworks that could be used within the industry, the Minerals Council of Australia’s (MCA) Water Accounting Framework (WAF) is an industry-led approach that provides a consistent representation of mine site water interactions regardless of their operational, social or environmental context that allows for valid comparisons between sites and companies. The WAF contains definitions of offsite water sources and destinations and onsite water use, a methodology for applying the definitions and a set of metrics to measure site performance. The WAF is comprised of two models: the Input-Output Model, which represents the interactions between sites and their surrounding community and the Operational Model, which represents onsite water interactions. Members of the MCA have recently adopted the WAF’s Input-Output Model to report on their external water interactions in their Australian operations with some adopting it on a global basis. To support this adoption, there is a need for companies to better understand how to implement the WAF in their own operations. Developing a water account is non-trivial, particularly for sites unfamiliar with the WAF or for sites with the need to represent unusual features. This work describes how to build a water account for a given site using the Input-Output Model with an emphasis on how to represent challenging situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a case study approach, this paper presents a robust methodology for assessing the compatibility of stormwater treatment performance data between two geographical regions in relation to a treatment system. The desktop analysis compared data derived from a field study undertaken in Florida, USA, with South East Queensland (SEQ) rainfall and pollutant characteristics. The analysis was based on the hypothesis that when transposing treatment performance information from one geographical region to another, detailed assessment of specific rainfall and stormwater quality parameters is required. Accordingly, characteristics of measured rainfall events and stormwater quality in the Florida study were compared with typical characteristics for SEQ. Rainfall events monitored in the Florida study were found to be similar to events that occur in SEQ in terms of their primary characteristics of depth, duration and intensity. Similarities in total suspended solids (TSS) and total nitrogen (TN) concentration ranges for Florida and SEQ suggest that TSS and TN removal performances would not be very different if the treatment system is installed in SEQ. However, further investigations are needed to evaluate the treatment performance of total phosphorus (TP). The methodology presented also allows comparison of other water quality parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Accurate diagnosis is essential for prompt and appropriate treatment of malaria. While rapid diagnostic tests (RDTs) offer great potential to improve malaria diagnosis, the sensitivity of RDTs has been reported to be highly variable. One possible factor contributing to variable test performance is the diversity of parasite antigens. This is of particular concern for Plasmodium falciparum histidine-rich protein 2 (PfHRP2)-detecting RDTs since PfHRP2 has been reported to be highly variable in isolates of the Asia-Pacific region. Methods The pfhrp2 exon 2 fragment from 458 isolates of P. falciparum collected from 38 countries was amplified and sequenced. For a subset of 80 isolates, the exon 2 fragment of histidine-rich protein 3 (pfhrp3) was also amplified and sequenced. DNA sequence and statistical analysis of the variation observed in these genes was conducted. The potential impact of the pfhrp2 variation on RDT detection rates was examined by analysing the relationship between sequence characteristics of this gene and the results of the WHO product testing of malaria RDTs: Round 1 (2008), for 34 PfHRP2-detecting RDTs. Results Sequence analysis revealed extensive variations in the number and arrangement of various repeats encoded by the genes in parasite populations world-wide. However, no statistically robust correlation between gene structure and RDT detection rate for P. falciparum parasites at 200 parasites per microlitre was identified. Conclusions The results suggest that despite extreme sequence variation, diversity of PfHRP2 does not appear to be a major cause of RDT sensitivity variation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Minerals Council of Australia’s (MCA) Water Accounting Framework (WAF) is an industry lead initiative to enable cross company communication and comparisons of water management performance. The WAF consists of two models, the Input-Output Model that represents water interactions between an operation and its surrounding environment and the Operational Model that represents water interactions within an operation. Recently, MCA member companies have agreed to use the Input-Output Model to report on their external water interactions in Australian operations, with some adopting it globally. The next step will be to adopt the Operational Model. This will expand the functionality of the WAF from corporate reporting to allowing widespread identification of inefficiencies and to connect internal and external interactions. Implementing the WAF, particularly the Operational Model, is non-trivial. It can be particularly difficult for operations that are unfamiliar with the WAF definitions and methodology, lack information pertaining to flow volumes or contain unusual configurations. Therefore, there is a need to help industry with its implementation. This work presents a step-by-step guide to producing the Operational Model. It begins by describing a methodology for implementing the Operational Model by describing the identification of pertinent objects (stores, tasks and treatments), quantification of flows, aggregation of objects and production of reports. It then discusses how the Operational Model can represent a series of challenging scenarios and how it can be connected with Input-Output Model to improve water management.