72 resultados para semi binary based feature detectordescriptor
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
A method for estimating both the Alfvén speed and the field-aligned flow of the magnetosheath at the magnetopause reconnection site is presented. The method employs low-altitude cusp ion observations and requires the identification of a feature in the cusp ion spectra near the low-energy cutoff which will often be present for a low-latitude dayside reconnection site. The appearance of these features in data of limited temporal, energy, and pitch angle resolution is illustrated by using model calculations of cusp ion distribution functions. These are based on the theory of ion acceleration at the dayside magnetopause and allow for the effects on the spectrum of flight times of ions precipitating down newly opened field lines. In addition, the variation of the reconnection rate can be evaluated, and comparison with ground-based observations of the corresponding sequence of transient events allows the field-aligned distance from the ionosphere to the reconnection site to be estimated.
Resumo:
Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n = 30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org.
Resumo:
Wireless video sensor networks have been a hot topic in recent years; the monitoring capability is the central feature of the services offered by a wireless video sensor network can be classified into three major categories: monitoring, alerting, and information on-demand. These features have been applied to a large number of applications related to the environment (agriculture, water, forest and fire detection), military, buildings, health (elderly people and home monitoring), disaster relief, area and industrial monitoring. Security applications oriented toward critical infrastructures and disaster relief are very important applications that many countries have identified as critical in the near future. This paper aims to design a cross layer based protocol to provide the required quality of services for security related applications using wireless video sensor networks. Energy saving, delay and reliability for the delivered data are crucial in the proposed application. Simulation results show that the proposed cross layer based protocol offers a good performance in term of providing the required quality of services for the proposed application.
Resumo:
Imagery registration is a fundamental step, which greatly affects later processes in image mosaic, multi-spectral image fusion, digital surface modelling, etc., where the final solution needs blending of pixel information from more than one images. It is highly desired to find a way to identify registration regions among input stereo image pairs with high accuracy, particularly in remote sensing applications in which ground control points (GCPs) are not always available, such as in selecting a landing zone on an outer space planet. In this paper, a framework for localization in image registration is developed. It strengthened the local registration accuracy from two aspects: less reprojection error and better feature point distribution. Affine scale-invariant feature transform (ASIFT) was used for acquiring feature points and correspondences on the input images. Then, a homography matrix was estimated as the transformation model by an improved random sample consensus (IM-RANSAC) algorithm. In order to identify a registration region with a better spatial distribution of feature points, the Euclidean distance between the feature points is applied (named the S criterion). Finally, the parameters of the homography matrix were optimized by the Levenberg–Marquardt (LM) algorithm with selective feature points from the chosen registration region. In the experiment section, the Chang’E-2 satellite remote sensing imagery was used for evaluating the performance of the proposed method. The experiment result demonstrates that the proposed method can automatically locate a specific region with high registration accuracy between input images by achieving lower root mean square error (RMSE) and better distribution of feature points.
Resumo:
SCOPE: A high intake of n-3 PUFA provides health benefits via changes in the n-6/n-3 ratio in blood. In addition to such dietary PUFAs, variants in the fatty acid desaturase 1 (FADS1) gene are also associated with altered PUFA profiles. METHODS AND RESULTS: We used mathematical modelling to predict levels of PUFA in whole blood, based on MHT and bolasso selected food items, anthropometric and lifestyle factors, and the rs174546 genotypes in FADS1 from 1,607 participants (Food4Me Study). The models were developed using data from the first reported time point (training set) and their predictive power was evaluated using data from the last reported time point (test set). Amongst other food items, fish, pizza, chicken and cereals were identified as being associated with the PUFA profiles. Using these food items and the rs174546 genotypes as predictors, models explained 26% to 43% of the variability in PUFA concentrations in the training set and 22% to 33% in the test set. CONCLUSIONS: Selecting food items using MHT is a valuable contribution to determine predictors, as our models' predictive power is higher compared to analogue studies. As unique feature, we additionally confirmed our models' power based on a test set.
Resumo:
This paper describes a new approach to detect and track maritime objects in real time. The approach particularly addresses the highly dynamic maritime environment, panning cameras, target scale changes, and operates on both visible and thermal imagery. Object detection is based on agglomerative clustering of temporally stable features. Object extents are first determined based on persistence of detected features and their relative separation and motion attributes. An explicit cluster merging and splitting process handles object creation and separation. Stable object clus- ters are tracked frame-to-frame. The effectiveness of the approach is demonstrated on four challenging real-world public datasets.
Resumo:
Biaxially oriented films produced from semi-crystalline, semi-aromatic polyesters are utilised extensively as components within various applications, including the specialist packaging, flexible electronic and photovoltaic markets. However, the thermal performance of such polyesters, specifically poly(ethylene terephthalate) (PET) and poly(ethylene-2,6-naphthalate) (PEN), is inadequate for several applications that require greater dimensional stability at higher operating temperatures. The work described in this project is therefore primarily focussed upon the copolymerisation of rigid comonomers with PET and PEN, in order to produce novel polyester-based materials that exhibit superior thermomechanical performance, with retention of crystallinity, to achieve biaxial orientation. Rigid biphenyldiimide comonomers were readily incorporated into PEN and poly(butylene-2,6-naphthalate) (PBN) via a melt-polycondensation route. For each copoly(ester-imide) series, retention of semi-crystalline behaviour is observed throughout entire copolymer composition ratios. This phenomenon may be rationalised by cocrystallisation between isomorphic biphenyldiimide and naphthalenedicarboxylate residues, which enables statistically random copolymers to melt-crystallise despite high proportions of imide sub-units being present. In terms of thermal performance, the glass transition temperature, Tg, linearly increases with imide comonomer content for both series. This facilitated the production of several high performance PEN-based biaxially oriented films, which displayed analogous drawing, barrier and optical properties to PEN. Selected PBN copoly(ester-imide)s also possess the ability to either melt-crystallise, or form a mesophase from the isotropic state depending on the applied cooling rate. An equivalent synthetic approach based upon isomorphic comonomer crystallisation was subsequently applied to PET by copolymerisation with rigid diimide and Kevlar®-type amide comonomers, to afford several novel high performance PET-based copoly(ester-imide)s and copoly(ester-amide)s that all exhibited increased Tgs. Retention of crystallinity was achieved in these copolymers by either melt-crystallisation or thermal annealing. The initial production of a semi-crystalline, PET-based biaxially oriented film with a Tg in excess of 100 °C was successful, and this material has obvious scope for further industrial scale-up and process development.
Resumo:
A Universal Serial Bus (USB) Mass Storage Device (MSD), often termed a USB flash drive, is ubiquitously used to store important information in unencrypted binary format. This low cost consumer device is incredibly popular due to its size, large storage capacity and relatively high transfer speed. However, if the device is lost or stolen an unauthorized person can easily retrieve all the information. Therefore, it is advantageous in many applications to provide security protection so that only authorized users can access the stored information. In order to provide security protection for a USB MSD, this paper proposes a session key agreement protocol after secure user authentication. The main aim of this protocol is to establish session key negotiation through which all the information retrieved, stored and transferred to the USB MSD is encrypted. This paper not only contributes an efficient protocol, but also does not suffer from the forgery attack and the password guessing attack as compared to other protocols in the literature. This paper analyses the security of the proposed protocol through a formal analysis which proves that the information is stored confidentially and is protected offering strong resilience to relevant security attacks. The computational cost and communication cost of the proposed scheme is analyzed and compared to related work to show that the proposed scheme has an improved tradeoff for computational cost, communication cost and security.
Resumo:
This study evaluated the use of Pluronic F127 and Pluronic F68 as excipients for formulating in situ gelling systems for ocular drug delivery. Thermal transitions have been studied in aqueous solutions of Pluronic F127, Pluronic F68 as well as their binary mixtures using differential scanning calorimetry, rheological measurements, and dynamic light scattering. It was established that the formation of transparent gels at physiologically relevant temperatures is observed only in the case of 20 wt % of Pluronic F127. The addition of Pluronic F68 to Pluronic F127 solutions increases the gelation temperature of binary formulation to above physiological range of temperatures. The biocompatibility evaluation of these formulations using slug mucosa irritation assay and bovine corneal erosion studies revealed that these polymers and their combinations do not cause significant irritation. In vitro drug retention study on glass surfaces and freshly excised bovine cornea showed superior performance of 20 wt % Pluronic F127 compared to other formulations. In addition, in vivo studies in rabbits demonstrated better retention performance of 20 wt % Pluronic F127 compared to Pluronic F68. These results confirmed that 20 wt % Pluronic F127 offers an attractive ocular formulation that can form a transparent gel in situ under physiological conditions with minimal irritation.
Resumo:
Subspace clustering groups a set of samples from a union of several linear subspaces into clusters, so that the samples in the same cluster are drawn from the same linear subspace. In the majority of the existing work on subspace clustering, clusters are built based on feature information, while sample correlations in their original spatial structure are simply ignored. Besides, original high-dimensional feature vector contains noisy/redundant information, and the time complexity grows exponentially with the number of dimensions. To address these issues, we propose a tensor low-rank representation (TLRR) and sparse coding-based (TLRRSC) subspace clustering method by simultaneously considering feature information and spatial structures. TLRR seeks the lowest rank representation over original spatial structures along all spatial directions. Sparse coding learns a dictionary along feature spaces, so that each sample can be represented by a few atoms of the learned dictionary. The affinity matrix used for spectral clustering is built from the joint similarities in both spatial and feature spaces. TLRRSC can well capture the global structure and inherent feature information of data, and provide a robust subspace segmentation from corrupted data. Experimental results on both synthetic and real-world data sets show that TLRRSC outperforms several established state-of-the-art methods.
Resumo:
Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.