881 resultados para Signature Verification, Forgery Detection, Fuzzy Modeling
Resumo:
We propose a novel technique for conducting robust voice activity detection (VAD) in high-noise recordings. We use Gaussian mixture modeling (GMM) to train two generic models; speech and non-speech. We then score smaller segments of a given (unseen) recording against each of these GMMs to obtain two respective likelihood scores for each segment. These scores are used to compute a dissimilarity measure between pairs of segments and to carry out complete-linkage clustering of the segments into speech and non-speech clusters. We compare the accuracy of our method against state-of-the-art and standardised VAD techniques to demonstrate an absolute improvement of 15% in half-total error rate (HTER) over the best performing baseline system and across the QUT-NOISE-TIMIT database. We then apply our approach to the Audio-Visual Database of American English (AVDBAE) to demonstrate the performance of our algorithm in using visual, audio-visual or a proposed fusion of these features.
Resumo:
The hemodynamic response function (HRF) describes the local response of brain vasculature to functional activation. Accurate HRF modeling enables the investigation of cerebral blood flow regulation and improves our ability to interpret fMRI results. Block designs have been used extensively as fMRI paradigms because detection power is maximized; however, block designs are not optimal for HRF parameter estimation. Here we assessed the utility of block design fMRI data for HRF modeling. The trueness (relative deviation), precision (relative uncertainty), and identifiability (goodness-of-fit) of different HRF models were examined and test-retest reproducibility of HRF parameter estimates was assessed using computer simulations and fMRI data from 82 healthy young adult twins acquired on two occasions 3 to 4 months apart. The effects of systematically varying attributes of the block design paradigm were also examined. In our comparison of five HRF models, the model comprising the sum of two gamma functions with six free parameters had greatest parameter accuracy and identifiability. Hemodynamic response function height and time to peak were highly reproducible between studies and width was moderately reproducible but the reproducibility of onset time was low. This study established the feasibility and test-retest reliability of estimating HRF parameters using data from block design fMRI studies.
Resumo:
Introduction Two symposia on “cardiovascular diseases and vulnerable plaques” Cardiovascular disease (CVD) is the leading cause of death worldwide. Huge effort has been made in many disciplines including medical imaging, computational modeling, bio- mechanics, bioengineering, medical devices, animal and clinical studies, population studies as well as genomic, molecular, cellular and organ-level studies seeking improved methods for early detection, diagnosis, prevention and treatment of these diseases [1-14]. However, the mechanisms governing the initiation, progression and the occurrence of final acute clinical CVD events are still poorly understood. A large number of victims of these dis- eases who are apparently healthy die suddenly without prior symptoms. Available screening and diagnostic methods are insufficient to identify the victims before the event occurs [8,9]. Most cardiovascular diseases are associated with vulnerable plaques. A grand challenge here is to develop new imaging techniques, predictive methods and patient screening tools to identify vulnerable plaques and patients who are more vulnerable to plaque rupture and associated clinical events such as stroke and heart attack, and recommend proper treatment plans to prevent those clinical events from happening. Articles in this special issue came from two symposia held recently focusing on “Cardio-vascular Diseases and Vulnerable Plaques: Data, Modeling, Predictions and Clinical Applications.” One was held at Worcester Polytechnic Institute (WPI), Worcester, MA, USA, July 13-14, 2014, right after the 7th World Congress of Biomechanics. This symposium was endorsed by the World Council of Biomechanics, and partially supported by a grant from NIH-National Institute of Biomedical Image and Bioengineering. The other was held at Southeast University (SEU), Nanjing, China, April 18-20, 2014.
Resumo:
QTL mapping methods for complex traits are challenged by new developments in marker technology, phenotyping platforms, and breeding methods. In meeting these challenges, QTL mapping approaches will need to also acknowledge the central roles of QTL by environment interactions (QEI) and QTL by trait interactions in the expression of complex traits like yield. This paper presents an overview of mixed model QTL methodology that is suitable for many types of populations and that allows predictive modeling of QEI, both for environmental and developmental gradients. Attention is also given to multi-trait QTL models which are essential to interpret the genetic basis of trait correlations. Biophysical (crop growth) model simulations are proposed as a complement to statistical QTL mapping for the interpretation of the nature of QEI and to investigate better methods for the dissection of complex traits into component traits and their genetic controls.
Resumo:
Uncertainties associated with the structural model and measured vibration data may lead to unreliable damage detection. In this paper, we show that geometric and measurement uncertainty cause considerable problem in damage assessment which can be alleviated by using a fuzzy logic-based approach for damage detection. Curvature damage factor (CDF) of a tapered cantilever beam are used as damage indicators. Monte Carlo simulation (MCS) is used to study the changes in the damage indicator due to uncertainty in the geometric properties of the beam. Variation in these CDF measures due to randomness in structural parameter, further contaminated with measurement noise, are used for developing and testing a fuzzy logic system (FLS). Results show that the method correctly identifies both single and multiple damages in the structure. For example, the FLS detects damage with an average accuracy of about 95 percent in a beam having geometric uncertainty of 1 percent COV and measurement noise of 10 percent in single damage scenario. For multiple damage case, the FLS identifies damages in the beam with an average accuracy of about 94 percent in the presence of above mentioned uncertainties. The paper brings together the disparate areas of probabilistic analysis and fuzzy logic to address uncertainty in structural damage detection.
Resumo:
In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.
Resumo:
Detect and Avoid (DAA) technology is widely acknowledged as a critical enabler for unsegregated Remote Piloted Aircraft (RPA) operations, particularly Beyond Visual Line of Sight (BVLOS). Image-based DAA, in the visible spectrum, is a promising technological option for addressing the challenges DAA presents. Two impediments to progress for this approach are the scarcity of available video footage to train and test algorithms, in conjunction with testing regimes and specifications which facilitate repeatable, statistically valid, performance assessment. This paper includes three key contributions undertaken to address these impediments. In the first instance, we detail our progress towards the creation of a large hybrid collision and near-collision encounter database. Second, we explore the suitability of techniques employed by the biometric research community (Speaker Verification and Language Identification), for DAA performance optimisation and assessment. These techniques include Detection Error Trade-off (DET) curves, Equal Error Rates (EER), and the Detection Cost Function (DCF). Finally, the hybrid database and the speech-based techniques are combined and employed in the assessment of a contemporary, image based DAA system. This system includes stabilisation, morphological filtering and a Hidden Markov Model (HMM) temporal filter.
Resumo:
The paper presents the results of a computational modeling for damage identification process for an axial rod representing an end-bearing pile foundation with known damage and a simply supported beam representing a bridge girder. The paper proposes a methodology for damage identification from measured natural frequencies of a contiguously damaged reinforced concrete axial rod and beam, idealized with distributed damage model. Identification of damage is from Equal_Eigen_value_change (Iso_Eigen_value_Change) contours, plotted between pairs of different frequencies. The performance of the method is checked for a wide variation of damage positions and extents. An experiment conducted on a free-free axially loaded reinforced concrete member and a flexural beam is shown as examples to prove the pros and cons of this method. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
The MIT Lincoln Laboratory IDS evaluation methodology is a practical solution in terms of evaluating the performance of Intrusion Detection Systems, which has contributed tremendously to the research progress in that field. The DARPA IDS evaluation dataset has been criticized and considered by many as a very outdated dataset, unable to accommodate the latest trend in attacks. Then naturally the question arises as to whether the detection systems have improved beyond detecting these old level of attacks. If not, is it worth thinking of this dataset as obsolete? The paper presented here tries to provide supporting facts for the use of the DARPA IDS evaluation dataset. The two commonly used signature-based IDSs, Snort and Cisco IDS, and two anomaly detectors, the PHAD and the ALAD, are made use of for this evaluation purpose and the results support the usefulness of DARPA dataset for IDS evaluation.
Resumo:
In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.
Resumo:
Several orthopoxviruses (OPV) and Borna disease virus (BDV) are enveloped, zoonotic viruses with a wide geographical distribution. OPV antibodies cross-react, and former smallpox vaccination has therefore protected human populations from another OPV infection, rodent-borne cowpox virus (CPXV). Cowpox in humans and cats usually manifests as a mild, self-limiting dermatitis and constitutional symptoms, but it can be severe and even life-threatening in the immunocompromised. Classical Borna disease is a progressive meningoencephalomyelitis in horses and sheep known in central Europe for centuries. Nowadays the virus or its close relative infects humans and also several other species in central Europe and elsewhere, but the existence of human Borna disease with its suspected neuropsychiatric symptoms is controversial. The epidemiology of BDV is largely unknown, and the present situation is even more intriguing following the recent detection of several-million-year-old, endogenized BDV genes in primate and various other vertebrate genomes. The aims of this study were to elucidate the importance of CPXV and BDV in Finland and in possible host species, and particularly to 1) establish relevant methods for the detection of CPXV and other OPVs as well as BDV in Finland, 2) determine whether CPXV and BDV exist in Finland, 3) discover how common OPV immunity is in different age groups in Finland, 4) characterize possible disease cases and clarify their epidemiological context, 5) establish the hosts and possible reservoir species of these viruses and their geographical distribution in wild rodents, and 6) elucidate the infection kinetics of BDV in the bank vole. An indirect immunofluorescence assay and avidity measurement were established for the detection, timing and verification of OPV or BDV antibodies in thousands of blood samples from humans, horses, ruminants, lynxes, gallinaceous birds, dogs, cats and rodents. The mostly vaccine-derived OPV seroprevalence was found to decrease gradually according to the year of birth of the sampled human subjects from 100% to 10% in those born after 1977. On the other hand, OPV antibodies indicating natural contact with CPXV or other OPVs were commonly found in domestic and wild animals: the horse, cow, lynx, dog, cat and, with a prevalence occasionally even as high as 92%, in wild rodents, including some previously undetected species and new regions. Antibodies to BDV were detected in humans, horses, a dog, cats, and for the first time in wild rodents, such as bank voles (Myodes glareolus). Because of the controversy within the human Borna disease field, extra verification methods were established for BDV antibody findings: recombinant nucleocapsid and phosphoproteins were produced in Escherichia coli and in a baculovirus system, and peptide arrays were additionally applied. With these verification assays, Finnish human, equine, feline and rodent BDV infections were confirmed. Taken together, wide host spectra were evident for both OPV and BDV infections based on the antibody findings, and OPV infections were found to be geographically broadly distributed. PCR amplification methods were utilised for hundreds of blood and tissue samples. The methods included conventional, nested and real-time PCRs with or without the reverse transcription step and detecting four or two genes of OPVs and BDV, respectively. OPV DNA could be amplified from two human patients and three bank voles, whereas no BDV RNA was detected in naturally infected individuals. Based on the phylogenetic analyses, the Finnish OPV sequences were closely related although not identical to a Russian CPXV isolate, and clearly different from other CPXV strains. Moreover, the Finnish sequences only equalled each other, but the short amplicons obtained from German rodents were identical to monkeypox virus, in addition to German CPXV variants. This reflects the close relationship of all OPVs. In summary, RNA of the Finnish BDV variant could not be detected with the available PCR methods, but OPV DNA infrequently could. The OPV species infecting the patients of this study was proven to be CPXV, which is most probably also responsible for the rodent infections. Multiple cell lines and some newborn rodents were utilised in the isolation of CPXV and BDV from patient and wildlife samples. CPXV could be isolated from a child with severe, generalised cowpox. BDV isolation attempts from rodents were unsuccessful in this study. However, in parallel studies, a transient BDV infection of cells inoculated with equine brain material was detected, and BDV antigens discovered in archival animal brains using established immunohistology. Thus, based on several independent methods, both CPXV and BDV (or a closely related agent) were shown to be present in Finland. Bank voles could be productively infected with BDV. This experimental infection did not result in notable pathological findings or symptoms, despite the intense spread of the virus in the central and peripheral nervous system. Infected voles commonly excreted the virus in urine and faeces, which emphasises their possible role as a BDV reservoir. Moreover, BDV RNA was regularly reverse transcribed into DNA in bank voles, which was detected by amplifying DNA by PCR without reverse transcription, and verified with nuclease treatments. This finding indicates that BDV genes could be endogenized during an acute infection. Although further transmission studies are needed, this experimental infection demonstrated that the bank vole can function as a potential BDV reservoir. In summary, multiple methods were established and applied in large panels to detect two zoonoses novel to Finland: cowpox virus and Borna disease virus. Moreover, new information was obtained on their geographical distribution, host spectrum, epidemiology and infection kinetics.
Resumo:
Damage detection by measuring and analyzing vibration signals in a machine component is an established procedure in mechanical and aerospace engineering. This paper presents vibration signature analysis of steel bridge structures in a nonconventional way using artificial neural networks (ANN). Multilayer perceptrons have been adopted using the back-propagation algorithm for network training. The training patterns in terms of vibration signature are generated analytically for a moving load traveling on a trussed bridge structure at a constant speed to simulate the inspection vehicle. Using the finite-element technique, the moving forces are converted into stationary time-dependent force functions in order to generate vibration signals in the structure and the same is used to train the network. The performance of the trained networks is examined for their capability to detect damage from unknown signatures taken independently at one, three, and five nodes. It has been observed that the prediction using the trained network with single-node signature measurement at a suitability chosen location is even better than that of three-node and five-node measurement data.
Resumo:
We have imaged the H92alpha and H75alpha radio recombination line (RRL) emissions from the starburst galaxy NGC 253 with a resolution of similar to4 pc. The peak of the RRL emission at both frequencies coincides with the unresolved radio nucleus. Both lines observed toward the nucleus are extremely wide, with FWHMs of similar to200 km s(-1). Modeling the RRL and radio continuum data for the radio nucleus shows that the lines arise in gas whose density is similar to10(4) cm(-3) and mass is a few thousand M., which requires an ionizing flux of (6-20) x 10(51) photons s(-1). We consider a supernova remnant (SNR) expanding in a dense medium, a star cluster, and also an active galactic nucleus (AGN) as potential ionizing sources. Based on dynamical arguments, we rule out an SNR as a viable ionizing source. A star cluster model is considered, and the dynamics of the ionized gas in a stellar-wind driven structure are investigated. Such a model is only consistent with the properties of the ionized gas for a cluster younger than similar to10(5) yr. The existence of such a young cluster at the nucleus seems improbable. The third model assumes the ionizing source to be an AGN at the nucleus. In this model, it is shown that the observed X-ray flux is too weak to account for the required ionizing photon flux. However, the ionization requirement can be explained if the accretion disk is assumed to have a big blue bump in its spectrum. Hence, we favor an AGN at the nucleus as the source responsible for ionizing the observed RRLs. A hybrid model consisting of an inner advection-dominated accretion flow disk and an outer thin disk is suggested, which could explain the radio, UV, and X-ray luminosities of the nucleus.
Resumo:
In a detailed model for reservoir irrigation taking into account the soil moisture dynamics in the root zone of the crops, the data set for reservoir inflow and rainfall in the command will usually be of sufficient length to enable their variations to be described by probability distributions. However, the potential evapotranspiration of the crop itself depends on the characteristics of the crop and the reference evaporation, the quantification of both being associated with a high degree of uncertainty. The main purpose of this paper is to propose a mathematical programming model to determine the annual relative yield of crops and to determine its reliability, for a single reservoir meant for irrigation of multiple crops, incorporating variations in inflow, rainfall in the command area, and crop consumptive use. The inflow to the reservoir and rainfall in the reservoir command area are treated as random variables, whereas potential evapotranspiration is modeled as a fuzzy set. The model's application is illustrated with reference to an existing single-reservoir system in Southern India.
Resumo:
Network Intrusion Detection Systems (NIDS) intercept the traffic at an organization's network periphery to thwart intrusion attempts. Signature-based NIDS compares the intercepted packets against its database of known vulnerabilities and malware signatures to detect such cyber attacks. These signatures are represented using Regular Expressions (REs) and strings. Regular Expressions, because of their higher expressive power, are preferred over simple strings to write these signatures. We present Cascaded Automata Architecture to perform memory efficient Regular Expression pattern matching using existing string matching solutions. The proposed architecture performs two stage Regular Expression pattern matching. We replace the substring and character class components of the Regular Expression with new symbols. We address the challenges involved in this approach. We augment the Word-based Automata, obtained from the re-written Regular Expressions, with counter-based states and length bound transitions to perform Regular Expression pattern matching. We evaluated our architecture on Regular Expressions taken from Snort rulesets. We were able to reduce the number of automata states between 50% to 85%. Additionally, we could reduce the number of transitions by a factor of 3 leading to further reduction in the memory requirements.