868 resultados para Rule-based techniques
Resumo:
Temperatures have increased and in-crop rainfall decreased over recent decades in many parts of the Australian wheat cropping region. With these trends set to continue or intensify, improving crop adaptation in the face of climate change is particularly urgent in this, already drought-prone, cropping region. Importantly, improved performance under water-limitation must be achieved while retaining yield potential during more favourable seasons. A multi-trait-based approach to improve wheat yield and yield stability in the face of water-limitation and heat has been instigated in northern Australia using novel phenotyping techniques and a nested association mapping (NAM) approach. An innovative laboratory technique allows rapid root trait screening of hundreds of lines. Using soil grown seedlings, the method offers significant advantages over many other lab-based techniques. Another recently developed method allows novel stay-green traits to be quantified objectively for hundreds of genotypes in standard field trial plots. Field trials in multiple locations and seasons allow evaluation of targeted trait values and identification of superior germplasm. Traits, including yield and yield components are measured for hundreds of NAM lines in rain fed environments under various levels of water-limitation. To rapidly generate lines of interest, the University of Queensland “speed breeding” method is being employed, allowing up to 7 plant generations per annum. A NAM population of over 1000 wheat recombinant inbred lines has been progressed to the F5 generation within 18 months. Genotyping the NAM lines with the genome-wide DArTseq molecular marker system provides up to 40,000 markers. They are now being used for association mapping to validate QTL previously identified in bi-parental populations and to identify novel QTL for stay-green and root traits. We believe that combining the latest techniques in physiology, phenotyping, genetics and breeding will increase genetic progress toward improved adaptation to water-limited environments.
Resumo:
Orgasm is a subjective experience accompanied by involuntary muscle contractions. We hypothesized that orgasm in women would be distinguishable by frequency analysis of a perineal muscle-derived signal. Rectal pressure, an index of perineal muscle activity, was measured continuously in 23 healthy women during different sexual tasks: receiving clitoral stimulation, imitation of orgasm, and attempt to reach orgasm, in which case the women were asked to report whether orgasm had been reached ("orgasm") or not ("failed orgasm attempt"). We performed spectral analysis on the rectal pressure data and calculated the spectral power in the frequency bands delta (0.5-4 Hz), theta (4-8 Hz), alpha (8-13 Hz), and beta (13-25 Hz). The most significant and most important difference in spectral power between orgasm and both control motor tasks (imitation of orgasm and failed orgasm attempt) was found in the alpha band. An objective rule based on spectral power in the alpha band recognized 94% (29/31) of orgasms and correctly labeled 69% (44/64) of all orgasm attempts as either successful or failed. Because outbursts of alpha fluctuations in rectal pressure only occurred during orgasm and not during voluntary imitation of orgasm or failed attempts, we propose that they represent involuntary contractions of muscles in the rectal vicinity. This is the first objective and quantitative measure that has a strong correspondence with the subjective experience of orgasm.
Resumo:
To obtain data on phytoplankton dynamics with improved spatial and temporal resolution, and at reduced cost, traditional phytoplankton monitoring methods have been supplemented with optical approaches. In this thesis, I have explored various fluorescence-based techniques for detection of phytoplankton abundance, taxonomy and physiology in the Baltic Sea. In algal cultures used in this thesis, the availability of nitrogen and light conditions caused changes in pigmentation, and consequently in light absorption and fluorescence properties of cells. In the Baltic Sea, physical environmental factors (e.g. mixing depth, irradiance and temperature) and related seasonal succession in the phytoplankton community explained a large part of the seasonal variability in the magnitude and shape of Chlorophyll a (Chla)-specific absorption. The variability in Chla-specific fluorescence was related to the abundance of cyanobacteria, the size structure of the phytoplankton community, and absorption characteristics of phytoplankton. Cyanobacteria show very low Chla-specific fluorescence. In the presence of eukaryotic species, Chla fluorescence describes poorly cyanobacteria. During cyanobacterial bloom in the Baltic Sea, phycocyanin fluorescence explained large part of the variability in Chla concentrations. Thus, both Chla and phycocyanin fluorescence were required to predict Chla concentration. Phycobilins are major light harvesting pigments for cyanobacteria. In the open Baltic Sea, small picoplanktonic cyanobacteria were the main source of phycoerythrin fluorescence and absorption signal. Large filamentous cyanobacteria, forming harmful blooms, were the main source of the phycocyanin fluorescence signal and typically their biomass and phycocyanin fluorescence were linearly related. Using phycocyanin fluorescence, dynamics of cyanobacterial blooms can be detected at high spatial and seasonal resolution not possible with other methods. Various taxonomic phytoplankton pigment groups can be separated by spectral fluorescence. I compared multivariate calibration methods for the retrieval of phytoplankton biomass in different taxonomic groups. Partial least squares regression method gave the closest predictions for all taxonomic groups, and the accuracy was adequate for phytoplankton bloom detection. Variable fluorescence has been proposed as a tool to study the physiological state of phytoplankton. My results from the Baltic Sea emphasize that variable fluorescence alone cannot be used to detect nutrient limitation of phytoplankton. However, when combined with experiments with active nutrient manipulation, and other nutrient limitation indices, variable fluorescence provided valuable information on the physiological responses of the phytoplankton community. This thesis found a severe limitation of a commercial fast repetition rate fluorometer, which couldn t detect the variable fluorescence of phycoerythrin-lacking cyanobacteria. For these species, the Photosystem II absorption of blue light is very low, and fluorometer excitation light did not saturate Photosystem II during a measurement. This thesis encourages the use of various in vivo fluorescence methods for the detection of bulk phytoplankton biomass, biomass of cyanobacteria, chemotaxonomy of phytoplankton community, and phytoplankton physiology. Fluorescence methods can support traditional phytoplankton monitoring by providing continuous measurements of phytoplankton, and thereby strengthen the understanding of the links between biological, chemical and physical processes in aquatic ecosystems.
Resumo:
Atmospheric aerosol particles have a strong impact on the global climate. A deep understanding of the physical and chemical processes affecting the atmospheric aerosol climate system is crucial in order to describe those processes properly in global climate models. Besides the climatic effects, aerosol particles can deteriorate e.g. visibility and human health. Nucleation is a fundamental step in atmospheric new particle formation. However, details of the atmospheric nucleation mechanisms have remained unresolved. The main reason for that has been the non-existence of instruments capable of measuring neutral newly formed particles in the size range below 3 nm in diameter. This thesis aims to extend the detectable particle size range towards close-to-molecular sizes (~1nm) of freshly nucleated clusters, and by direct measurement obtain the concentrations of sub-3 nm particles in atmospheric environment and in well defined laboratory conditions. In the work presented in this thesis, new methods and instruments for the sub-3 nm particle detection were developed and tested. The selected approach comprises four different condensation based techniques and one electrical detection scheme. All of them are capable to detect particles with diameters well below 3 nm, some even down to ~1 nm. The developed techniques and instruments were deployed in the field measurements as well as in laboratory nucleation experiments. Ambient air studies showed that in a boreal forest environment a persistent population of 1-2 nm particles or clusters exists. The observation was done using 4 different instruments showing a consistent capability for the direct measurement of the atmospheric nucleation. The results from the laboratory experiments showed that sulphuric acid is a key species in the atmospheric nucleation. The mismatch between the earlier laboratory data and ambient observations on the dependency of nucleation rate on sulphuric acid concentration was explained. The reason was shown to be associated in the inefficient growth of the nucleated clusters and in the insufficient detection efficiency of particle counters used in the previous experiments. Even though the exact molecular steps of nucleation still remain an open question, the instrumental techniques developed in this work as well as their application in laboratory and ambient studies opened a new view into atmospheric nucleation and prepared the way for investigating the nucleation processes with more suitable tools.
Resumo:
The motivation behind the fusion of Intrusion Detection Systems was the realization that with the increasing traffic and increasing complexity of attacks, none of the present day stand-alone Intrusion Detection Systems can meet the high demand for a very high detection rate and an extremely low false positive rate. Multi-sensor fusion can be used to meet these requirements by a refinement of the combined response of different Intrusion Detection Systems. In this paper, we show the design technique of sensor fusion to best utilize the useful response from multiple sensors by an appropriate adjustment of the fusion threshold. The threshold is generally chosen according to the past experiences or by an expert system. In this paper, we show that the choice of the threshold bounds according to the Chebyshev inequality principle performs better. This approach also helps to solve the problem of scalability and has the advantage of failsafe capability. This paper theoretically models the fusion of Intrusion Detection Systems for the purpose of proving the improvement in performance, supplemented with the empirical evaluation. The combination of complementary sensors is shown to detect more attacks than the individual components. Since the individual sensors chosen detect sufficiently different attacks, their result can be merged for improved performance. The combination is done in different ways like (i) taking all the alarms from each system and avoiding duplications, (ii) taking alarms from each system by fixing threshold bounds, and (iii) rule-based fusion with a priori knowledge of the individual sensor performance. A number of evaluation metrics are used, and the results indicate that there is an overall enhancement in the performance of the combined detector using sensor fusion incorporating the threshold bounds and significantly better performance using simple rule-based fusion.
Resumo:
In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.
Resumo:
Extraction of text areas from the document images with complex content and layout is one of the challenging tasks. Few texture based techniques have already been proposed for extraction of such text blocks. Most of such techniques are greedy for computation time and hence are far from being realizable for real time implementation. In this work, we propose a modification to two of the existing texture based techniques to reduce the computation. This is accomplished with Harris corner detectors. The efficiency of these two textures based algorithms, one based on Gabor filters and other on log-polar wavelet signature, are compared. A combination of Gabor feature based texture classification performed on a smaller set of Harris corner detected points is observed to deliver the accuracy and efficiency.
Resumo:
Conventional Random access scan (RAS) for testing has lower test application time, low power dissipation, and low test data volume compared to standard serial scan chain based design In this paper, we present two cluster based techniques, namely, Serial Input Random Access Scan and Variable Word Length Random Access Scan to reduce test application time even further by exploiting the parallelism among the clusters and performing write operations on multiple bits Experimental results on benchmarks circuits show on an average 2-3 times speed up in test write time and average 60% reduction in write test data volume
Resumo:
The dynamic response of a single span cable due to a travelling seismic excitation is studied in this paper. The influence of propagation time between the supports is investigated in detail. The importance of considering both vertical and longitudinal equations of motion in the analysis is highlighted. The results indicate the considerable influence of the time-lagged support motions on the cable dynamic tension. A modal combination rule based on the response spectrum method is developed to arrive at the peak estimates of the cable response. Some significant aspects of cable behaviour, especially under horizontal support motion, are discussed.
Resumo:
We introduce a variation density function that profiles the relationship between multiple scalar fields over isosurfaces of a given scalar field. This profile serves as a valuable tool for multifield data exploration because it provides the user with cues to identify interesting isovalues of scalar fields. Existing isosurface-based techniques for scalar data exploration like Reeb graphs, contour spectra, isosurface statistics, etc., study a scalar field in isolation. We argue that the identification of interesting isovalues in a multifield data set should necessarily be based on the interaction between the different fields. We demonstrate the effectiveness of our approach by applying it to explore data from a wide variety of applications.
Resumo:
The 21st century poses many challenges for global sustainability. Among them, most importantly, the human race will encounter scarcity of raw materials and conventional energy resources. And, India may have to take the brunt of these problems as it is going to be the most populated region of the world with concomitant increase in energy demand and requirement of other resources. India will be the testing ground for introducing newer ways of green technology and innovative principles of resource management and utilization. With the vagaries of potential climate change gathering clouds in the background, Earth sciences will have a special and predominant role in guiding the society in prioritizing our resource discovery, utilization and their consumption and the upkeep of environment. On the fundamental level, Earth sciences are going through a most exciting phase of development as a born-again science. Technological breakthroughs including the satellite-based observations augur well for gaining new insights into Earth processes. A set of exciting fundamental problems that are globally identified will set the stage for an exhilarating period of new discoveries. Improvements in numerical and computer-based techniques will assist in modelling of Earth processes to unprecedented levels. India will have to take special effort in improving the existing experimentation facilities in the Earth science departments of the country, and also the general level of Earth science education to meet the global standards. This article presents an Earth science vision for the 21st century in an Indian context.
Resumo:
The problem of spurious patterns in neural associative memory models is discussed, Some suggestions to solve this problem from the literature are reviewed and their inadequacies are pointed out, A solution based on the notion of neural self-interaction with a suitably chosen magnitude is presented for the Hebb learning rule. For an optimal learning rule based on linear programming, asymmetric dilution of synaptic connections is presented as another solution to the problem of spurious patterns, With varying percentages of asymmetric dilution it is demonstrated numerically that this optimal learning rule leads to near total suppression of spurious patterns. For practical usage of neural associative memory networks a combination of the two solutions with the optimal learning rule is recommended to be the best proposition.
Resumo:
A fuzzy logic system is developed for helicopter rotor system fault isolation. Inputs to the fuzzy logic system are measurement deviations of blade bending and torsion response and vibration from a "good" undamaged helicopter rotor. The rotor system measurements used are flap and lag bending tip deflections, elastic twist deflection at the tip, and three forces and three moments at the rotor hub. The fuzzy logic system uses rules developed from an aeroelastic model of the helicopter rotor with implanted faults to isolate the fault while accounting for uncertainty in the measurements. The faults modeled include moisture absorption, loss of trim mass, damaged lag damper, damaged pitch control system, misadjusted pitch link, and damaged flap. Tests with simulated data show that the fuzzy system isolates rotor system faults with an accuracy of about 90-100%. Furthermore, the fuzzy system is robust and gives excellent results, even when some measurements are not available. A rule-based expert system based on similar rules from the aeroelastic model performs much more poorly than the fuzzy system in the presence of high levels of uncertainty.
Resumo:
This paper presents a prototype of a fuzzy system for alleviation of network overloads in the day-to-day operation of power systems. The control used for overload alleviation is real power generation rescheduling. Generation Shift Sensitivity Factors (GSSF) are computed accurately, using a more realistic operational load flow model. Overloading of lines and sensitivity of controlling variables are translated into fuzzy set notations to formulate the relation between overloading of line and controlling ability of generation scheduling. A fuzzy rule based system is formed to select the controllers, their movement direction and step size. Overall sensitivity of line loading to each of the generation is also considered in selecting the controller. Results obtained for network overload alleviation of two modified Indian power networks of 24 bus and 82 bus with line outage contingencies are presented for illustration purposes.
Resumo:
Diatoms have become important organisms for monitoring freshwaters and their value has been recognised in Europe, American and African continents. If India is to include diatoms in the current suite of bioindicators, then thorough testing of diatom-based techniques is required. This paper provides guidance on methods through all stages of diatom collection from different habitats from streams and lakes, preparation and examination for the purposes of water quality assessment that can be adapted to most aquatic ecosystems in India.