147 resultados para temporal visualization techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of feature selection in a nonparametric unsupervised learning environment is practically undeveloped because no true measure for the effectiveness of a feature exists in such an environment. The lack of a feature selection phase preceding the clustering process seriously affects the reliability of such learning. New concepts such as significant features, level of significance of features, and immediate neighborhood are introduced which result in meeting implicitly the need for feature slection in the context of clustering techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tambura is an essential drone accompaniment used in Indian music concerts. It acts as an immediate reference of pitch for both the artists and listeners. The four strings of Tambura are tuned to the frequency ratio :1:1: . Careful listening to Tambura sound reveals that the tonal spectrum is not stationary but is time varying. The object of this study is to make a detailed spectrum analysis to find out the nature of temporal variation of the tonal spectrum of Tambura sound. Results of the analysis are correlated with perceptual evaluation conducted in a controlled acoustic environment. A significant result of this study is to demonstrate the presence of several notes which are normally not noticed even by a professional artist. The effect of bridge in Tambura in producing the so called “live tone” is explained through time and frequency parameters of Tambura sounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When a uniform flow of any nature is interrupted, the readjustment of the flow results in concentrations and rare-factions, so that the peak value of the flow parameter will be higher than that which an elementary computation would suggest. When stress flow in a structure is interrupted, there are stress concentrations. These are generally localized and often large, in relation to the values indicated by simple equilibrium calculations. With the advent of the industrial revolution, dynamic and repeated loading of materials had become commonplace in engine parts and fast moving vehicles of locomotion. This led to serious fatigue failures arising from stress concentrations. Also, many metal forming processes, fabrication techniques and weak-link type safety systems benefit substantially from the intelligent use or avoidance, as appropriate, of stress concentrations. As a result, in the last 80 years, the study and and evaluation of stress concentrations has been a primary objective in the study of solid mechanics. Exact mathematical analysis of stress concentrations in finite bodies presents considerable difficulty for all but a few problems of infinite fields, concentric annuli and the like, treated under the presumption of small deformation, linear elasticity. A whole series of techniques have been developed to deal with different classes of shapes and domains, causes and sources of concentration, material behaviour, phenomenological formulation, etc. These include real and complex functions, conformal mapping, transform techniques, integral equations, finite differences and relaxation, and, more recently, the finite element methods. With the advent of large high speed computers, development of finite element concepts and a good understanding of functional analysis, it is now, in principle, possible to obtain with economy satisfactory solutions to a whole range of concentration problems by intelligently combining theory and computer application. An example is the hybridization of continuum concepts with computer based finite element formulations. This new situation also makes possible a more direct approach to the problem of design which is the primary purpose of most engineering analyses. The trend would appear to be clear: the computer will shape the theory, analysis and design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract is not available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Urban growth identification, quantification, knowledge of rate and the trends of growth would help in regional planning for better infrastructure provision in environmentally sound way. This requires analysis of spatial and temporal data, which help in quantifying the trends of growth on spatial scale. Emerging technologies such as Remote Sensing, Geographic Information System (GIS) along with Global Positioning System (GPS) help in this regard. Remote sensing aids in the collection of temporal data and GIS helps in spatial analysis. This paper focuses on the analysis of urban growth pattern in the form of either radial or linear sprawl along the Bangalore - Mysore highway. Various GIS base layers such as builtup areas along the highway, road network, village boundary etc. were generated using collateral data such as the Survey of India toposheet, etc. Further, this analysis was complemented with the computation of Shannon's entropy, which helped in identifying prevalent sprawl zone, rate of growth and in delineating potential sprawl locations. The computation Shannon's entropy helped in delineating regions with dispersed and compact growth. This study reveals that the Bangalore North and South taluks contributed mainly to the sprawl with 559% increase in built-up area over a period of 28 years and high degree of dispersion. The Mysore and Srirangapatna region showed 128% change in built-up area and a high potential for sprawl with slightly high dispersion. The degree of sprawl was found to be directly proportional to the distances from the cities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, pattern classification problem in tool wear monitoring is solved using nature inspired techniques such as Genetic Programming(GP) and Ant-Miner (AM). The main advantage of GP and AM is their ability to learn the underlying data relationships and express them in the form of mathematical equation or simple rules. The extraction of knowledge from the training data set using GP and AM are in the form of Genetic Programming Classifier Expression (GPCE) and rules respectively. The GPCE and AM extracted rules are then applied to set of data in the testing/validation set to obtain the classification accuracy. A major attraction in GP evolved GPCE and AM based classification is the possibility of obtaining an expert system like rules that can be directly applied subsequently by the user in his/her application. The performance of the data classification using GP and AM is as good as the classification accuracy obtained in the earlier study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the functioning of a neural system in terms of its underlying circuitry is an important problem in neuroscience. Recent d evelopments in electrophysiology and imaging allow one to simultaneously record activities of hundreds of neurons. Inferring the underlying neuronal connectivity patterns from such multi-neuronal spike train data streams is a challenging statistical and computational problem. This task involves finding significant temporal patterns from vast amounts of symbolic time series data. In this paper we show that the frequent episode mining methods from the field of temporal data mining can be very useful in this context. In the frequent episode discovery framework, the data is viewed as a sequence of events, each of which is characterized by an event type and its time of occurrence and episodes are certain types of temporal patterns in such data. Here we show that, using the set of discovered frequent episodes from multi-neuronal data, one can infer different types of connectivity patterns in the neural system that generated it. For this purpose, we introduce the notion of mining for frequent episodes under certain temporal constraints; the structure of these temporal constraints is motivated by the application. We present algorithms for discovering serial and parallel episodes under these temporal constraints. Through extensive simulation studies we demonstrate that these methods are useful for unearthing patterns of neuronal network connectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Direct numerical simulations (DNS) of spatially growing turbulent shear layers may be performed as temporal simulations by solving the governing equations with some additional terms while imposing streamwise periodicity. These terms are functions of the means whose spatial growth is calculated easily and accurately from statistics of the temporal DNS. Equations for such simulations are derived.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Along with useful microorganisms, there are some that cause potential damage to the animals and plants. Detection and identification of these harmful organisms in a cost and time effective way is a challenge for the researchers. The future of detection methods for microorganisms shall be guided by biosensor, which has already contributed enormously in sensing and detection technology. Here, we aim to review the use of various biosensors, developed by integrating the biological and physicochemical/mechanical properties (of tranducers), which can have enormous implication in healthcare, food, agriculture and biodefence. We have also highlighted the ways to improve the functioning of the biosensor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the presence of ATP, recA protein forms a presynaptic complex with single-stranded DNA that is an obligatory intermediate in homologous pairing. Presynaptic complexes of recA protein and circular single strands that are active in forming joint molecules can be isolated by gel filtration. These isolated active complexes are nucleoprotein filaments with the following characteristics: (i) a contour length that is at least 1.5 times that of the corresponding duplex DNA molecule, (ii) an ordered structure visualized by negative staining as a striated filament with a repeat distance of 9.0 nm and a width of 9.3 nm, (iii) approximately 8 molecules of recA protein and 20 nucleotide residues per striation. The widened spacing between bases in the nucleoprotein filament means that the initial matching of complementary sequences must involve intertwining of the filament and duplex DNA, unwinding of the latter, or some combination of both to equalize the spacing between nascent base pairs. These experiments support the concept that recA protein first forms a filament with single-stranded DNA, which in turn binds to duplex DNA to mediate both homologous pairing and subsequent strand exchange.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the issue of noise robustness of reconstruction techniques for frequency-domain optical-coherence tomography (FDOCT). We consider three reconstruction techniques: Fourier, iterative phase recovery, and cepstral techniques. We characterize the reconstructions in terms of their statistical bias and variance and obtain approximate analytical expressions under the assumption of small noise. We also perform Monte Carlo analyses and show that the experimental results are in agreement with the theoretical predictions. It turns out that the iterative and cepstral techniques yield reconstructions with a smaller bias than the Fourier method. The three techniques, however, have identical variance profiles, and their consistency increases linearly as a function of the signal-to-noise ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on optimisation algorithms inspired by swarm intelligence for satellite image classification from high resolution satellite multi- spectral images. Amongst the multiple benefits and uses of remote sensing, one of the most important has been its use in solving the problem of land cover mapping. As the frontiers of space technology advance, the knowledge derived from the satellite data has also grown in sophistication. Image classification forms the core of the solution to the land cover mapping problem. No single classifier can prove to satisfactorily classify all the basic land cover classes of an urban region. In both supervised and unsupervised classification methods, the evolutionary algorithms are not exploited to their full potential. This work tackles the land map covering by Ant Colony Optimisation (ACO) and Particle Swarm Optimisation (PSO) which are arguably the most popular algorithms in this category. We present the results of classification techniques using swarm intelligence for the problem of land cover mapping for an urban region. The high resolution Quick-bird data has been used for the experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Frequency multiplication (FM) can be used to design low power frequency synthesizers. This is achieved by running the VCO at a much reduced frequency, while employing a power efficient frequency multiplier, and also thereby eliminating the first few dividers. Quadrature signals can be generated by frequency- multiplying low frequency I/Q signals, however this also multiplies the quadrature error of these signals. Another way is generating additional edges from the low-frequency oscillator (LFO) and develop a quadrature FM. This makes the I-Q precision heavily dependent on process mismatches in the ring oscillator. In this paper we examine the use of fewer edges from LFO and a single stage polyphase filter to generate approximate quadrature signals, which is then followed by an injection-locked quadrature VCO to generate high- precision I/Q signals. Simulation comparisons with the existing approach shows that the proposed method offers very good phase accuracy of 0.5deg with only a modest increase in power dissipation for 2.4 GHz IEEE 802.15.4 standard using UMC 0.13 mum RFCMOS technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE. To understand the molecular features underlying autosomal dominant congenital cataracts caused by the deletion mutations W156X in human gamma D-crystallin and W157X in human gamma C-crystallin. METHODS. Normal and mutant cDNAs (with the enhanced green fluorescent protein [EGFP] tag in the front) were cloned into the pEGFP-C1 vector, transfected into various cell lines, and observed under a confocal microscope for EGFP fluorescence. Normal and W156X gamma D cDNAs were also cloned into the pET21a(+) vector, and the recombinant proteins were overexpressed in the BL-21(DE3) pLysS strain of Escherichia coli, purified, and isolated. The conformational features, structural stability, and solubility in aqueous solution of the mutant protein were compared with those of the wild type using spectroscopic methods. Comparative molecular modeling was performed to provide additional structural information. RESULTS. Transfection of the EGFP-tagged mutant cDNAs into several cell lines led to the visualization of aggregates, whereas that of wild-type cDNAs did not. Turning to the properties of the expressed proteins, the mutant molecules show remarkable reduction in solubility. They also seem to have a greater degree of surface hydrophobicity than the wild-type molecules, most likely accounting for self-aggregation. Molecular modeling studies support these features. CONCLUSIONS. The deletion of C-terminal 18 residues of human gamma C-and gamma D-crystallins exposes the side chains of several hydrophobic residues in the sequence to the solvent, causing the molecule to self-aggregate. This feature appears to be reflected in situ on the introduction of the mutants in human lens epithelial cells.