19 resultados para Informatics

em Indian Institute of Science - Bangalore - Índia


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Glycomics is the study of comprehensive structural elucidation and characterization of all glycoforms found in nature and their dynamic spatiotemporal changes that are associated with biological processes. Glycocalyx of mammalian cells actively participate in cell-cell, cell-matrix, and cell-pathogen interactions, which impact embryogenesis, growth and development, homeostasis, infection and immunity, signaling, malignancy, and metabolic disorders. Relative to genomics and proteomics, glycomics is just growing out of infancy with great potential in biomedicine for biomarker discovery, diagnosis, and treatment. However, the immense diversity and complexity of glycan structures and their multiple modes of interactions with proteins pose great challenges for development of analytical tools for delineating structure function relationships and understanding glycocode. Several tools are being developed for glycan profiling based on chromatography,m mass spectrometry, glycan microarrays, and glyco-informatics. Lectins, which have long been used in glyco-immunology, printed on a microarray provide a versatile platform for rapid high throughput analysis of glycoforms of biological samples. Herein, we summarize technological advances in lectin microarrays and critically review their impact on glycomics analysis. Challenges remain in terms of expansion to include nonplant derived lectins, standardization for routine clinical use, development of recombinant lectins, and exploration of plant kingdom for discovery of novel lectins.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mobile ad-hoc networks (MANETs) have recently drawn significant research attention since they offer unique benefits and versatility with respect to bandwidth spatial reuse, intrinsic fault tolerance, and low-cost rapid deployment. This paper addresses the issue of delay sensitive realtime data transport in these type of networks. An effective QoS mechanism is thereby required for the speedy transport of the realtime data. QoS issue in MANET is an open-end problem. Various QoS measures are incorporated in the upperlayers of the network, but a few techniques addresses QoS techniques in the MAC layer. There are quite a few QoS techniques in the MAC layer for the infrastructure based wireless network. The goal and the challenge is to achieve a QoS delivery and a priority access to the real time traffic in adhoc wireless environment, while maintaining democracy in the resource allocation. We propose a MAC layer protocol called "FCP based FAMA protocol", which allocates the channel resources to the needy in a more democratic way, by examining the requirements, malicious behavior and genuineness of the request. We have simulated both the FAMA as well as FCP based FAMA and tested in various MANET conditions. Simulated results have clearly shown a performance improvement in the channel utilization and a decrease in the delay parameters in the later case. Our new protocol outperforms the other QoS aware MAC layer protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image and video filtering is a key image-processing task in computer vision especially in noisy environment. In most of the cases the noise source is unknown and hence possess a major difficulty in the filtering operation. In this paper we present an error-correction based learning approach for iterative filtering. A new FIR filter is designed in which the filter coefficients are updated based on Widrow-Hoff rule. Unlike the standard filter the proposed filter has the ability to remove noise without the a priori knowledge of the noise. Experimental result shows that the proposed filter efficiently removes the noise and preserves the edges in the image. We demonstrate the capability of the proposed algorithm by testing it on standard images infected by Gaussian noise and on a real time video containing inherent noise. Experimental result shows that the proposed filter is better than some of the existing standard filters

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the image reconstruction using the fan-beam filtered backprojection (FBP) algorithm with no backprojection weight from windowed linear prediction (WLP) completed truncated projection data. The image reconstruction from truncated projections aims to reconstruct the object accurately from the available limited projection data. Due to the incomplete projection data, the reconstructed image contains truncation artifacts which extends into the region of interest (ROI) making the reconstructed image unsuitable for further use. Data completion techniques have been shown to be effective in such situations. We use windowed linear prediction technique for projection completion and then use the fan-beam FBP algorithm with no backprojection weight for the 2-D image reconstruction. We evaluate the quality of the reconstructed image using fan-beam FBP algorithm with no backprojection weight after WLP completion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mining association rules from a large collection of databases is based on two main tasks. One is generation of large itemsets; and the other is finding associations between the discovered large itemsets. Existing formalism for association rules are based on a single transaction database which is not sufficient to describe the association rules based on multiple database environment. In this paper, we give a general characterization of association rules and also give a framework for knowledge-based mining of multiple databases for association rules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-Time services are traditionally supported on circuit switched network. However, there is a need to port these services on packet switched network. Architecture for audio conferencing application over the Internet in the light of ITU-T H.323 recommendations is considered. In a conference, considering packets only from a set of selected clients can reduce speech quality degradation because mixing packets from all clients can lead to lack of speech clarity. A distributed algorithm and architecture for selecting clients for mixing is suggested here based on a new quantifier of the voice activity called “Loudness Number” (LN). The proposed system distributes the computation load and reduces the load on client terminals. The highlights of this architecture are scalability, bandwidth saving and speech quality enhancement. Client selection for playing out tries to mimic a physical conference where the most vocal participants attract more attention. The contributions of the paper are expected to aid H.323 recommendations implementations for Multipoint Processors (MP). A working prototype based on the proposed architecture is already functional.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quest for new drug targets in Plasmodium sp. has underscored malonyl CoA:ACP transacylase (PfFabD) of fatty acid biosynthetic pathway in apicoplast. In this study, a piggyback approach was employed for the receptor deorphanization using inhibitors of bacterial FabD enzymes. Due to the lack of crystal structure, theoretical model was constructed using the structural details of homologous enzymes. Sequence and structure analysis has localized the presence of two conserved pentapeptide motifs: GQGXG and GXSXG and five key invariant residues viz., Gln109, Ser193, Arg218, His305 and Gln354 characteristic of FabD enzyme. Active site mapping of PfFabD using substrate molecules has disclosed the spatial arrangement of key residues in the cavity. As structurally similar molecules exhibit similar biological activities, signature pharmacophore fingerprints of FabD antagonists were generated using 0D-3D descriptors for molecular similarity-based cluster analysis and to correlate with their binding profiles. It was observed that antagonists showing good geometrical fitness score were grouped in cluster-1, whereas those exhibiting high binding affinities in cluster-2. This study proves important to shed light on the active site environment to reveal the hotspot for binding with higher affinity and to narrow down the virtual screening process by searching for close neighbors of the active compounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large software systems are developed by composing multiple programs. If the programs manip-ulate and exchange complex data, such as network packets or files, it is essential to establish that they follow compatible data formats. Most of the complexity of data formats is associated with the headers. In this paper, we address compatibility of programs operating over headers of network packets, files, images, etc. As format specifications are rarely available, we infer the format associated with headers by a program as a set of guarded layouts. In terms of these formats, we define and check compatibility of (a) producer-consumer programs and (b) different versions of producer (or consumer) programs. A compatible producer-consumer pair is free of type mismatches and logical incompatibilities such as the consumer rejecting valid outputs gen-erated by the producer. A backward compatible producer (resp. consumer) is guaranteed to be compatible with consumers (resp. producers) that were compatible with its older version. With our prototype tool, we identified 5 known bugs and 1 potential bug in (a) sender-receiver modules of Linux network drivers of 3 vendors and (b) different versions of a TIFF image library.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electrical Impedance Tomography (EIT) is a computerized medical imaging technique which reconstructs the electrical impedance images of a domain under test from the boundary voltage-current data measured by an EIT electronic instrumentation using an image reconstruction algorithm. Being a computed tomography technique, EIT injects a constant current to the patient's body through the surface electrodes surrounding the domain to be imaged (Omega) and tries to calculate the spatial distribution of electrical conductivity or resistivity of the closed conducting domain using the potentials developed at the domain boundary (partial derivative Omega). Practical phantoms are essentially required to study, test and calibrate a medical EIT system for certifying the system before applying it on patients for diagnostic imaging. Therefore, the EIT phantoms are essentially required to generate boundary data for studying and assessing the instrumentation and inverse solvers a in EIT. For proper assessment of an inverse solver of a 2D EIT system, a perfect 2D practical phantom is required. As the practical phantoms are the assemblies of the objects with 3D geometries, the developing of a practical 2D-phantom is a great challenge and therefore, the boundary data generated from the practical phantoms with 3D geometry are found inappropriate for assessing a 2D inverse solver. Furthermore, the boundary data errors contributed by the instrumentation are also difficult to separate from the errors developed by the 3D phantoms. Hence, the errorless boundary data are found essential to assess the inverse solver in 2D EIT. In this direction, a MatLAB-based Virtual Phantom for 2D EIT (MatVP2DEIT) is developed to generate accurate boundary data for assessing the 2D-EIT inverse solvers and the image reconstruction accuracy. MatVP2DEIT is a MatLAB-based computer program which simulates a phantom in computer and generates the boundary potential data as the outputs by using the combinations of different phantom parameters as the inputs to the program. Phantom diameter, inhomogeneity geometry (shape, size and position), number of inhomogeneities, applied current magnitude, background resistivity, inhomogeneity resistivity all are set as the phantom variables which are provided as the input parameters to the MatVP2DEIT for simulating different phantom configurations. A constant current injection is simulated at the phantom boundary with different current injection protocols and boundary potential data are calculated. Boundary data sets are generated with different phantom configurations obtained with the different combinations of the phantom variables and the resistivity images are reconstructed using EIDORS. Boundary data of the virtual phantoms, containing inhomogeneities with complex geometries, are also generated for different current injection patterns using MatVP2DEIT and the resistivity imaging is studied. The effect of regularization method on the image reconstruction is also studied with the data generated by MatVP2DEIT. Resistivity images are evaluated by studying the resistivity parameters and contrast parameters estimated from the elemental resistivity profiles of the reconstructed phantom domain. Results show that the MatVP2DEIT generates accurate boundary data for different types of single or multiple objects which are efficient and accurate enough to reconstruct the resistivity images in EIDORS. The spatial resolution studies show that, the resistivity imaging conducted with the boundary data generated by MatVP2DEIT with 2048 elements, can reconstruct two circular inhomogeneities placed with a minimum distance (boundary to boundary) of 2 mm. It is also observed that, in MatVP2DEIT with 2048 elements, the boundary data generated for a phantom with a circular inhomogeneity of a diameter less than 7% of that of the phantom domain can produce resistivity images in EIDORS with a 1968 element mesh. Results also show that the MatVP2DEIT accurately generates the boundary data for neighbouring, opposite reference and trigonometric current patterns which are very suitable for resistivity reconstruction studies. MatVP2DEIT generated data are also found suitable for studying the effect of the different regularization methods on reconstruction process. Comparing the reconstructed image with an original geometry made in MatVP2DEIT, it would be easier to study the resistivity imaging procedures as well as the inverse solver performance. Using the proposed MatVP2DEIT software with modified domains, the cross sectional anatomy of a number of body parts can be simulated in PC and the impedance image reconstruction of human anatomy can be studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Natural multispecies acoustic choruses such as the dusk chorus of a tropical rain forest consist of simultaneously signalling individuals of different species whose calls travel through a common shared medium before reaching their `intended' receivers. This causes masking interference between signals and impedes signal detection, recognition and localization. The levels of acoustic overlap depend on a number of factors, including call structure, intensity, habitat-dependent signal attenuation and receiver tuning. In addition, acoustic overlaps should also depend on caller density and the species composition of choruses, including relative and absolute abundance of the different calling species. In this study, we used simulations to examine the effects of chorus species relative abundance and caller density on the levels of effective heterospecific acoustic overlap in multispecies choruses composed of the calls of five species of crickets and katydids that share the understorey of a rain forest in southern India. We found that on average species-even choruses resulted in higher levels of effective heterospecific acoustic overlap than choruses with strong dominance structures. This effect was found consistently across dominance levels ranging from 0.4 to 0.8 for larger choruses of forty individuals. For smaller choruses of twenty individuals, the effect was seen consistently for dominance levels of 0.6 and 0.8 but not 0.4. Effective acoustic overlap (EAO) increased with caller density but the manner and extent of increase depended both on the species' call structure and the acoustic context provided by the composition scenario. The Phaloria sp. experienced very low levels of EAO and was highly buffered to changes in acoustic context whereas other species experienced high FAO across contexts or were poorly buffered. These differences were not simply predictable from call structures. These simulation-based findings may have important implications for acoustic biodiversity monitoring and for the study of acoustic masking interference in natural environments. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rapid and invasive urbanization has been associated with depletion of natural resources (vegetation and water resources), which in turn deteriorates the landscape structure and conditions in the local environment. Rapid increase in population due to the migration from rural areas is one of the critical issues of the urban growth. Urbanisation in India is drastically changing the land cover and often resulting in the sprawl. The sprawl regions often lack basic amenities such as treated water supply, sanitation, etc. This necessitates regular monitoring and understanding of the rate of urban development in order to ensure the sustenance of natural resources. Urban sprawl is the extent of urbanization which leads to the development of urban forms with the destruction of ecology and natural landforms. The rate of change of land use and extent of urban sprawl can be efficiently visualized and modelled with the help of geo-informatics. The knowledge of urban area, especially the growth magnitude, shape geometry, and spatial pattern is essential to understand the growth and characteristics of urbanization process. Urban pattern, shape and growth can be quantified using spatial metrics. This communication quantifies the urbanisation and associated growth pattern in Delhi. Spatial data of four decades were analysed to understand land over and land use dynamics. Further the region was divided into 4 zones and into circles of 1 km incrementing radius to understand and quantify the local spatial changes. Results of the landscape metrics indicate that the urban center was highly aggregated and the outskirts and the buffer regions were in the verge of aggregating urban patches. Shannon's Entropy index clearly depicted the outgrowth of sprawl areas in different zones of Delhi. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with processing the EEG signals obtained from 16 spatially arranged electrodes to measure coupling or synchrony between the frontal, parietal, occipital and temporal lobes of the cerebrum under the eyes open and eyes closed conditions. This synchrony was measured using magnitude squared coherence, Short Time Fourier Transform and wavelet based coherences. We found a pattern in the time-frequency coherence as we moved from the nasion to the inion of the subject's head. The coherence pattern obtained from the wavelet approach was found to be far more capable of picking up peaks in coherence with respect to frequency when compared to the regular Fourier based coherence. We detected high synchrony between frontal polar electrodes that is missing in coherence plots between other electrode pairs. The study has potential applications in healthcare.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An abundance of spectrum access and sensing algorithms are available in the dynamic spectrum access (DSA) and cognitive radio (CR) literature. Often, however, the functionality and performance of such algorithms are validated against theoretical calculations using only simulations. Both the theoretical calculations and simulations come with their attendant sets of assumptions. For instance, designers of dynamic spectrum access algorithms often take spectrum sensing and rendezvous mechanisms between transmitter-receiver pairs for granted. Test bed designers, on the other hand, either customize so much of their design that it becomes difficult to replicate using commercial off the shelf (COTS) components or restrict themselves to simulation, emulation /hardware-in-Ioop (HIL), or pure hardware but not all three. Implementation studies on test beds sophisticated enough to combine the three aforementioned aspects, but at the same time can also be put together using COTS hardware and software packages are rare. In this paper we describe i) the implementation of a hybrid test bed using a previously proposed hardware agnostic system architecture ii) the implementation of DSA on this test bed, and iii) the realistic hardware and software-constrained performance of DSA. Snapshot energy detector (ED) and Cumulative Summation (CUSUM), a sequential change detection algorithm, are available for spectrum sensing and a two-way handshake mechanism in a dedicated control channel facilitates transmitter-receiver rendezvous.