31 resultados para Textual simplification
em Indian Institute of Science - Bangalore - Índia
Resumo:
The diffusion equation-based modeling of near infrared light propagation in tissue is achieved by using finite-element mesh for imaging real-tissue types, such as breast and brain. The finite-element mesh size (number of nodes) dictates the parameter space in the optical tomographic imaging. Most commonly used finite-element meshing algorithms do not provide the flexibility of distinct nodal spacing in different regions of imaging domain to take the sensitivity of the problem into consideration. This study aims to present a computationally efficient mesh simplification method that can be used as a preprocessing step to iterative image reconstruction, where the finite-element mesh is simplified by using an edge collapsing algorithm to reduce the parameter space at regions where the sensitivity of the problem is relatively low. It is shown, using simulations and experimental phantom data for simple meshes/domains, that a significant reduction in parameter space could be achieved without compromising on the reconstructed image quality. The maximum errors observed by using the simplified meshes were less than 0.27% in the forward problem and 5% for inverse problem.
Resumo:
We address the task of mapping a given textual domain model (e.g., an industry-standard reference model) for a given domain (e.g., ERP), with the source code of an independently developed application in the same domain. This has applications in improving the understandability of an existing application, migrating it to a more flexible architecture, or integrating it with other related applications. We use the vector-space model to abstractly represent domain model elements as well as source-code artifacts. The key novelty in our approach is to leverage the relationships between source-code artifacts in a principled way to improve the mapping process. We describe experiments wherein we apply our approach to the task of matching two real, open-source applications to corresponding industry-standard domain models. We demonstrate the overall usefulness of our approach, as well as the role of our propagation techniques in improving the precision and recall of the mapping task.
Resumo:
The Reeb graph tracks topology changes in level sets of a scalar function and finds applications in scientific visualization and geometric modeling. This paper describes a near-optimal two-step algorithm that constructs the Reeb graph of a Morse function defined over manifolds in any dimension. The algorithm first identifies the critical points of the input manifold, and then connects these critical points in the second step to obtain the Reeb graph. A simplification mechanism based on topological persistence aids in the removal of noise and unimportant features. A radial layout scheme results in a feature-directed drawing of the Reeb graph. Experimental results demonstrate the efficiency of the Reeb graph construction in practice and its applications.
Resumo:
This article discusses the design and development of GRDB (General Purpose Relational Data Base System) which has been implemented on a DEC-1090 system in Pascal. GRDB is a general purpose database system designed to be completely independent of the nature of data to be handled, since it is not tailored to the specific requirements of any particular enterprise. It can handle different types of data such as variable length records and textual data. Apart from the usual database facilities such as data definition and data manipulation, GRDB supports User Definition Language (UDL) and Security definition language. These facilities are provided through a SEQUEL-like General Purpose Query Language (GQL). GRDB provides adequate protection facilities up to the relation level. The concept of “security matrix” has been made use of to provide database protection. The concept of Unique IDentification number (UID) and Password is made use of to ensure user identification and authentication. The concept of static integrity constraints has been used to ensure data integrity. Considerable efforts have been made to improve the response time through indexing on the data files and query optimisation. GRDB is designed for an interactive use but alternate provision has been made for its use through batch mode also. A typical Air Force application (consisting of data about personnel, inventory control, and maintenance planning) has been used to test GRDB and it has been found to perform satisfactorily.
Resumo:
It is well known that the notions of normal forms and acyclicity capture many practical desirable properties for database schemes. The basic schema design problem is to develop design methodologies that strive toward these ideals. The usual approach is to first normalize the database scheme as far as possible. If the resulting scheme is cyclic, then one tries to transform it into an acyclic scheme. In this paper, we argue in favor of carrying out these two phases of design concurrently. In order to do this efficiently, we need to be able to incrementally analyze the acyclicity status of a database scheme as it is being designed. To this end, we propose the formalism of "binary decompositions". Using this, we characterize design sequences that exactly generate theta-acyclic schemes, for theta = agr,beta. We then show how our results can be put to use in database design. Finally, we also show that our formalism above can be effectively used as a proof tool in dependency theory. We demonstrate its power by showing that it leads to a significant simplification of the proofs of some previous results connecting sets of multivalued dependencies and acyclic join dependencies.
Resumo:
A period timing device suitable for processing laser Doppler anemometer signals has been described here. The important features of this instrument are: it is inexpensive, simple to operate, and easy to fabricate. When the concentration of scattering particles is low the Doppler signal is in the form of a burst and the Doppler frequency is measured by timing the zero crossings of the signal. But the presence of noise calls for the use of validation criterion, and a 5–8 cycles comparison has been used in this instrument. Validation criterion requires the differential count between the 5 and 8 cycles to be multiplied by predetermined numbers that prescribe the accuracy of measurement. By choosing these numbers to be binary numbers, much simplification in circuit design has been accomplished since this permits the use of shift registers for multiplication. Validation accuracies of 1.6%, 3.2%, 6.3%, and 12.5% are possible with this device. The design presented here is for a 16-bit processor and uses TTL components. By substituting Schottky barrier TTLs the clock frequency can be increased from about 10 to 30 MHz resulting in an extension in the range of the instrument. Review of Scientific Instruments is copyrighted by The American Institute of Physics.
Resumo:
The scalar coupled proton NMR spectra of many organic molecules possessing more than one phenyl ring are generally complex due to degeneracy of transitions arising from the closely resonating protons, in addition to several short- and long- range couplings experienced by each proton. Analogous situations are generally encountered in derivatives of halogenated benzanilides. Extraction of information from such spectra is challenging and demands the differentiation of spectrum pertaining to each phenyl ring and the simplification of their spectral complexity. The present study employs the blend of independent spin system filtering and the spin-state selective detection of single quantum (SO) transitions by the two-dimensional multiple quantum (MQ) methodology in achieving this goal. The precise values of the scalar couplings of very small magnitudes have been derived by double quantum resolved experiments. The experiments also provide the relative signs of heteronuclear couplings. Studies on four isomers of dilhalogenated benzanilides are reported in this work.
Resumo:
The Finite Element Method (FEM) has made a number of otherwise intractable problems solvable. An important aspect for achieving an economical and accurate solution through FEM is matching the formulation and the computational organisation to the problem. This was realised forcefully in the present case of the solution of a class of moving contact boundary value problems of fastener joints. This paper deals with the problem of changing contact at the pin-hole interface of a fastener joint. Due to moving contact, the stresses and displacements are nonlinear with load. This would, in general, need an interactive-incremental approach for solution. However, by posing the problem in an inverse way, a solution is sought for obtaining loads to suit given contact configuration. Numerical results are given for typical isotropic and composite plates with rigid pins. Two cases of loading are considered: (i) load applied only at the edges of the plate and (ii) load applied at the pin and reacted at a part of the edge of the plate. Load-contact relationships, compliance and stress-patterns are investigated. This paper clearly demonstrates the simplification achieved by a suitable formulation of the problem. The results are of significance to the design and analysis of fastener joints.
Resumo:
The Finite Element Method (FEM) has made a number of otherwise intractable problems solvable. An important aspect for achieving an economical and accurate solution through FEM is matching the formulation and the computational organisation to the problem. This was realised forcefully in the present case of the solution of a class of moving contact boundary value problems of fastener joints. This paper deals with the problem of changing contact at the pin-hole interface of a fastener joint. Due to moving contact, the stresses and displacements are nonlinear with load. This would, in general, need an interactive-incremental approach for solution. However, by posing the problem in an inverse way, a solution is sought for obtaining loads to suit given contact configuration. Numerical results are given for typical isotropic and composite plates with rigid pins. Two cases of loading are considered: (i) load applied only at the edges of the plate and (ii) load applied at the pin and reacted at a part of the edge of the plate. Load-contact relationships, compliance and stress-patterns are investigated. This paper clearly demonstrates the simplification achieved by a suitable formulation of the problem. The results are of significance to the design and analysis of fastener joints.
Resumo:
A period timing device suitable for processing laser Doppler anemometer signals has been described here. The important features of this instrument are: it is inexpensive, simple to operate, and easy to fabricate. When the concentration of scattering particles is low the Doppler signal is in the form of a burst and the Doppler frequency is measured by timing the zero crossings of the signal. But the presence of noise calls for the use of validation criterion, and a 5–8 cycles comparison has been used in this instrument. Validation criterion requires the differential count between the 5 and 8 cycles to be multiplied by predetermined numbers that prescribe the accuracy of measurement. By choosing these numbers to be binary numbers, much simplification in circuit design has been accomplished since this permits the use of shift registers for multiplication. Validation accuracies of 1.6%, 3.2%, 6.3%, and 12.5% are possible with this device. The design presented here is for a 16-bit processor and uses TTL components. By substituting Schottky barrier TTLs the clock frequency can be increased from about 10 to 30 MHz resulting in an extension in the range of the instrument. Review of Scientific Instruments is copyrighted by The American Institute of Physics.
Resumo:
The recent trend towards minimizing the interconnections in large scale integration (LSI) circuits has led to intensive investigation in the development of ternary circuits and the improvement of their design. The ternary multiplexer is a convenient and useful logic module which can be used as a basic building block in the design of a ternary system. This paper discusses a systematic procedure for the simplification and realization of ternary functions using ternary multiplexers as building blocks. Both single level and multilevel multiplexing techniques are considered. The importance of the design procedure is highlighted by considering two specific applications, namely, the development of ternary adder/subtractor and TCD to ternary converter.
Resumo:
MIPS (metal interactions in protein structures) is a database of metals in the three-dimensional acromolecular structures available in the Protein Data Bank. Bound metal ions in proteins have both catalytic and structural functions. The proposed database serves as an open resource for the analysis and visualization of all metals and their interactions with macromolecular (protein and nucleic acid) structures. MIPS can be searched via a user-friendly interface, and the interactions between metals and protein molecules, and the geometric parameters, can be viewed in both textual and graphical format using the freely available graphics plug-in Jmol. MIPS is updated regularly, by means of programmed scripts to find metal-containing proteins from newly released protein structures. The database is useful for studying the properties of coordination between metals and protein molecules. It also helps to improve understanding of the relationship between macromolecular structure and function. This database is intended to serve the scientific community working in the areas of chemical and structural biology, and is freely available to all users, around the clock, at http://dicsoft2.physics.iisc.ernet.in/mips/.
Resumo:
We study a fixed-point formalization of the well-known analysis of Bianchi. We provide a significant simplification and generalization of the analysis. In this more general framework, the fixed-point solution and performance measures resulting from it are studied. Uniqueness of the fixed point is established. Simple and general throughput formulas are provided. It is shown that the throughput of any flow will be bounded by the one with the smallest transmission rate. The aggregate throughput is bounded by the reciprocal of the harmonic mean of the transmission rates. In an asymptotic regime with a large number of nodes, explicit formulas for the collision probability, the aggregate attempt rate, and the aggregate throughput are provided. The results from the analysis are compared with ns2 simulations and also with an exact Markov model of the backoff process. It is shown how the saturated network analysis can be used to obtain TCP transfer throughputs in some cases.
Resumo:
The proton NMR spectral complexity arising due to severe overlap of peaks hampers their analyses in diverse situations, even by the application of two-dimensional experiments. The selective or complete removal of the couplings and retention of only the chemical shift interactions in indirect dimension aids in the simplification of the spectrum to a large extent with little investment of the instrument time. The present study provides precise enantiodiscrimination employing more anisotropic NMR parameters in the chiral liquid crystalline medium and differentiates the overlapped peaks of many organic molecules and peptides dissolved in isotropic solvents.