12 resultados para Cohesion-based discourse analysis
em Indian Institute of Science - Bangalore - Índia
Resumo:
Reliability analysis for computing systems in aerospace applications must account for actual computations the system performs in the use environment. This paper introduces a theoretical nonhomogeneous Markov model for such applications.
Resumo:
Ubiquitous Computing is an emerging paradigm which facilitates user to access preferred services, wherever they are, whenever they want, and the way they need, with zero administration. While moving from one place to another the user does not need to specify and configure their surrounding environment, the system initiates necessary adaptation by itself to cope up with the changing environment. In this paper we propose a system to provide context-aware ubiquitous multimedia services, without user’s intervention. We analyze the context of the user based on weights, identify the UMMS (Ubiquitous Multimedia Service) based on the collected context information and user profile, search for the optimal server to provide the required service, then adapts the service according to user’s local environment and preferences, etc. The experiment conducted several times with different context parameters, their weights and various preferences for a user. The results are quite encouraging.
Resumo:
We propose a simple, reliable method based on probability of transitions and distribution of adjacent pixel pairs for steganalysis on digital images in spatial domain subjected to Least Significant Bit replacement steganography. Our method is sensitive to the statistics of underlying cover image and is a variant of Sample Pair Method. We use the new method to estimate length of hidden message reliably. The novelty of our method is that it detects from the statistics of the underlying image, which is invariant with embedding, whether the results it calculate are reliable or not. To our knowledge, no steganalytic method so far predicts from the properties of the stego image, whether its results are accurate or not.
Resumo:
This paper presents a simple second-order, curvature based mobility analysis of planar curves in contact. The underlying theory deals with penetration and separation of curves with multiple contacts, based on relative configuration of osculating circles at points of contact for a second-order rotation about each point of the plane. Geometric and analytical treatment of mobility analysis is presented for generic as well as special contact geometries. For objects with a single contact, partitioning of the plane into four types of mobility regions has been shown. Using point based composition operations based on dual-number matrices, analysis has been extended to computationally handle multiple contacts scenario. A novel color coded directed line has been proposed to capture the contact scenario. Multiple contacts mobility is obtained through intersection of the mobility half-spaces. It is derived that mobility region comprises a pair of unbounded or a single bounded convex polygon. The theory has been used for analysis and synthesis of form closure configurations, revolute and prismatic kinematic pairs. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Lipocalins constitute a superfamily of extracellular proteins that are found in all three kingdoms of life. Although very divergent in their sequences and functions, they show remarkable similarity in 3-D structures. Lipocalins bind and transport small hydrophobic molecules. Earlier sequence-based phylogenetic studies of lipocalins highlighted that they have a long evolutionary history. However the molecular and structural basis of their functional diversity is not completely understood. The main objective of the present study is to understand functional diversity of the lipocalins using a structure-based phylogenetic approach. The present study with 39 protein domains from the lipocalin superfamily suggests that the clusters of lipocalins obtained by structure-based phylogeny correspond well with the functional diversity. The detailed analysis on each of the clusters and sub-clusters reveals that the 39 lipocalin domains cluster based on their mode of ligand binding though the clustering was performed on the basis of gross domain structure. The outliers in the phylogenetic tree are often from single member families. Also structure-based phylogenetic approach has provided pointers to assign putative function for the domains of unknown function in lipocalin family. The approach employed in the present study can be used in the future for the functional identification of new lipocalin proteins and may be extended to other protein families where members show poor sequence similarity but high structural similarity.
Resumo:
People living under $2 income per day, referred as Base of the Pyramid (BoP), face undesired situations like lack of nutrition, health, education etc. Design as a process of changing current undesired situation to a desired situation has failed. A crucial reason behind these failures is lack of normative basis to identify and understand the absent or unsatisfied stakeholder. Currently stakeholder analysis in the design is heuristic. This paper uses a normative framework of Capability Approach (CA) for the stakeholder analysis. A brief discussion on stakeholder theory and analysis is used to identify gaps in the literature. The constructs of the CA are discussed for its suitability to the purpose. Along with methodological details, data generated from the stakeholder interviews, focus groups in a case study of dissemination of improved cook-stoves is used to interlink the theory with the practice. The scope of this work is in identifying and investigating the motives of the stakeholders in the involvement in the product. Though a lot of insights to discern and manage crucial stakeholders is inbuilt in the methodology, this work does not claim explicit coverage of these aspects.
Resumo:
The broader goal of the research being described here is to automatically acquire diagnostic knowledge from documents in the domain of manual and mechanical assembly of aircraft structures. These documents are treated as a discourse used by experts to communicate with others. It therefore becomes possible to use discourse analysis to enable machine understanding of the text. The research challenge addressed in the paper is to identify documents or sections of documents that are potential sources of knowledge. In a subsequent step, domain knowledge will be extracted from these segments. The segmentation task requires partitioning the document into relevant segments and understanding the context of each segment. In discourse analysis, the division of a discourse into various segments is achieved through certain indicative clauses called cue phrases that indicate changes in the discourse context. However, in formal documents such language may not be used. Hence the use of a domain specific ontology and an assembly process model is proposed to segregate chunks of the text based on a local context. Elements of the ontology/model, and their related terms serve as indicators of current context for a segment and changes in context between segments. Local contexts are aggregated for increasingly larger segments to identify if the document (or portions of it) pertains to the topic of interest, namely, assembly. Knowledge acquired through such processes enables acquisition and reuse of knowledge during any part of the lifecycle of a product.
Resumo:
A new method based on analysis of a single diffraction pattern is proposed to measure deflections in micro-cantilever (MC) based sensor probes, achieving typical deflection resolutions of 1nm and surface stress changes of 50 mu N/m. The proposed method employs a double MC structure where the deflection of one of the micro-cantilevers relative to the other due to surface stress changes results in a linear shift of intensity maxima of the Fraunhofer diffraction pattern of the transilluminated MC. Measurement of such shifts in the intensity maxima of a particular order along the length of the structure can be done to an accuracy of 0.01mm leading to the proposed sensitivity of deflection measurement in a typical microcantilever. This method can overcome the fundamental measurement sensitivity limit set by diffraction and pointing stability of laser beam in the widely used Optical Beam Deflection method (OBDM).
Resumo:
There are many well-known examples of proteins with low sequence similarity, adopting the same structural fold. This aspect of sequence-structure relationship has been extensively studied both experimentally and theoretically, however with limited success. Most of the studies consider remote homology or ``sequence conservation'' as the basis for their understanding. Recently ``interaction energy'' based network formalism (Protein Energy Networks (PENs)) was developed to understand the determinants of protein structures. In this paper we have used these PENs to investigate the common non-covalent interactions and their collective features which stabilize the TIM barrel fold. We have also developed a method of aligning PENs in order to understand the spatial conservation of interactions in the fold. We have identified key common interactions responsible for the conservation of the TIM fold, despite high sequence dissimilarity. For instance, the central beta barrel of the TIM fold is stabilized by long-range high energy electrostatic interactions and low-energy contiguous vdW interactions in certain families. The other interfaces like the helix-sheet or the helix-helix seem to be devoid of any high energy conserved interactions. Conserved interactions in the loop regions around the catalytic site of the TIM fold have also been identified, pointing out their significance in both structural and functional evolution. Based on these investigations, we have developed a novel network based phylogenetic analysis for remote homologues, which can perform better than sequence based phylogeny. Such an analysis is more meaningful from both structural and functional evolutionary perspective. We believe that the information obtained through the ``interaction conservation'' viewpoint and the subsequently developed method of structure network alignment, can shed new light in the fields of fold organization and de novo computational protein design.
Resumo:
The voltage ripple and power loss in the DC-capacitor of a voltage source inverter depend on the harmonic currents flowing through the capacitor. This paper presents double Fourier series based harmonic analysis of DC capacitor current in a three-level neutral point clamped inverter, modulated with sine-triangle PWM. The analytical results are validated experimentally on a 5-kVA three-level inverter prototype. The results of the analysis are used for predicting the power loss in the DC capacitor.
Resumo:
3-D full-wave method of moments (MoM) based electromagnetic analysis is a popular means toward accurate solution of Maxwell's equations. The time and memory bottlenecks associated with such a solution have been addressed over the last two decades by linear complexity fast solver algorithms. However, the accurate solution of 3-D full-wave MoM on an arbitrary mesh of a package-board structure does not guarantee accuracy, since the discretization may not be fine enough to capture spatial changes in the solution variable. At the same time, uniform over-meshing on the entire structure generates a large number of solution variables and therefore requires an unnecessarily large matrix solution. In this paper, different refinement criteria are studied in an adaptive mesh refinement platform. Consequently, the most suitable conductor mesh refinement criterion for MoM-based electromagnetic package-board extraction is identified and the advantages of this adaptive strategy are demonstrated from both accuracy and speed perspectives. The results are also compared with those of the recently reported integral equation-based h-refinement strategy. Finally, a new methodology to expedite each adaptive refinement pass is proposed.
Resumo:
Cache analysis plays a very important role in obtaining precise Worst Case Execution Time (WCET) estimates of programs for real-time systems. While Abstract Interpretation based approaches are almost universally used for cache analysis, they fail to take advantage of its unique requirement: it is not necessary to find the guaranteed cache behavior that holds across all executions of a program. We only need the cache behavior along one particular program path, which is the path with the maximum execution time. In this work, we introduce the concept of cache miss paths, which allows us to use the worst-case path information to improve the precision of AI-based cache analysis. We use Abstract Interpretation to determine the cache miss paths, and then integrate them in the IPET formulation. An added advantage is that this further allows us to use infeasible path information for cache analysis. Experimentally, our approach gives more precise WCETs as compared to AI-based cache analysis, and we also provide techniques to trade-off analysis time with precision to provide scalability.