916 resultados para Computational topology
Resumo:
Quantitative structure-activity/property relationships (QSAR/QSPR) studies have been exploited extensively in the designs of drugs and pesticides, but few such studies have been applied to the design of colour reagents. In this work, the topological indices A(x1)-A(x3) suggested in this laboratory were applied to multivariate analysis in structure-property studies. The topological indices of 43 phosphone bisazo derivatives of chromotropic acid were calculated. The structure-property relationships between colour reagents and their colour reactions with cerium were studied using A(x1-Ax3) indices with satisfactory results. The purpose of this work was to establish whether QSAR can be used to predict the contrasts of colour reactions and in the longer term to be a helpful tool in colour reagent design.
Resumo:
In this paper, the new topological indices A(x1)-A(x3) suggested in our laboratory and molecular connectivity indices have been applied to multivariate analysis in structure-property studies. The topological indices of twenty asymmetrical phosphono bisazo derivatives of chromotropic acid have been calculated. The structure-property relationships between colour reagents and their colour reactions with ytterbium have been studied by A(x1)-A(x3) indices and molecular connectivity indices with satisfactory results. Multiple regression analysis and neural networks were employed simultaneously in this study.
STRUCTURE-PROPERTY RELATIONSHIP BETWEEN HALF-WAVE POTENTIALS OF ORGANIC-COMPOUNDS AND THEIR TOPOLOGY
Resumo:
A significant correlation was found between half-wave potentials of organic compounds and their topological indices, A(x1), A(x2), and A(x3). The simplicity of calculation of the index from the connectivity in the molecular skeleton, together with the significant correlation, indicates its practical value. Good results have been obtained by using them to predict the half-wave potentials of some organic compounds.
Resumo:
Quantitative structure-toxicity models were developed that directly link the molecular structures of a et of 50 alkYlated and/or halogenated phenols with their polar narcosis toxicity, expressed as the negative logarithm of the IGC50 (50% growth inhibitor
Resumo:
Starting from nonhydrostatic Boussinesq approximation equations, a general method is introduced to deduce the dispersion relationships. A comparative investigation is performed on inertia-gravity wave with horizontal lengths of 100, 10 and 1 km. These are examined using the second-order central difference scheme and the fourth-order compact difference scheme on vertical grids that are currently available from the perspectives of frequency, horizontal and vertical component of group velocity. These findings are compared to analytical solutions. The obtained results suggest that whether for the second-order central difference scheme or for the fourth-order compact difference scheme, Charny-Phillips and Lorenz ( L) grids are suitable for studying waves at the above-mentioned horizontal scales; the Lorenz time-staggered and Charny-Phillips time staggered (CPTS) grids are applicable only to the horizontal scales of less than 10 km, and N grid ( unstaggered grid) is unsuitable for simulating waves at any horizontal scale. Furthermore, by using fourth-order compact difference scheme with higher difference precision, the errors of frequency and group velocity in horizontal and vertical directions produced on all vertical grids in describing the waves with horizontal lengths of 1, 10 and 100 km cannot inevitably be decreased. So in developing a numerical model, the higher-order finite difference scheme, like fourth-order compact difference scheme, should be avoided as much as possible, typically on L and CPTS grids, since it will not only take many efforts to design program but also make the calculated group velocity in horizontal and vertical directions even worse in accuracy.
Resumo:
A new lead(II) phosphonate, Pb[(PO3)(2)C(OH)CH3]center dot H2O (1) was hydrothermally synthesized and characterized by IR, elemental analysis, UV, TGA, SEM, and single crystal X-ray diffraction analysis. X-ray crystallographic study showed that complex 1 has a two-dimensional double layered hybrid structure containing interconnected 4- and 12-membered rings and shows an unusual (5,5)-connected (4(7) . 6(3)) (4(8) .6(2)) topology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
We review the progress made in computational vision, as represented by Marr's approach, in the last fifteen years. First, we briefly outline computational theories developed for low, middle and high-level vision. We then discuss in more detail solutions proposed to three representative problems in vision, each dealing with a different level of visual processing. Finally, we discuss modifications to the currently established computational paradigm that appear to be dictated by the recent developments in vision.
Resumo:
The computer science technique of computational complexity analysis can provide powerful insights into the algorithm-neutral analysis of information processing tasks. Here we show that a simple, theory-neutral linguistic model of syntactic agreement and ambiguity demonstrates that natural language parsing may be computationally intractable. Significantly, we show that it may be syntactic features rather than rules that can cause this difficulty. Informally, human languages and the computationally intractable Satisfiability (SAT) problem share two costly computional mechanisms: both enforce agreement among symbols across unbounded distances (Subject-Verb agreement) and both allow ambiguity (is a word a Noun or a Verb?).
Resumo:
This thesis introduces elements of a theory of design activity and a computational framework for developing design systems. The theory stresses the opportunistic nature of designing and the complementary roles of focus and distraction, the interdependence of evaluation and generation, the multiplicity of ways of seeing over the history of a design session versus the exclusivity of a given way of seeing over an arbitrarily short period, and the incommensurability of criteria used to evaluate a design. The thesis argues for a principle based rather than rule based approach to designing documents. The Discursive Generator is presented as a computational framework for implementing specific design systems, and a simple system for arranging blocks according to a set of formal principles is developed by way of illustration. Both shape grammars and constraint based systems are used to contrast current trends in design automation with the discursive approach advocated in the thesis. The Discursive Generator is shown to have some important properties lacking in other types of systems, such as dynamism, robustness and the ability to deal with partial designs. When studied in terms of a search metaphor, the Discursive Generator is shown to exhibit behavior which is radically different from some traditional search techniques, and to avoid some of the well-known difficulties associated with them.
Resumo:
Does knowledge of language consist of symbolic rules? How do children learn and use their linguistic knowledge? To elucidate these questions, we present a computational model that acquires phonological knowledge from a corpus of common English nouns and verbs. In our model the phonological knowledge is encapsulated as boolean constraints operating on classical linguistic representations of speech sounds in term of distinctive features. The learning algorithm compiles a corpus of words into increasingly sophisticated constraints. The algorithm is incremental, greedy, and fast. It yields one-shot learning of phonological constraints from a few examples. Our system exhibits behavior similar to that of young children learning phonological knowledge. As a bonus the constraints can be interpreted as classical linguistic rules. The computational model can be implemented by a surprisingly simple hardware mechanism. Our mechanism also sheds light on a fundamental AI question: How are signals related to symbols?
Resumo:
This report describes a computational system with which phonologists may describe a natural language in terms of autosegmental phonology, currently the most advanced theory pertaining to the sound systems of human languages. This system allows linguists to easily test autosegmental hypotheses against a large corpus of data. The system was designed primarily with tonal systems in mind, but also provides support for tree or feature matrix representation of phonemes (as in The Sound Pattern of English), as well as syllable structures and other aspects of phonological theory. Underspecification is allowed, and trees may be specified before, during, and after rule application. The association convention is automatically applied, and other principles such as the conjunctivity condition are supported. The method of representation was designed such that rules are designated in as close a fashion as possible to the existing conventions of autosegmental theory while adhering to a textual constraint for maximum portability.
Resumo:
This thesis describes an investigation of retinal directional selectivity. We show intracellular (whole-cell patch) recordings in turtle retina which indicate that this computation occurs prior to the ganglion cell, and we describe a pre-ganglionic circuit model to account for this and other findings which places the non-linear spatio-temporal filter at individual, oriented amacrine cell dendrites. The key non-linearity is provided by interactions between excitatory and inhibitory synaptic inputs onto the dendrites, and their distal tips provide directionally selective excitatory outputs onto ganglion cells. Detailed simulations of putative cells support this model, given reasonable parameter constraints. The performance of the model also suggests that this computational substructure may be relevant within the dendritic trees of CNS neurons in general.
Resumo:
The primary goal of this report is to demonstrate how considerations from computational complexity theory can inform grammatical theorizing. To this end, generalized phrase structure grammar (GPSG) linguistic theory is revised so that its power more closely matches the limited ability of an ideal speaker--hearer: GPSG Recognition is EXP-POLY time hard, while Revised GPSG Recognition is NP-complete. A second goal is to provide a theoretical framework within which to better understand the wide range of existing GPSG models, embodied in formal definitions as well as in implemented computer programs. A grammar for English and an informal explanation of the GPSG/RGPSG syntactic features are included in appendices.
Resumo:
This report investigates the process of focussing as a description and explanation of the comprehension of certain anaphoric expressions in English discourse. The investigation centers on the interpretation of definite anaphora, that is, on the personal pronouns, and noun phrases used with a definite article the, this or that. Focussing is formalized as a process in which a speaker centers attention on a particular aspect of the discourse. An algorithmic description specifies what the speaker can focus on and how the speaker may change the focus of the discourse as the discourse unfolds. The algorithm allows for a simple focussing mechanism to be constructed: and element in focus, an ordered collection of alternate foci, and a stack of old foci. The data structure for the element in focus is a representation which encodes a limted set of associations between it and other elements from teh discourse as well as from general knowledge.
Resumo:
This thesis confronts the nature of the process of learning an intellectual skill, the ability to solve problems efficiently in a particular domain of discourse. The investigation is synthetic; a computational performance model, HACKER, is displayed. Hacker is a computer problem-solving system whose performance improves with practice. HACKER maintains performance knowledge as a library of procedures indexed by descriptions of the problem types for which the procedures are appropriate. When applied to a problem, HACKER tries to use a procedure from this "Answer Library". If no procedure is found to be applicable, HACKER writes one using more general knowledge of the problem domain and of programming techniques. This new program may be generalized and added to the Answer Library.