855 resultados para complex knowledge structures
Resumo:
Humic substances are complex polymeric structures.No other polymers with such a wide range of properties are so widely distributed in nature.But still their moleculer structures are unknown. A structural knowledge is essential in determining their reactivity with metals.In the present work structural elucidation of humic acids from three different mangrove ecosystems of Cochin area is done with the available data from functional group analysis and various spectroscopic methods.13C NMR spectra of the solid samples with CPMAS,IR and SEM are very promising in revealing the complex structures of these polymeric substances.Sorptional studies on the sediment and humic acid of mangrove ecosystem reveals that the major portion of the organic matter is not extractable with Sodium hydroxide and humic acid only a small portion of the total organic matter. Humic acid is a good complexing agent and scavenger. Due to the nonextractable nature of the organic matter present with the sediment left after alkali extraction it is a better scavenger.
Resumo:
Formal Concept Analysis allows to derive conceptual hierarchies from data tables. Formal Concept Analysis is applied in various domains, e.g., data analysis, information retrieval, and knowledge discovery in databases. In order to deal with increasing sizes of the data tables (and to allow more complex data structures than just binary attributes), conceputal scales habe been developed. They are considered as metadata which structure the data conceptually. But in large applications, the number of conceptual scales increases as well. Techniques are needed which support the navigation of the user also on this meta-level of conceptual scales. In this paper, we attack this problem by extending the set of scales by hierarchically ordered higher level scales and by introducing a visualization technique called nested scaling. We extend the two-level architecture of Formal Concept Analysis (the data table plus one level of conceptual scales) to many-level architecture with a cascading system of conceptual scales. The approach also allows to use representation techniques of Formal Concept Analysis for the visualization of thesauri and ontologies.
Resumo:
This dissertation synthesizes previous research and develops a model for the study of strategic development, strategic congruence and management control. The model is used to analyze a longitudinal case study of the Swedish engineering company Atlas Copco. Employing contingency theory, the study confirms that long-term survival of a company requires adaption to contingencies. Three levels of strategy are examined: corporate, business and functional. Previous research suggests that consistency between these levels (strategic congruence) is necessary for a company to be competitive. The dissertation challenges this proposition by using a life-cycle perspective and analyzes strategic congruence in the different phases of a life cycle. It also studies management control from a life-cycle perspective. In this context, two types of management control are examined: formal and informal. From a longitudinal perspective, the study further discusses how these types interact during organizational life cycles. The dissertation shows that strategic development is more complex than previous studies have indicated. It is a long, complex and non-linear process, the results of which cannot always be predicted. Previous models for strategy and management control are based on simple relationships and rarely take into account the fact that companies often go through different phases of strategic development. The case study shows that strategic incongruence may occur at times during organizational life cycles. Furthermore, the use of management control varies over time. In the maturity phase, formal control is in focus, while the use of informal control has a bigger role in both the introduction and decline phases. Research on strategy and management control has intensified in recent years. Still there is a gap regarding the coordination of complex corporate structures. The present study contributes with further knowledge on how companies manage long-term strategic development. Few studies deal with more than two levels of strategy. Moreover, the present study addresses the need to understand strategic congruence from a life-cycle perspective. This is particularly relevant in practice, when management in large companies face difficult issues for which they expect business research to assist them in the decision-making process.
Resumo:
The knowledge of the normal anatomy and variations regarding the management of tumors of the sellar region is paramount to perform safe surgical procedures. The sellar region is located in the center of the middle cranial fossa; it contains complex anatomical structures, and is the site of various pathological processes: tumor, vascular, developmental, and neuroendocrine. We review the microsurgical anatomy (microscopic and endoscopic) of this region and discuss the surgical nuances regarding this topic, based on anatomical concepts.
Resumo:
The model of development and evolution of complex morphological structures conceived by Atchley and Hall in 1991 (Biol. Rev. 66:101-157), which establishes that changes at the macroscopic, morphogenetic level can be statistically detected as variation in skeletal units at distinct scales, was applied in combination with the formalism of geometric morphometrics to study variation in mandible shape among populations of the rodent species Thrichomys apereoides. The thin-plate spline technique produced geometric descriptors of shape derived from anatomical landmarks in the mandible, which we used with graphical and inferential approaches to partition the contribution of global and localized components to the observed differentiation in mandible shape. A major pattern of morphological differentiation in T. apereoides is attributable to localized components of shape at smaller geometric scales associated with specific morphogenetic units of the mandible. On the other hand, a clinal trend of variation is associated primarily with localized components of shape at larger geometric scales. Morphogenetic mechanisms assumed to be operating to produce the observed differentiation in the specific units of the mandible include mesenchymal condensation differentiation, muscle hypertrophy, and tooth growth. Perspectives for the application of models of morphological evolution and geometric morphometrics to morphologically based systematic biology are considered.
Resumo:
Pós-graduação em Engenharia de Produção - FEB
Resumo:
The weakening mechanisms involved in the collapse of complex impact craters are controversial. The Araguainha impact crater, in Brazil, exposes a complex structure of 40 km in diameter, and is an excellent object to address this issue. Its core is dominated by granite. In addition to microstructural observations, magnetic studies reveal its internal fabric acquired during the collapse phase. All granite samples exhibit impact-related planar deformation features (PDFs) and planar fractures (PFs), which were overprinted by cataclasis. Cataclastic deformation has evolved from incipient brittle fracturing to the development of discrete shear bands in the center of the structure. Fracture planes are systematically decorated by tiny grains (<10 mu m) of magnetite and hematite, and the orientation of magnetic lineation and magnetic foliation obtained by the anisotropies of magnetic susceptibility (AMS) and anhysteretic remanence (AAR) are perfectly coaxial in all studied sites. Therefore, we could track the orientation of deformation features which are decorated by iron oxides using the AMS and AAR. The magnetic fabrics show a regular pattern at the borders of the central peak, with orientations consistent with the fabric of sediments at the crater's inner collar and complex in the center of the structure. Both the cataclastic flow revealed from microstructural observations and the structural pattern of the magnetic anisotropy match the predictions from numerical models of complex impact structures. The widespread occurrence of cataclasis in the central peak, and its orientations revealed by magnetic studies indicate that acoustic fluidization likely operates at all scales, including the mineral scales. The cataclastic flow made possible by acoustic fluidization results in an apparent plastic deformation at the macroscopic scale in the core. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
High-frequency seismograms contain features that reflect the random inhomogeneities of the earth. In this work I use an imaging method to locate the high contrast small- scale heterogeneity respect to the background earth medium. This method was first introduced by Nishigami (1991) and than applied to different volcanic and tectonically active areas (Nishigami, 1997, Nishigami, 2000, Nishigami, 2006). The scattering imaging method is applied to two volcanic areas: Campi Flegrei and Mt. Vesuvius. Volcanic and seismological active areas are often characterized by complex velocity structures, due to the presence of rocks with different elastic properties. I introduce some modifications to the original method in order to make it suitable for small and highly complex media. In particular, for very complex media the single scattering approximation assumed by Nishigami (1991) is not applicable as the mean free path becomes short. The multiple scattering or diffusive approximation become closer to the reality. In this thesis, differently from the ordinary Nishigami’s method (Nishigami, 1991), I use the mean of the recorded coda envelope as reference curve and calculate the variations from this average envelope. In this way I implicitly do not assume any particular scattering regime for the "average" scattered radiation, whereas I consider the variations as due to waves that are singularly scattered from the strongest heterogeneities. The imaging method is applied to a relatively small area (20 x 20 km), this choice being justified by the small length of the analyzed codas of the low magnitude earthquakes. I apply the unmodified Nishigami’s method to the volcanic area of Campi Flegrei and compare the results with the other tomographies done in the same area. The scattering images, obtained with frequency waves around 18 Hz, show the presence of high scatterers in correspondence with the submerged caldera rim in the southern part of the Pozzuoli bay. Strong scattering is also found below the Solfatara crater, characterized by the presence of densely fractured, fluid-filled rocks and by a strong thermal anomaly. The modified Nishigami’s technique is applied to the Mt. Vesuvius area. Results show a low scattering area just below the central cone and a high scattering area around it. The high scattering zone seems to be due to the contrast between the high rigidity body located beneath the crater and the low rigidity materials located around it. The central low scattering area overlaps the hydrothermal reservoirs located below the central cone. An interpretation of the results in terms of geological properties of the medium is also supplied, aiming to find a correspondence of the scattering properties and the geological nature of the material. A complementary result reported in this thesis is that the strong heterogeneity of the volcanic medium create a phenomenon called "coda localization". It has been verified that the shape of the seismograms recorded from the stations located at the top of the volcanic edifice of Mt. Vesuvius is different from the shape of the seismograms recorded at the bottom. This behavior is justified by the consideration that the coda energy is not uniformly distributed within a region surrounding the source for great lapse time.
Resumo:
Our research project develops an intranet search engine with concept- browsing functionality, where the user is able to navigate the conceptual level in an interactive, automatically generated knowledge map. This knowledge map visualizes tacit, implicit knowledge, extracted from the intranet, as a network of semantic concepts. Inductive and deductive methods are combined; a text ana- lytics engine extracts knowledge structures from data inductively, and the en- terprise ontology provides a backbone structure to the process deductively. In addition to performing conventional keyword search, the user can browse the semantic network of concepts and associations to find documents and data rec- ords. Also, the user can expand and edit the knowledge network directly. As a vision, we propose a knowledge-management system that provides concept- browsing, based on a knowledge warehouse layer on top of a heterogeneous knowledge base with various systems interfaces. Such a concept browser will empower knowledge workers to interact with knowledge structures.
Resumo:
Folksonomies emerge as the result of the free tagging activity of a large number of users over a variety of resources. They can be considered as valuable sources from which it is possible to obtain emerging vocabularies that can be leveraged in knowledge extraction tasks. However, when it comes to understanding the meaning of tags in folksonomies, several problems mainly related to the appearance of synonymous and ambiguous tags arise, specifically in the context of multilinguality. The authors aim to turn folksonomies into knowledge structures where tag meanings are identified, and relations between them are asserted. For such purpose, they use DBpedia as a general knowledge base from which they leverage its multilingual capabilities.
Resumo:
The definition of an agent architecture at the knowledge level makes emphasis on the knowledge role played by the data interchanged between the agent components and makes explicit this data interchange this makes easier the reuse of these knowledge structures independently of the implementation This article defines a generic task model of an agent architecture and refines some of these tasks using the interference diagrams. Finally, a operationalisation of this conceptual model using the rule-oriented language Jess is shown. knowledge level,
Resumo:
Knowledge modeling tools are software tools that follow a modeling approach to help developers in building a knowledge-based system. The purpose of this article is to show the advantages of using this type of tools in the development of complex knowledge-based decision support systems. In order to do so, the article describes the development of a system called SAIDA in the domain of hydrology with the help of the KSM modeling tool. SAIDA operates on real-time receiving data recorded by sensors (rainfall, water levels, flows, etc.). It follows a multi-agent architecture to interpret the data, predict the future behavior and recommend control actions. The system includes an advanced knowledge based architecture with multiple symbolic representation. KSM was especially useful to design and implement the complex knowledge based architecture in an efficient way.
Resumo:
This paper describes the adaptation approach of reusable knowledge representation components used in the KSM environment for the formulation and operationalisation of structured knowledge models. Reusable knowledge representation components in KSM are called primitives of representation. A primitive of representation provides: (1) a knowledge representation formalism (2) a set of tasks that use this knowledge together with several problem-solving methods to carry out these tasks (3) a knowledge acquisition module that provides different services to acquire and validate this knowledge (4) an abstract terminology about the linguistic categories included in the representation language associated to the primitive. Primitives of representation usually are domain independent. A primitive of representation can be adapted to support knowledge in a given domain by importing concepts from this domain. The paper describes how this activity can be carried out by mean of a terminological importation. Informally, a terminological importation partially populates an abstract terminology with concepts taken from a given domain. The information provided by the importation can be used by the acquisition and validation facilities to constraint the classes of knowledge that can be described using the representation formalism according to the domain knowledge. KSM provides the LINK-S language to specify terminological importation from a domain terminology to an abstract one. These terminologies are described in KSM by mean of the CONCEL language. Terminological importation is used to adapt reusable primitives of representation in order to increase the usability degree of such components in these domains. In addition, two primitives of representation can share a common vocabulary by importing common domain CONCEL terminologies (conceptual vocabularies). It is a necessary condition to make possible the interoperability between different, heterogeneous knowledge representation components in the framework of complex knowledge - based architectures.
Resumo:
Los tipos de datos concurrentes son implementaciones concurrentes de las abstracciones de datos clásicas, con la diferencia de que han sido específicamente diseñados para aprovechar el gran paralelismo disponible en las modernas arquitecturas multiprocesador y multinúcleo. La correcta manipulación de los tipos de datos concurrentes resulta esencial para demostrar la completa corrección de los sistemas de software que los utilizan. Una de las mayores dificultades a la hora de diseñar y verificar tipos de datos concurrentes surge de la necesidad de tener que razonar acerca de un número arbitrario de procesos que invocan estos tipos de datos de manera concurrente. Esto requiere considerar sistemas parametrizados. En este trabajo estudiamos la verificación formal de propiedades temporales de sistemas concurrentes parametrizados, poniendo especial énfasis en programas que manipulan estructuras de datos concurrentes. La principal dificultad a la hora de razonar acerca de sistemas concurrentes parametrizados proviene de la interacción entre el gran nivel de concurrencia que éstos poseen y la necesidad de razonar al mismo tiempo acerca de la memoria dinámica. La verificación de sistemas parametrizados resulta en sí un problema desafiante debido a que requiere razonar acerca de estructuras de datos complejas que son accedidas y modificadas por un numero ilimitado de procesos que manipulan de manera simultánea el contenido de la memoria dinámica empleando métodos de sincronización poco estructurados. En este trabajo, presentamos un marco formal basado en métodos deductivos capaz de ocuparse de la verificación de propiedades de safety y liveness de sistemas concurrentes parametrizados que manejan estructuras de datos complejas. Nuestro marco formal incluye reglas de prueba y técnicas especialmente adaptadas para sistemas parametrizados, las cuales trabajan en colaboración con procedimientos de decisión especialmente diseñados para analizar complejas estructuras de datos concurrentes. Un aspecto novedoso de nuestro marco formal es que efectúa una clara diferenciación entre el análisis del flujo de control del programa y el análisis de los datos que se manejan. El flujo de control del programa se analiza utilizando reglas de prueba y técnicas de verificación deductivas especialmente diseñadas para lidiar con sistemas parametrizados. Comenzando a partir de un programa concurrente y la especificación de una propiedad temporal, nuestras técnicas deductivas son capaces de generar un conjunto finito de condiciones de verificación cuya validez implican la satisfacción de dicha especificación temporal por parte de cualquier sistema, sin importar el número de procesos que formen parte del sistema. Las condiciones de verificación generadas se corresponden con los datos manipulados. Estudiamos el diseño de procedimientos de decisión especializados capaces de lidiar con estas condiciones de verificación de manera completamente automática. Investigamos teorías decidibles capaces de describir propiedades de tipos de datos complejos que manipulan punteros, tales como implementaciones imperativas de pilas, colas, listas y skiplists. Para cada una de estas teorías presentamos un procedimiento de decisión y una implementación práctica construida sobre SMT solvers. Estos procedimientos de decisión son finalmente utilizados para verificar de manera automática las condiciones de verificación generadas por nuestras técnicas de verificación parametrizada. Para concluir, demostramos como utilizando nuestro marco formal es posible probar no solo propiedades de safety sino además de liveness en algunas versiones de protocolos de exclusión mutua y programas que manipulan estructuras de datos concurrentes. El enfoque que presentamos en este trabajo resulta ser muy general y puede ser aplicado para verificar un amplio rango de tipos de datos concurrentes similares. Abstract Concurrent data types are concurrent implementations of classical data abstractions, specifically designed to exploit the great deal of parallelism available in modern multiprocessor and multi-core architectures. The correct manipulation of concurrent data types is essential for the overall correctness of the software system built using them. A major difficulty in designing and verifying concurrent data types arises by the need to reason about any number of threads invoking the data type simultaneously, which requires considering parametrized systems. In this work we study the formal verification of temporal properties of parametrized concurrent systems, with a special focus on programs that manipulate concurrent data structures. The main difficulty to reason about concurrent parametrized systems comes from the combination of their inherently high concurrency and the manipulation of dynamic memory. This parametrized verification problem is very challenging, because it requires to reason about complex concurrent data structures being accessed and modified by threads which simultaneously manipulate the heap using unstructured synchronization methods. In this work, we present a formal framework based on deductive methods which is capable of dealing with the verification of safety and liveness properties of concurrent parametrized systems that manipulate complex data structures. Our framework includes special proof rules and techniques adapted for parametrized systems which work in collaboration with specialized decision procedures for complex data structures. A novel aspect of our framework is that it cleanly differentiates the analysis of the program control flow from the analysis of the data being manipulated. The program control flow is analyzed using deductive proof rules and verification techniques specifically designed for coping with parametrized systems. Starting from a concurrent program and a temporal specification, our techniques generate a finite collection of verification conditions whose validity entails the satisfaction of the temporal specification by any client system, in spite of the number of threads. The verification conditions correspond to the data manipulation. We study the design of specialized decision procedures to deal with these verification conditions fully automatically. We investigate decidable theories capable of describing rich properties of complex pointer based data types such as stacks, queues, lists and skiplists. For each of these theories we present a decision procedure, and its practical implementation on top of existing SMT solvers. These decision procedures are ultimately used for automatically verifying the verification conditions generated by our specialized parametrized verification techniques. Finally, we show how using our framework it is possible to prove not only safety but also liveness properties of concurrent versions of some mutual exclusion protocols and programs that manipulate concurrent data structures. The approach we present in this work is very general, and can be applied to verify a wide range of similar concurrent data types.
Resumo:
In view of the need to provide tools to facilitate the re-use of existing knowledge structures such as ontologies, we present in this paper a system, AKTiveRank, for the ranking of ontologies. AKTiveRank uses as input the search terms provided by a knowledge engineer and, using the output of an ontology search engine, ranks the ontologies. We apply a number of metrics in an attempt to investigate their appropriateness for ranking ontologies, and compare the results with a questionnaire-based human study. Our results show that AKTiveRank will have great utility although there is potential for improvement.