917 resultados para non-trivial data structures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The non-isothermal data given by TG curves for poly(3-hydroxybutyrate) (PHB) were studied in order to obtain a consistent kinetic model that better represents the PHB thermal decomposition. Thus, data obtained from the dynamic TG curves were suitably managed in order to obtain the Arrhenius kinetic parameter E according to the isoconversional F-W-O method. Once the E parameters is found, a suitable logA and kinetic model (f(alpha)) could be calculated. Hence, the kinetic triplet (E +/- SD, logA +/- SD and f(alpha)) obtained for the thermal decomposition of PHB under non-isothermal conditions was E=152 +/- 4 kJ mol(-1), logA=14.1 +/- 0.2 s(-1) for the kinetic model, and the autocatalytic model function was: f(alpha)=alpha(m)(1-alpha)(n)=alpha(0.42)(1-alpha)(0.56).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper aims to present the use of a learning object (CADILAG), developed to facilitate understanding data structure operations by using visual presentations and animations. The CADILAG allows visualizing the behavior of algorithms usually discussed during Computer Science and Information System courses. For each data structure it is possible visualizing its content and its operation dynamically. Its use was evaluated an the results are presented. © 2012 AISTI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper some aspects on chaotic behavior and minimality in planar piecewise smooth vector fields theory are treated. The occurrence of non-deterministic chaos is observed and the concept of orientable minimality is introduced. Some relations between minimality and orientable minimality are also investigated and the existence of new kinds of non-trivial minimal sets in chaotic systems is observed. The approach is geometrical and involves the ordinary techniques of non-smooth systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show that it is consistent with ZFC that the free Abelian group of cardinality c admits a topological group topology that makes it countably compact with a non-trivial convergent sequence. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital terrain models (DTM) typically contain large numbers of postings, from hundreds of thousands to billions. Many algorithms that run on DTMs require topological knowledge of the postings, such as finding nearest neighbors, finding the posting closest to a chosen location, etc. If the postings are arranged irregu- larly, topological information is costly to compute and to store. This paper offers a practical approach to organizing and searching irregularly-space data sets by presenting a collection of efficient algorithms (O(N),O(lgN)) that compute important topological relationships with only a simple supporting data structure. These relationships include finding the postings within a window, locating the posting nearest a point of interest, finding the neighborhood of postings nearest a point of interest, and ordering the neighborhood counter-clockwise. These algorithms depend only on two sorted arrays of two-element tuples, holding a planimetric coordinate and an integer identification number indicating which posting the coordinate belongs to. There is one array for each planimetric coordinate (eastings and northings). These two arrays cost minimal overhead to create and store but permit the data to remain arranged irregularly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are many diseases associated with the expansion of DNA repeats in humans. Myotonic dystrophy type 2 is one of such diseases, characterized by expansions of a (CCTG)•(CAGG) repeat tract in intron 1 of zinc finger protein 9 (ZNF9) in chromosome 3q21.3. The DM2 repeat tract contains a flanking region 5' to the tract that consists of a polymorphic repetitive sequence (TG)14-25(TCTG)4-11(CCTG) n. The (CCTG)•(CAGG) repeat is typically 11-26 repeats in persons without the disease, but can expand up to 11,000 repeats in affected individuals, which is the largest expansion seen in DNA repeat diseases to date. This DNA tract remains one of the least characterized disease-associated DNA repeats, and mechanisms causing the repeat expansion in humans have yet to be elucidated. Alternative, non B-DNA structures formed by the expanded repeats are typical in DNA repeat expansion diseases. These sequences may promote instability of the repeat tracts. I determined that slipped strand structure formation occurs for (CCTG)•(CAGG) repeats at a length of 42 or more. In addition, Z-DNA structure forms in the flanking human sequence adjacent to the (CCTG)•(CAGG) repeat tract. I have also performed genetic assays in E. coli cells and results indicate that the (CCTG)•(CAGG) repeats are more similar to the highly unstable (CTG)•(CAG) repeat tracts seen in Huntington's disease and myotonic dystrophy type 1, than to those of the more stable (ATTCT)•(AGAAT) repeat tracts of spinocerebellar ataxia type 10. This instability, however, is RecA-independent in the (CCTG)•(CAGG) and (ATTCT)•(AGAAT) repeats, whereas the instability is RecA-dependent in the (CTG)•(CAG) repeats. Structural studies of the (CCTG)•(CAGG) repeat tract and the flanking sequence, as well as genetic selection assays may reveal the mechanisms responsible for the repeat instability in E. coli, and this may lead to a better understanding of the mechanisms contributing to the human disease state. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the data structures and algorithms used in the approach for building domain ontologies from folksonomies and linked data. In this approach we extracts domain terms from folksonomies and enrich them with semantic information from the Linked Open Data cloud. As a result, we obtain a domain ontology that combines the emergent knowledge of social tagging systems with formal knowledge from Ontologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the innovations in the practical work of the Data Structures subject carried out in the last five years, including a transition period and a first year of implantation of the European Higher Education Area. The practical coursework is inspired by a project-based methodology and from 2008/2009 additional laboratory sessions are included in the subject schedule. We will present the academic results and ratios of the mentioned time period which imply a significant improvement on students' performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this work is to present a way to emulate some functions of the mammalian visual system and a model to analyze subjective sensations and visual illusions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los tipos de datos concurrentes son implementaciones concurrentes de las abstracciones de datos clásicas, con la diferencia de que han sido específicamente diseñados para aprovechar el gran paralelismo disponible en las modernas arquitecturas multiprocesador y multinúcleo. La correcta manipulación de los tipos de datos concurrentes resulta esencial para demostrar la completa corrección de los sistemas de software que los utilizan. Una de las mayores dificultades a la hora de diseñar y verificar tipos de datos concurrentes surge de la necesidad de tener que razonar acerca de un número arbitrario de procesos que invocan estos tipos de datos de manera concurrente. Esto requiere considerar sistemas parametrizados. En este trabajo estudiamos la verificación formal de propiedades temporales de sistemas concurrentes parametrizados, poniendo especial énfasis en programas que manipulan estructuras de datos concurrentes. La principal dificultad a la hora de razonar acerca de sistemas concurrentes parametrizados proviene de la interacción entre el gran nivel de concurrencia que éstos poseen y la necesidad de razonar al mismo tiempo acerca de la memoria dinámica. La verificación de sistemas parametrizados resulta en sí un problema desafiante debido a que requiere razonar acerca de estructuras de datos complejas que son accedidas y modificadas por un numero ilimitado de procesos que manipulan de manera simultánea el contenido de la memoria dinámica empleando métodos de sincronización poco estructurados. En este trabajo, presentamos un marco formal basado en métodos deductivos capaz de ocuparse de la verificación de propiedades de safety y liveness de sistemas concurrentes parametrizados que manejan estructuras de datos complejas. Nuestro marco formal incluye reglas de prueba y técnicas especialmente adaptadas para sistemas parametrizados, las cuales trabajan en colaboración con procedimientos de decisión especialmente diseñados para analizar complejas estructuras de datos concurrentes. Un aspecto novedoso de nuestro marco formal es que efectúa una clara diferenciación entre el análisis del flujo de control del programa y el análisis de los datos que se manejan. El flujo de control del programa se analiza utilizando reglas de prueba y técnicas de verificación deductivas especialmente diseñadas para lidiar con sistemas parametrizados. Comenzando a partir de un programa concurrente y la especificación de una propiedad temporal, nuestras técnicas deductivas son capaces de generar un conjunto finito de condiciones de verificación cuya validez implican la satisfacción de dicha especificación temporal por parte de cualquier sistema, sin importar el número de procesos que formen parte del sistema. Las condiciones de verificación generadas se corresponden con los datos manipulados. Estudiamos el diseño de procedimientos de decisión especializados capaces de lidiar con estas condiciones de verificación de manera completamente automática. Investigamos teorías decidibles capaces de describir propiedades de tipos de datos complejos que manipulan punteros, tales como implementaciones imperativas de pilas, colas, listas y skiplists. Para cada una de estas teorías presentamos un procedimiento de decisión y una implementación práctica construida sobre SMT solvers. Estos procedimientos de decisión son finalmente utilizados para verificar de manera automática las condiciones de verificación generadas por nuestras técnicas de verificación parametrizada. Para concluir, demostramos como utilizando nuestro marco formal es posible probar no solo propiedades de safety sino además de liveness en algunas versiones de protocolos de exclusión mutua y programas que manipulan estructuras de datos concurrentes. El enfoque que presentamos en este trabajo resulta ser muy general y puede ser aplicado para verificar un amplio rango de tipos de datos concurrentes similares. Abstract Concurrent data types are concurrent implementations of classical data abstractions, specifically designed to exploit the great deal of parallelism available in modern multiprocessor and multi-core architectures. The correct manipulation of concurrent data types is essential for the overall correctness of the software system built using them. A major difficulty in designing and verifying concurrent data types arises by the need to reason about any number of threads invoking the data type simultaneously, which requires considering parametrized systems. In this work we study the formal verification of temporal properties of parametrized concurrent systems, with a special focus on programs that manipulate concurrent data structures. The main difficulty to reason about concurrent parametrized systems comes from the combination of their inherently high concurrency and the manipulation of dynamic memory. This parametrized verification problem is very challenging, because it requires to reason about complex concurrent data structures being accessed and modified by threads which simultaneously manipulate the heap using unstructured synchronization methods. In this work, we present a formal framework based on deductive methods which is capable of dealing with the verification of safety and liveness properties of concurrent parametrized systems that manipulate complex data structures. Our framework includes special proof rules and techniques adapted for parametrized systems which work in collaboration with specialized decision procedures for complex data structures. A novel aspect of our framework is that it cleanly differentiates the analysis of the program control flow from the analysis of the data being manipulated. The program control flow is analyzed using deductive proof rules and verification techniques specifically designed for coping with parametrized systems. Starting from a concurrent program and a temporal specification, our techniques generate a finite collection of verification conditions whose validity entails the satisfaction of the temporal specification by any client system, in spite of the number of threads. The verification conditions correspond to the data manipulation. We study the design of specialized decision procedures to deal with these verification conditions fully automatically. We investigate decidable theories capable of describing rich properties of complex pointer based data types such as stacks, queues, lists and skiplists. For each of these theories we present a decision procedure, and its practical implementation on top of existing SMT solvers. These decision procedures are ultimately used for automatically verifying the verification conditions generated by our specialized parametrized verification techniques. Finally, we show how using our framework it is possible to prove not only safety but also liveness properties of concurrent versions of some mutual exclusion protocols and programs that manipulate concurrent data structures. The approach we present in this work is very general, and can be applied to verify a wide range of similar concurrent data types.