945 resultados para Data Structure and Algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the data structures and algorithms used in the approach for building domain ontologies from folksonomies and linked data. In this approach we extracts domain terms from folksonomies and enrich them with semantic information from the Linked Open Data cloud. As a result, we obtain a domain ontology that combines the emergent knowledge of social tagging systems with formal knowledge from Ontologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

E-learning is supposing an innovation in teaching, raising from the development of new technologies. It is based in a set of educational resources, including, among others, multimedia or interactive contents accessible through Internet or Intranet networks. A whole spectrum of tools and services support e-learning, some of them include auto-evaluation and automated correction of test-like exercises, however, this sort of exercises are very constrained because of its nature: fixed contents and correct answers suppose a limit in the way teachers may evaluation students. In this paper we propose a new engine that allows validating complex exercises in the area of Data Structures and Algorithms. Correct solutions to exercises do not rely only in how good the execution of the code is, or if the results are same as expected. A set of criteria on algorithm complexity or correctness in the use of the data structures are required. The engine presented in this work covers a wide set of exercises with these characteristics allowing teachers to establish the set of requirements for a solution, and students to obtain a measure on the quality of their solution in the same terms that are later required for exams.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design of a network is a solution to several engineering and science problems. Several network design problems are known to be NP-hard, and population-based metaheuristics like evolutionary algorithms (EAs) have been largely investigated for such problems. Such optimization methods simultaneously generate a large number of potential solutions to investigate the search space in breadth and, consequently, to avoid local optima. Obtaining a potential solution usually involves the construction and maintenance of several spanning trees, or more generally, spanning forests. To efficiently explore the search space, special data structures have been developed to provide operations that manipulate a set of spanning trees (population). For a tree with n nodes, the most efficient data structures available in the literature require time O(n) to generate a new spanning tree that modifies an existing one and to store the new solution. We propose a new data structure, called node-depth-degree representation (NDDR), and we demonstrate that using this encoding, generating a new spanning forest requires average time O(root n). Experiments with an EA based on NDDR applied to large-scale instances of the degree-constrained minimum spanning tree problem have shown that the implementation adds small constants and lower order terms to the theoretical bound.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concerns regarding groundwater contamination with nitrate and the long-term sustainability of groundwater resources have prompted the development of a multi-layered three dimensional (3D) geological model to characterise the aquifer geometry of the Wairau Plain, Marlborough District, New Zealand. The 3D geological model which consists of eight litho-stratigraphic units has been subsequently used to synthesise hydrogeological and hydrogeochemical data for different aquifers in an approach that aims to demonstrate how integration of water chemistry data within the physical framework of a 3D geological model can help to better understand and conceptualise groundwater systems in complex geological settings. Multivariate statistical techniques(e.g. Principal Component Analysis and Hierarchical Cluster Analysis) were applied to groundwater chemistry data to identify hydrochemical facies which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters. Principal Component Analysis on hydrochemical data demonstrated that natural water-rock interactions, redox potential and human agricultural impact are the key controls of groundwater quality in the Wairau Plain. Hierarchical Cluster Analysis revealed distinct hydrochemical water quality groups in the Wairau Plain groundwater system. Visualisation of the results of the multivariate statistical analyses and distribution of groundwater nitrate concentrations in the context of aquifer lithology highlighted the link between groundwater chemistry and the lithology of host aquifers. The methodology followed in this study can be applied in a variety of hydrogeological settings to synthesise geological, hydrogeological and hydrochemical data and present them in a format readily understood by a wide range of stakeholders. This enables a more efficient communication of the results of scientific studies to the wider community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matrix decompositions, where a given matrix is represented as a product of two other matrices, are regularly used in data mining. Most matrix decompositions have their roots in linear algebra, but the needs of data mining are not always those of linear algebra. In data mining one needs to have results that are interpretable -- and what is considered interpretable in data mining can be very different to what is considered interpretable in linear algebra. --- The purpose of this thesis is to study matrix decompositions that directly address the issue of interpretability. An example is a decomposition of binary matrices where the factor matrices are assumed to be binary and the matrix multiplication is Boolean. The restriction to binary factor matrices increases interpretability -- factor matrices are of the same type as the original matrix -- and allows the use of Boolean matrix multiplication, which is often more intuitive than normal matrix multiplication with binary matrices. Also several other decomposition methods are described, and the computational complexity of computing them is studied together with the hardness of approximating the related optimization problems. Based on these studies, algorithms for constructing the decompositions are proposed. Constructing the decompositions turns out to be computationally hard, and the proposed algorithms are mostly based on various heuristics. Nevertheless, the algorithms are shown to be capable of finding good results in empirical experiments conducted with both synthetic and real-world data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The past several years have seen significant advances in the development of computational methods for the prediction of the structure and interactions of coiled-coil peptides. These methods are generally based on pairwise correlations of amino acids, helical propensity, thermal melts and the energetics of sidechain interactions, as well as statistical patterns based on Hidden Markov Model (HMM) and Support Vector Machine (SVM) techniques. These methods are complemented by a number of public databases that contain sequences, motifs, domains and other details of coiled-coil structures identified by various algorithms. Some of these computational methods have been developed to make predictions of coiled-coil structure on the basis of sequence information; however, structural predictions of the oligomerisation state of these peptides still remains largely an open question due to the dynamic behaviour of these molecules. This review focuses on existing in silico methods for the prediction of coiled-coil peptides of functional importance using sequence and/or three-dimensional structural data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The torsional potential functions Vt(phi) and Vt(psi) around single bonds N--C alpha and C alpha--C, which can be used in conformational studies of oligopeptides, polypeptides and proteins, have been derived, using crystal structure data of 22 globular proteins, fitting the observed distribution in the (phi, psi)-plane with the value of Vtot(phi, psi), using the Boltzmann distribution. The averaged torsional potential functions, obtained from various amino acid residues in L-configuration, are Vt(phi) = 1.0 cos (phi + 60 degrees); Vt(psi) = 0.5 cos (psi + 60 degrees) - 1.0 cos (2 psi + 30 degrees) - 0.5 cos (3 psi + 30 degrees). The dipeptide energy maps Vtot(phi, psi) obtained using these functions, instead of the normally accepted torsional functions, were found to explain various observations, such as the absence of the left-handed alpha helix and the C7 conformation, and the relatively high density of points near the line psi = 0 degrees. These functions derived from observational data on protein structures, will, it is hoped, explain various previously unexplained facts in polypeptide conformation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The torsional potential functions Vt(φ) and Vt(ψ) around single bonds N–Cα and Cα-C, which can be used in conformational studies of oligopeptides, polypeptides and proteins, have been derived, using crystal structure data of 22 globular proteins, fitting the observed distribution in the (φ, ψ)-plane with the value of Vtot(φ, ψ), using the Boltzmann distribution. The averaged torsional potential functions, obtained from various amino acid residues in l-configuration, are Vt(φ) = – 1.0 cos (φ + 60°); Vt(ψ) = – 0.5 cos (ψ + 60°) – 1.0 cos (2ψ + 30°) – 0.5 cos (3ψ + 30°). The dipeptide energy maps Vtot(φ, ψ) obtained using these functions, instead of the normally accepted torsional functions, were found to explain various observations, such as the absence of the left-handed alpha helix and the C7 conformation, and the relatively high density of points near the line ψ = 0°. These functions, derived from observational data on protein structures, will, it is hoped, explain various previously unexplained facts in polypeptide conformation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efavirenz, (S)-6-chloro-4-(cyclopropylethynyl)-1,4-dihydro-4-(trifluoromethyl)-2H-3 ,1-benzoxazin-2-one, is an anti HIV agent belonging to the class of the non-nucleoside inhibitors of the HIV-1 virus reverse transcriptase. A systematic quantum chemical study of the possible conformations, their relative stabilities and vibrational spectra of efavirenz has been reported. Structural and spectral characteristics of efavirenz have been studied by vibrational spectroscopy and quantum chemical methods. Density functional theory (DFT) calculations for potential energy curve, optimized geometries and vibrational spectra have been carried out using 6-311++G(d,p) basis sets and B3LYP functionals. Based on these results, we have discussed the correlation between the vibrational modes and the crystalline structure of the most stable form of efavirenz. A complete analysis of the experimental infrared and Raman spectra has been reported on the basis of wavenumber of the vibrational bands and potential energy distribution. The infrared and the Raman spectra of the molecule based on OFT calculations show reasonable agreement with the experimental results. The calculated HOMO and LUMO energies shows that charge transfer occur within the molecule. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Fourier transform Raman and infrared (IR) spectra of the Ceramide 3 (CER3) have been recorded in the regions 200-3500 cm(-1) and 680-4000 cm(-1), respectively. We have calculated the equilibrium geometry, harmonic vibrational wavenumbers, electrostatic potential surfaces, absolute Raman scattering activities and IR absorption intensities by the density functional theory with B3LYP functionals having extended basis set 6-311G. This work is undertaken to study the vibrational spectra of CER3 completely and to identify the various normal modes with better wavenumber accuracy. Good consistency is found between the calculated results and experimental data for the IR and Raman spectra.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the magnetic and magneto-optical properties of amorphous rare earth-transition metal (RE-TM) alloys as well as the magnetic coupling in the multi-layer thin films for high density optical data storage are presented. Using magnetic effect in scanning tunneling microscopy the clusters structure of amorphous RE-TM thin films has been observed and the perpendicular magnetic anisotropy in amorphous RE-TM thin films has been interpreted. Experimental results of quick phase transformation under short pulse laser irradiation of amorphous semiconductor and metallic alloy thin films for phase change optical recording are reported. A step-by-step phase transformation process through metastable states has been observed. The waveform of crystallization propagation in micro-size spot during laser recording in amorphous semiconductor thin films is characterized and quick recording and erasing mechanism for optical data storage with high performance are discussed. The nonlinear optical effects in amorphous alloy thin films have been studied. By photo-thermal effect or third order optical nonlinearity, the optical self-focusing is observed in amorphous mask thin films. The application of amorphous thin films with super-resolution near field structure for high-density optical data storage is performed. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This approach is undertaken to examine the correlation ability of the general a(N)-index (GAI) to predict chromatographic behavior. The test is performed on various types of organophosphorus compounds. The results demonstrate that the GAI possesses a good correlation with chromatographic properties.