84 resultados para Algorithmen


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Zeitreihen sind allgegenwärtig. Die Erfassung und Verarbeitung kontinuierlich gemessener Daten ist in allen Bereichen der Naturwissenschaften, Medizin und Finanzwelt vertreten. Das enorme Anwachsen aufgezeichneter Datenmengen, sei es durch automatisierte Monitoring-Systeme oder integrierte Sensoren, bedarf außerordentlich schneller Algorithmen in Theorie und Praxis. Infolgedessen beschäftigt sich diese Arbeit mit der effizienten Berechnung von Teilsequenzalignments. Komplexe Algorithmen wie z.B. Anomaliedetektion, Motivfabfrage oder die unüberwachte Extraktion von prototypischen Bausteinen in Zeitreihen machen exzessiven Gebrauch von diesen Alignments. Darin begründet sich der Bedarf nach schnellen Implementierungen. Diese Arbeit untergliedert sich in drei Ansätze, die sich dieser Herausforderung widmen. Das umfasst vier Alignierungsalgorithmen und ihre Parallelisierung auf CUDA-fähiger Hardware, einen Algorithmus zur Segmentierung von Datenströmen und eine einheitliche Behandlung von Liegruppen-wertigen Zeitreihen.rnrnDer erste Beitrag ist eine vollständige CUDA-Portierung der UCR-Suite, die weltführende Implementierung von Teilsequenzalignierung. Das umfasst ein neues Berechnungsschema zur Ermittlung lokaler Alignierungsgüten unter Verwendung z-normierten euklidischen Abstands, welches auf jeder parallelen Hardware mit Unterstützung für schnelle Fouriertransformation einsetzbar ist. Des Weiteren geben wir eine SIMT-verträgliche Umsetzung der Lower-Bound-Kaskade der UCR-Suite zur effizienten Berechnung lokaler Alignierungsgüten unter Dynamic Time Warping an. Beide CUDA-Implementierungen ermöglichen eine um ein bis zwei Größenordnungen schnellere Berechnung als etablierte Methoden.rnrnAls zweites untersuchen wir zwei Linearzeit-Approximierungen für das elastische Alignment von Teilsequenzen. Auf der einen Seite behandeln wir ein SIMT-verträgliches Relaxierungschema für Greedy DTW und seine effiziente CUDA-Parallelisierung. Auf der anderen Seite führen wir ein neues lokales Abstandsmaß ein, den Gliding Elastic Match (GEM), welches mit der gleichen asymptotischen Zeitkomplexität wie Greedy DTW berechnet werden kann, jedoch eine vollständige Relaxierung der Penalty-Matrix bietet. Weitere Verbesserungen umfassen Invarianz gegen Trends auf der Messachse und uniforme Skalierung auf der Zeitachse. Des Weiteren wird eine Erweiterung von GEM zur Multi-Shape-Segmentierung diskutiert und auf Bewegungsdaten evaluiert. Beide CUDA-Parallelisierung verzeichnen Laufzeitverbesserungen um bis zu zwei Größenordnungen.rnrnDie Behandlung von Zeitreihen beschränkt sich in der Literatur in der Regel auf reellwertige Messdaten. Der dritte Beitrag umfasst eine einheitliche Methode zur Behandlung von Liegruppen-wertigen Zeitreihen. Darauf aufbauend werden Distanzmaße auf der Rotationsgruppe SO(3) und auf der euklidischen Gruppe SE(3) behandelt. Des Weiteren werden speichereffiziente Darstellungen und gruppenkompatible Erweiterungen elastischer Maße diskutiert.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the case of a 72-year old woman with known metastatic breast cancer who presented to the emergency department with progressive dyspnea on exertion and chest pain. The diagnosis of pulmonary embolism could be established by pulmonary scintigraphy after computed tomography and ultrasound of the lower extremities had been negative in spite of a moderate clinical pretest probability (Wells score). This case shows that even if we manage suspected pulmonary embolism using algorithms combining clinical probability, computed tomography and ultrasound we must remain aware of eventually missing the diagnosis and carry on investigating cases with elevated clinical probability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After 20 years of silence, two recent references from the Czech Republic (Bezpečnostní softwarová asociace, Case C-393/09) and from the English High Court (SAS Institute, Case C-406/10) touch upon several questions that are fundamental for the extent of copyright protection for software under the Computer Program Directive 91/25 (now 2009/24) and the Information Society Directive 2001/29. In Case C-393/09, the European Court of Justice held that “the object of the protection conferred by that directive is the expression in any form of a computer program which permits reproduction in different computer languages, such as the source code and the object code.” As “any form of expression of a computer program must be protected from the moment when its reproduction would engender the reproduction of the computer program itself, thus enabling the computer to perform its task,” a graphical user interface (GUI) is not protected under the Computer Program Directive, as it does “not enable the reproduction of that computer program, but merely constitutes one element of that program by means of which users make use of the features of that program.” While the definition of computer program and the exclusion of GUIs mirror earlier jurisprudence in the Member States and therefore do not come as a surprise, the main significance of Case C-393/09 lies in its interpretation of the Information Society Directive. In confirming that a GUI “can, as a work, be protected by copyright if it is its author’s own intellectual creation,” the ECJ continues the Europeanization of the definition of “work” which began in Infopaq (Case C-5/08). Moreover, the Court elaborated this concept further by excluding expressions from copyright protection which are dictated by their technical function. Even more importantly, the ECJ held that a television broadcasting of a GUI does not constitute a communication to the public, as the individuals cannot have access to the “essential element characterising the interface,” i.e., the interaction with the user. The exclusion of elements dictated by technical functions from copyright protection and the interpretation of the right of communication to the public with reference to the “essential element characterising” the work may be seen as welcome limitations of copyright protection in the interest of a free public domain which were not yet apparent in Infopaq. While Case C-393/09 has given a first definition of the computer program, the pending reference in Case C-406/10 is likely to clarify the scope of protection against nonliteral copying, namely in how far the protection extends beyond the text of the source code to the design of a computer program and where the limits of protection lie as regards the functionality of a program and mere “principles and ideas.” In light of the travaux préparatoires, it is submitted that the ECJ is also likely to grant protection for the design of a computer program, while excluding both the functionality and underlying principles and ideas from protection under the European copyright directives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Obwohl Distributionszentren (DZ) zentrale Kernelemente von Lieferketten darstellen, lässt sich gegenwärtig keine strukturierte Methodik finden, um diese objektiv, systematisch und insbesondere ganzheitlich über alle Funktionsbereiche hinweg – vom Wareneingang über die Kommissionierung bis zum Warenausgang – zu planen. Der vorliegende Artikel befasst sich mit dieser wissenschaftlichen Lücke und beschreibt wie mit Hilfe von analytisch modellierten Standardmodulen innerhalb der verschiedenen Funktionsbereiche eines DZ durch Anwendung eines graphentheoretischen Ansatzes funktionsbereichsübergreifende Varianten von DZ generiert werden können. Zur automatisierten Ermittlung der optimalen Standardmodulkombination bzw. der optimalen DZ-Variante werden modifizierte Algorithmen zur Findung der kürzesten Wege innerhalb eines Graphen angewendet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Postpartum hemorrhage (PPH) is one of the main causes of maternal deaths even in industrialized countries. It represents an emergency situation which necessitates a rapid decision and in particular an exact diagnosis and root cause analysis in order to initiate the correct therapeutic measures in an interdisciplinary cooperation. In addition to established guidelines, the benefits of standardized therapy algorithms have been demonstrated. A therapy algorithm for the obstetric emergency of postpartum hemorrhage in the German language is not yet available. The establishment of an international (Germany, Austria and Switzerland D-A-CH) "treatment algorithm for postpartum hemorrhage" was an interdisciplinary project based on the guidelines of the corresponding specialist societies (anesthesia and intensive care medicine and obstetrics) in the three countries as well as comparable international algorithms for therapy of PPH.The obstetrics and anesthesiology personnel must possess sufficient expertise for emergency situations despite lower case numbers. The rarity of occurrence for individual patients and the life-threatening situation necessitate a structured approach according to predetermined treatment algorithms. This can then be carried out according to the established algorithm. Furthermore, this algorithm presents the opportunity to train for emergency situations in an interdisciplinary team.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stochastische Modelle sind bei der Bewertung von Schadensbeträgen für Versicherungen von besonderer Bedeutung. Das Buch gibt eine Einführung in die dabei verwendeten Modelle für kleine und große Schadensbeträge wie auch in die stochastische Prozesse der aktuariellen Risikotheorie (Zählprozesse und Poisson-Prozess). Zentrales Thema ist die Analyse der Ruinwahrscheinlichkeit, wobei exakte Berechnungsmethoden, asymptotische Approximationen und numerische Algorithmen wie Monte Carlo-Simulation und schnelle Fourier-Transformation vorgestellt werden. Ein Appendix mit wichtigen Resultaten der Wahrscheinlichkeitstheorie erleichtert die Lektüre dieses Buches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE In this study, the "Progressive Resolution Optimizer PRO3" (Varian Medical Systems) is compared to the previous version "PRO2" with respect to its potential to improve dose sparing to the organs at risk (OAR) and dose coverage of the PTV for head and neck cancer patients. MATERIALS AND METHODS For eight head and neck cancer patients, volumetric modulated arc therapy (VMAT) treatment plans were generated in this study. All cases have 2-3 phases and the total prescribed dose (PD) was 60-72Gy in the PTV. The study is mainly focused on the phase 1 plans, which all have an identical PD of 54Gy, and complex PTV structures with an overlap to the parotids. Optimization was performed based on planning objectives for the PTV according to ICRU83, and with minimal dose to spinal cord, and parotids outside PTV. In order to assess the quality of the optimization algorithms, an identical set of constraints was used for both, PRO2 and PRO3. The resulting treatment plans were investigated with respect to dose distribution based on the analysis of the dose volume histograms. RESULTS For the phase 1 plans (PD=54Gy) the near maximum dose D2% of the spinal cord, could be minimized to 22±5 Gy with PRO3, as compared to 32±12Gy with PRO2, averaged for all patients. The mean dose to the parotids was also lower in PRO3 plans compared to PRO2, but the differences were less pronounced. A PTV coverage of V95%=97±1% could be reached with PRO3, as compared to 86±5% with PRO2. In clinical routine, these PRO2 plans would require modifications to obtain better PTV coverage at the cost of higher OAR doses. CONCLUSION A comparison between PRO3 and PRO2 optimization algorithms was performed for eight head and neck cancer patients. In general, the quality of VMAT plans for head and neck patients are improved with PRO3 as compared to PRO2. The dose to OARs can be reduced significantly, especially for the spinal cord. These reductions are achieved with better PTV coverage as compared to PRO2. The improved spinal cord sparing offers new opportunities for all types of paraspinal tumors and for re-irradiation of recurrent tumors or second malignancies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we present a novel method to compensate the movement in images acquired during free breathing using first-pass gadolinium enhanced, myocardial perfusion magnetic resonance imaging (MRI). First, we use independent component analysis (ICA) to identify the optimal number of independent components (ICs) that separate the breathing motion from the intensity change induced by the contrast agent. Then, synthetic images are created by recombining the ICs, but other then in previously published work (Milles et al. 2008), we omit the component related to motion, and therefore, the resulting reference image series is free of motion. Motion compensation is then achieved by using a multi-pass non-rigid image registration scheme. We tested our method on 15 distinct image series (5 patients) consisting of 58 images each and we validated our method by comparing manually tracked intensity profiles of the myocardial sections to automatically generated ones before and after registration. The average correlation to the manually obtained curves before registration 0:89 0:11 was increased to 0:98 0:02

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hybrid simulation is a technique that combines experimental and numerical testing and has been used for the last decades in the fields of aerospace, civil and mechanical engineering. During this time, most of the research has focused on developing algorithms and the necessary technology, including but not limited to, error minimisation techniques, phase lag compensation and faster hydraulic cylinders. However, one of the main shortcomings in hybrid simulation that has pre- vented its widespread use is the size of the numerical models and the effect that higher frequencies may have on the stability and accuracy of the simulation. The first chapter in this document provides an overview of the hybrid simulation method and the different hybrid simulation schemes, and the corresponding time integration algorithms, that are more commonly used in this field. The scope of this thesis is presented in more detail in chapter 2: a substructure algorithm, the Substep Force Feedback (Subfeed), is adapted in order to fulfil the necessary requirements in terms of speed. The effects of more complex models on the Subfeed are also studied in detail, and the improvements made are validated experimentally. Chapters 3 and 4 detail the methodologies that have been used in order to accomplish the objectives mentioned in the previous lines, listing the different cases of study and detailing the hardware and software used to experimentally validate them. The third chapter contains a brief introduction to a project, the DFG Subshake, whose data have been used as a starting point for the developments that are shown later in this thesis. The results obtained are presented in chapters 5 and 6, with the first of them focusing on purely numerical simulations while the second of them is more oriented towards a more practical application including experimental real-time hybrid simulation tests with large numerical models. Following the discussion of the developments in this thesis is a list of hardware and software requirements that have to be met in order to apply the methods described in this document, and they can be found in chapter 7. The last chapter, chapter 8, of this thesis focuses on conclusions and achievements extracted from the results, namely: the adaptation of the hybrid simulation algorithm Subfeed to be used in conjunction with large numerical models, the study of the effect of high frequencies on the substructure algorithm and experimental real-time hybrid simulation tests with vibrating subsystems using large numerical models and shake tables. A brief discussion of possible future research activities can be found in the concluding chapter.