942 resultados para Time-memory attacks
Resumo:
One-transistor floating-body random access memory retention time distribution is investigated on silicon-on-insulator UTBOX devices. It is shown that the average retention time can be improved by two to three orders of magnitude by reducing the body-junction electric field. However, the retention time distribution, which is mainly caused by the generation-recombination center density variation, remains similar.
Resumo:
The few studies that have investigated judgments of time have suggested that the memory of duration is distorted more for emotional events than for neutral events, while in contrast there is abundant evidence that other aspects of memories of emotional events are more accurate. To reconcile this apparent discrepancy, we used a procedure in which the participants learned a standard duration over several trials under three emotional conditions: a threatening, a nonthreatening, and a neutral control condition. They were then tested either immediately or 24 h after learning. In this test phase, they had to indicate whether presented comparison durations were or were not the same as the previously learned standard duration. We found that durations were recalled better in the emotional than in the neutral condition, and that this occurred to a greater extent in the threatening than in the nonthreatening condition. Arousing emotions thus enhanced temporal memory, just as they enhance memory for other aspects of emotional events.
Resumo:
Studies of subjective time have adopted different methods to understand different processes of time perception. Four sculptures, with implied movement ranked as 1.5-, 3.0-, 4.5-, and 6.0-point stimuli on the Body Movement Ranking Scale, were randomly presented to 42 university students untrained in visual arts and ballet. Participants were allowed to observe the images for any length of time (exploration time) and, immediately after each image was observed, recorded the duration as they perceived it. The results of temporal ratio (exploration time/time estimation) showed that exploration time of images also affected perception of time, i.e., the subjective time for sculptures representing implied movement were overestimated.\
Resumo:
The floating-body-RAM sense margin and retention-time dependence on the gate length is investigated in UTBOX devices using BJT programming combined with a positive back bias (so-called V th feedback). It is shown that the sense margin and the retention time can be kept constant versus the gate length by using a positive back bias. Nevertheless, below a critical L, there is no room for optimization, and the memory performances suddenly drop. The mechanism behind this degradation is attributed to GIDL current amplification by the lateral bipolar transistor with a narrow base. The gate length can be further scaled using underlap junctions.
Resumo:
Clinical and experimental evidence suggest that estrogens have a major impact on cognition, presenting neurotrophic and neuroprotective actions in regions involved in such function. In opposite, some studies indicate that certain hormone therapy regimens may provoke detrimental effects over female cognitive and neurological function. Therefore, we decided to investigate how estrogen treatment would influence cognition and depression in different ages. For that matter, this study assessed the effects of chronic 17 beta-estradiol treatment over cognition and depressive-like behaviors of young (3 months old), adult (7 months old) and middle-aged (12 months old) reproductive female Wistar rats. These functions were also correlated with alterations in the serotonergic system, as well as hippocampal BDNF. 17 beta-Estradiol treatment did not influence animals' locomotor activity and exploratory behavior, but it was able to improve the performance of adult and middle-aged rats in the Morris water maze, the latter being more responsive to the treatment. Young and adult rats displayed decreased immobility time in the forced swimming test, suggesting an effect of 17 beta-estradiol also over such depressive-like behavior. This same test revealed increased swimming behavior, triggered by serotonergic pathway, in adult rats. Neurochemical evaluations indicated that 17 beta-estradiol treatment was able to increase serotonin turnover rate in the hippocampus of adult rats. Interestingly, estrogen treatment increased BDNF levels from animals of all ages. These findings support the notion that the beneficial effects of 17 beta-estradiol over spatial reference memory and depressive-like behavior are evident only when hormone therapy occurs at early ages and early stages of hormonal decline. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Most superdiffusive Non-Markovian random walk models assume that correlations are maintained at all time scales, e. g., fractional Brownian motion, Levy walks, the Elephant walk and Alzheimer walk models. In the latter two models the random walker can always "remember" the initial times near t = 0. Assuming jump size distributions with finite variance, the question naturally arises: is superdiffusion possible if the walker is unable to recall the initial times? We give a conclusive answer to this general question, by studying a non-Markovian model in which the walker's memory of the past is weighted by a Gaussian centered at time t/2, at which time the walker had one half the present age, and with a standard deviation sigma t which grows linearly as the walker ages. For large widths we find that the model behaves similarly to the Elephant model, but for small widths this Gaussian memory profile model behaves like the Alzheimer walk model. We also report that the phenomenon of amnestically induced persistence, known to occur in the Alzheimer walk model, arises in the Gaussian memory profile model. We conclude that memory of the initial times is not a necessary condition for generating (log-periodic) superdiffusion. We show that the phenomenon of amnestically induced persistence extends to the case of a Gaussian memory profile.
Resumo:
Glucose metabolism and insulin signaling disruptions in the brain have been proposed as a likely etiology of Alzheimer's disease. The aim of the present study was to investigate the time course of cognitive impairments induced by intracerebroventricular injection of streptozotocin (STZ) in rats and correlate them with the ensuing neurodegenerative process. Early and late effects of STZ were evaluated by using the reference and working memory versions of the Morris' water maze task and the evaluation of neurodegenerative markers by immunoblotting and the Fluoro-jade C histochemistry. The results revealed different types of behavioral and neurodegenerative responses, with distinct time courses. We observed an early disruption on the working memory as early as 3 h after STZ injections, which was followed by degenerative processes in the hippocampus at 1 and 15 days after STZ injections. Memory disruption increases over time and culminates with significant changes in amyloid-beta peptide and hyperphosphorylated Tau protein levels in distinct brain structures. These findings add information on the Alzheimer's disease-like STZ animal model and on the mechanisms underlying neurodegenerative processes. (C) 2012 Elsevier Inc. All rights reserved.
Resumo:
Cognitive dysfunction is found in patients with brain tumors and there is a need to determine whether it can be replicated in an experimental model. In the present study, the object recognition (OR) paradigm was used to investigate cognitive performance in nude mice, which represent one of the most important animal models available to study human tumors in vivo. Mice with orthotopic xenografts of the human U87MG glioblastoma cell line were trained at 9, 14, and 18days (D9, D14, and D18, respectively) after implantation of 5×10(5) cells. At D9, the mice showed normal behavior when tested 90min or 24h after training and compared to control nude mice. Animals at D14 were still able to discriminate between familiar and novel objects, but exhibited a lower performance than animals at D9. Total impairment in the OR memory was observed when animals were evaluated on D18. These alterations were detected earlier than any other clinical symptoms, which were observed only 22-24days after tumor implantation. There was a significant correlation between the discrimination index (d2) and time after tumor implantation as well as between d2 and tumor volume. These data indicate that the OR task is a robust test to identify early behavior alterations caused by glioblastoma in nude mice. In addition, these results suggest that OR task can be a reliable tool to test the efficacy of new therapies against these tumors.
Resumo:
This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.
Resumo:
Introduction and aims of the research Nitric oxide (NO) and endocannabinoids (eCBs) are major retrograde messengers, involved in synaptic plasticity (long-term potentiation, LTP, and long-term depression, LTD) in many brain areas (including hippocampus and neocortex), as well as in learning and memory processes. NO is synthesized by NO synthase (NOS) in response to increased cytosolic Ca2+ and mainly exerts its functions through soluble guanylate cyclase (sGC) and cGMP production. The main target of cGMP is the cGMP-dependent protein kinase (PKG). Activity-dependent release of eCBs in the CNS leads to the activation of the Gαi/o-coupled cannabinoid receptor 1 (CB1) at both glutamatergic and inhibitory synapses. The perirhinal cortex (Prh) is a multimodal associative cortex of the temporal lobe, critically involved in visual recognition memory. LTD is proposed to be the cellular correlate underlying this form of memory. Cholinergic neurotransmission has been shown to play a critical role in both visual recognition memory and LTD in Prh. Moreover, visual recognition memory is one of the main cognitive functions impaired in the early stages of Alzheimer’s disease. The main aim of my research was to investigate the role of NO and ECBs in synaptic plasticity in rat Prh and in visual recognition memory. Part of this research was dedicated to the study of synaptic transmission and plasticity in a murine model (Tg2576) of Alzheimer’s disease. Methods Field potential recordings. Extracellular field potential recordings were carried out in horizontal Prh slices from Sprague-Dawley or Dark Agouti juvenile (p21-35) rats. LTD was induced with a single train of 3000 pulses delivered at 5 Hz (10 min), or via bath application of carbachol (Cch; 50 μM) for 10 min. LTP was induced by theta-burst stimulation (TBS). In addition, input/output curves and 5Hz-LTD were carried out in Prh slices from 3 month-old Tg2576 mice and littermate controls. Behavioural experiments. The spontaneous novel object exploration task was performed in intra-Prh bilaterally cannulated adult Dark Agouti rats. Drugs or vehicle (saline) were directly infused into the Prh 15 min before training to verify the role of nNOS and CB1 in visual recognition memory acquisition. Object recognition memory was tested at 20 min and 24h after the end of the training phase. Results Electrophysiological experiments in Prh slices from juvenile rats showed that 5Hz-LTD is due to the activation of the NOS/sGC/PKG pathway, whereas Cch-LTD relies on NOS/sGC but not PKG activation. By contrast, NO does not appear to be involved in LTP in this preparation. Furthermore, I found that eCBs are involved in LTP induction, but not in basal synaptic transmission, 5Hz-LTD and Cch-LTD. Behavioural experiments demonstrated that the blockade of nNOS impairs rat visual recognition memory tested at 24 hours, but not at 20 min; however, the blockade of CB1 did not affect visual recognition memory acquisition tested at both time points specified. In three month-old Tg2576 mice, deficits in basal synaptic transmission and 5Hz-LTD were observed compared to littermate controls. Conclusions The results obtained in Prh slices from juvenile rats indicate that NO and CB1 play a role in the induction of LTD and LTP, respectively. These results are confirmed by the observation that nNOS, but not CB1, is involved in visual recognition memory acquisition. The preliminary results obtained in the murine model of Alzheimer’s disease indicate that deficits in synaptic transmission and plasticity occur very early in Prh; further investigations are required to characterize the molecular mechanisms underlying these deficits.
Resumo:
Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.
Resumo:
Shape memory materials (SMMs) represent an important class of smart materials that have the ability to return from a deformed state to their original shape. Thanks to such a property, SMMs are utilized in a wide range of innovative applications. The increasing number of applications and the consequent involvement of industrial players in the field have motivated researchers to formulate constitutive models able to catch the complex behavior of these materials and to develop robust computational tools for design purposes. Such a research field is still under progress, especially in the prediction of shape memory polymer (SMP) behavior and of important effects characterizing shape memory alloy (SMA) applications. Moreover, the frequent use of shape memory and metallic materials in biomedical devices, particularly in cardiovascular stents, implanted in the human body and experiencing millions of in-vivo cycles by the blood pressure, clearly indicates the need for a deeper understanding of fatigue/fracture failure in microsize components. The development of reliable stent designs against fatigue is still an open subject in scientific literature. Motivated by the described framework, the thesis focuses on several research issues involving the advanced constitutive, numerical and fatigue modeling of elastoplastic and shape memory materials. Starting from the constitutive modeling, the thesis proposes to develop refined phenomenological models for reliable SMA and SMP behavior descriptions. Then, concerning the numerical modeling, the thesis proposes to implement the models into numerical software by developing implicit/explicit time-integration algorithms, to guarantee robust computational tools for practical purposes. The described modeling activities are completed by experimental investigations on SMA actuator springs and polyethylene polymers. Finally, regarding the fatigue modeling, the thesis proposes the introduction of a general computational approach for the fatigue-life assessment of a classical stent design, in order to exploit computer-based simulations to prevent failures and modify design, without testing numerous devices.
Resumo:
Phenomenology is a critical component of autobiographical memory retrieval. Some memories are vivid and rich in sensory details whereas others are faded; some memories are experienced as emotionally intense whereas others are not. Sutin and Robins (2007) identified 10 dimensions in which a memory may vary—i.e., Vividness, Coherence, Accessibility, Sensory Details, Emotional Intensity, Visual Perspective, Time Perspective, Sharing, Distancing, and Valence—and developed a comprehensive psychometrically sound measure of memory phenomenology, the Memory Experiences Questionnaire (MEQ). Phenomenology has been linked to underlining stable dispositions—i.e. personality, as well as to a variety of positive/negative psychological outcomes—well-being and life satisfaction, depression and anxiety, among others. Using the MEQ, a cross-sectional and a longitudinal study were conducted on a large sample of American and Italian adults. In both studies, participants retrieved two ‘key’ personal memories, a Turning Point and a Childhood Memory, and rated the affect and phenomenology of each memory. Participants also completed self-reported measures of personality (i.e. Neuroticism and Conscientiousness), and measures of depression, well-being and life satisfaction. The present research showed that phenomenological ratings tend (a) to cross-sectionally increase across adulthood (Study 1), and (b) to be moderately stable over time, regardless the contents of the memories (Study 2). Interrelations among memory phenomenology, personality and psychological outcome variables were also examined (Study 1 and Study 2). In particular, autobiographical memory phenomenology was proposed as a dynamic expression of personality functioning that partially explains adaptive/maladaptive psychological outcomes. In fact, the findings partially supported the hypothesized mediating effect of phenomenology on the personality association with psychological outcomes. Implications of the findings are discussed proposing future lines of research. In particular, the need for more longitudinal studies is highlighted, along with the combined application of both self-report questionnaires and narrative measures.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
Zeitreihen sind allgegenwärtig. Die Erfassung und Verarbeitung kontinuierlich gemessener Daten ist in allen Bereichen der Naturwissenschaften, Medizin und Finanzwelt vertreten. Das enorme Anwachsen aufgezeichneter Datenmengen, sei es durch automatisierte Monitoring-Systeme oder integrierte Sensoren, bedarf außerordentlich schneller Algorithmen in Theorie und Praxis. Infolgedessen beschäftigt sich diese Arbeit mit der effizienten Berechnung von Teilsequenzalignments. Komplexe Algorithmen wie z.B. Anomaliedetektion, Motivfabfrage oder die unüberwachte Extraktion von prototypischen Bausteinen in Zeitreihen machen exzessiven Gebrauch von diesen Alignments. Darin begründet sich der Bedarf nach schnellen Implementierungen. Diese Arbeit untergliedert sich in drei Ansätze, die sich dieser Herausforderung widmen. Das umfasst vier Alignierungsalgorithmen und ihre Parallelisierung auf CUDA-fähiger Hardware, einen Algorithmus zur Segmentierung von Datenströmen und eine einheitliche Behandlung von Liegruppen-wertigen Zeitreihen.rnrnDer erste Beitrag ist eine vollständige CUDA-Portierung der UCR-Suite, die weltführende Implementierung von Teilsequenzalignierung. Das umfasst ein neues Berechnungsschema zur Ermittlung lokaler Alignierungsgüten unter Verwendung z-normierten euklidischen Abstands, welches auf jeder parallelen Hardware mit Unterstützung für schnelle Fouriertransformation einsetzbar ist. Des Weiteren geben wir eine SIMT-verträgliche Umsetzung der Lower-Bound-Kaskade der UCR-Suite zur effizienten Berechnung lokaler Alignierungsgüten unter Dynamic Time Warping an. Beide CUDA-Implementierungen ermöglichen eine um ein bis zwei Größenordnungen schnellere Berechnung als etablierte Methoden.rnrnAls zweites untersuchen wir zwei Linearzeit-Approximierungen für das elastische Alignment von Teilsequenzen. Auf der einen Seite behandeln wir ein SIMT-verträgliches Relaxierungschema für Greedy DTW und seine effiziente CUDA-Parallelisierung. Auf der anderen Seite führen wir ein neues lokales Abstandsmaß ein, den Gliding Elastic Match (GEM), welches mit der gleichen asymptotischen Zeitkomplexität wie Greedy DTW berechnet werden kann, jedoch eine vollständige Relaxierung der Penalty-Matrix bietet. Weitere Verbesserungen umfassen Invarianz gegen Trends auf der Messachse und uniforme Skalierung auf der Zeitachse. Des Weiteren wird eine Erweiterung von GEM zur Multi-Shape-Segmentierung diskutiert und auf Bewegungsdaten evaluiert. Beide CUDA-Parallelisierung verzeichnen Laufzeitverbesserungen um bis zu zwei Größenordnungen.rnrnDie Behandlung von Zeitreihen beschränkt sich in der Literatur in der Regel auf reellwertige Messdaten. Der dritte Beitrag umfasst eine einheitliche Methode zur Behandlung von Liegruppen-wertigen Zeitreihen. Darauf aufbauend werden Distanzmaße auf der Rotationsgruppe SO(3) und auf der euklidischen Gruppe SE(3) behandelt. Des Weiteren werden speichereffiziente Darstellungen und gruppenkompatible Erweiterungen elastischer Maße diskutiert.