830 resultados para Computation time delay
Resumo:
Zeitreihen sind allgegenwärtig. Die Erfassung und Verarbeitung kontinuierlich gemessener Daten ist in allen Bereichen der Naturwissenschaften, Medizin und Finanzwelt vertreten. Das enorme Anwachsen aufgezeichneter Datenmengen, sei es durch automatisierte Monitoring-Systeme oder integrierte Sensoren, bedarf außerordentlich schneller Algorithmen in Theorie und Praxis. Infolgedessen beschäftigt sich diese Arbeit mit der effizienten Berechnung von Teilsequenzalignments. Komplexe Algorithmen wie z.B. Anomaliedetektion, Motivfabfrage oder die unüberwachte Extraktion von prototypischen Bausteinen in Zeitreihen machen exzessiven Gebrauch von diesen Alignments. Darin begründet sich der Bedarf nach schnellen Implementierungen. Diese Arbeit untergliedert sich in drei Ansätze, die sich dieser Herausforderung widmen. Das umfasst vier Alignierungsalgorithmen und ihre Parallelisierung auf CUDA-fähiger Hardware, einen Algorithmus zur Segmentierung von Datenströmen und eine einheitliche Behandlung von Liegruppen-wertigen Zeitreihen.rnrnDer erste Beitrag ist eine vollständige CUDA-Portierung der UCR-Suite, die weltführende Implementierung von Teilsequenzalignierung. Das umfasst ein neues Berechnungsschema zur Ermittlung lokaler Alignierungsgüten unter Verwendung z-normierten euklidischen Abstands, welches auf jeder parallelen Hardware mit Unterstützung für schnelle Fouriertransformation einsetzbar ist. Des Weiteren geben wir eine SIMT-verträgliche Umsetzung der Lower-Bound-Kaskade der UCR-Suite zur effizienten Berechnung lokaler Alignierungsgüten unter Dynamic Time Warping an. Beide CUDA-Implementierungen ermöglichen eine um ein bis zwei Größenordnungen schnellere Berechnung als etablierte Methoden.rnrnAls zweites untersuchen wir zwei Linearzeit-Approximierungen für das elastische Alignment von Teilsequenzen. Auf der einen Seite behandeln wir ein SIMT-verträgliches Relaxierungschema für Greedy DTW und seine effiziente CUDA-Parallelisierung. Auf der anderen Seite führen wir ein neues lokales Abstandsmaß ein, den Gliding Elastic Match (GEM), welches mit der gleichen asymptotischen Zeitkomplexität wie Greedy DTW berechnet werden kann, jedoch eine vollständige Relaxierung der Penalty-Matrix bietet. Weitere Verbesserungen umfassen Invarianz gegen Trends auf der Messachse und uniforme Skalierung auf der Zeitachse. Des Weiteren wird eine Erweiterung von GEM zur Multi-Shape-Segmentierung diskutiert und auf Bewegungsdaten evaluiert. Beide CUDA-Parallelisierung verzeichnen Laufzeitverbesserungen um bis zu zwei Größenordnungen.rnrnDie Behandlung von Zeitreihen beschränkt sich in der Literatur in der Regel auf reellwertige Messdaten. Der dritte Beitrag umfasst eine einheitliche Methode zur Behandlung von Liegruppen-wertigen Zeitreihen. Darauf aufbauend werden Distanzmaße auf der Rotationsgruppe SO(3) und auf der euklidischen Gruppe SE(3) behandelt. Des Weiteren werden speichereffiziente Darstellungen und gruppenkompatible Erweiterungen elastischer Maße diskutiert.
Resumo:
A general approach is presented for implementing discrete transforms as a set of first-order or second-order recursive digital filters. Clenshaw's recurrence formulae are used to formulate the second-order filters. The resulting structure is suitable for efficient implementation of discrete transforms in VLSI or FPGA circuits. The general approach is applied to the discrete Legendre transform as an illustration.
Resumo:
Clenshaw’s recurrenee formula is used to derive recursive algorithms for the discrete cosine transform @CT) and the inverse discrete cosine transform (IDCT). The recursive DCT algorithm presented here requires one fewer delay element per coefficient and one fewer multiply operation per coeflident compared with two recently proposed methods. Clenshaw’s recurrence formula provides a unified development for the recursive DCT and IDCT algorithms. The M v e al gorithms apply to arbitrary lengtb algorithms and are appropriate for VLSI implementation.
Resumo:
We studied the time interval between starting tuberculosis treatment and commencing antiretroviral treatment (ART) in HIV-infected patients (n = 1433; median CD4 count 71 cells per microliter, interquartile range: 32-132) attending 3 South African township ART services between 2002 and 2008. The overall median delay was 2.66 months (interquartile range: 1.58-4.17). In adjusted analyses, delays varied between treatment sites but were shorter for patients with lower CD4 counts and those treated in more recent calendar years. During the most recent period (2007-2008), 4.7%, 19.7%, and 51.1% of patients started ART within 2, 4, and 8 weeks of tuberculosis treatment, respectively. Operational barriers must be tackled to permit further acceleration of ART initiation as recommended by 2010 WHO ART guidelines.
Resumo:
Prospective systematic analyses of the clinical presentation of bullous pemphigoid (BP) are lacking. Little is known about the time required for its diagnosis. Knowledge of the disease spectrum is important for diagnosis, management and inclusion of patients in therapeutic trials.
Resumo:
Various inference procedures for linear regression models with censored failure times have been studied extensively. Recent developments on efficient algorithms to implement these procedures enhance the practical usage of such models in survival analysis. In this article, we present robust inferences for certain covariate effects on the failure time in the presence of "nuisance" confounders under a semiparametric, partial linear regression setting. Specifically, the estimation procedures for the regression coefficients of interest are derived from a working linear model and are valid even when the function of the confounders in the model is not correctly specified. The new proposals are illustrated with two examples and their validity for cases with practical sample sizes is demonstrated via a simulation study.
Resumo:
This paper introduces a novel approach to making inference about the regression parameters in the accelerated failure time (AFT) model for current status and interval censored data. The estimator is constructed by inverting a Wald type test for testing a null proportional hazards model. A numerically efficient Markov chain Monte Carlo (MCMC) based resampling method is proposed to simultaneously obtain the point estimator and a consistent estimator of its variance-covariance matrix. We illustrate our approach with interval censored data sets from two clinical studies. Extensive numerical studies are conducted to evaluate the finite sample performance of the new estimators.
Resumo:
Visualization and exploratory analysis is an important part of any data analysis and is made more challenging when the data are voluminous and high-dimensional. One such example is environmental monitoring data, which are often collected over time and at multiple locations, resulting in a geographically indexed multivariate time series. Financial data, although not necessarily containing a geographic component, present another source of high-volume multivariate time series data. We present the mvtsplot function which provides a method for visualizing multivariate time series data. We outline the basic design concepts and provide some examples of its usage by applying it to a database of ambient air pollution measurements in the United States and to a hypothetical portfolio of stocks.
Resumo:
BACKGROUND: Constipation is a significant side effect of opioid therapy. We have previously demonstrated that naloxone-3-glucuronide (NX3G) antagonizes the motility-lowering-effect of morphine in the rat colon. AIM: To find out whether oral NX3G is able to reduce the morphine-induced delay in colonic transit time (CTT) without being absorbed and influencing the analgesic effect. METHODS: Fifteen male volunteers were included. Pharmacokinetics: after oral administration of 0.16 mg/kg NX3G, blood samples were collected over a 6-h period. Pharmacodynamics: NX3G or placebo was then given at the start time and every 4 h thereafter. Morphine (0.05 mg/kg) or placebo was injected s.c. 2 h after starting and thereafter every 6 h for 24 h. CTT was measured over a 48-h period by scintigraphy. Pressure pain threshold tests were performed. RESULTS: Neither NX3G nor naloxone was detected in the venous blood. The slowest transit time was observed during the morphine phase, which was significantly different from morphine with NX3G and placebo. The pain perception was not significantly influenced by NX3G. CONCLUSIONS: Orally administered NX3G is able to reverse the morphine-induced delay of CTT in humans without being detected in peripheral blood samples. Therefore, NX3G may improve symptoms of constipation in-patients using opioid medication without affecting opioid-analgesic effects.
Resumo:
The performance of memory-guided saccades with two different delays (3 and 30 s of memorization) was studied in seven healthy subjects. Double-pulse transcranial magnetic stimulation (dTMS) with an interstimulus interval of 100 ms was applied over the right dorsolateral prefrontal cortex (DLPFC) early (1 s after target presentation) and late (28 s after target presentation). Early stimulation significantly increased in both delays the percentage of error in amplitude (PEA) of contralateral memory-guided saccades compared to the control experiment without stimulation. dTMS applied late in the delay had no significant effect on PEA. Furthermore, we found a significantly smaller effect of early stimulation in the long-delay paradigm. These results suggest a time-dependent hierarchical organization of the spatial working memory with a functional dominance of DLPFC during the early memorization, independent from the memorization delay. For a long memorization delay, however, working memory seems to have an additional, DLPFC-independent component.
Resumo:
For broadcasting purposes MIXED REALITY, the combination of real and virtual scene content, has become ubiquitous nowadays. Mixed Reality recording still requires expensive studio setups and is often limited to simple color keying. We present a system for Mixed Reality applications which uses depth keying and provides threedimensional mixing of real and artificial content. It features enhanced realism through automatic shadow computation which we consider a core issue to obtain realism and a convincing visual perception, besides the correct alignment of the two modalities and correct occlusion handling. Furthermore we present a possibility to support placement of virtual content in the scene. Core feature of our system is the incorporation of a TIME-OF-FLIGHT (TOF)-camera device. This device delivers real-time depth images of the environment at a reasonable resolution and quality. This camera is used to build a static environment model and it also allows correct handling of mutual occlusions between real and virtual content, shadow computation and enhanced content planning. The presented system is inexpensive, compact, mobile, flexible and provides convenient calibration procedures. Chroma-keying is replaced by depth-keying which is efficiently performed on the GRAPHICS PROCESSING UNIT (GPU) by the usage of an environment model and the current ToF-camera image. Automatic extraction and tracking of dynamic scene content is herewith performed and this information is used for planning and alignment of virtual content. An additional sustainable feature is that depth maps of the mixed content are available in real-time, which makes the approach suitable for future 3DTV productions. The presented paper gives an overview of the whole system approach including camera calibration, environment model generation, real-time keying and mixing of virtual and real content, shadowing for virtual content and dynamic object tracking for content planning.
Resumo:
Recent downward revisions in the climate response to rising CO2 levels, and opportunities for reducing non-CO2 climate warming, have both been cited as evidence that the case for reducing CO2 emissions is less urgent than previously thought. Evaluating the impact of delay is complicated by the fact that CO2 emissions accumulate over time, so what happens after they peak is as relevant for long-term warming as the size and timing of the peak itself. Previous discussions have focused on how the rate of reduction required to meet any given temperature target rises asymptotically the later the emissions peak. Here we focus on a complementary question: how fast is peak CO2-induced warming increasing while mitigation is delayed, assuming no increase in rates of reduction after the emissions peak? We show that this peak-committed warming is increasing at the same rate as cumulative CO2 emissions, about 2% per year, much faster than observed warming, independent of the climate response.
Resumo:
The responses of carbon dioxide (CO2) and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP) and Global Temperature change Potential (GTP), to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%). The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence) lies within the range of (68 to 117) × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and subjective choice to determine AGWP of CO2 and GWP is the time horizon.
Resumo:
There is great demand for easily-accessible, user-friendly dietary self-management applications. Yet accurate, fully-automatic estimation of nutritional intake using computer vision methods remains an open research problem. One key element of this problem is the volume estimation, which can be computed from 3D models obtained using multi-view geometry. The paper presents a computational system for volume estimation based on the processing of two meal images. A 3D model of the served meal is reconstructed using the acquired images and the volume is computed from the shape. The algorithm was tested on food models (dummy foods) with known volume and on real served food. Volume accuracy was in the order of 90 %, while the total execution time was below 15 seconds per image pair. The proposed system combines simple and computational affordable methods for 3D reconstruction, remained stable throughout the experiments, operates in near real time, and places minimum constraints on users.
Resumo:
Low-grade gliomas (LGGs) are a group of primary brain tumours usually encountered in young patient populations. These tumours represent a difficult challenge because many patients survive a decade or more and may be at a higher risk for treatment-related complications. Specifically, radiation therapy is known to have a relevant effect on survival but in many cases it can be deferred to avoid side effects while maintaining its beneficial effect. However, a subset of LGGs manifests more aggressive clinical behaviour and requires earlier intervention. Moreover, the effectiveness of radiotherapy depends on the tumour characteristics. Recently Pallud et al. (2012. Neuro-Oncology, 14: , 1-10) studied patients with LGGs treated with radiation therapy as a first-line therapy and obtained the counterintuitive result that tumours with a fast response to the therapy had a worse prognosis than those responding late. In this paper, we construct a mathematical model describing the basic facts of glioma progression and response to radiotherapy. The model provides also an explanation to the observations of Pallud et al. Using the model, we propose radiation fractionation schemes that might be therapeutically useful by helping to evaluate tumour malignancy while at the same time reducing the toxicity associated to the treatment.