928 resultados para Elements, High Trhoughput Data, elettrofisiologia, elaborazione dati, analisi Real Time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is a collection of works focused on the topic of Earthquake Early Warning, with a special attention to large magnitude events. The topic is addressed from different points of view and the structure of the thesis reflects the variety of the aspects which have been analyzed. The first part is dedicated to the giant, 2011 Tohoku-Oki earthquake. The main features of the rupture process are first discussed. The earthquake is then used as a case study to test the feasibility Early Warning methodologies for very large events. Limitations of the standard approaches for large events arise in this chapter. The difficulties are related to the real-time magnitude estimate from the first few seconds of recorded signal. An evolutionary strategy for the real-time magnitude estimate is proposed and applied to the single Tohoku-Oki earthquake. In the second part of the thesis a larger number of earthquakes is analyzed, including small, moderate and large events. Starting from the measurement of two Early Warning parameters, the behavior of small and large earthquakes in the initial portion of recorded signals is investigated. The aim is to understand whether small and large earthquakes can be distinguished from the initial stage of their rupture process. A physical model and a plausible interpretation to justify the observations are proposed. The third part of the thesis is focused on practical, real-time approaches for the rapid identification of the potentially damaged zone during a seismic event. Two different approaches for the rapid prediction of the damage area are proposed and tested. The first one is a threshold-based method which uses traditional seismic data. Then an innovative approach using continuous, GPS data is explored. Both strategies improve the prediction of large scale effects of strong earthquakes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of High-Integrity Real-Time Systems has a high footprint in terms of human, material and schedule costs. Factoring functional, reusable logic in the application favors incremental development and contains costs. Yet, achieving incrementality in the timing behavior is a much harder problem. Complex features at all levels of the execution stack, aimed to boost average-case performance, exhibit timing behavior highly dependent on execution history, which wrecks time composability and incrementaility with it. Our goal here is to restitute time composability to the execution stack, working bottom up across it. We first characterize time composability without making assumptions on the system architecture or the software deployment to it. Later, we focus on the role played by the real-time operating system in our pursuit. Initially we consider single-core processors and, becoming less permissive on the admissible hardware features, we devise solutions that restore a convincing degree of time composability. To show what can be done for real, we developed TiCOS, an ARINC-compliant kernel, and re-designed ORK+, a kernel for Ada Ravenscar runtimes. In that work, we added support for limited-preemption to ORK+, an absolute premiere in the landscape of real-word kernels. Our implementation allows resource sharing to co-exist with limited-preemptive scheduling, which extends state of the art. We then turn our attention to multicore architectures, first considering partitioned systems, for which we achieve results close to those obtained for single-core processors. Subsequently, we shy away from the over-provision of those systems and consider less restrictive uses of homogeneous multiprocessors, where the scheduling algorithm is key to high schedulable utilization. To that end we single out RUN, a promising baseline, and extend it to SPRINT, which supports sporadic task sets, hence matches real-world industrial needs better. To corroborate our results we present findings from real-world case studies from avionic industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il primo capitolo di questo lavoro di tesi introduce i concetti di biologia necessari per comprendere il fenomeno dell’espressione genica. Il secondo capitolo descrive i metodi e le tecniche di laboratorio utilizzate per ottenere il cDNA, il materiale genetico che verrà amplificato nella real-time PCR. Nel terzo capitolo si descrive la tecnica di real-time PCR, partendo da una descrizione della PCR convenzionale fino a delineare le caratteristiche della sua evoluzione in real-time PCR. Si prosegue con la spiegazione del principio fisico alla base della tecnica e delle molecole necessarie (fluorofori e sonde) per realizzarla; infine si descrive l’hardware e il software dello strumento. Il quarto capitolo presenta le tecniche di analisi del segnale che utilizzano metodi di quantificazione assoluta o relativa. Infine nel quinto capitolo è presentato un caso di studio, cioè un’analisi di espressione genica con real-time PCR condotta durante l’esperienza di tirocinio presso il laboratorio ICM. e delle molecole necessarie (fluorofori e sonde) per realizzarla; infine si descrive l’hardware e il software dello strumento. Il quarto capitolo presenta le tecniche di analisi del segnale che utilizzano metodi di quantificazione assoluta o relativa. Infine nel quinto capitolo è presentato un caso di studio, cioè un’analisi di espressione genica con real-time PCR condotta durante l’esperienza di tirocinio presso il laboratorio ICM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[1] Early and Mid-Pleistocene climate, ocean hydrography and ice sheet dynamics have been reconstructed using a high-resolution data set (planktonic and benthicδ18O time series, faunal-based sea surface temperature (SST) reconstructions and ice-rafted debris (IRD)) record from a high-deposition-rate sedimentary succession recovered at the Gardar Drift formation in the subpolar North Atlantic (Integrated Ocean Drilling Program Leg 306, Site U1314). Our sedimentary record spans from late in Marine Isotope Stage (MIS) 31 to MIS 19 (1069–779 ka). Different trends of the benthic and planktonic oxygen isotopes, SST and IRD records before and after MIS 25 (∼940 ka) evidence the large increase in Northern Hemisphere ice-volume, linked to the cyclicity change from the 41-kyr to the 100-kyr that occurred during the Mid-Pleistocene Transition (MPT). Beside longer glacial-interglacial (G-IG) variability, millennial-scale fluctuations were a pervasive feature across our study. Negative excursions in the benthicδ18O time series observed at the times of IRD events may be related to glacio-eustatic changes due to ice sheets retreats and/or to changes in deep hydrography. Time series analysis on surface water proxies (IRD, SST and planktonicδ18O) of the interval between MIS 31 to MIS 26 shows that the timing of these millennial-scale climate changes are related to half-precessional (10 kyr) components of the insolation forcing, which are interpreted as cross-equatorial heat transport toward high latitudes during both equinox insolation maxima at the equator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative meta-analyses of randomized clinical trials investigating the specific therapeutic efficacy of homeopathic remedies yielded statistically significant differences compared to placebo. Since the remedies used contained mostly only very low concentrations of pharmacologically active compounds, these effects cannot be accounted for within the framework of current pharmacology. Theories to explain clinical effects of homeopathic remedies are partially based upon changes in diluent structure. To investigate the latter, we measured for the first time high-field (600/500 MHz) 1H T1 and T2 nuclear magnetic resonance relaxation times of H2O in homeopathic preparations with concurrent contamination control by inductively coupled plasma mass spectrometry (ICP-MS). Homeopathic preparations of quartz (10c–30c, n = 21, corresponding to iterative dilutions of 100−10–100−30), sulfur (13x–30x, n = 18, 10−13–10−30), and copper sulfate (11c–30c, n = 20, 100−11–100−30) were compared to n = 10 independent controls each (analogously agitated dilution medium) in randomized and blinded experiments. In none of the samples, the concentration of any element analyzed by ICP-MS exceeded 10 ppb. In the first measurement series (600 MHz), there was a significant increase in T1 for all samples as a function of time, and there were no significant differences between homeopathic potencies and controls. In the second measurement series (500 MHz) 1 year after preparation, we observed statistically significant increased T1 relaxation times for homeopathic sulfur preparations compared to controls. Fifteen out of 18 correlations between sample triplicates were higher for controls than for homeopathic preparations. No conclusive explanation for these phenomena can be given at present. Possible hypotheses involve differential leaching from the measurement vessel walls or a change in water molecule dynamics, i.e., in rotational correlation time and/or diffusion. Homeopathic preparations thus may exhibit specific physicochemical properties that need to be determined in detail in future investigations.