919 resultados para Signal amplitude
Resumo:
The general model The aim of this chapter is to introduce a structured overview of the different possibilities available to display and analyze brain electric scalp potentials. First, a general formal model of time-varying distributed EEG potentials is introduced. Based on this model, the most common analysis strategies used in EEG research are introduced and discussed as specific cases of this general model. Both the general model and particular methods are also expressed in mathematical terms. It is however not necessary to understand these terms to understand the chapter. The general model that we propose here is based on the statement made in Chapter 3, stating that the electric field produced by active neurons in the brain propagates in brain tissue without delay in time. Contrary to other imaging methods that are based on hemodynamic or metabolic processes, the EEG scalp potentials are thus “real-time,” not delayed and not a-priori frequency-filtered measurements. If only a single dipolar source in the brain were active, the temporal dynamics of the activity of that source would be exactly reproduced by the temporal dynamics observed in the scalp potentials produced by that source. This is illustrated in Figure 5.1, where the expected EEG signal of a single source with spindle-like dynamics in time has been computed. The dynamics of the scalp potentials exactly reproduce the dynamics of the source. The amplitude of the measured potentials depends on the relation between the location and orientation of the active source, its strength and the electrode position.
Resumo:
The paper discusses new business models of transmission of television programs in the context of definitions of broadcasting and retransmission. Typically the whole process of supplying content to the end user has two stages: a media service provider supplies a signal assigned to a given TV channel to the cable operators and satellite DTH platform operators (dedicated transmission), and cable operators and satellite DTH platform operators transmit this signal to end users. In each stage the signals are encoded and are not available for the general public without the intervention of cable/platform operators. The services relating to the supply and transmission of the content are operated by different business entities: each earns money separately and each uses the content protected by copyright. We should determine how to define the actions of the entity supplying the signal with the TV program directly to the cable/digital platform operator and the actions of the entity providing the end user with the signal. The author criticizes the approach presented in the Chellomedia and Norma rulings, arguing that they lead to a significant level of legal uncertainty, and poses the basic questions concerning the notion of “public” in copyright.
Resumo:
Accumulation and delta O-18 data from Alpine ice cores provide information on past temperature and precipitation. However, their correlation with seasonal or annual mean temperature and precipitation at nearby sites is often low. This is partly due to the irregular sampling of the atmosphere by the ice core (i.e. ice cores almost only record precipitation events and not dry periods) and the possible incongruity between annual layers and calendar years. Using daily meteorological data from a nearby station and reanalyses, we replicate the ice core from the Grenzgletscher (Switzerland, 4200m a.s.l.) on a sample-by-sample basis by calculating precipitation-weighted temperature (PWT) over short intervals. Over the last 15 yr of the ice core record, accumulation and delta O-18 variations can be well reproduced on a sub-seasonal scale. This allows a wiggle-matching approach for defining quasi-annual layers, resulting in high correlations between measured quasi-annual delta O-18 and PWT. Further back in time, the agreement deteriorates. Nevertheless, we find significant correlations over the entire length of the record (1938-1993) of ice core delta O-18 with PWT, but not with annual mean temperature. This is due to the low correlations between PWT and annual mean temperature, a characteristic which in ERA-Interim reanalysis is also found for many other continental mid-to-high-latitude regions. The fact that meteorologically very different years can lead to similar combinations of PWT and accumulation poses limitations to the use of delta O-18 from Alpine ice cores for temperature reconstructions. Rather than for reconstructing annual mean temperature, delta O-18 from Alpine ice cores should be used to reconstruct PWT over quasi-annual periods. This variable is reproducible in reanalysis or climate model data and could thus be assimilated into conventional climate models.
Resumo:
OBJECTIVES This study prospectively evaluated the role of a novel 3-dimensional, noninvasive, beat-by-beat mapping system, Electrocardiographic Mapping (ECM), in facilitating the diagnosis of atrial tachycardias (AT). BACKGROUND Conventional 12-lead electrocardiogram, a widely used noninvasive tool in clinical arrhythmia practice, has diagnostic limitations. METHODS Various AT (de novo and post-atrial fibrillation ablation) were mapped using ECM followed by standard-of-care electrophysiological mapping and ablation in 52 patients. The ECM consisted of recording body surface electrograms from a 252-electrode-vest placed on the torso combined with computed tomography-scan-based biatrial anatomy (CardioInsight Inc., Cleveland, Ohio). We evaluated the feasibility of this system in defining the mechanism of AT-macro-re-entrant (perimitral, cavotricuspid isthmus-dependent, and roof-dependent circuits) versus centrifugal (focal-source) activation-and the location of arrhythmia in centrifugal AT. The accuracy of the noninvasive diagnosis and detection of ablation targets was evaluated vis-à-vis subsequent invasive mapping and successful ablation. RESULTS Comparison between ECM and electrophysiological diagnosis could be accomplished in 48 patients (48 AT) but was not possible in 4 patients where the AT mechanism changed to another AT (n = 1), atrial fibrillation (n = 1), or sinus rhythm (n = 2) during the electrophysiological procedure. ECM correctly diagnosed AT mechanisms in 44 of 48 (92%) AT: macro-re-entry in 23 of 27; and focal-onset with centrifugal activation in 21 of 21. The region of interest for focal AT perfectly matched in 21 of 21 (100%) AT. The 2:1 ventricular conduction and low-amplitude P waves challenged the diagnosis of 4 of 27 macro-re-entrant (perimitral) AT that can be overcome by injecting atrioventricular node blockers and signal averaging, respectively. CONCLUSIONS This prospective multicenter series shows a high success rate of ECM in accurately diagnosing the mechanism of AT and the location of focal arrhythmia. Intraprocedural use of the system and its application to atrial fibrillation mapping is under way.
Resumo:
PURPOSE We tested the hypothesis that whiplash trauma leads to changes of the signal intensity of cervical discs in T2-weighted images. METHODS AND MATERIALS 50 whiplash patients (18-65 years) were examined within 48h after motor vehicle accident, and again after 3 and 6 months and compared to 50 age- and sex-matched controls. Signal intensity in ROI's of the discs at the levels C2/3 to C7/T1 and the adjacent vertebral bodies were measured on sagittal T2 weighted MR images and normalized using the average of ROI's in fat tissue. The contrast between discs and both adjacent vertebrae was calculated and disc degeneration was graded by the Pfirrmann-grading system. RESULTS Whiplash trauma did not have a significant effect on the normalized signals from discs and vertebrae, on the contrast between discs and adjacent vertebrae, or on the Pfirrmann grading. However, the contrast between discs and adjacent vertebrae and the Pfirrmann grading showed a strong correlation. In healthy volunteers, the contrast between discs and adjacent vertebrae and Pfirrmann grading increased with age and was dependent on the disc level. CONCLUSION We could not find any trauma related changes of cervical disc signal intensities. Normalized signals of discs and Pfirrmann grading changed with age and varied between disc levels with the used MR sequence.
Resumo:
[1] In the event of a termination of the Gravity Recovery and Climate Experiment (GRACE) mission before the launch of GRACE Follow-On (due for launch in 2017), high-low satellite-to-satellite tracking (hl-SST) will be the only dedicated observing system with global coverage available to measure the time-variable gravity field (TVG) on a monthly or even shorter time scale. Until recently, hl-SST TVG observations were of poor quality and hardly improved the performance of Satellite Laser Ranging observations. To date, they have been of only very limited usefulness to geophysical or environmental investigations. In this paper, we apply a thorough reprocessing strategy and a dedicated Kalman filter to Challenging Minisatellite Payload (CHAMP) data to demonstrate that it is possible to derive the very long-wavelength TVG features down to spatial scales of approximately 2000 km at the annual frequency and for multi-year trends. The results are validated against GRACE data and surface height changes from long-term GPS ground stations in Greenland. We find that the quality of the CHAMP solutions is sufficient to derive long-term trends and annual amplitudes of mass change over Greenland. We conclude that hl-SST is a viable source of information for TVG and can serve to some extent to bridge a possible gap between the end-of-life of GRACE and the availability of GRACE Follow-On.
Resumo:
Medical instrumentation used in diagnosis and treatment relies on the accurate detection and processing of various physiological events and signals. While signal detection technology has improved greatly in recent years, there remain inherent delays in signal detection/ processing. These delays may have significant negative clinical consequences during various pathophysiological events. Reducing or eliminating such delays would increase the ability to provide successful early intervention in certain disorders thereby increasing the efficacy of treatment. In recent years, a physical phenomenon referred to as Negative Group Delay (NGD), demonstrated in simple electronic circuits, has been shown to temporally advance the detection of analog waveforms. Specifically, the output is temporally advanced relative to the input, as the time delay through the circuit is negative. The circuit output precedes the complete detection of the input signal. This process is referred to as signal advance (SA) detection. An SA circuit model incorporating NGD was designed, developed and tested. It imparts a constant temporal signal advance over a pre-specified spectral range in which the output is almost identical to the input signal (i.e., it has minimal distortion). Certain human patho-electrophysiological events are good candidates for the application of temporally-advanced waveform detection. SA technology has potential in early arrhythmia and epileptic seizure detection and intervention. Demonstrating reliable and consistent temporally advanced detection of electrophysiological waveforms may enable intervention with a pathological event (much) earlier than previously possible. SA detection could also be used to improve the performance of neural computer interfaces, neurotherapy applications, radiation therapy and imaging. In this study, the performance of a single-stage SA circuit model on a variety of constructed input signals, and human ECGs is investigated. The data obtained is used to quantify and characterize the temporal advances and circuit gain, as well as distortions in the output waveforms relative to their inputs. This project combines elements of physics, engineering, signal processing, statistics and electrophysiology. Its success has important consequences for the development of novel interventional methodologies in cardiology and neurophysiology as well as significant potential in a broader range of both biomedical and non-biomedical areas of application.