58 resultados para Gaussian and Lorentz spectral fitting
Resumo:
BACKGROUND: Reports on the effects of focal hemispheric damage on sleep EEG are rare and contradictory. PATIENTS AND METHODS: Twenty patients (mean age +/- SD 53 +/- 14 years) with a first acute hemispheric stroke and no sleep apnea were studied. Stroke severity [National Institute of Health Stroke Scale (NIHSS)], volume (diffusion-weighted brain MRI), and short-term outcome (Rankin score) were assessed. Within the first 8 days after stroke onset, 1-3 sleep EEG recordings per patient were performed. Sleep scoring and spectral analysis were based on the central derivation of the healthy hemisphere. Data were compared with those of 10 age-matched and gender-matched hospitalized controls with no brain damage and no sleep apnea. RESULTS: Stroke patients had higher amounts of wakefulness after sleep onset (112 +/- 53 min vs. 60 +/- 38 min, p < 0.05) and a lower sleep efficiency (76 +/- 10% vs. 86 +/- 8%, p < 0.05) than controls. Time spent in slow-wave sleep (SWS) and rapid eye movement (REM) sleep and total sleep time were lower in stroke patients, but differences were not significant. A positive correlation was found between the amount of SWS and stroke volume (r = 0.79). The slow-wave activity (SWA) ratio NREM sleep/wakefulness was lower in patients than in controls (p < 0.05), and correlated with NIHSS (r = -0.47). CONCLUSION: Acute hemispheric stroke is accompanied by alterations of sleep EEG over the healthy hemisphere that correlate with stroke volume and outcome. The increased SWA during wakefulness and SWS over the healthy hemisphere contralaterally to large strokes may reflect neuronal hypometabolism induced transhemispherically (diaschisis).
Issues of spectral quality in clinical 1H-magnetic resonance spectroscopy and a gallery of artifacts
Resumo:
In spite of the facts that magnetic resonance spectroscopy (MRS) is applied as clinical tool in non-specialized institutions and that semi-automatic acquisition and processing tools can be used to produce quantitative information from MRS exams without expert information, issues of spectral quality and quality assessment are neglected in the literature of MR spectroscopy. Even worse, there is no consensus among experts on concepts or detailed criteria of quality assessment for MR spectra. Furthermore, artifacts are not at all conspicuous in MRS and can easily be taken for true, interpretable features. This article aims to increase interest in issues of spectral quality and quality assessment, to start a larger debate on generally accepted criteria that spectra must fulfil to be clinically and scientifically acceptable, and to provide a sample gallery of artifacts, which can be used to raise awareness for potential pitfalls in MRS.
Resumo:
Spectral domain optical coherence tomography (SD-OCT) in patients can deliver retinal cross-sectional images with high resolution. This may allow the evaluation of the extent of damage to the retinal pigment epithelium (RPE) and the neurosensory retina after laser treatment. This article aims to investigate the value of SD-OCT in comparing laser lesions produced by conventional laser photocoagulation and selective retina treatment (SRT).
Resumo:
Several methods based on Kriging have recently been proposed for calculating a probability of failure involving costly-to-evaluate functions. A closely related problem is to estimate the set of inputs leading to a response exceeding a given threshold. Now, estimating such a level set—and not solely its volume—and quantifying uncertainties on it are not straightforward. Here we use notions from random set theory to obtain an estimate of the level set, together with a quantification of estimation uncertainty. We give explicit formulae in the Gaussian process set-up and provide a consistency result. We then illustrate how space-filling versus adaptive design strategies may sequentially reduce level set estimation uncertainty.
Resumo:
In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.
Resumo:
We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T2.33TC.
Resumo:
Localized short-echo-time (1)H-MR spectra of human brain contain contributions of many low-molecular-weight metabolites and baseline contributions of macromolecules. Two approaches to model such spectra are compared and the data acquisition sequence, optimized for reproducibility, is presented. Modeling relies on prior knowledge constraints and linear combination of metabolite spectra. Investigated was what can be gained by basis parameterization, i.e., description of basis spectra as sums of parametric lineshapes. Effects of basis composition and addition of experimentally measured macromolecular baselines were investigated also. Both fitting methods yielded quantitatively similar values, model deviations, error estimates, and reproducibility in the evaluation of 64 spectra of human gray and white matter from 40 subjects. Major advantages of parameterized basis functions are the possibilities to evaluate fitting parameters separately, to treat subgroup spectra as independent moieties, and to incorporate deviations from straightforward metabolite models. It was found that most of the 22 basis metabolites used may provide meaningful data when comparing patient cohorts. In individual spectra, sums of closely related metabolites are often more meaningful. Inclusion of a macromolecular basis component leads to relatively small, but significantly different tissue content for most metabolites. It provides a means to quantitate baseline contributions that may contain crucial clinical information.
Resumo:
We obtain eigenvalue enclosures and basisness results for eigen- and associated functions of a non-self-adjoint unbounded linear operator pencil A−λBA−λB in which BB is uniformly positive and the essential spectrum of the pencil is empty. Both Riesz basisness and Bari basisness results are obtained. The results are applied to a system of singular differential equations arising in the study of Hagen–Poiseuille flow with non-axisymmetric disturbances.