969 resultados para Attenuation correction
Resumo:
We evaluated the performance of an optical camera based prospective motion correction (PMC) system in improving the quality of 3D echo-planar imaging functional MRI data. An optical camera and external marker were used to dynamically track the head movement of subjects during fMRI scanning. PMC was performed by using the motion information to dynamically update the sequence's RF excitation and gradient waveforms such that the field-of-view was realigned to match the subject's head movement. Task-free fMRI experiments on five healthy volunteers followed a 2×2×3 factorial design with the following factors: PMC on or off; 3.0mm or 1.5mm isotropic resolution; and no, slow, or fast head movements. Visual and motor fMRI experiments were additionally performed on one of the volunteers at 1.5mm resolution comparing PMC on vs PMC off for no and slow head movements. Metrics were developed to quantify the amount of motion as it occurred relative to k-space data acquisition. The motion quantification metric collapsed the very rich camera tracking data into one scalar value for each image volume that was strongly predictive of motion-induced artifacts. The PMC system did not introduce extraneous artifacts for the no motion conditions and improved the time series temporal signal-to-noise by 30% to 40% for all combinations of low/high resolution and slow/fast head movement relative to the standard acquisition with no prospective correction. The numbers of activated voxels (p<0.001, uncorrected) in both task-based experiments were comparable for the no motion cases and increased by 78% and 330%, respectively, for PMC on versus PMC off in the slow motion cases. The PMC system is a robust solution to decrease the motion sensitivity of multi-shot 3D EPI sequences and thereby overcome one of the main roadblocks to their widespread use in fMRI studies.
Traitement chirurgical de l'hallux valgus : correction des tissus mous ou ostéotomie sous-capitale ?
Resumo:
[This corrects the article DOI: 10.1371/journal.pone.0114418.].
Resumo:
OBJECTIVES: To determine inter-session and intra/inter-individual variations of the attenuations of aortic blood/myocardium with MDCT in the context of calcium scoring. To evaluate whether these variations are dependent on patients' characteristics. METHODS: Fifty-four volunteers were evaluated with calcium scoring non-enhanced CT. We measured attenuations (inter-individual variation) and standard deviations (SD, intra-individual variation) of the blood in the ascending aorta and of the myocardium of left ventricle. Every volunteer was examined twice to study the inter-session variation. The fat pad thickness at the sternum and noise (SD of air) were measured too. These values were correlated with the measured aortic/ventricular attenuations and their SDs (Pearson). Historically fixed thresholds (90 and 130 HU) were tested against different models based on attenuations of blood/ventricle. RESULTS: The mean attenuation was 46 HU (range, 17-84 HU) with mean SD 23 HU for the blood, and 39 HU (10-82 HU) with mean SD 18 HU for the myocardium. The attenuation/SD of the blood were significantly higher than those of the myocardium (p < 0.01). The inter-session variation was not significant. There was a poor correlation between SD of aortic blood/ventricle with fat thickness/noise. Based on existing models, 90 HU threshold offers a confidence interval of approximately 95% and 130 HU more than 99%. CONCLUSIONS: Historical thresholds offer high confidence intervals for exclusion of aortic blood/myocardium and by the way for detecting calcifications. Nevertheless, considering the large variations of blood/myocardium CT values and the influence of patient's characteristics, a better approach might be an adaptive threshold.
Resumo:
For a massless fluid (density = 0), the steady flow along a duct is governed exclusively by viscous losses. In this paper, we show that the velocity profile obtained in this limit can be used to calculate the pressure drop up to the first order in density. This method has been applied to the particular case of a duct, defined by two plane-parallel discs. For this case, the first-order approximation results in a simple analytical solution which has been favorably checked against numerical simulations. Finally, an experiment has been carried out with water flowing between the discs. The experimental results show good agreement with the approximate solution
Resumo:
In this work, we propose a method for prospective motion correction in MRI using a novel image navigator module, which is triggered by a free induction decay (FID) navigator. Only when motion occurs, the image navigator is run and new positional information is obtained through image registration. The image navigator was specifically designed to match the impact on the magnetization and the acoustic noise of the host sequence. This detection-correction scheme was implemented for an MP-RAGE sequence and 5 healthy volunteers were scanned at 3T while performing various head movements. The correction performance was demonstrated through automated brain segmentation and an image quality index whose results are sensitive to motion artifacts.
Resumo:
Connectivity analysis on diffusion MRI data of the whole- brain suffers from distortions caused by the standard echo- planar imaging acquisition strategies. These images show characteristic geometrical deformations and signal destruction that are an important drawback limiting the success of tractography algorithms. Several retrospective correction techniques are readily available. In this work, we use a digital phantom designed for the evaluation of connectivity pipelines. We subject the phantom to a âeurooetheoretically correctâeuro and plausible deformation that resembles the artifact under investigation. We correct data back, with three standard methodologies (namely fieldmap-based, reversed encoding-based, and registration- based). Finally, we rank the methods based on their geometrical accuracy, the dropout compensation, and their impact on the resulting connectivity matrices.
Resumo:
Using event-related brain potentials, the time course of error detection and correction was studied in healthy human subjects. A feedforward model of error correction was used to predict the timing properties of the error and corrective movements. Analysis of the multichannel recordings focused on (1) the error-related negativity (ERN) seen immediately after errors in response- and stimulus-locked averages and (2) on the lateralized readiness potential (LRP) reflecting motor preparation. Comparison of the onset and time course of the ERN and LRP components showed that the signs of corrective activity preceded the ERN. Thus, error correction was implemented before or at least in parallel with the appearance of the ERN component. Also, the amplitude of the ERN component was increased for errors, followed by fast corrective movements. The results are compatible with recent views considering the ERN component as the output of an evaluative system engaged in monitoring motor conflict.
Resumo:
In this work we propose a new approach for the determination of the mobility of mercury in sediments based on spatial distribution of concentrations. We chose the Tainheiros Cove, located in the Todos os Santos Bay, Brazil, as the study area, for it has a history of mercury contamination due to a chloro-alkali plant that was active during 12 years. Twenty-six surface sediment samples were collected from the area and mercury concentrations were measured by cold vapour atomic absorption spectrophotometry. A contour map was constructed from the results, indicating that mercury accumulated in a "hot spot" where concentrations reach more than 1 µg g-1. The model is able to estimate mobility of mercury in the sediments based on the distances between iso-concentration contours that determines an attenuation of concentrations factor. Values of attenuation ranged between 0.0729 (East of the hot spot, indicating higher mobility) to 0.7727 (North of the hot spot, indicating lower mobility).
Resumo:
In this work, a new mathematical equation correction approach for overcoming spectral and transport interferences was proposed. The proposal was applied to eliminate spectral interference caused by PO molecules at the 217.0005 nm Pb line, and the transport interference caused by variations in phosphoric acid concentrations. Correction may be necessary at 217.0005 nm to account for the contribution of PO, since Atotal217.0005 nm = A Pb217.0005 nm + A PO217.0005 nm. This may be easily done by measuring other PO wavelengths (e.g. 217.0458 nm) and calculating the relative contribution of PO absorbance (A PO) to the total absorbance (Atotal) at 217.0005 nm: A Pb217.0005 nm = Atotal217.0005 nm - A PO217.0005 nm = Atotal217.0005 nm - k (A PO217.0458 nm). The correction factor k is calculated from slopes of calibration curves built up for phosphorous (P) standard solutions measured at 217.0005 and 217.0458 nm, i.e. k = (slope217.0005 nm/slope217.0458 nm). For wavelength integrated absorbance of 3 pixels, sample aspiration rate of 5.0 ml min-1, analytical curves in the 0.1 - 1.0 mg L-1 Pb range with linearity better than 0.9990 were consistently obtained. Calibration curves for P at 217.0005 and 217.0458 nm with linearity better than 0.998 were obtained. Relative standard deviations (RSD) of measurements (n = 12) in the range of 1.4 - 4.3% and 2.0 - 6.0% without and with mathematical equation correction approach were obtained respectively. The limit of detection calculated to analytical line at 217.0005 nm was 10 µg L-1 Pb. Recoveries for Pb spikes were in the 97.5 - 100% and 105 - 230% intervals with and without mathematical equation correction approach, respectively.
Resumo:
Thermal and air conditions inside animal facilities change during the day due to the influence of the external environment. For statistical and geostatistical analyses to be representative, a large number of points spatially distributed in the facility area must be monitored. This work suggests that the time variation of environmental variables of interest for animal production, monitored within animal facility, can be modeled accurately from discrete-time records. The aim of this study was to develop a numerical method to correct the temporal variations of these environmental variables, transforming the data so that such observations are independent of the time spent during the measurement. The proposed method approached values recorded with time delays to those expected at the exact moment of interest, if the data were measured simultaneously at the moment at all points distributed spatially. The correction model for numerical environmental variables was validated for environmental air temperature parameter, and the values corrected by the method did not differ by Tukey's test at 5% significance of real values recorded by data loggers.
Resumo:
Objective: To analyze the performance of two surgical meshes of different compositions during the defect healing process of the abdominal wall of rats. Methods: thirty-three adult Wistar rats were anesthetized and subjected to removal of an area of 1.5 cm x 2 cm of the anterior abdominal wall, except for the skin; 17 animals had the defect corrected by edge-to-edge surgical suture of a mesh made of polypropylene + poliglecaprone (Group U - UltraproTM); 16 animals had the defect corrected with a surgical mesh made of polypropylene + polidioxanone + cellulose (Group P - ProceedTM). Each group was divided into two subgroups, according to the euthanasia moment (seven days or 28 days after the operation). Parameters analyzed were macroscopic (adherence), microscopic (quantification of mature and immature collagen) and tensiometric (maximum tension and maximum rupture strength). Results : there was an increase in collagen type I in the ProceedTM group from seven to 28 days, p = 0.047. Also, there was an increase in the rupture tension on both groups when comparing the two periods. There was a lower rupture tension and tissue deformity with ProceedTM mesh in seven days, becoming equal at day 28. Conclusion : the meshes retain similarities in the final result and more studies with larger numbers of animals must be carried for better assessment.
Resumo:
We introduce a new tool for correcting OCR errors of materials in a repository of cultural materials. The poster is aimed to all who are interested in digital humanities and who might find our tool useful. The poster will focus on the OCR correction tool and on the background processes. We have started a project on materials published in Finno-Ugric languages in the Soviet Union in the 1920s and 1930s. The materials are digitised in Russia. As they arrive, we publish them in DSpace (fennougrica.kansalliskirjasto.fi). For research purposes, the results of the OCR must be corrected manually. For this we have built a new tool. Although similar tools exist, we found in-house development necessary in order to serve the researchers' needs. The tool enables exporting the corrected text as required by the researchers. It makes it possible to distribute the correction tasks and their supervision. After a supervisor has approved a text as finalised, the new version of the work will replace the old one in DSpace. The project has - benefitted the small language communities, - opened channels for cooperation in Russia. - increased our capabilities in digital humanities. The OCR correction tool will be available to others.