3 resultados para Non-uniform flow

em DigitalCommons@The Texas Medical Center


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hypothesis and Objectives PEGylated liposomal blood pool contrast agents maintain contrast enhancement over several hours. This study aimed to evaluate (long-term) imaging of pulmonary arteries, comparing conventional iodinated contrast with a liposomal blood pool contrast agent. Secondly, visualization of the (real-time) therapeutic effects of tissue-Plasminogen Activator (t-PA) on pulmonary embolism (PE) was attempted. Materials and Methods Six rabbits (approximate 4 kg weight) had autologous blood clots injected through the superior vena cava. Imaging was performed using conventional contrast (iohexol, 350 mg I/ml, GE HealthCare, Princeton, NJ) at a dose of 1400 mgI per animal and after wash-out, animals were imaged using an iodinated liposomal blood pool agent (88 mg I/mL, dose 900 mgI/animal). Subsequently, five animals were injected with 2mg t-PA and imaging continued for up to 4 ½ hours. Results Both contrast agents identified PE in the pulmonary trunk and main pulmonary arteries in all rabbits. Liposomal blood pool agent yielded uniform enhancement, which remained relatively constant throughout the experiments. Conventional agents exhibited non uniform opacification and rapid clearance post injection. Three out of six rabbits had mistimed bolus injections, requiring repeat injections. Following t-PA, Pulmonary embolus volume (central to segmental) decreased in four of five treated rabbits (range 10–57%, mean 42%). One animal showed no response to t-PA. Conclusions Liposomal blood pool agents effectively identified acute PE without need for re-injection. PE resolution following t-PA was quantifiable over several hours. Blood pool agents offer the potential for repeated imaging procedures without need for repeated (nephrotoxic) contrast injections

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Detector uniformity is a fundamental performance characteristic of all modern gamma camera systems, and ensuring a stable, uniform detector response is critical for maintaining clinical images that are free of artifact. For these reasons, the assessment of detector uniformity is one of the most common activities associated with a successful clinical quality assurance program in gamma camera imaging. The evaluation of this parameter, however, is often unclear because it is highly dependent upon acquisition conditions, reviewer expertise, and the application of somewhat arbitrary limits that do not characterize the spatial location of the non-uniformities. Furthermore, as the goal of any robust quality control program is the determination of significant deviations from standard or baseline conditions, clinicians and vendors often neglect the temporal nature of detector degradation (1). This thesis describes the development and testing of new methods for monitoring detector uniformity. These techniques provide more quantitative, sensitive, and specific feedback to the reviewer so that he or she may be better equipped to identify performance degradation prior to its manifestation in clinical images. The methods exploit the temporal nature of detector degradation and spatially segment distinct regions-of-non-uniformity using multi-resolution decomposition. These techniques were tested on synthetic phantom data using different degradation functions, as well as on experimentally acquired time series floods with induced, progressively worsening defects present within the field-of-view. The sensitivity of conventional, global figures-of-merit for detecting changes in uniformity was evaluated and compared to these new image-space techniques. The image-space algorithms provide a reproducible means of detecting regions-of-non-uniformity prior to any single flood image’s having a NEMA uniformity value in excess of 5%. The sensitivity of these image-space algorithms was found to depend on the size and magnitude of the non-uniformities, as well as on the nature of the cause of the non-uniform region. A trend analysis of the conventional figures-of-merit demonstrated their sensitivity to shifts in detector uniformity. The image-space algorithms are computationally efficient. Therefore, the image-space algorithms should be used concomitantly with the trending of the global figures-of-merit in order to provide the reviewer with a richer assessment of gamma camera detector uniformity characteristics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Next-generation sequencing (NGS) technology has become a prominent tool in biological and biomedical research. However, NGS data analysis, such as de novo assembly, mapping and variants detection is far from maturity, and the high sequencing error-rate is one of the major problems. . To minimize the impact of sequencing errors, we developed a highly robust and efficient method, MTM, to correct the errors in NGS reads. We demonstrated the effectiveness of MTM on both single-cell data with highly non-uniform coverage and normal data with uniformly high coverage, reflecting that MTM’s performance does not rely on the coverage of the sequencing reads. MTM was also compared with Hammer and Quake, the best methods for correcting non-uniform and uniform data respectively. For non-uniform data, MTM outperformed both Hammer and Quake. For uniform data, MTM showed better performance than Quake and comparable results to Hammer. By making better error correction with MTM, the quality of downstream analysis, such as mapping and SNP detection, was improved. SNP calling is a major application of NGS technologies. However, the existence of sequencing errors complicates this process, especially for the low coverage (