95 resultados para artifacts


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emergency CT examination is considered to be a trade-off between a short scan time and the acceptance of artifacts. This study evaluates the influence of patient repositioning on artifacts and scan time. Eighty-three consecutive multiple-trauma patients were included in this prospective study. Patients were examined without repositioning (group 1, n=39) or with patient rotation to feet-first with arms raised for scanning the chest and abdomen/pelvis (group 2, n=44). The mean scan time was 21 min in group 1 and 25 min in group 2 (P=0.01). The mean repositioning time in group 2 was 8 min. Significantly, more artifacts were observed in group 1 (with a repeated scan in 7%) than in group 2 (P=0.0001). This novel multiple- trauma CT-scanning protocol with patient repositioning achieves a higher image quality with significantly fewer artifacts than without repositioning but increases scan time slightly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Postmortem investigation is increasingly supported by computed tomography (CT) and magnetic resonance imaging, in which postmortem minimal invasive angiography has become important. The newly introduced approach using an aqueous contrast agent solution provided excellent vessel visualization but was suspected to possibly cause tissue edema artifacts in histological investigations. The aim of this study was to investigate on a porcine heart model whether it is possible to influence the contrast agent distribution within the soft tissue by changing its viscosity by dissolving the contrast agent in polyethylene glycol (PEG) as a matrix medium. High-resolution CT scans after injection showed that viscosities above c. 15 mPa s (65% PEG) prevented a contrast agent distribution within the capillary bed of the left ventricular myocardium. Thereby, the precondition of edema artifacts could be reduced. Its minimal invasive application on human corpses needs to be further adapted as the flow resistance is expected to differ between different tissues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of dental processing software for computed tomography (CT) data (Dentascan) is described on postmortem (pm) CT data for the purpose of pm identification. The software allows reconstructing reformatted images comparable to conventional panoramic dental radiographs by defining a curved reconstruction line along the teeth on oblique images. Three corpses that have been scanned within the virtopsy project were used to test the software for the purpose of dental identification. In every case, dental panoramic images could be reconstructed and compared to antemortem radiographs. The images showed the basic component of teeth (enamel, dentin, and pulp), the anatomic structure of the alveolar bone, missing or unerupted teeth as well as restorations of the teeth that could be used for identification. When streak artifacts due to metal-containing dental work reduced image quality, it was still necessary to perform pm conventional radiographs for comparison of the detailed shape of the restoration. Dental identification or a dental profiling seems to become possible in a noninvasive manner using the Dentascan software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Features encapsulate the domain knowledge of a software system and thus are valuable sources of information for a reverse engineer. When analyzing the evolution of a system, we need to know how and which features were modified to recover both the change intention and its extent, namely which source artifacts are affected. Typically, the implementation of a feature crosscuts a number of source artifacts. To obtain a mapping between features to the source artifacts, we exercise the features and capture their execution traces. However this results in large traces that are difficult to interpret. To tackle this issue we compact the traces into simple sets of source artifacts that participate in a feature's runtime behavior. We refer to these compacted traces as feature views. Within a feature view, we partition the source artifacts into disjoint sets of characterized software entities. The characterization defines the level of participation of a source entity in the features. We then analyze the features over several versions of a system and we plot their evolution to reveal how and hich features were affected by changes in the code. We show the usefulness of our approach by applying it to a case study where we address the problem of merging parallel development tracks of the same system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To prospectively determine the accuracy of 1.5 Tesla (T) and 3 T magnetic resonance angiography (MRA) versus digital subtraction angiography (DSA) in the depiction of infrageniculate arteries in patients with symptomatic peripheral arterial disease. PATIENTS AND METHODS: A prospective 1.5 T, 3 T MRA, and DSA comparison was used to evaluate 360 vessel segments in 10 patients (15 limbs) with chronic symptomatic peripheral arterial disease. Selective DSA was performed within 30 days before both MRAs. The accuracy of 1.5 T and 3 T MRA was compared with DSA as the standard of reference by consensus agreement of 2 experienced readers. Signal-to-noise ratios (SNR) and signal-difference-to-noise ratios (SDNRs) were quantified. RESULTS: No significant difference in overall image quality, sufficiency for diagnosis, depiction of arterial anatomy, motion artifacts, and venous overlap was found comparing 1.5 T with 3 T MRA (P > 0.05 by Wilcoxon signed rank and as by Cohen k test). Overall sensitivity of 1.5 and 3 T MRA for detection of significant arterial stenosis was 79% and 82%, and specificity was 87% and 87% for both modalities, respectively. Interobserver agreement was excellent k > 0.8, P < 0.05) for 1.5 T as well as for 3 T MRA. SNR and SDNR were significantly increased using the 3 T system (average increase: 36.5%, P < 0.032 by t test, and 38.5%, P < 0.037 respectively). CONCLUSIONS: Despite marked improvement of SDNR, 3 T MRA does not yet provide a significantly higher accuracy in diagnostic imaging of atherosclerotic lesions below the knee joint as compared with 1.5 T MRA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To evaluate a triphasic injection protocol for whole-body multidetector computed tomography (MDCT) in patients with multiple trauma. Fifty consecutive patients (41 men) were examined. Contrast medium (300 mg/mL iodine) was injected starting with 70 mL at 3 mL/s, followed by 0.1 mL/s for 8 s, and by another bolus of 75 mL at 4 mL/s. CT data acquisition started 50 s after the beginning of the first injection. Two experienced, blinded readers independently measured the density in all major arteries, veins, and parenchymatous organs. Image quality was assessed using a five-point ordinal rating scale and compared to standard injection protocols [n = 25 each for late arterial chest, portovenous abdomen, and MDCT angiography (CTA)]. With the exception of the infrarenal inferior caval vein, all blood vessels were depicted with diagnostic image quality using the multiple-trauma protocol. Arterial luminal density was slightly but significantly smaller compared to CTA (P < 0.01). Veins and parenchymatous organs were opacified significantly better compared to all other protocols (P < 0.01). Arm artifacts reduced the density of spleen and liver parenchyma significantly (P < 0.01). Similarly high image quality is achieved for arteries using the multiple-trauma protocol compared to CTA, and parenchymatous organs are depicted with better image quality compared to specialized protocols. Arm artifacts should be avoided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soon after its introduction in 1991, MR cholangiopancreatography has become an established diagnostic tool for the evaluation of the pancreaticobiliary ductal system at a field strength of 1.5T. It remains unclear whether MR cholangiopancreatography performed at 3T will benefit from the higher magnetic field strength or whether a field strength of 1.5T should continue to be considered the gold standard for MR cholangiopancreatography. This article reviews the current literature on the benefits and drawbacks of MR cholangiopancreatography at 3T compared with a standard field strength of 1.5T. Field strength-related artifacts that affect MR cholangiopancreatography at 3T also are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identification of 15N-labeled 3-nitrotyrosine (NTyr) by gas chromatography/mass spectroscopy in protein hydrolyzates from activated RAW 264.7 macrophages incubated with 15N-L-arginine confirms that nitric oxide synthase (NOS) is involved in the nitration of protein-bound tyrosine (Tyr). An assay is presented for NTyr that employs HPLC with tandem electrochemical and UV detection. The assay involves enzymatic hydrolysis of protein, acetylation, solvent extraction, O-deacetylation, and dithionite reduction to produce an analyte containing N-acetyl-3-aminotyrosine, an electrochemically active derivative of NTyr. We estimate the level of protein-bound NTyr in normal rat plasma to be approximately 0-1 residues per 10(6) Tyr with a detection limit of 0.5 per 10(7) Tyr when > 100 nmol of Tyr is analyzed and when precautions are taken to limit nitration artifacts. Zymosan-treated RAW 264.7 cells were shown to have an approximately 6-fold higher level of protein-bound NTyr compared with control cells and cells treated with N(G)-monomethyl-L-arginine, an inhibitor of NOS. Intraperitoneal injection of F344 rats with zymosan led to a marked elevation in protein-bound NTyr to approximately 13 residues per 10(6) Tyr, an approximately 40-fold elevation compared with plasma protein of untreated rats; cotreatment with N(G)-monomethyl-L-arginine inhibited the formation of NTyr in plasma protein from blood and peritoneal exudate by 69% and 53%, respectively. This assay offers a highly sensitive and quantitative approach for investigating the role of reactive byproducts of nitric oxide in the many pathological conditions and disease states associated with NO(X) exposure such as inflammation and smoking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To determine whether neutral contrast agents with water-equivalent intraluminal attenuation can improve delineation of the bowel wall and increase overall image quality for a non-selected patient population, a neutral oral contrast agent (3% mannitol) was administered to 100 patients referred for abdominal multidetector row computed tomography (MDCT). Their results were compared with those of 100 patients given a positive oral contrast agent. Qualitative and quantitative measurements were done on different levels of the gastrointestinal tract by three experienced readers. Patients given the neutral oral contrast agent showed significant better qualitative results for bowel distension (P < 0.001), homogeneity of the luminal content (P < 0.001), delineation of the bowel-wall to the lumen (P < 0.001) and to the mesentery (P < 0.001) and artifacts (P < 0.001), leading to a significant better overall image quality (P < 0.001) than patients receiving positive oral contrast medium. The quantitative measurements revealed significant better distension (P < 0.001) and wall to lumen delineation (P < 0.001) for the patients receiving neutral oral contrast medium. The present results show that the neutral oral contrast agent (mannitol) produced better distension, better homogeneity and better delineation of the bowel wall leading to a higher overall image quality than the positive oral contrast medium in a non-selected patient population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Combined EEG/fMRI recordings offer a promising opportunity to detect brain areas with altered BOLD signal during interictal epileptic discharges (IEDs). These areas are likely to represent the irritative zone, which is itself a reflection of the epileptogenic zone. This paper reports on the imaging findings using independent component analysis (ICA) to continuously quantify epileptiform activity in simultaneously acquired EEG and fMRI. Using ICA derived factors coding for the epileptic activity takes into account that epileptic activity is continuously fluctuating with each spike differing in amplitude, duration and maybe topography, including subthreshold epileptic activity besides clear IEDs and may thus increase the sensitivity and statistical power of combined EEG/fMRI in epilepsy. Twenty patients with different types of focal and generalized epilepsy syndromes were investigated. ICA separated epileptiform activity from normal physiological brain activity and artifacts. In 16/20 patients, BOLD correlates of epileptic activity matched the EEG sources, the clinical semiology, and, if present, the structural lesions. In clinically equivocal cases, the BOLD correlates aided to attribute proper diagnosis of the underlying epilepsy syndrome. Furthermore, in one patient with temporal lobe epilepsy, BOLD correlates of rhythmic delta activity could be employed to delineate the affected hippocampus. Compared to BOLD correlates of manually identified IEDs, the sensitivity was improved from 50% (10/20) to 80%. The ICA EEG/fMRI approach is a safe, non-invasive and easily applicable technique, which can be used to identify regions with altered hemodynamic effects related to IEDs as well as intermittent rhythmic discharges in different types of epilepsy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of modern transmission electron microscopy (TEM) in life science is to observe biological structures in a state as close as possible to the living organism. TEM samples have to be thin and to be examined in vacuum; therefore only solid samples can be investigated. The most common and popular way to prepare samples for TEM is to subject them to chemical fixation, staining, dehydration, and embedding in a resin (all of these steps introduce considerable artifacts) before investigation. An alternative is to immobilize samples by cooling. High pressure freezing is so far the only approach to vitrify (water solidification without ice crystal formation) bulk biological samples of about 200 micrometer thick. This method leads to an improved ultrastructural preservation. After high pressure freezing, samples have to be subjected to follow-up procedure, such as freeze-substitution and embedding. The samples can also be sectioned into frozen hydrated sections and analyzed in a cryo-TEM. Also for immunocytochemistry, high pressure freezing is a good and practicable way.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transmission electron microscopy has provided most of what is known about the ultrastructural organization of tissues, cells, and organelles. Due to tremendous advances in crystallography and magnetic resonance imaging, almost any protein can now be modeled at atomic resolution. To fully understand the workings of biological "nanomachines" it is necessary to obtain images of intact macromolecular assemblies in situ. Although the resolution power of electron microscopes is on the atomic scale, in biological samples artifacts introduced by aldehyde fixation, dehydration and staining, but also section thickness reduces it to some nanometers. Cryofixation by high pressure freezing circumvents many of the artifacts since it allows vitrifying biological samples of about 200 mum in thickness and immobilizes complex macromolecular assemblies in their native state in situ. To exploit the perfect structural preservation of frozen hydrated sections, sophisticated instruments are needed, e.g., high voltage electron microscopes equipped with precise goniometers that work at low temperature and digital cameras of high sensitivity and pixel number. With them, it is possible to generate high resolution tomograms, i.e., 3D views of subcellular structures. This review describes theory and applications of the high pressure cryofixation methodology and compares its results with those of conventional procedures. Moreover, recent findings will be discussed showing that molecular models of proteins can be fitted into depicted organellar ultrastructure of images of frozen hydrated sections. High pressure freezing of tissue is the base which may lead to precise models of macromolecular assemblies in situ, and thus to a better understanding of the function of complex cellular structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intestinal intraepithelial lymphocytes (IEL) are specialized subsets of T cells with distinct functional capacities. While some IEL subsets are circulating, others such as CD8alphaalpha TCRalphabeta IEL are believed to represent non-circulating resident T cell subsets [Sim, G.K., Intraepithelial lymphocytes and the immune system. Adv. Immunol., 1995. 58: 297-343.]. Current methods to obtain enriched preparations of intraepithelial lymphocytes are mostly based on Percoll density gradient or magnetic bead-based technologies [Lundqvist, C., et al., Isolation of functionally active intraepithelial lymphocytes and enterocytes from human small and large intestine. J. Immunol. Methods, 1992. 152(2): 253-263.]. However, these techniques are hampered by a generally low yield of isolated cells, and potential artifacts due to the interference of the isolation procedure with subsequent functional assays, in particular, when antibodies against cell surface markers are required. Here we describe a new method for obtaining relatively pure populations of intestinal IEL (55-75%) at a high yield (>85%) by elutriation centrifugation. This technique is equally suited for the isolation and enrichment of intraepithelial lymphocytes of both mouse and human origin. Time requirements for fractionating cell suspensions by elutriation centrifugation are comparable to Percoll-, or MACS-based isolation procedures. Hence, the substantially higher yield and the consistent robust enrichment for intraepithelial lymphocytes, together with the gentle treatment of the cells during elutriation that does not interfere with subsequent functional assays, are important aspects that are in favor of using this elegant technology to obtain unmanipulated, unbiased populations of intestinal intraepithelial lymphocytes, and, if desired, also of pure epithelial cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite widespread use of species-area relationships (SARs), dispute remains over the most representative SAR model. Using data of small-scale SARs of Estonian dry grassland communities, we address three questions: (1) Which model describes these SARs best when known artifacts are excluded? (2) How do deviating sampling procedures (marginal instead of central position of the smaller plots in relation to the largest plot; single values instead of average values; randomly located subplots instead of nested subplots) influence the properties of the SARs? (3) Are those effects likely to bias the selection of the best model? Our general dataset consisted of 16 series of nested-plots (1 cm(2)-100 m(2), any-part system), each of which comprised five series of subplots located in the four corners and the centre of the 100-m(2) plot. Data for the three pairs of compared sampling designs were generated from this dataset by subsampling. Five function types (power, quadratic power, logarithmic, Michaelis-Menten, Lomolino) were fitted with non-linear regression. In some of the communities, we found extremely high species densities (including bryophytes and lichens), namely up to eight species in 1 cm(2) and up to 140 species in 100 m(2), which appear to be the highest documented values on these scales. For SARs constructed from nested-plot average-value data, the regular power function generally was the best model, closely followed by the quadratic power function, while the logarithmic and Michaelis-Menten functions performed poorly throughout. However, the relative fit of the latter two models increased significantly relative to the respective best model when the single-value or random-sampling method was applied, however, the power function normally remained far superior. These results confirm the hypothesis that both single-value and random-sampling approaches cause artifacts by increasing stochasticity in the data, which can lead to the selection of inappropriate models.