918 resultados para STANDARD AUTOMATED PERIMETRY


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intramedullary nailing is the standard fixation method for displaced diaphyseal fractures of the tibia. An optimal nail design should both facilitate insertion and anatomically fit the bone geometry at its final position in order to reduce the risk of stress fractures and malalignments. Due to the nonexistence of suitable commercial software, we developed a software tool for the automated fit assessment of nail designs. Furthermore, we demonstrated that an optimised nail, which fits better at the final position, is also easier to insert. Three-dimensional models of two nail designs and 20 tibiae were used. The fitting was quantified in terms of surface area, maximum distance, sum of surface areas and sum of maximum distances by which the nail was protruding into the cortex. The software was programmed to insert the nail into the bone model and to quantify the fit at defined increment levels. On average, the misfit during the insertion in terms of the four fitting parameters was smaller for the Expert Tibial Nail Proximal bend (476.3 mm2, 1.5 mm, 2029.8 mm2, 6.5 mm) than the Expert Tibial Nail (736.7 mm2, 2.2 mm, 2491.4 mm2, 8.0 mm). The differences were statistically significant (p ≤ 0.05). The software could be used by nail implant manufacturers for the purpose of implant design validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intramedullary nailing is the standard fixation method for displaced diaphyseal fractures of tibia. Selection of the correct nail insertion point is important for axial alignment of bone fragments and to avoid iatrogenic fractures. However, the standard entry point (SEP) may not always optimise the bone-nail fit due to geometric variations of bones. This study aimed to investigate the optimal entry for a given bone-nail pair using the fit quantification software tool previously developed by the authors. The misfit was quantified for 20 bones with two nail designs (ETN and ETN-Proximal Bend) related to the SEP and 5 entry points which were 5 mm and 10 mm away from the SEP. The SEP was the optimal entry point for 50% of the bones used. For the remaining bones, the optimal entry point was located 5 mm away from the SEP, which improved the overall fit by 40% on average. However, entry points 10 mm away from the SEP doubled the misfit. The optimised bone-nail fit can be achieved through the SEP and within the range of a 5 mm radius, except posteriorly. The study results suggest that the optimal entry point should be selected by considering the fit during insertion and not only at the final position.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We developed and validated a new method to create automated 3D parametric surface models of the lateral ventricles, designed for monitoring degenerative disease effects in clinical neuroscience studies and drug trials. First we used a set of parameterized surfaces to represent the ventricles in a manually labeled set of 9 subjects' MRIs (atlases). We fluidly registered each of these atlases and mesh models to a set of MRIs from 12 Alzheimer's disease (AD) patients and 14 matched healthy elderly subjects, and we averaged the resulting meshes for each of these images. Validation experiments on expert segmentations showed that (1) the Hausdorff labeling error rapidly decreased, and (2) the power to detect disease-related alterations monotonically improved as the number of atlases, N, was increased from 1 to 9. We then combined the segmentations with a radial mapping approach to localize ventricular shape differences in patients. In surface-based statistical maps, we detected more widespread and intense anatomical deficits as we increased the number of atlases, and we formulated a statistical stopping criterion to determine the optimal value of N. Anterior horn anomalies in Alzheimer's patients were only detected with the multi-atlas segmentation, which clearly outperformed the standard single-atlas approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A completely automated temperature-programmed reaction (TPR) system for carrying out gas-solid catalytic reactions under atmospheric flow conditions is fabricated to study CO and hydrocarbon oxidation, and NO reduction. The system consists of an all-stainless steel UHV system, quadrupole mass spectrometer SX200 (VG Scientific), a tubular furnace and micro-reactor, a temperature controller, a versatile gas handling system, and a data acquisition and analysis system. The performance of the system has been tested under standard experimental conditions for CO oxidation over well-characterized Ce1-x-y(La/Y)(y)O2-delta catalysts. Testing of 3-way catalysis with CO, NO and C2H2 to convert to CO2, N-2 and H2O is done with this catalyst which shows complete removal of pollutants below 325 degrees C. Fixed oxide-ion defects in Pt substituted Ce1-y(La/Y)(y)O2-y/2 show higher catalytic activity than Pt ion-substituted CeO2

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A completely automated temperature-programmed reaction (TPR) system for carrying out gas-solid catalytic reactions under atmospheric flow conditions is fabricated to study CO and hydrocarbon oxidation, and NO reduction. The system consists of an all-stainless steel UHV system, quadrupole mass spectrometer SX200 (VG Scientific), a tubular furnace and micro-reactor, a temperature controller, a versatile gas handling system, and a data acquisition and analysis system. The performance of the system has been tested under standard experimental conditions for CO oxidation over well-characterized Ce1-x-y(La/Y)(y)O2-delta catalysts. Testing of 3-way catalysis with CO, NO and C2H2 to convert to CO2, N-2 and H2O is done with this catalyst which shows complete removal of pollutants below 325 degrees C. Fixed oxide-ion defects in Pt substituted Ce1-y(La/Y)(y)O2-y/2 show higher catalytic activity than Pt ion-substituted CeO2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aims were to determine whether measures of acceleration of the legs and back of dairy cows while they walk could help detect changes in gait or locomotion associated with lameness and differences in the walking surface. In 2 experiments, 12 or 24 multiparous dairy cows were fitted with five 3-dimensional accelerometers, 1 attached to each leg and 1 to the back, and acceleration data were collected while cows walked in a straight line on concrete (experiment 1) or on both concrete and rubber (experiment 2). Cows were video-recorded while walking to assess overall gait, asymmetry of the steps, and walking speed. In experiment 1, cows were selected to maximize the range of gait scores, whereas no clinically lame cows were enrolled in experiment 2. For each accelerometer location, overall acceleration was calculated as the magnitude of the 3-dimensional acceleration vector and the variance of overall acceleration, as well as the asymmetry of variance of acceleration within the front and rear pair of legs. In experiment 1, the asymmetry of variance of acceleration in the front and rear legs was positively correlated with overall gait and the visually assessed asymmetry of the steps (r ≥0.6). Walking speed was negatively correlated with the asymmetry of variance of the rear legs (r=−0.8) and positively correlated with the acceleration and the variance of acceleration of each leg and back (r ≥0.7). In experiment 2, cows had lower gait scores [2.3 vs. 2.6; standard error of the difference (SED)=0.1, measured on a 5-point scale] and lower scores for asymmetry of the steps (18.0 vs. 23.1; SED=2.2, measured on a continuous 100-unit scale) when they walked on rubber compared with concrete, and their walking speed increased (1.28 vs. 1.22m/s; SED=0.02). The acceleration of the front (1.67 vs. 1.72g; SED=0.02) and rear (1.62 vs. 1.67g; SED=0.02) legs and the variance of acceleration of the rear legs (0.88 vs. 0.94g; SED=0.03) were lower when cows walked on rubber compared with concrete. Despite the improvements in gait score that occurred when cows walked on rubber, the asymmetry of variance of acceleration of the front leg was higher (15.2 vs. 10.4%; SED=2.0). The difference in walking speed between concrete and rubber correlated with the difference in the mean acceleration and the difference in the variance of acceleration of the legs and back (r ≥0.6). Three-dimensional accelerometers seem to be a promising tool for lameness detection on farm and to study walking surfaces, especially when attached to a leg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molar heat capacities (C-p,C-m) of aspirin were precisely measured with a small sample precision automated adiabatic calorimeter over the temperature range from 78 to 383 K. No phase transition was observed in this temperature region. The polynomial function of Cp, vs. T was established in the light of the low-temperature heat capacity measurements and least square fitting method. The corresponding function is as follows: for 78 Kless than or equal toTless than or equal to383 K, C-p,C-m/J mol(-1) K-1=19.086X(4)+15.951X(3)-5.2548X(2)+90.192X+176.65, [X=(T-230.50/152.5)]. The thermodynamic functions on the base of the reference temperature of 298.15 K, {DeltaH(T)-DeltaH(298.15)} and {S-T-S-298.15}, were derived.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The validation of a fully automated dissolved Ni monitor for in situ estuarine studies is presented, based on adsorptive cathodic stripping voltammetry (AdCSV). Dissolved Ni concentrations were determined following on-line filtration and UV digestion, and addition of an AdCSV ligand (dimethyl glyoxime) and pH buffer (N-2-hydroxyethylpiperazine-N′-2-ethanesulphonic acid). The technique is capable of up to six fully quantified Ni measurements per hour. The automated in situ methodology was applied successfully during two surveys on the Tamar estuary (south west Britain). The strongly varying sample matrix encountered in the estuarine system did not present analytical interferences, and each sample was quantified using internal standard additions. Up to 37 Ni measurements were performed during each survey, which involved 13 h of continuous sampling and analysis. The high resolution data from the winter and summer tidal cycle studies allowed a thorough interpretation of the biogeochemical processes in the studied estuarine system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An automated and semi-intelligent voltammetric system is described for trace metal analysis. The system consists of a voltammeter interfaced with a personal computer, a sample changer, 2 peristaltic pumps, a motor burette and a hanging mercury drop electrode. The system carries out fully automatically approximately 5 metal determinations per hour (including at least 3 repetitive scans and calibration by standard addition) at trace levels encountered in clean sea water. The computer program decides what level of standard addition to use and evaluates the data prior to switching to the next sample. Alternatively, the system can be used to carry out complexing ligand titration with copper whilst recording the labile copper concentration; in this mode up to 8 full titrations are carried out per day. Depth profiles for chromium speciation in the Mediterranean Sea and a profile for copper complexing ligand concentrations in the North Atlantic Ocean measured on board-ship with the system are presented. The chromium speciation was determined using a new method to differentiate between Cr(III) and Cr(VI) utilizing adsorption of Cr(III) on silica particles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recognizing standard computational structures (cliches) in a program can help an experienced programmer understand the program. We develop a graph parsing approach to automating program recognition in which programs and cliches are represented in an attributed graph grammar formalism and recognition is achieved by graph parsing. In studying this approach, we evaluate our representation's ability to suppress many common forms of variation which hinder recognition. We investigate the expressiveness of our graph grammar formalism for capturing programming cliches. We empirically and analytically study the computational cost of our recognition approach with respect to two medium-sized, real-world simulator programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Histopathology is the clinical standard for tissue diagnosis. However, histopathology has several limitations including that it requires tissue processing, which can take 30 minutes or more, and requires a highly trained pathologist to diagnose the tissue. Additionally, the diagnosis is qualitative, and the lack of quantitation leads to possible observer-specific diagnosis. Taken together, it is difficult to diagnose tissue at the point of care using histopathology.

Several clinical situations could benefit from more rapid and automated histological processing, which could reduce the time and the number of steps required between obtaining a fresh tissue specimen and rendering a diagnosis. For example, there is need for rapid detection of residual cancer on the surface of tumor resection specimens during excisional surgeries, which is known as intraoperative tumor margin assessment. Additionally, rapid assessment of biopsy specimens at the point-of-care could enable clinicians to confirm that a suspicious lesion is successfully sampled, thus preventing an unnecessary repeat biopsy procedure. Rapid and low cost histological processing could also be potentially useful in settings lacking the human resources and equipment necessary to perform standard histologic assessment. Lastly, automated interpretation of tissue samples could potentially reduce inter-observer error, particularly in the diagnosis of borderline lesions.

To address these needs, high quality microscopic images of the tissue must be obtained in rapid timeframes, in order for a pathologic assessment to be useful for guiding the intervention. Optical microscopy is a powerful technique to obtain high-resolution images of tissue morphology in real-time at the point of care, without the need for tissue processing. In particular, a number of groups have combined fluorescence microscopy with vital fluorescent stains to visualize micro-anatomical features of thick (i.e. unsectioned or unprocessed) tissue. However, robust methods for segmentation and quantitative analysis of heterogeneous images are essential to enable automated diagnosis. Thus, the goal of this work was to obtain high resolution imaging of tissue morphology through employing fluorescence microscopy and vital fluorescent stains and to develop a quantitative strategy to segment and quantify tissue features in heterogeneous images, such as nuclei and the surrounding stroma, which will enable automated diagnosis of thick tissues.

To achieve these goals, three specific aims were proposed. The first aim was to develop an image processing method that can differentiate nuclei from background tissue heterogeneity and enable automated diagnosis of thick tissue at the point of care. A computational technique called sparse component analysis (SCA) was adapted to isolate features of interest, such as nuclei, from the background. SCA has been used previously in the image processing community for image compression, enhancement, and restoration, but has never been applied to separate distinct tissue types in a heterogeneous image. In combination with a high resolution fluorescence microendoscope (HRME) and a contrast agent acriflavine, the utility of this technique was demonstrated through imaging preclinical sarcoma tumor margins. Acriflavine localizes to the nuclei of cells where it reversibly associates with RNA and DNA. Additionally, acriflavine shows some affinity for collagen and muscle. SCA was adapted to isolate acriflavine positive features or APFs (which correspond to RNA and DNA) from background tissue heterogeneity. The circle transform (CT) was applied to the SCA output to quantify the size and density of overlapping APFs. The sensitivity of the SCA+CT approach to variations in APF size, density and background heterogeneity was demonstrated through simulations. Specifically, SCA+CT achieved the lowest errors for higher contrast ratios and larger APF sizes. When applied to tissue images of excised sarcoma margins, SCA+CT correctly isolated APFs and showed consistently increased density in tumor and tumor + muscle images compared to images containing muscle. Next, variables were quantified from images of resected primary sarcomas and used to optimize a multivariate model. The sensitivity and specificity for differentiating positive from negative ex vivo resected tumor margins was 82% and 75%. The utility of this approach was further tested by imaging the in vivo tumor cavities from 34 mice after resection of a sarcoma with local recurrence as a bench mark. When applied prospectively to images from the tumor cavity, the sensitivity and specificity for differentiating local recurrence was 78% and 82%. The results indicate that SCA+CT can accurately delineate APFs in heterogeneous tissue, which is essential to enable automated and rapid surveillance of tissue pathology.

Two primary challenges were identified in the work in aim 1. First, while SCA can be used to isolate features, such as APFs, from heterogeneous images, its performance is limited by the contrast between APFs and the background. Second, while it is feasible to create mosaics by scanning a sarcoma tumor bed in a mouse, which is on the order of 3-7 mm in any one dimension, it is not feasible to evaluate an entire human surgical margin. Thus, improvements to the microscopic imaging system were made to (1) improve image contrast through rejecting out-of-focus background fluorescence and to (2) increase the field of view (FOV) while maintaining the sub-cellular resolution needed for delineation of nuclei. To address these challenges, a technique called structured illumination microscopy (SIM) was employed in which the entire FOV is illuminated with a defined spatial pattern rather than scanning a focal spot, such as in confocal microscopy.

Thus, the second aim was to improve image contrast and increase the FOV through employing wide-field, non-contact structured illumination microscopy and optimize the segmentation algorithm for new imaging modality. Both image contrast and FOV were increased through the development of a wide-field fluorescence SIM system. Clear improvement in image contrast was seen in structured illumination images compared to uniform illumination images. Additionally, the FOV is over 13X larger than the fluorescence microendoscope used in aim 1. Initial segmentation results of SIM images revealed that SCA is unable to segment large numbers of APFs in the tumor images. Because the FOV of the SIM system is over 13X larger than the FOV of the fluorescence microendoscope, dense collections of APFs commonly seen in tumor images could no longer be sparsely represented, and the fundamental sparsity assumption associated with SCA was no longer met. Thus, an algorithm called maximally stable extremal regions (MSER) was investigated as an alternative approach for APF segmentation in SIM images. MSER was able to accurately segment large numbers of APFs in SIM images of tumor tissue. In addition to optimizing MSER for SIM image segmentation, an optimal frequency of the illumination pattern used in SIM was carefully selected because the image signal to noise ratio (SNR) is dependent on the grid frequency. A grid frequency of 31.7 mm-1 led to the highest SNR and lowest percent error associated with MSER segmentation.

Once MSER was optimized for SIM image segmentation and the optimal grid frequency was selected, a quantitative model was developed to diagnose mouse sarcoma tumor margins that were imaged ex vivo with SIM. Tumor margins were stained with acridine orange (AO) in aim 2 because AO was found to stain the sarcoma tissue more brightly than acriflavine. Both acriflavine and AO are intravital dyes, which have been shown to stain nuclei, skeletal muscle, and collagenous stroma. A tissue-type classification model was developed to differentiate localized regions (75x75 µm) of tumor from skeletal muscle and adipose tissue based on the MSER segmentation output. Specifically, a logistic regression model was used to classify each localized region. The logistic regression model yielded an output in terms of probability (0-100%) that tumor was located within each 75x75 µm region. The model performance was tested using a receiver operator characteristic (ROC) curve analysis that revealed 77% sensitivity and 81% specificity. For margin classification, the whole margin image was divided into localized regions and this tissue-type classification model was applied. In a subset of 6 margins (3 negative, 3 positive), it was shown that with a tumor probability threshold of 50%, 8% of all regions from negative margins exceeded this threshold, while over 17% of all regions exceeded the threshold in the positive margins. Thus, 8% of regions in negative margins were considered false positives. These false positive regions are likely due to the high density of APFs present in normal tissues, which clearly demonstrates a challenge in implementing this automatic algorithm based on AO staining alone.

Thus, the third aim was to improve the specificity of the diagnostic model through leveraging other sources of contrast. Modifications were made to the SIM system to enable fluorescence imaging at a variety of wavelengths. Specifically, the SIM system was modified to enabling imaging of red fluorescent protein (RFP) expressing sarcomas, which were used to delineate the location of tumor cells within each image. Initial analysis of AO stained panels confirmed that there was room for improvement in tumor detection, particularly in regards to false positive regions that were negative for RFP. One approach for improving the specificity of the diagnostic model was to investigate using a fluorophore that was more specific to staining tumor. Specifically, tetracycline was selected because it appeared to specifically stain freshly excised tumor tissue in a matter of minutes, and was non-toxic and stable in solution. Results indicated that tetracycline staining has promise for increasing the specificity of tumor detection in SIM images of a preclinical sarcoma model and further investigation is warranted.

In conclusion, this work presents the development of a combination of tools that is capable of automated segmentation and quantification of micro-anatomical images of thick tissue. When compared to the fluorescence microendoscope, wide-field multispectral fluorescence SIM imaging provided improved image contrast, a larger FOV with comparable resolution, and the ability to image a variety of fluorophores. MSER was an appropriate and rapid approach to segment dense collections of APFs from wide-field SIM images. Variables that reflect the morphology of the tissue, such as the density, size, and shape of nuclei and nucleoli, can be used to automatically diagnose SIM images. The clinical utility of SIM imaging and MSER segmentation to detect microscopic residual disease has been demonstrated by imaging excised preclinical sarcoma margins. Ultimately, this work demonstrates that fluorescence imaging of tissue micro-anatomy combined with a specialized algorithm for delineation and quantification of features is a means for rapid, non-destructive and automated detection of microscopic disease, which could improve cancer management in a variety of clinical scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the impedance cardiogram recorded by an automated external defibrillator during cardiac arrest to facilitate emergency care by lay persons. Lay persons are poor at emergency pulse checks (sensitivity 84%, specificity 36%); guidelines recommend they should not be performed. The impedance cardiogram (dZ/dt) is used to indicate stroke volume. Can an impedance cardiogram algorithm in a defibrillator determine rapidly circulatory arrest and facilitate prompt initiation of external cardiac massage?

DESIGN: Clinical study.

SETTING: University hospital.

PATIENTS: Phase 1 patients attended for myocardial perfusion imaging. Phase 2 patients were recruited during cardiac arrest. This group included nonarrest controls.

INTERVENTIONS: The impedance cardiogram was recorded through defibrillator/electrocardiographic pads oriented in the standard cardiac arrest position.

MEASUREMENTS AND MAIN RESULTS: Phase 1: Stroke volumes from gated myocardial perfusion imaging scans were correlated with parameters from the impedance cardiogram system (dZ/dt(max) and the peak amplitude of the Fast Fourier Transform of dZ/dt between 1.5 Hz and 4.5 Hz). Multivariate analysis was performed to fit stroke volumes from gated myocardial perfusion imaging scans with linear and quadratic terms for dZ/dt(max) and the Fast Fourier Transform to identify significant parameters for incorporation into a cardiac arrest diagnostic algorithm. The square of the peak amplitude of the Fast Fourier Transform of dZ/dt was the best predictor of reduction in stroke volumes from gated myocardial perfusion imaging scans (range = 33-85 mL; p = .016). Having established that the two pad impedance cardiogram system could detect differences in stroke volumes from gated myocardial perfusion imaging scans, we assessed its performance in diagnosing cardiac arrest. Phase 2: The impedance cardiogram was recorded in 132 "cardiac arrest" patients (53 training, 79 validation) and 97 controls (47 training, 50 validation): the diagnostic algorithm indicated cardiac arrest with sensitivities and specificities (+/- exact 95% confidence intervals) of 89.1% (85.4-92.1) and 99.6% (99.4-99.7; training) and 81.1% (77.6-84.3) and 97% (96.7-97.4; validation).

CONCLUSIONS: The impedance cardiogram algorithm is a significant marker of circulatory collapse. Automated defibrillators with an integrated impedance cardiogram could improve emergency care by lay persons, enabling rapid and appropriate initiation of external cardiac massage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a robust finite element procedure for modelling the behaviour of postbuckling structures undergoing mode-jumping. Current non-linear implicit finite element solution schemes, found in most finite element codes, are discussed and their shortcomings highlighted. A more effective strategy is presented which combines a quasi-static and a pseudo-transient routine for modelling this behaviour. The switching between these two schemes is fully automated and therefore eliminates the need for user intervention during the solution process. The quasi-static response is modelled using the are-length constraint while the pseudo-transient routine uses a modified explicit dynamic routine, which is more computationally efficient than standard implicit and explicit dynamic schemes. The strategies for switching between the quasi-static and pseudo-transient routines are presented

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To evaluate the effect of cataract extraction on Swedish Interactive Thresholding Algorithm (SITA) perimetry in patients with coexisting cataract and glaucoma. PATIENTS AND METHODS: This is a retrospective noncomparative interventional study. Thirty-seven consecutive patients with open-angle glaucoma who had cataract extraction alone or combined with trabeculectomy were included. All patients had SITA-standard 24-2 visual fields before and after the surgery. The main outcome measures were changes in mean deviation (MD) and pattern standard deviation (PSD). Additionally, changes in best-corrected visual acuity, intraocular pressure, and number of glaucoma medications were also studied. RESULTS: Visual field tests were performed 3.9±4.4 months before surgery and 4.1±2.8 months after surgery. Mean visual acuity improved after the surgery, from 0.41±0.21 to 0.88±0.32 (P

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To evaluate the sensitivity and specificity of the screening mode of the Humphrey-Welch Allyn frequency-doubling technology (FDT), Octopus tendency-oriented perimetry (TOP), and the Humphrey Swedish Interactive Threshold Algorithm (SITA)-fast (HSF) in patients with glaucoma. DESIGN: A comparative consecutive case series. METHODS: This was a prospective study which took place in the glaucoma unit of an academic department of ophthalmology. One eye of 70 consecutive glaucoma patients and 28 age-matched normal subjects was studied. Eyes were examined with the program C-20 of FDT, G1-TOP, and 24-2 HSF in one visit and in random order. The gold standard for glaucoma was presence of a typical glaucomatous optic disk appearance on stereoscopic examination, which was judged by a glaucoma expert. The sensitivity and specificity, positive and negative predictive value, and receiver operating characteristic (ROC) curves of two algorithms for the FDT screening test, two algorithms for TOP, and three algorithms for HSF, as defined before the start of this study, were evaluated. The time required for each test was also analyzed. RESULTS: Values for area under the ROC curve ranged from 82.5%-93.9%. The largest area (93.9%) under the ROC curve was obtained with the FDT criteria, defining abnormality as presence of at least one abnormal location. Mean test time was 1.08 ± 0.28 minutes, 2.31 ± 0.28 minutes, and 4.14 ± 0.57 minutes for the FDT, TOP, and HSF, respectively. The difference in testing time was statistically significant (P <.0001). CONCLUSIONS: The C-20 FDT, G1-TOP, and 24-2 HSF appear to be useful tools to diagnose glaucoma. The test C-20 FDT and G1-TOP take approximately 1/4 and 1/2 of the time taken by 24 to 2 HSF. © 2002 by Elsevier Science Inc. All rights reserved.