899 resultados para Error correction methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of measurement-enabled production is based on integrating metrology systems into production processes and generated significant interest in industry, due to its potential to increase process capability and accuracy, which in turn reduces production times and eliminates defective parts. One of the most promising methods of integrating metrology into production is the usage of external metrology systems to compensate machine tool errors in real time. The development and experimental performance evaluation of a low-cost, prototype three-axis machine tool that is laser tracker assisted are described in this paper. Real-time corrections of the machine tool's absolute volumetric error have been achieved. As a result, significant increases in static repeatability and accuracy have been demonstrated, allowing the low-cost three-axis machine tool to reliably reach static positioning accuracies below 35 μm throughout its working volume without any prior calibration or error mapping. This is a significant technical development that demonstrated the feasibility of the proposed methods and can have wide-scale industrial applications by enabling low-cost and structural integrity machine tools that could be deployed flexibly as end-effectors of robotic automation, to achieve positional accuracies that were the preserve of large, high-precision machine tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To ascertain the agreement level between intra-operative refraction using a prototype surgical Hartmann-Shack aberrometer and subjective refraction a month later. Methods: Fifty-four consecutive patients had their pseudophakic refractive measured with the aberrometer intra-operatively at the end of their cataract surgery. A masked optometrist performed subjective refraction 4 weeks later. The two sets of data were then analysed for correlation. Results: The mean spherical equivalent was −0.14 ± 0.37 D (Range: −1.41 to +1.72 D) with the prototype aberrometer and −0.34 ± 0.32 (−1.64 to +1.88 D) with subjective refraction. The measurements positively correlated to a very high degree (r =+0.81, p < 0.01). In 84.3% of cases the two measurements were within 0.50D of each other. Conclusion: The aberrometer can verify the aimed refractive status of the eye intraoperatively to avoid a refractive surprise. The aberrometer is a useful tool for real time assessment of the ocular refractive status.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To determine the utility of a range of clinical and non-clinical indicators to aid the initial selection of the optimum presbyopic contact lens. In addition, to assess whether lens preference was influenced by the visual performance compared to the other designs trialled (intra-subject) or compared to participants who preferred other designs (inter-subject). METHODS: A double-masked randomised crossover trial of Air Optix Aqua multifocal, PureVision 2 for Presbyopia, Acuvue OASYS for Presbyopia, Biofinity multifocal and monovision was conducted on 35 presbyopes (54.3±6.2years). Participant lifestyle, personality, pupil characteristics and aberrometry were assessed prior to lens fitting. After 4 weeks of wear, high and low contrast visual acuity (VA) under photopic and mesopic conditions, reading speed, Near Activity Visual Questionnaire (NAVQ) rating, subjective quality-of-vision scoring, defocus curves, stereopsis, halometry, aberrometry and ocular physiology were quantified. RESULTS: After trialling all the lenses, preference was mixed (n=12 Biofinity, n=10 monovision, n=7 Purevision, n=4 Air Optix Aqua, n=2 Oasys). Lens preference was not dependent on personality (F=1.182, p=0.323) or the hours spent working at near (p=0.535) or intermediate (p=0.759) distances. No intersubject or strong intrasubject relationships emerged between lens preference and reading speed, NAVQ rating, halo size, aberrometry or ocular physiology (p>0.05). CONCLUSIONS: Participant lifestyle and personality, ocular optics, contact lens visual performance and ocular physiology provided poor indicators of the preferred lens type after 4 weeks of wear. This is confounded by the wide range of task visual demands of presbyopes and the limited optical differences between current multifocal contact lens designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corneal surface laser ablation procedures for the correction of refractive error have enjoyed a resurgence of interest, especially in patients with a possible increased risk of complications after lamellar surgery. Improvements in the understanding of corneal biomechanical changes, the modulation of wound healing, laser technology including ablation profiles and different methods for epithelial removal have widened the scope for surface ablation. This article discusses photorefractive keratectomy, trans-epithelial photorefractive keratectomy, laser-assisted sub-epithelial keratomileusis and epithelial-laser-assisted in situ keratomileusis. © 2010 The Authors. Journal compilation © 2010 Royal Australian and New Zealand College of Ophthalmologists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Respiratory gating in lung PET imaging to compensate for respiratory motion artifacts is a current research issue with broad potential impact on quantitation, diagnosis and clinical management of lung tumors. However, PET images collected at discrete bins can be significantly affected by noise as there are lower activity counts in each gated bin unless the total PET acquisition time is prolonged, so that gating methods should be combined with imaging-based motion correction and registration methods. The aim of this study was to develop and validate a fast and practical solution to the problem of respiratory motion for the detection and accurate quantitation of lung tumors in PET images. This included: (1) developing a computer-assisted algorithm for PET/CT images that automatically segments lung regions in CT images, identifies and localizes lung tumors of PET images; (2) developing and comparing different registration algorithms which processes all the information within the entire respiratory cycle and integrate all the tumor in different gated bins into a single reference bin. Four registration/integration algorithms: Centroid Based, Intensity Based, Rigid Body and Optical Flow registration were compared as well as two registration schemes: Direct Scheme and Successive Scheme. Validation was demonstrated by conducting experiments with the computerized 4D NCAT phantom and with a dynamic lung-chest phantom imaged using a GE PET/CT System. Iterations were conducted on different size simulated tumors and different noise levels. Static tumors without respiratory motion were used as gold standard; quantitative results were compared with respect to tumor activity concentration, cross-correlation coefficient, relative noise level and computation time. Comparing the results of the tumors before and after correction, the tumor activity values and tumor volumes were closer to the static tumors (gold standard). Higher correlation values and lower noise were also achieved after applying the correction algorithms. With this method the compromise between short PET scan time and reduced image noise can be achieved, while quantification and clinical analysis become fast and precise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In topographically flat wetlands, where shallow water table and conductive soil may develop as a result of wet and dry seasons, the connection between surface water and groundwater is not only present, but perhaps the key factor dominating the magnitude and direction of water flux. Due to their complex characteristics, modeling waterflow through wetlands using more realistic process formulations (integrated surface-ground water and vegetative resistance) is an actual necessity. This dissertation focused on developing an integrated surface – subsurface hydrologic simulation numerical model by programming and testing the coupling of the USGS MODFLOW-2005 Groundwater Flow Process (GWF) package (USGS, 2005) with the 2D surface water routing model: FLO-2D (O’Brien et al., 1993). The coupling included the necessary procedures to numerically integrate and verify both models as a single computational software system that will heretofore be referred to as WHIMFLO-2D (Wetlands Hydrology Integrated Model). An improved physical formulation of flow resistance through vegetation in shallow waters based on the concept of drag force was also implemented for the simulations of floodplains, while the use of the classical methods (e.g., Manning, Chezy, Darcy-Weisbach) to calculate flow resistance has been maintained for the canals and deeper waters. A preliminary demonstration exercise WHIMFLO-2D in an existing field site was developed for the Loxahatchee Impoundment Landscape Assessment (LILA), an 80 acre area, located at the Arthur R. Marshall Loxahatchee National Wild Life Refuge in Boynton Beach, Florida. After applying a number of simplifying assumptions, results have illustrated the ability of the model to simulate the hydrology of a wetland. In this illustrative case, a comparison between measured and simulated stages level showed an average error of 0.31% with a maximum error of 2.8%. Comparison of measured and simulated groundwater head levels showed an average error of 0.18% with a maximum of 2.9%. The coupling of FLO-2D model with MODFLOW-2005 model and the incorporation of the dynamic effect of flow resistance due to vegetation performed in the new modeling tool WHIMFLO-2D is an important contribution to the field of numerical modeling of hydrologic flow in wetlands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation introduces a new system for handwritten text recognition based on an improved neural network design. Most of the existing neural networks treat mean square error function as the standard error function. The system as proposed in this dissertation utilizes the mean quartic error function, where the third and fourth derivatives are non-zero. Consequently, many improvements on the training methods were achieved. The training results are carefully assessed before and after the update. To evaluate the performance of a training system, there are three essential factors to be considered, and they are from high to low importance priority: (1) error rate on testing set, (2) processing time needed to recognize a segmented character and (3) the total training time and subsequently the total testing time. It is observed that bounded training methods accelerate the training process, while semi-third order training methods, next-minimal training methods, and preprocessing operations reduce the error rate on the testing set. Empirical observations suggest that two combinations of training methods are needed for different case character recognition. Since character segmentation is required for word and sentence recognition, this dissertation provides also an effective rule-based segmentation method, which is different from the conventional adaptive segmentation methods. Dictionary-based correction is utilized to correct mistakes resulting from the recognition and segmentation phases. The integration of the segmentation methods with the handwritten character recognition algorithm yielded an accuracy of 92% for lower case characters and 97% for upper case characters. In the testing phase, the database consists of 20,000 handwritten characters, with 10,000 for each case. The testing phase on the recognition 10,000 handwritten characters required 8.5 seconds in processing time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major objectives of this dissertation were to develop optimal spatial techniques to model the spatial-temporal changes of the lake sediments and their nutrients from 1988 to 2006, and evaluate the impacts of the hurricanes occurred during 1998–2006. Mud zone reduced about 10.5% from 1988 to 1998, and increased about 6.2% from 1998 to 2006. Mud areas, volumes and weight were calculated using validated Kriging models. From 1988 to 1998, mud thicknesses increased up to 26 cm in the central lake area. The mud area and volume decreased about 13.78% and 10.26%, respectively. From 1998 to 2006, mud depths declined by up to 41 cm in the central lake area, mud volume reduced about 27%. Mud weight increased up to 29.32% from 1988 to 1998, but reduced over 20% from 1998 to 2006. The reduction of mud sediments is likely due to re-suspension and redistribution by waves and currents produced by large storm events, particularly Hurricanes Frances and Jeanne in 2004 and Wilma in 2005. Regression, kriging, geographically weighted regression (GWR) and regression-kriging models have been calibrated and validated for the spatial analysis of the sediments TP and TN of the lake. GWR models provide the most accurate predictions for TP and TN based on model performance and error analysis. TP values declined from an average of 651 to 593 mg/kg from 1998 to 2006, especially in the lake’s western and southern regions. From 1988 to 1998, TP declined in the northern and southern areas, and increased in the central-western part of the lake. The TP weights increased about 37.99%–43.68% from 1988 to 1998 and decreased about 29.72%–34.42% from 1998 to 2006. From 1988 to 1998, TN decreased in most areas, especially in the northern and southern lake regions; western littoral zone had the biggest increase, up to 40,000 mg/kg. From 1998 to 2006, TN declined from an average of 9,363 to 8,926 mg/kg, especially in the central and southern regions. The biggest increases occurred in the northern lake and southern edge areas. TN weights increased about 15%–16.2% from 1988 to 1998, and decreased about 7%–11% from 1998 to 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Respiratory gating in lung PET imaging to compensate for respiratory motion artifacts is a current research issue with broad potential impact on quantitation, diagnosis and clinical management of lung tumors. However, PET images collected at discrete bins can be significantly affected by noise as there are lower activity counts in each gated bin unless the total PET acquisition time is prolonged, so that gating methods should be combined with imaging-based motion correction and registration methods. The aim of this study was to develop and validate a fast and practical solution to the problem of respiratory motion for the detection and accurate quantitation of lung tumors in PET images. This included: (1) developing a computer-assisted algorithm for PET/CT images that automatically segments lung regions in CT images, identifies and localizes lung tumors of PET images; (2) developing and comparing different registration algorithms which processes all the information within the entire respiratory cycle and integrate all the tumor in different gated bins into a single reference bin. Four registration/integration algorithms: Centroid Based, Intensity Based, Rigid Body and Optical Flow registration were compared as well as two registration schemes: Direct Scheme and Successive Scheme. Validation was demonstrated by conducting experiments with the computerized 4D NCAT phantom and with a dynamic lung-chest phantom imaged using a GE PET/CT System. Iterations were conducted on different size simulated tumors and different noise levels. Static tumors without respiratory motion were used as gold standard; quantitative results were compared with respect to tumor activity concentration, cross-correlation coefficient, relative noise level and computation time. Comparing the results of the tumors before and after correction, the tumor activity values and tumor volumes were closer to the static tumors (gold standard). Higher correlation values and lower noise were also achieved after applying the correction algorithms. With this method the compromise between short PET scan time and reduced image noise can be achieved, while quantification and clinical analysis become fast and precise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focused on developing an integrated surface – subsurface hydrologic simulation numerical model by programming and testing the coupling of the USGS MODFLOW-2005 Groundwater Flow Process (GWF) package (USGS, 2005) with the 2D surface water routing model: FLO-2D (O’Brien et al., 1993). The coupling included the necessary procedures to numerically integrate and verify both models as a single computational software system that will heretofore be referred to as WHIMFLO-2D (Wetlands Hydrology Integrated Model). An improved physical formulation of flow resistance through vegetation in shallow waters based on the concept of drag force was also implemented for the simulations of floodplains, while the use of the classical methods (e.g., Manning, Chezy, Darcy-Weisbach) to calculate flow resistance has been maintained for the canals and deeper waters. A preliminary demonstration exercise WHIMFLO-2D in an existing field site was developed for the Loxahatchee Impoundment Landscape Assessment (LILA), an 80 acre area, located at the Arthur R. Marshall Loxahatchee National Wild Life Refuge in Boynton Beach, Florida. After applying a number of simplifying assumptions, results have illustrated the ability of the model to simulate the hydrology of a wetland. In this illustrative case, a comparison between measured and simulated stages level showed an average error of 0.31% with a maximum error of 2.8%. Comparison of measured and simulated groundwater head levels showed an average error of 0.18% with a maximum of 2.9%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On the base of data on benthic foraminifera and sediment biogeochemistry (contents of total organic carbon, calcium carbonate and biogenic opal) in two cores (1265 and 1312 m water depth) from the southeastern Sakhalin slope and one core (839 m water depth) from the southwestern Kamchatka slope variations of the oxygen minimum zone during the last 50 ka in the Okhotsk Sea are reconstructed. The oxygen minimum zone was less pronounced during cooling in the MIS 2 that is suggested to be caused by maximal expansion of the sea ice cover, decrease of marine productivity and increase of production of oxygenated Okhotsk Sea Intermediate Water (OSIW). Two-step-like strengthening of oxygen minimum zone during warmings in the Terminations 1a and 1b was combined with (1) enhanced oxygen consumption due to decomposition of large amount of organic matter in the water column and bottom sediments due to increased marine productivity and supply of terrigenous material from submerged northern shelves; (2) sea ice cover retreat and reduction of OSIW production; (3) freely inflow of the oxygen-depleted intermediate water mass from the North Pacific.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable dating of glaciomarine sediments deposited on the Antarctic shelf since the Last Glacial Maximum (LGM) is very challenging because of the general absence of calcareous (micro-) fossils and the recycling of fossil organic matter. As a consequence, radiocarbon (14C) ages of the acid-insoluble organic fraction (AIO) of the sediments bear uncertainties that are very difficult to quantify. In this paper we present the results of three different chronostratigraphic methods to date a sedimentary unit consisting of diatomaceous ooze and diatomaceous mud that was deposited following the last deglaciation at five core sites on the inner shelf in the western Amundsen Sea (West Antarctica). In three cores conventional 14C dating of the AIO in bulk sediment samples yielded age reversals down-core, but at all sites the AIO 14C ages obtained from diatomaceous ooze within the diatom-rich unit yielded similar uncorrected 14C ages ranging from 13,517±56 to 11,543±47 years before present (yr BP). Correction of these ages by subtracting the core-top ages, which are assumed to reflect present-day deposition (as indicated by 21044 Pb dating of the sediment surface at one core site), yielded ages between ca. 10,500 and 8,400 calibrated years before present (cal yr BP). Correction of the AIO ages of the diatomaceous ooze by only subtracting the marine reservoir effect (MRE) of 1,300 years indicated deposition of the diatom-rich sediments between 14,100 and 11,900 cal yr BP. Most of these ages are consistent with age constraints between 13.0 and 8.0 ka BP for the diatom-rich unit, which we obtained by correlating the relative palaeomagnetic intensity (RPI) records of three of the sediment cores with global and regional reference curves for palaeomagnetic intensity. As a third dating technique we applied conventional 53 radiocarbon dating of the AIO included in acid-cleaned diatom hard parts that were extracted from the diatomaceous ooze. This method yielded uncorrected 14C ages of only 5,111±38 and 5,106±38 yr BP, respectively. We reject these young ages, because they are likely to be overprinted by the adsorption of modern atmospheric carbon dioxide onto the surfaces of the extracted diatom hard parts prior to sample graphitisation and combustion for 14C dating. The deposition of the diatom-rich unit in the western Amundsen Sea suggests deglaciation of the inner shelf before ca. 13 ka BP. The deposition of diatomaceous oozes on other parts of the Antarctic shelf around the same time, however, seems to be coincidental rather than directly related.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It was recently shown [Phys. Rev. Lett. 110, 227201 (2013)] that the critical behavior of the random-field Ising model in three dimensions is ruled by a single universality class. This conclusion was reached only after a proper taming of the large scaling corrections of the model by applying a combined approach of various techniques, coming from the zero-and positive-temperature toolboxes of statistical physics. In the present contribution we provide a detailed description of this combined scheme, explaining in detail the zero-temperature numerical scheme and developing the generalized fluctuation-dissipation formula that allowed us to compute connected and disconnected correlation functions of the model. We discuss the error evolution of our method and we illustrate the infinite limit-size extrapolation of several observables within phenomenological renormalization. We present an extension of the quotients method that allows us to obtain estimates of the critical exponent a of the specific heat of the model via the scaling of the bond energy and we discuss the self-averaging properties of the system and the algorithmic aspects of the maximum-flow algorithm used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We recently published an article (García-Pérez & Alcalá- Quintana, 2010) reanalyzing data presented by Lapid, Ulrich, and Rammsayer (2008) and discussing a theoretical argument developed by Ulrich and Vorberg (2009). The purpose of this note is to correct an error in our study that has some theoretical importance, although it does not affect the conclusion that was raised. The error lies in that asymptote parameters reflecting lapses or finger errors should not enter the constraint relating the psychometric functions that describe performance when the comparison stimulus in a two-alternative forced choice (2AFC) discrimination task is presented in the first or second interval.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Computed Tomography (CT) is one of the standard diagnostic imaging modalities for the evaluation of a patient’s medical condition. In comparison to other imaging modalities such as Magnetic Resonance Imaging (MRI), CT is a fast acquisition imaging device with higher spatial resolution and higher contrast-to-noise ratio (CNR) for bony structures. CT images are presented through a gray scale of independent values in Hounsfield units (HU). High HU-valued materials represent higher density. High density materials, such as metal, tend to erroneously increase the HU values around it due to reconstruction software limitations. This problem of increased HU values due to metal presence is referred to as metal artefacts. Hip prostheses, dental fillings, aneurysm clips, and spinal clips are a few examples of metal objects that are of clinical relevance. These implants create artefacts such as beam hardening and photon starvation that distort CT images and degrade image quality. This is of great significance because the distortions may cause improper evaluation of images and inaccurate dose calculation in the treatment planning system. Different algorithms are being developed to reduce these artefacts for better image quality for both diagnostic and therapeutic purposes. However, very limited information is available about the effect of artefact correction on dose calculation accuracy. This research study evaluates the dosimetric effect of metal artefact reduction algorithms on severe artefacts on CT images. This study uses Gemstone Spectral Imaging (GSI)-based MAR algorithm, projection-based Metal Artefact Reduction (MAR) algorithm, and the Dual-Energy method.

Materials and Methods: The Gemstone Spectral Imaging (GSI)-based and SMART Metal Artefact Reduction (MAR) algorithms are metal artefact reduction protocols embedded in two different CT scanner models by General Electric (GE), and the Dual-Energy Imaging Method was developed at Duke University. All three approaches were applied in this research for dosimetric evaluation on CT images with severe metal artefacts. The first part of the research used a water phantom with four iodine syringes. Two sets of plans, multi-arc plans and single-arc plans, using the Volumetric Modulated Arc therapy (VMAT) technique were designed to avoid or minimize influences from high-density objects. The second part of the research used projection-based MAR Algorithm and the Dual-Energy Method. Calculated Doses (Mean, Minimum, and Maximum Doses) to the planning treatment volume (PTV) were compared and homogeneity index (HI) calculated.

Results: (1) Without the GSI-based MAR application, a percent error between mean dose and the absolute dose ranging from 3.4-5.7% per fraction was observed. In contrast, the error was decreased to a range of 0.09-2.3% per fraction with the GSI-based MAR algorithm. There was a percent difference ranging from 1.7-4.2% per fraction between with and without using the GSI-based MAR algorithm. (2) A range of 0.1-3.2% difference was observed for the maximum dose values, 1.5-10.4% for minimum dose difference, and 1.4-1.7% difference on the mean doses. Homogeneity indexes (HI) ranging from 0.068-0.065 for dual-energy method and 0.063-0.141 with projection-based MAR algorithm were also calculated.

Conclusion: (1) Percent error without using the GSI-based MAR algorithm may deviate as high as 5.7%. This error invalidates the goal of Radiation Therapy to provide a more precise treatment. Thus, GSI-based MAR algorithm was desirable due to its better dose calculation accuracy. (2) Based on direct numerical observation, there was no apparent deviation between the mean doses of different techniques but deviation was evident on the maximum and minimum doses. The HI for the dual-energy method almost achieved the desirable null values. In conclusion, the Dual-Energy method gave better dose calculation accuracy to the planning treatment volume (PTV) for images with metal artefacts than with or without GE MAR Algorithm.