920 resultados para field method
Resumo:
The study developed statistical techniques to evaluate visual field progression for use with the Humphrey Field Analyzer (HFA). The long-term fluctuation (LF) was evaluated in stable glaucoma. The magnitude of both LF components showed little relationship with MD, CPSD and SF. An algorithm was proposed for determining the clinical necessity for a confirmatory follow-up examination. The between-examination variability was determined for the HFA Standard and FASTPAC algorithms in glaucoma. FASTPAC exhibited greater between-examination variability than the Standard algorithm across the range of sensitivities and with increasing eccentricity. The difference in variability between the algorithms had minimal clinical significance. The effect of repositioning the baseline in the Glaucoma Change Probability Analysis (GCPA) was evaluated. The global baseline of the GCPA limited the detection of progressive change at a single stimulus location. A new technique, pointwise univariate linear regressions (ULR), of absolute sensitivity and, of pattern deviation, against time to follow-up was developed. In each case, pointwise ULR was more sensitive to localised progressive changes in sensitivity than ULR of MD, alone. Small changes in sensitivity were more readily determined by the pointwise ULR than by the GCPA. A comparison between the outcome of pointwise ULR for all fields and for the last six fields manifested linear and curvilinear declines in the absolute sensitivity and the pattern deviation. A method for delineating progressive loss in glaucoma, based upon the error in the forecasted sensitivity of a multivariate model, was developed. Multivariate forecasting exhibited little agreement with GCPA in glaucoma but showed promise for monitoring visual field progression in OHT patients. The recovery of sensitivity in optic neuritis over time was modelled with a Cumulative Gaussian function. The rate and level of recovery was greater in the peripheral than the central field. Probability models to forecast the field of recovery were proposed.
Resumo:
A new instrument and method are described that allow the hydraulic conductivities of highly permeable porous materials, such as gravels in constructed wetlands, to be determined in the field. The instrument consists of a Mariotte siphon and a submersible permeameter cell with manometer take-off tubes, to recreate in-situ the constant head permeameter test typically used with excavated samples. It allows permeability to be measured at different depths and positions over the wetland. Repeatability obtained at fixed positions was good (normalised standard deviation of 1–4%), and results obtained for highly homogenous silica sand compared well when the sand was retested in a lab permeameter (0.32 mm.s–1 and 0.31 mm.s–1 respectively). Practical results have a ±30% associated degree of uncertainty because of the mixed effect of natural variation in gravel core profiles, and interstitial clogging disruption during insertion of the tube into the gravel. This error is small, however, compared to the orders of magnitude spatial variations detected. The technique was used to survey the hydraulic conductivity profile of two constructed wetlands in the UK, aged 1 and 15 years respectively. Measured values were high (up to 900 mm.s –1) and varied by three orders of magnitude, reflecting the immaturity of the wetland. Detailed profiling of the younger system suggested the existence of preferential flow paths at a depth of 200 mm, corresponding to the transition between more coarse and less coarse gravel layers (6–12 mm and 3–6 mm respectively), and transverse drift towards the outlet.
Resumo:
We investigate the design of electronic dispersion compensation (EDC) using full optical-field reconstruction in 10Gbit/s on-off keyed transmission systems limited by optical signal-to-noise ratio (OSNR). By effectively suppressing the impairment due to low- frequency component amplification in phase reconstruction, properly designing the transmission system configuration to combat fiber nonlinearity, and successfully reducing the vulnerability to thermal noise, a 4.8dB OSNR margin can be achieved for 2160km single-mode fiber transmission without any optical dispersion compensation. We also investigate the performance sensitivity of the scheme to various system parameters, and propose a novel method to greatly enhance the tolerance to differential phase misalignment of the asymmetric Mach-Zehnder interferometer. This numerical study provides important design guidelines which will enable full optical-field EDC to become a cost-effective dispersion compensation solution for future transparent optical networks.
Resumo:
We investigate the pattern-dependent decoding failures in full-field electronic dispersion compensation (EDC) by offline processing of experimental signals, and find that the performance of such an EDC receiver may be degraded by an isolated "1" bit surrounded by long strings of consecutive "0s". By reducing the probability of occurrence of this kind of isolated "1" and using a novel adaptive threshold decoding method, we greatly improve the compensation performance to achieve 10-Gb/s on-off keyed signal transmission over 496-km field-installed single-mode fiber without optical dispersion compensation.
Resumo:
Magnetoencephalography (MEG), a non-invasive technique for characterizing brain electrical activity, is gaining popularity as a tool for assessing group-level differences between experimental conditions. One method for assessing task-condition effects involves beamforming, where a weighted sum of field measurements is used to tune activity on a voxel-by-voxel basis. However, this method has been shown to produce inhomogeneous smoothness differences as a function of signal-to-noise across a volumetric image, which can then produce false positives at the group level. Here we describe a novel method for group-level analysis with MEG beamformer images that utilizes the peak locations within each participant's volumetric image to assess group-level effects. We compared our peak-clustering algorithm with SnPM using simulated data. We found that our method was immune to artefactual group effects that can arise as a result of inhomogeneous smoothness differences across a volumetric image. We also used our peak-clustering algorithm on experimental data and found that regions were identified that corresponded with task-related regions identified in the literature. These findings suggest that our technique is a robust method for group-level analysis with MEG beamformer images.
Resumo:
The principle theme of this thesis is the advancement and expansion of ophthalmic research via the collaboration between professional Engineers and professional Optometrists. The aim has been to develop new and novel approaches and solutions to contemporary problems in the field. The work is sub divided into three areas of investigation; 1) High technology systems, 2) Modification of current systems to increase functionality, and 3) Development of smaller more portable and cost effective systems. High Technology Systems: A novel high speed Optical Coherence Tomography (OCT) system with integrated simultaneous high speed photography was developed achieving better operational speed than is currently available commercially. The mechanical design of the system featured a novel 8 axis alignment system. A full set of capture, analysis, and post processing software was developed providing custom analysis systems for ophthalmic OCT imaging, expanding the current capabilities of the technology. A large clinical trial was undertaken to test the dynamics of contact lens edge interaction with the cornea in-vivo. The interaction between lens edge design, lens base curvature, post insertion times and edge positions was investigated. A novel method for correction of optical distortion when assessing lens indentation was also demonstrated. Modification of Current Systems: A commercial autorefractor, the WAM-5500, was modified with the addition of extra hardware and a custom software and firmware solution to produce a system that was capable of measuring dynamic accommodative response to various stimuli in real time. A novel software package to control the data capture process was developed allowing real time monitoring of data by the practitioner, adding considerable functionality of the instrument further to the standard system. The device was used to assess the accommodative response differences between subjects who had worn UV blocking contact lens for 5 years, verses a control group that had not worn UV blocking lenses. While the standard static measurement of accommodation showed no differences between the two groups, it was determined that the UV blocking group did show better accommodative rise and fall times (faster), thus demonstrating the benefits of the modification of this commercially available instrumentation. Portable and Cost effective Systems: A new instrument was developed to expand the capability of the now defunct Keeler Tearscope. A device was developed that provided a similar capability in allowing observation of the reflected mires from the tear film surface, but with the added advantage of being able to record the observations. The device was tested comparatively with the tearscope and other tear film break-up techniques, demonstrating its potential. In Conclusion: This work has successfully demonstrated the advantages of interdisciplinary research between engineering and ophthalmic research has provided new and novel instrumented solutions as well as having added to the sum of scientific understanding in the ophthalmic field.
Resumo:
An iterative procedure is proposed for the reconstruction of a temperature field from a linear stationary heat equation with stochastic coefficients, and stochastic Cauchy data given on a part of the boundary of a bounded domain. In each step, a series of mixed well-posed boundary-value problems are solved for the stochastic heat operator and its adjoint. Well-posedness of these problems is shown to hold and convergence in the mean of the procedure is proved. A discretized version of this procedure, based on a Monte Carlo Galerkin finite-element method, suitable for numerical implementation is discussed. It is demonstrated that the solution to the discretized problem converges to the continuous as the mesh size tends to zero.
Resumo:
In this study, we investigate the problem of reconstruction of a stationary temperature field from given temperature and heat flux on a part of the boundary of a semi-infinite region containing an inclusion. This situation can be modelled as a Cauchy problem for the Laplace operator and it is an ill-posed problem in the sense of Hadamard. We propose and investigate a Landweber-Fridman type iterative method, which preserve the (stationary) heat operator, for the stable reconstruction of the temperature field on the boundary of the inclusion. In each iteration step, mixed boundary value problems for the Laplace operator are solved in the semi-infinite region. Well-posedness of these problems is investigated and convergence of the procedures is discussed. For the numerical implementation of these mixed problems an efficient boundary integral method is proposed which is based on the indirect variant of the boundary integral approach. Using this approach the mixed problems are reduced to integral equations over the (bounded) boundary of the inclusion. Numerical examples are included showing that stable and accurate reconstructions of the temperature field on the boundary of the inclusion can be obtained also in the case of noisy data. These results are compared with those obtained with the alternating iterative method.
Resumo:
We consider a Cauchy problem for the heat equation, where the temperature field is to be reconstructed from the temperature and heat flux given on a part of the boundary of the solution domain. We employ a Landweber type method proposed in [2], where a sequence of mixed well-posed problems are solved at each iteration step to obtain a stable approximation to the original Cauchy problem. We develop an efficient boundary integral equation method for the numerical solution of these mixed problems, based on the method of Rothe. Numerical examples are presented both with exact and noisy data, showing the efficiency and stability of the proposed procedure and approximations.
Resumo:
The dynamics of the non-equilibrium Ising model with parallel updates is investigated using a generalized mean field approximation that incorporates multiple two-site correlations at any two time steps, which can be obtained recursively. The proposed method shows significant improvement in predicting local system properties compared to other mean field approximation techniques, particularly in systems with symmetric interactions. Results are also evaluated against those obtained from Monte Carlo simulations. The method is also employed to obtain parameter values for the kinetic inverse Ising modeling problem, where couplings and local field values of a fully connected spin system are inferred from data. © 2014 IOP Publishing Ltd and SISSA Medialab srl.
Resumo:
Visual field assessment is a core component of glaucoma diagnosis and monitoring, and the Standard Automated Perimetry (SAP) test is considered up until this moment, the gold standard of visual field assessment. Although SAP is a subjective assessment and has many pitfalls, it is being constantly used in the diagnosis of visual field loss in glaucoma. Multifocal visual evoked potential (mfVEP) is a newly introduced method used for visual field assessment objectively. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study, we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. OBJECTIVES: The purpose of this study is to examine the effectiveness of a new analysis method in the Multi-Focal Visual Evoked Potential (mfVEP) when it is used for the objective assessment of the visual field in glaucoma patients, compared to the gold standard technique. METHODS: 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the 3 groups in the mean signal to noise ratio SNR (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). sensitivity and specificity of the HAS protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. DISCUSSION: The results showed that the new analysis protocol was able to confirm already existing field defects detected by standard HFA, was able to differentiate between the 3 study groups with a clear distinction between normal and patients with suspected glaucoma; however the distinction between normal and glaucoma patients was especially clear and significant. CONCLUSION: The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.
Resumo:
Objective: The purpose of this study was to examine the effectiveness of a new analysis method of mfVEP objective perimetry in the early detection of glaucomatous visual field defects compared to the gold standard technique. Methods and patients: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes), and glaucoma suspect patients (38 eyes). All subjects underwent two standard 24-2 visual field tests: one with the Humphrey Field Analyzer and a single mfVEP test in one session. Analysis of the mfVEP results was carried out using the new analysis protocol: the hemifield sector analysis protocol. Results: Analysis of the mfVEP showed that the signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the three groups (analysis of variance, P<0.001 with a 95% confidence interval, 2.82, 2.89 for normal group; 2.25, 2.29 for glaucoma suspect group; 1.67, 1.73 for glaucoma group). The difference between superior and inferior hemifield sectors and hemi-rings was statistically significant in 11/11 pair of sectors and hemi-rings in the glaucoma patients group (t-test P<0.001), statistically significant in 5/11 pairs of sectors and hemi-rings in the glaucoma suspect group (t-test P<0.01), and only 1/11 pair was statistically significant (t-test P<0.9). The sensitivity and specificity of the hemifield sector analysis protocol in detecting glaucoma was 97% and 86% respectively and 89% and 79% in glaucoma suspects. These results showed that the new analysis protocol was able to confirm existing visual field defects detected by standard perimetry, was able to differentiate between the three study groups with a clear distinction between normal patients and those with suspected glaucoma, and was able to detect early visual field changes not detected by standard perimetry. In addition, the distinction between normal and glaucoma patients was especially clear and significant using this analysis. Conclusion: The new hemifield sector analysis protocol used in mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol, it can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. The sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucomatous visual field loss. The intersector analysis protocol can detect early field changes not detected by the standard Humphrey Field Analyzer test. © 2013 Mousa et al, publisher and licensee Dove Medical Press Ltd.
Resumo:
An iterative method for the reconstruction of a stationary three-dimensional temperature field, from Cauchy data given on a part of the boundary, is presented. At each iteration step, a series of mixed well-posed boundary value problems are solved for the heat operator and its adjoint. A convergence proof of this method in a weighted L 2-space is include
Resumo:
In this paper, we proposed a new method using long digital straight segments (LDSSs) for fingerprint recognition based on such a discovery that LDSSs in fingerprints can accurately characterize the global structure of fingerprints. Different from the estimation of orientation using the slope of the straight segments, the length of LDSSs provides a measure for stability of the estimated orientation. In addition, each digital straight segment can be represented by four parameters: x-coordinate, y-coordinate, slope and length. As a result, only about 600 bytes are needed to store all the parameters of LDSSs of a fingerprint, as is much less than the storage orientation field needs. Finally, the LDSSs can well capture the structural information of local regions. Consequently, LDSSs are more feasible to apply to the matching process than orientation fields. The experiments conducted on fingerprint databases FVC2002 DB3a and DB4a show that our method is effective.
Resumo:
The reasonable choice is a critical success factor for decision- making in the field of software engineering (SE). A case-driven comparative analysis has been introduced and a procedure for its systematic application has been suggested. The paper describes how the proposed method can be built in a general framework for SE activities. Some examples of experimental versions of the framework are brie y presented.