23 resultados para Direct Analysis Method
Resumo:
Abstract A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.
Resumo:
Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.
Resumo:
Since 1996 direct femtosecond inscription in transparent dielectrics has become the subject of intensive research. This enabling technology significantly expands the technological boundaries for direct fabrication of 3D structures in a wide variety of materials. It allows modification of non-photosensitive materials, which opens the door to numerous practical applications. In this work we explored the direct femtosecond inscription of waveguides and demonstrated at least one order of magnitude enhancement in the most critical parameter - the induced contrast of the refractive index in a standard borosilicate optical glass. A record high induced refractive contrast of 2.5×10-2 is demonstrated. The waveguides fabricated possess one of the lowest losses, approaching level of Fresnel reflection losses at the glassair interface. High refractive index contrast allows the fabrication of curvilinear waveguides with low bend losses. We also demonstrated the optimisation of the inscription regimes in BK7 glass over a broad range of experimental parameters and observed a counter-intuitive increase of the induced refractive index contrast with increasing translation speed of a sample. Examples of inscription in a number of transparent dielectrics hosts using high repetition rate fs laser system (both glasses and crystals) are also presented. Sub-wavelength scale periodic inscription inside any material often demands supercritical propagation regimes, when pulse peak power is more than the critical power for selffocusing, sometimes several times higher than the critical power. For a sub-critical regime, when the pulse peak power is less than the critical power for self-focusing, we derive analytic expressions for Gaussian beam focusing in the presence of Kerr non-linearity as well as for a number of other beam shapes commonly used in experiments, including astigmatic and ring-shaped ones. In the part devoted to the fabrication of periodic structures, we report on recent development of our point-by-point method, demonstrating the shortest periodic perturbation created in the bulk of a pure fused silica sample, by using third harmonics (? =267 nm) of fundamental laser frequency (? =800 nm) and 1 kHz femtosecond laser system. To overcome the fundamental limitations of the point-by-point method we suggested and experimentally demonstrated the micro-holographic inscription method, which is based on using the combination of a diffractive optical element and standard micro-objectives. Sub-500 nm periodic structures with a much higher aspect ratio were demonstrated. From the applications point of view, we demonstrate examples of photonics devices by direct femtosecond fabrication method, including various vectorial bend-sensors fabricated in standard optical fibres, as well as a highly birefringent long-period gratings by direct modulation method. To address the intrinsic limitations of femtosecond inscription at very shallow depths we suggested the hybrid mask-less lithography method. The method is based on precision ablation of a thin metal layer deposited on the surface of the sample to create a mask. After that an ion-exchange process in the melt of Ag-containing salts allows quick and low-cost fabrication of shallow waveguides and other components of integrated optics. This approach covers the gap in direct fs inscription of shallow waveguide. Perspectives and future developments of direct femtosecond micro-fabrication are also discussed.
Resumo:
This article is aimed primarily at eye care practitioners who are undertaking advanced clinical research, and who wish to apply analysis of variance (ANOVA) to their data. ANOVA is a data analysis method of great utility and flexibility. This article describes why and how ANOVA was developed, the basic logic which underlies the method and the assumptions that the method makes for it to be validly applied to data from clinical experiments in optometry. The application of the method to the analysis of a simple data set is then described. In addition, the methods available for making planned comparisons between treatment means and for making post hoc tests are evaluated. The problem of determining the number of replicates or patients required in a given experimental situation is also discussed. Copyright (C) 2000 The College of Optometrists.
Resumo:
Recently, we introduced a new 'GLM-beamformer' technique for MEG analysis that enables accurate localisation of both phase-locked and non-phase-locked neuromagnetic effects, and their representation as statistical parametric maps (SPMs). This provides a useful framework for comparison of the full range of MEG responses with fMRI BOLD results. This paper reports a 'proof of principle' study using a simple visual paradigm (static checkerboard). The five subjects each underwent both MEG and fMRI paradigms. We demonstrate, for the first time, the presence of a sustained (DC) field in the visual cortex, and its co-localisation with the visual BOLD response. The GLM-beamformer analysis method is also used to investigate the main non-phase-locked oscillatory effects: an event-related desynchronisation (ERD) in the alpha band (8-13 Hz) and an event-related synchronisation (ERS) in the gamma band (55-70 Hz). We show, using SPMs and virtual electrode traces, the spatio-temporal covariance of these effects with the visual BOLD response. Comparisons between MEG and fMRI data sets generally focus on the relationship between the BOLD response and the transient evoked response. Here, we show that the stationary field and changes in oscillatory power are also important contributors to the BOLD response, and should be included in future studies on the relationship between neuronal activation and the haemodynamic response. © 2005 Elsevier Inc. All rights reserved.
Resumo:
Purpose: To determine the most appropriate analysis technique for the differentiation of multifocal intraocular lens (MIOL) designs using defocus curve assessment of visual capability.Methods:Four groups of fifteen subjects were implanted bilaterally with either monofocal intraocular lenses, refractive MIOLs, diffractive MIOLs, or a combination of refractive and diffractive MIOLs. Defocus curves between -5.0D and +1.5D were evaluated using an absolute and relative depth-of-focus method, the direct comparison method and a new 'Area-of-focus' metric. The results were correlated with a subjective perception of near and intermediate vision. Results:Neither depth-of-focus method of analysis were sensitive enough to differentiate between MIOL groups (p>0.05). The direct comparison method indicated that the refractive MIOL group performed better at +1.00, -1.00 and -1.50 D and worse at -3.00, -3.50, -4.00 and -5.00D compared to the diffractive MIOL group (p
Resumo:
The concept of a task is fundamental to the discipline of ergonomics. Approaches to the analysis of tasks began in the early 1900's. These approaches have evolved and developed to the present day, when there is a vast array of methods available. Some of these methods are specific to particular contexts or applications, others more general. However, whilst many of these analyses allow tasks to be examined in detail, they do not act as tools to aid the design process or the designer. The present thesis examines the use of task analysis in a process control context, and in particular the use of task analysis to specify operator information and display requirements in such systems. The first part of the thesis examines the theoretical aspect of task analysis and presents a review of the methods, issues and concepts relating to task analysis. A review of over 80 methods of task analysis was carried out to form a basis for the development of a task analysis method to specify operator information requirements in industrial process control contexts. Of the methods reviewed Hierarchical Task Analysis was selected to provide such a basis and developed to meet the criteria outlined for such a method of task analysis. The second section outlines the practical application and evolution of the developed task analysis method. Four case studies were used to examine the method in an empirical context. The case studies represent a range of plant contexts and types, both complex and more simple, batch and continuous and high risk and low risk processes. The theoretical and empirical issues are drawn together and a method developed to provide a task analysis technique to specify operator information requirements and to provide the first stages of a tool to aid the design of VDU displays for process control.
Resumo:
The thesis presents new methodology and algorithms that can be used to analyse and measure the hand tremor and fatigue of surgeons while performing surgery. This will assist them in deriving useful information about their fatigue levels, and make them aware of the changes in their tool point accuracies. This thesis proposes that muscular changes of surgeons, which occur through a day of operating, can be monitored using Electromyography (EMG) signals. The multi-channel EMG signals are measured at different muscles in the upper arm of surgeons. The dependence of EMG signals has been examined to test the hypothesis that EMG signals are coupled with and dependent on each other. The results demonstrated that EMG signals collected from different channels while mimicking an operating posture are independent. Consequently, single channel fatigue analysis has been performed. In measuring hand tremor, a new method for determining the maximum tremor amplitude using Principal Component Analysis (PCA) and a new technique to detrend acceleration signals using Empirical Mode Decomposition algorithm were introduced. This tremor determination method is more representative for surgeons and it is suggested as an alternative fatigue measure. This was combined with the complexity analysis method, and applied to surgically captured data to determine if operating has an effect on a surgeon’s fatigue and tremor levels. It was found that surgical tremor and fatigue are developed throughout a day of operating and that this could be determined based solely on their initial values. Finally, several Nonlinear AutoRegressive with eXogenous inputs (NARX) neural networks were evaluated. The results suggest that it is possible to monitor surgeon tremor variations during surgery from their EMG fatigue measurements.
Resumo:
In this thesis, standard algorithms are used to carry out the optimisation of cold-formed steel purlins such as zed, channel and sigma sections, which are assumed to be simply supported and subjected to a gravity load. For zed, channel and sigma section, the local buckling, distortional buckling and lateral-torsional buckling are considered respectively herein. Currently, the local buckling is based on the BS 5950-5:1998 and EN 1993-1-3:2006. The distortional buckling is calculated by the direct strength method employing the elastic distortional buckling which is calculated by three available approaches such as Hancock (1995), Schafer and Pekoz (1998), Yu (2005). In the optimisation program, the lateral-torsional buckling based on BS 5950-5:1998, AISI and analytical model of Li (2004) are investigated. For the optimisation program, the programming codes are written for optimisation of channel, zed and sigma beam. The full study has been coded into a computer-based analysis program (MATLAB).
Resumo:
Visual field assessment is a core component of glaucoma diagnosis and monitoring, and the Standard Automated Perimetry (SAP) test is considered up until this moment, the gold standard of visual field assessment. Although SAP is a subjective assessment and has many pitfalls, it is being constantly used in the diagnosis of visual field loss in glaucoma. Multifocal visual evoked potential (mfVEP) is a newly introduced method used for visual field assessment objectively. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study, we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. OBJECTIVES: The purpose of this study is to examine the effectiveness of a new analysis method in the Multi-Focal Visual Evoked Potential (mfVEP) when it is used for the objective assessment of the visual field in glaucoma patients, compared to the gold standard technique. METHODS: 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the 3 groups in the mean signal to noise ratio SNR (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). sensitivity and specificity of the HAS protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. DISCUSSION: The results showed that the new analysis protocol was able to confirm already existing field defects detected by standard HFA, was able to differentiate between the 3 study groups with a clear distinction between normal and patients with suspected glaucoma; however the distinction between normal and glaucoma patients was especially clear and significant. CONCLUSION: The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.
Resumo:
Objective: The purpose of this study was to examine the effectiveness of a new analysis method of mfVEP objective perimetry in the early detection of glaucomatous visual field defects compared to the gold standard technique. Methods and patients: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes), and glaucoma suspect patients (38 eyes). All subjects underwent two standard 24-2 visual field tests: one with the Humphrey Field Analyzer and a single mfVEP test in one session. Analysis of the mfVEP results was carried out using the new analysis protocol: the hemifield sector analysis protocol. Results: Analysis of the mfVEP showed that the signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the three groups (analysis of variance, P<0.001 with a 95% confidence interval, 2.82, 2.89 for normal group; 2.25, 2.29 for glaucoma suspect group; 1.67, 1.73 for glaucoma group). The difference between superior and inferior hemifield sectors and hemi-rings was statistically significant in 11/11 pair of sectors and hemi-rings in the glaucoma patients group (t-test P<0.001), statistically significant in 5/11 pairs of sectors and hemi-rings in the glaucoma suspect group (t-test P<0.01), and only 1/11 pair was statistically significant (t-test P<0.9). The sensitivity and specificity of the hemifield sector analysis protocol in detecting glaucoma was 97% and 86% respectively and 89% and 79% in glaucoma suspects. These results showed that the new analysis protocol was able to confirm existing visual field defects detected by standard perimetry, was able to differentiate between the three study groups with a clear distinction between normal patients and those with suspected glaucoma, and was able to detect early visual field changes not detected by standard perimetry. In addition, the distinction between normal and glaucoma patients was especially clear and significant using this analysis. Conclusion: The new hemifield sector analysis protocol used in mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol, it can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. The sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucomatous visual field loss. The intersector analysis protocol can detect early field changes not detected by the standard Humphrey Field Analyzer test. © 2013 Mousa et al, publisher and licensee Dove Medical Press Ltd.
Resumo:
In the face of global population growth and the uneven distribution of water supply, a better knowledge of the spatial and temporal distribution of surface water resources is critical. Remote sensing provides a synoptic view of ongoing processes, which addresses the intricate nature of water surfaces and allows an assessment of the pressures placed on aquatic ecosystems. However, the main challenge in identifying water surfaces from remotely sensed data is the high variability of spectral signatures, both in space and time. In the last 10 years only a few operational methods have been proposed to map or monitor surface water at continental or global scale, and each of them show limitations. The objective of this study is to develop and demonstrate the adequacy of a generic multi-temporal and multi-spectral image analysis method to detect water surfaces automatically, and to monitor them in near-real-time. The proposed approach, based on a transformation of the RGB color space into HSV, provides dynamic information at the continental scale. The validation of the algorithm showed very few omission errors and no commission errors. It demonstrates the ability of the proposed algorithm to perform as effectively as human interpretation of the images. The validation of the permanent water surface product with an independent dataset derived from high resolution imagery, showed an accuracy of 91.5% and few commission errors. Potential applications of the proposed method have been identified and discussed. The methodology that has been developed 27 is generic: it can be applied to sensors with similar bands with good reliability, and minimal effort. Moreover, this experiment at continental scale showed that the methodology is efficient for a large range of environmental conditions. Additional preliminary tests over other continents indicate that the proposed methodology could also be applied at the global scale without too many difficulties
Resumo:
Consideration of the influence of test technique and data analysis method is important for data comparison and design purposes. The paper highlights the effects of replication interval, crack growth rate averaging and curve-fitting procedures on crack growth rate results for a Ni-base alloy. It is shown that an upper bound crack growth rate line is not appropriate for use in fatigue design, and that the derivative of a quadratic fit to the a vs N data looks promising. However, this type of averaging, or curve fitting, is not useful in developing an understanding of microstructure/crack tip interactions. For this purpose, simple replica-to-replica growth rate calculations are preferable. © 1988.
Resumo:
Most of the previous studies on intellectual capital disclosures have been conducted from developed countries' context. There is very limited empirical evidence in this area from the context of emerging economies in general and Africa in particular. This paper is one of the early attempts in this regard. The main purpose of this study is to examine the extent and nature of intellectual capitaldisclosures in ‘Top 20’ South African companies over a 5 years period (2002–2006). The study uses content analysis method to scrutinise the patterns of intellectual capital disclosures during the study period. The results show that intellectual capital disclosures in South Africa have increased over the 5 years study period with certain firms reporting considerably more than others. Out of the three broad categories of intellectual capital disclosures human capital appears to be the most popular category. This finding stands in sharp contrast to the previous studies in this area where external capital was found to be most popular category.
Resumo:
Purpose: Most published surface wettability data are based on hydrated materials and are dominated by the air-water interface. Water soluble species with hydrophobic domains (such as surfactants) interact directly with the hydrophobic domains in the lens polymer. Characterisation of relative polar and non-polar fractions of the dehydrated material provides an additional approach to surface analysis. Method: Probe liquids (water and diiodomethane) were used to characterise polar and dispersive components of surface energies of dehydrated lenses using the method of Owens and Wendt. A range of conventional and silicone hydrogel soft lenses was studied. The polar fraction (i.e. polar/total) of surface energy was used as a basis for the study of the structural effects that influence surfactant persistence on the lens surface. Results: When plotted against water content of the hydrated lens, polar fraction of surface energy (PFSE) values of the dehydrated lenses fell into two rectilinear bands. One of these bands covered PFSE values ranging from 0.4 to 0.8 and contained only conventional hydrogels, with two notable additions: the plasma coated silicone hydrogels lotrafilcon A and B. The second band covered PFSE values ranging from 0.04 to 0.28 and contained only silicone hydrogels. Significantly, the silicone hydrogel lenses with lowest PFSE values (p<0.15) are found to be prone to lipid deposition duringwear. Additionally, more hydrophobic surfactants were found to be more persistent on lenses with lower PFSE values. Conclusions: Measurement of polar fraction of surface energy provides an importantmechanistic insight into surface interactions of silicone hydrogels.