3 resultados para Method validation

em Universidad de Alicante


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose. To validate clinically a new method for estimating the corneal power (P,) using a variable keratometric index (nkadj) in eyes with previous laser refractive surgery. Setting. University of Alicante and Medimar International Hospital (Oftalmar), Alicante, (Spain). Design. Retrospective case series. Methods. This retrospective study comprised 62 eyes of 62 patients that had undergone myopic LASIK surgery. An algorithm for the calculation of 11kadj was used for the estimation of the adjusted keratometric corneal power (Pkadj). This value was compared with the classical keratometric corneal power (Pk), the True Net Power (TNP), and the Gaussian corneal power (PcGauss). Likewise, Pkadj was compared with other previously described methods. Results. Differences between PcGauss and P, values obtained with all methods evaluated were statistically significant (p < 0.01). Differences between Pkadj and PcGauss were in the limit of clinical significance (p < 0.01, loA [ - 0.33,0.60] D). Differences between Pkadj and TNP were not statistically and clinically significant (p = 0.319, loA [- 0.50,0.44] D). Differences between Pkadj and previously described methods were statistically significant (p < 0.01), except with PcHaigisL (p = 0.09, loA [ - 0.37,0.29] D). Conclusion. The use of the adjusted keratometric index (nkadj) is a valid method to estimate the central corneal power in corneas with previous myopic laser refractive surgery, providing results comparable to PcHaigisL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Iterative Closest Point algorithm (ICP) is commonly used in engineering applications to solve the rigid registration problem of partially overlapped point sets which are pre-aligned with a coarse estimate of their relative positions. This iterative algorithm is applied in many areas such as the medicine for volumetric reconstruction of tomography data, in robotics to reconstruct surfaces or scenes using range sensor information, in industrial systems for quality control of manufactured objects or even in biology to study the structure and folding of proteins. One of the algorithm’s main problems is its high computational complexity (quadratic in the number of points with the non-optimized original variant) in a context where high density point sets, acquired by high resolution scanners, are processed. Many variants have been proposed in the literature whose goal is the performance improvement either by reducing the number of points or the required iterations or even enhancing the complexity of the most expensive phase: the closest neighbor search. In spite of decreasing its complexity, some of the variants tend to have a negative impact on the final registration precision or the convergence domain thus limiting the possible application scenarios. The goal of this work is the improvement of the algorithm’s computational cost so that a wider range of computationally demanding problems from among the ones described before can be addressed. For that purpose, an experimental and mathematical convergence analysis and validation of point-to-point distance metrics has been performed taking into account those distances with lower computational cost than the Euclidean one, which is used as the de facto standard for the algorithm’s implementations in the literature. In that analysis, the functioning of the algorithm in diverse topological spaces, characterized by different metrics, has been studied to check the convergence, efficacy and cost of the method in order to determine the one which offers the best results. Given that the distance calculation represents a significant part of the whole set of computations performed by the algorithm, it is expected that any reduction of that operation affects significantly and positively the overall performance of the method. As a result, a performance improvement has been achieved by the application of those reduced cost metrics whose quality in terms of convergence and error has been analyzed and validated experimentally as comparable with respect to the Euclidean distance using a heterogeneous set of objects, scenarios and initial situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of digital competence remains an issue of interest for both the scientific community and the supranational political agenda. This study uses the Delphi method to validate the design of a questionnaire to determine the perceived importance of digital competence in higher education. The questionnaire was constructed from different framework documents in digital competence standards (NETS, ACLR, UNESCO). The triangulation of non-parametric techniques made it possible to consolidate the results obtained through the Delphi panel, the suitability of which was highlighted through the expert competence index (K). The resulting questionnaire emerges as a good tool for undertaking future national and international studies on digital competence in higher education.