8 resultados para objective refraction
em Universidad de Alicante
Resumo:
Objective: To evaluate the visual and refractive outcomes after phacoemulsification surgery in eyes with isolated lens coloboma. Design: Prospective, consecutive case series. Participants: Eighteen eyes with isolated lens coloboma of 13 patients were included in the study. Mean patient age was 13.9 ± 6.5 years. Methods: Patients underwent phacoemulsification surgery, with combined implantation of capsular tension ring (CTR) and intraocular lens. In colobomas of less than 120°, a CTR was used, whereas in colobomas of more than 120°, a Cionni-modified single eyelet CTR was used to achieve better capsular centration. The main outcome measures were uncorrected distance visual acuity, corrected distance visual acuity, refraction, and keratometry. Results: Mean logMAR uncorrected distance visual acuity and corrected distance visual acuity improved significantly from 1.53 ± 0.35 and 1.02 ± 0.47 before surgery to 0.67 ± 0.51 and 0.52 ± 0.49 at the last visit of the follow-up (p < 0.001). Mean refractive cylinder and spherical equivalent decreased significantly from –6.73 ± 1.73 and –6.72 ± 4.07 D preoperatively to –1.40 ± 1.39 and –0.83 ± 1.31 D at the end of the follow-up (p = 0.001 and p = 0.01, respectively). Mean keratometric astigmatism at preoperative and postoperative visits were 1.58 ± 0.97 and 1.65 ± 0.94 D, respectively (p = 0.70). Conclusions: Phacoemulsification with CTR and intraocular lens implantation is an effective and safe option for providing a refractive correction and a significant visual improvement in eyes with isolated lens coloboma.
Resumo:
Tuning compilations is the process of adjusting the values of a compiler options to improve some features of the final application. In this paper, a strategy based on the use of a genetic algorithm and a multi-objective scheme is proposed to deal with this task. Unlike previous works, we try to take advantage of the knowledge of this domain to provide a problem-specific genetic operation that improves both the speed of convergence and the quality of the results. The evaluation of the strategy is carried out by means of a case of study aimed to improve the performance of the well-known web server Apache. Experimental results show that a 7.5% of overall improvement can be achieved. Furthermore, the adaptive approach has shown an ability to markedly speed-up the convergence of the original strategy.
Resumo:
In this work, we analyze the effect of demand uncertainty on the multi-objective optimization of chemical supply chains (SC) considering simultaneously their economic and environmental performance. To this end, we present a stochastic multi-scenario mixed-integer linear program (MILP) with the unique feature of incorporating explicitly the demand uncertainty using scenarios with given probability of occurrence. The environmental performance is quantified following life cycle assessment (LCA) principles, which are represented in the model formulation through standard algebraic equations. The capabilities of our approach are illustrated through a case study. We show that the stochastic solution improves the economic performance of the SC in comparison with the deterministic one at any level of the environmental impact.
Resumo:
The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).
Resumo:
Modern compilers present a great and ever increasing number of options which can modify the features and behavior of a compiled program. Many of these options are often wasted due to the required comprehensive knowledge about both the underlying architecture and the internal processes of the compiler. In this context, it is usual, not having a single design goal but a more complex set of objectives. In addition, the dependencies between different goals are difficult to be a priori inferred. This paper proposes a strategy for tuning the compilation of any given application. This is accomplished by using an automatic variation of the compilation options by means of multi-objective optimization and evolutionary computation commanded by the NSGA-II algorithm. This allows finding compilation options that simultaneously optimize different objectives. The advantages of our proposal are illustrated by means of a case study based on the well-known Apache web server. Our strategy has demonstrated an ability to find improvements up to 7.5% and up to 27% in context switches and L2 cache misses, respectively, and also discovers the most important bottlenecks involved in the application performance.
Resumo:
Feature selection is an important and active issue in clustering and classification problems. By choosing an adequate feature subset, a dataset dimensionality reduction is allowed, thus contributing to decreasing the classification computational complexity, and to improving the classifier performance by avoiding redundant or irrelevant features. Although feature selection can be formally defined as an optimisation problem with only one objective, that is, the classification accuracy obtained by using the selected feature subset, in recent years, some multi-objective approaches to this problem have been proposed. These either select features that not only improve the classification accuracy, but also the generalisation capability in case of supervised classifiers, or counterbalance the bias toward lower or higher numbers of features that present some methods used to validate the clustering/classification in case of unsupervised classifiers. The main contribution of this paper is a multi-objective approach for feature selection and its application to an unsupervised clustering procedure based on Growing Hierarchical Self-Organising Maps (GHSOMs) that includes a new method for unit labelling and efficient determination of the winning unit. In the network anomaly detection problem here considered, this multi-objective approach makes it possible not only to differentiate between normal and anomalous traffic but also among different anomalies. The efficiency of our proposals has been evaluated by using the well-known DARPA/NSL-KDD datasets that contain extracted features and labelled attacks from around 2 million connections. The selected feature sets computed in our experiments provide detection rates up to 99.8% with normal traffic and up to 99.6% with anomalous traffic, as well as accuracy values up to 99.12%.
Resumo:
In this paper we examine multi-objective linear programming problems in the face of data uncertainty both in the objective function and the constraints. First, we derive a formula for the radius of robust feasibility guaranteeing constraint feasibility for all possible scenarios within a specified uncertainty set under affine data parametrization. We then present numerically tractable optimality conditions for minmax robust weakly efficient solutions, i.e., the weakly efficient solutions of the robust counterpart. We also consider highly robust weakly efficient solutions, i.e., robust feasible solutions which are weakly efficient for any possible instance of the objective matrix within a specified uncertainty set, providing lower bounds for the radius of highly robust efficiency guaranteeing the existence of this type of solutions under affine and rank-1 objective data uncertainty. Finally, we provide numerically tractable optimality conditions for highly robust weakly efficient solutions.
Resumo:
We examined the optical properties of nanolayered metal-dielectric lattices. At subwavelength regimes, the periodic array of metallic nanofilms demonstrates nonlocality-induced double refraction, conventional positive and as well as negative. In particular, we report on energy-flow considerations concerning both refractive behaviors concurrently. Numerical simulations provide transmittance of individual beams in Ag-TiO2 metamaterials under different configurations. In regimes of the effective-medium theory predicting elliptic dispersion, negative refraction may be stronger than the expected positive refraction.