5 resultados para EFFICIENCY OPTIMIZATION

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Umbilical cord blood (UCB) is a source of hematopoietic stem cells that initially was used exclusively for the hematopoietic reconstitution of pediatric patients. It is now suggested for use for adults as well, a fact that increases the pressure to obtain units with high cellularity. Therefore, the optimization of UCB processing is a priority.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The rigorous test to which homeopathy was subject in our recent double-blind clinical trail of homeopathic treatment of attention deficit hyperactivity disorder (ADHD) necessitated optimized treatment meeting the highest standards. METHODS: Optimization was performed in three steps: (1) In successfully treated children, prescriptions leading to an insufficient response were analysed by a general questionnaire to identify unreliable symptoms. (2) Polarity analysis, a further development of Bönninghausen's concept of contraindications, was introduced in response to the frequently one-sided symptoms. This enabled us to use few but specific symptoms to identify the medicine whose genius symptoms exhibit the closest match to the patient's characteristic symptoms. (3) We investigated the influence of the primary perception symptoms on the result of the repertorization. Perception symptoms are not normally recorded during a patient interview even though they are among the most reliable facts related by the patients. At the same time we were able to improve the continuity of improvement of ADHD symptoms using liquid Q-potencies. RESULTS: Introducing the questionnaire, polarity analysis, and including perception symptoms, lead to an improvement in the success rate of the first prescription from 21% to 54%, of the fifth prescription from 68% to 84%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: first, we propose a quantile-based criterion for the sequential design of experiments, in the fashion of the classical expected improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Second, we present a procedure for the allocation of the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety. This article has supplementary material available online. The proposed criterion is available in the R package DiceOptim.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centers from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.