14 resultados para Particle swarm optimization algorithm PSO

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two new approaches to quantitatively analyze diffuse diffraction intensities from faulted layer stacking are reported. The parameters of a probability-based growth model are determined with two iterative global optimization methods: a genetic algorithm (GA) and particle swarm optimization (PSO). The results are compared with those from a third global optimization method, a differential evolution (DE) algorithm [Storn & Price (1997). J. Global Optim. 11, 341359]. The algorithm efficiencies in the early and late stages of iteration are compared. The accuracy of the optimized parameters improves with increasing size of the simulated crystal volume. The wall clock time for computing quite large crystal volumes can be kept within reasonable limits by the parallel calculation of many crystals (clones) generated for each model parameter set on a super- or grid computer. The faulted layer stacking in single crystals of trigonal three-pointedstar- shaped tris(bicylco[2.1.1]hexeno)benzene molecules serves as an example for the numerical computations. Based on numerical values of seven model parameters (reference parameters), nearly noise-free reference intensities of 14 diffuse streaks were simulated from 1280 clones, each consisting of 96 000 layers (reference crystal). The parameters derived from the reference intensities with GA, PSO and DE were compared with the original reference parameters as a function of the simulated total crystal volume. The statistical distribution of structural motifs in the simulated crystals is in good agreement with that in the reference crystal. The results found with the growth model for layer stacking disorder are applicable to other disorder types and modeling techniques, Monte Carlo in particular.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE In this study, the "Progressive Resolution Optimizer PRO3" (Varian Medical Systems) is compared to the previous version "PRO2" with respect to its potential to improve dose sparing to the organs at risk (OAR) and dose coverage of the PTV for head and neck cancer patients. MATERIALS AND METHODS For eight head and neck cancer patients, volumetric modulated arc therapy (VMAT) treatment plans were generated in this study. All cases have 2-3 phases and the total prescribed dose (PD) was 60-72Gy in the PTV. The study is mainly focused on the phase 1 plans, which all have an identical PD of 54Gy, and complex PTV structures with an overlap to the parotids. Optimization was performed based on planning objectives for the PTV according to ICRU83, and with minimal dose to spinal cord, and parotids outside PTV. In order to assess the quality of the optimization algorithms, an identical set of constraints was used for both, PRO2 and PRO3. The resulting treatment plans were investigated with respect to dose distribution based on the analysis of the dose volume histograms. RESULTS For the phase 1 plans (PD=54Gy) the near maximum dose D2% of the spinal cord, could be minimized to 225Gy with PRO3, as compared to 3212Gy with PRO2, averaged for all patients. The mean dose to the parotids was also lower in PRO3 plans compared to PRO2, but the differences were less pronounced. A PTV coverage of V95%=971% could be reached with PRO3, as compared to 865% with PRO2. In clinical routine, these PRO2 plans would require modifications to obtain better PTV coverage at the cost of higher OAR doses. CONCLUSION A comparison between PRO3 and PRO2 optimization algorithms was performed for eight head and neck cancer patients. In general, the quality of VMAT plans for head and neck patients are improved with PRO3 as compared to PRO2. The dose to OARs can be reduced significantly, especially for the spinal cord. These reductions are achieved with better PTV coverage as compared to PRO2. The improved spinal cord sparing offers new opportunities for all types of paraspinal tumors and for re-irradiation of recurrent tumors or second malignancies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

SOMS is a general surrogate-based multistart algorithm, which is used in combination with any local optimizer to nd global optima for computationally expensive functions with multiple local minima. SOMS differs from previous multistart methods in that a surrogate approximation is used by the multistart algorithm to help reduce the number of function evaluations necessary to identify the most promising points from which to start each nonlinear programming local search. SOMSs numerical results are compared with four well-known methods, namely, Multi-Level Single Linkage (MLSL), MATLABs MultiStart, MATLABs GlobalSearch, and GLOBAL. In addition, we propose a class of wavy test functions that mimic the wavy nature of objective functions arising in many black-box simulations. Extensive comparisons of algorithms on the wavy testfunctions and on earlier standard global-optimization test functions are done for a total of 19 different test problems. The numerical results indicate that SOMS performs favorably in comparison to alternative methods and does especially well on wavy functions when the number of function evaluations allowed is limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently photon Monte Carlo treatment planning (MCTP) for a patient stored in the patient database of a treatment planning system (TPS) can usually only be performed using a cumbersome multi-step procedure where many user interactions are needed. This means automation is needed for usage in clinical routine. In addition, because of the long computing time in MCTP, optimization of the MC calculations is essential. For these purposes a new graphical user interface (GUI)-based photon MC environment has been developed resulting in a very flexible framework. By this means appropriate MC transport methods are assigned to different geometric regions by still benefiting from the features included in the TPS. In order to provide a flexible MC environment, the MC particle transport has been divided into different parts: the source, beam modifiers and the patient. The source part includes the phase-space source, source models and full MC transport through the treatment head. The beam modifier part consists of one module for each beam modifier. To simulate the radiation transport through each individual beam modifier, one out of three full MC transport codes can be selected independently. Additionally, for each beam modifier a simple or an exact geometry can be chosen. Thereby, different complexity levels of radiation transport are applied during the simulation. For the patient dose calculation, two different MC codes are available. A special plug-in in Eclipse providing all necessary information by means of Dicom streams was used to start the developed MC GUI. The implementation of this framework separates the MC transport from the geometry and the modules pass the particles in memory; hence, no files are used as the interface. The implementation is realized for 6 and 15 MV beams of a Varian Clinac 2300 C/D. Several applications demonstrate the usefulness of the framework. Apart from applications dealing with the beam modifiers, two patient cases are shown. Thereby, comparisons are performed between MC calculated dose distributions and those calculated by a pencil beam or the AAA algorithm. Interfacing this flexible and efficient MC environment with Eclipse allows a widespread use for all kinds of investigations from timing and benchmarking studies to clinical patient studies. Additionally, it is possible to add modules keeping the system highly flexible and efficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a computer-aided diagnostic (CAD) system for the classification of hepatic lesions from computed tomography (CT) images is presented. Regions of interest (ROIs) taken from nonenhanced CT images of normal liver, hepatic cysts, hemangiomas, and hepatocellular carcinomas have been used as input to the system. The proposed system consists of two modules: the feature extraction and the classification modules. The feature extraction module calculates the average gray level and 48 texture characteristics, which are derived from the spatial gray-level co-occurrence matrices, obtained from the ROIs. The classifier module consists of three sequentially placed feed-forward neural networks (NNs). The first NN classifies into normal or pathological liver regions. The pathological liver regions are characterized by the second NN as cyst or "other disease." The third NN classifies "other disease" into hemangioma or hepatocellular carcinoma. Three feature selection techniques have been applied to each individual NN: the sequential forward selection, the sequential floating forward selection, and a genetic algorithm for feature selection. The comparative study of the above dimensionality reduction methods shows that genetic algorithms result in lower dimension feature vectors and improved classification performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluorides are used in dental care due to their beneficial effect in tooth enamel de-/remineralization cycles. To achieve a desired constant supply of soluble fluorides in the oral cavity, different approaches have been followed. Here we present results on the preparation of CaF2 particles and their characterization with respect to a potential application as enamel associated fluoride releasing reservoirs. CaF2 particles were synthesized by precipitation from soluble NaF and CaCl2 salt solutions of defined concentrations and their morphology analyzed by scanning electron microscopy. CaF2 particles with defined sizes and shapes could be synthesized by adjusting the concentrations of the precursor salt solutions. Such particles interacted with enamel surfaces when applied at fluoride concentrations correlating to typical dental care products. Fluoride release from the synthesized CaF2 particles was observed to be largely influenced by the concentration of phosphate in the solution. Physiological solutions with phosphate concentration similar to saliva (3.5mM) reduced the fluoride release from pure CaF2 particles by a factor of 10-20as compared to phosphate free buffer solutions. Fluoride release was even lower in human saliva. The fluoride release could be increased by the addition of phosphate in substoichiometric amounts during CaF2 particle synthesis. The presented results demonstrate that the morphology and fluoride release characteristics of CaF2 particles can be tuned and provide evidence of the suitability of synthetic CaF2 particles as enamel associated fluoride reservoirs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurements of charged-particle fragmentation functions of jets produced in ultra-relativistic nuclear collisions can provide insight into the modification of parton showers in the hot, dense medium created in the collisions. ATLAS has measured jets in sNN=2.76 TeV Pb+Pb collisions at the LHC using a data set recorded in 2011 with an integrated luminosity of 0.14 nb1. Jets were reconstructed using the anti-kt algorithm with distance parameter values R = 0.2, 0.3, and 0.4. Distributions of charged-particle transverse momentum and longitudinal momentum fraction are reported for seven bins in collision centrality for R=0.4 jets with pjetT>100 GeV. Commensurate minimum pT values are used for the other radii. Ratios of fragment distributions in each centrality bin to those measured in the most peripheral bin are presented. These ratios show a reduction of fragment yield in central collisions relative to peripheral collisions at intermediate z values, 0.04z0.2 and an enhancement in fragment yield for z0.04. A smaller, less significant enhancement is observed at large z and large pT in central collisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attractive business cases in various application fields contribute to the sustained long-term interest in indoor localization and tracking by the research community. Location tracking is generally treated as a dynamic state estimation problem, consisting of two steps: (i) location estimation through measurement, and (ii) location prediction. For the estimation step, one of the most efficient and low-cost solutions is Received Signal Strength (RSS)-based ranging. However, various challenges - unrealistic propagation model, non-line of sight (NLOS), and multipath propagation - are yet to be addressed. Particle filters are a popular choice for dealing with the inherent non-linearities in both location measurements and motion dynamics. While such filters have been successfully applied to accurate, time-based ranging measurements, dealing with the more error-prone RSS based ranging is still challenging. In this work, we address the above issues with a novel, weighted likelihood, bootstrap particle filter for tracking via RSS-based ranging. Our filter weights the individual likelihoods from different anchor nodes exponentially, according to the ranging estimation. We also employ an improved propagation model for more accurate RSS-based ranging, which we suggested in recent work. We implemented and tested our algorithm in a passive localization system with IEEE 802.15.4 signals, showing that our proposed solution largely outperforms a traditional bootstrap particle filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent studies of Schwinger pair production have demonstrated that the asymptotic particle spectrum is extremely sensitive to the applied field profile. We extend the idea of the dynamically assisted Schwinger effect from single pulse profiles to more realistic field configurations to be generated in an all-optical experiment searching for pair creation. We use the quantum kinetic approach to study the particle production and employ a multi-start method, combined with optimal control theory, to determine a set of parameters for which the particle yield in the forward direction in momentum space is maximized. We argue that this strategy can be used to enhance the signal of pair production on a given detector in an experimental setup.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Herein, we report the discovery of the first potent and selective inhibitor of TRPV6, a calcium channel overexpressed in breast and prostate cancer, and its use to test the effect of blocking TRPV6-mediated Ca2+-influx on cell growth. The inhibitor was discovered through a computational method, xLOS, a 3D-shape and pharmacophore similarity algorithm, a type of ligand-based virtual screening (LBVS) method described briefly here. Starting with a single weakly active seed molecule, two successive rounds of LBVS followed by optimization by chemical synthesis led to a selective molecule with 0.3M inhibition of TRPV6. The ability of xLOS to identify different scaffolds early in LBVS was essential to success. The xLOS method may be generally useful to develop tool compounds for poorly characterized targets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a parallel surrogate-based global optimization method for computationally expensive objective functions that is more effective for larger numbers of processors. To reach this goal, we integrated concepts from multi-objective optimization and tabu search into, single objective, surrogate optimization. Our proposed derivative-free algorithm, called SOP, uses non-dominated sorting of points for which the expensive function has been previously evaluated. The two objectives are the expensive function value of the point and the minimum distance of the point to previously evaluated points. Based on the results of non-dominated sorting, P points from the sorted fronts are selected as centers from which many candidate points are generated by random perturbations. Based on surrogate approximation, the best candidate point is subsequently selected for expensive evaluation for each of the P centers, with simultaneous computation on P processors. Centers that previously did not generate good solutions are tabu with a given tenure. We show almost sure convergence of this algorithm under some conditions. The performance of SOP is compared with two RBF based methods. The test results show that SOP is an efficient method that can reduce time required to find a good near optimal solution. In a number of cases the efficiency of SOP is so good that SOP with 8 processors found an accurate answer in less wall-clock time than the other algorithms did with 32 processors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.