Computational study of the step size parameter of the subgradient optimization method
Resumo |
The subgradient optimization method is a simple and flexible linear programming iterative algorithm. It is much simpler than Newton's method and can be applied to a wider variety of problems. It also converges when the objective function is non-differentiable. Since an efficient algorithm will not only produce a good solution but also take less computing time, we always prefer a simpler algorithm with high quality. In this study a series of step size parameters in the subgradient equation is studied. The performance is compared for a general piecewise function and a specific p-median problem. We examine how the quality of solution changes by setting five forms of step size parameter. |
---|---|
Formato |
application/pdf |
Identificador | |
Idioma(s) |
eng |
Publicador |
Högskolan Dalarna, Statistik |
Direitos |
info:eu-repo/semantics/openAccess |
Palavras-Chave | #subgradient method; optimization; convex function; p-median |
Tipo |
Manuscript (preprint) info:eu-repo/semantics/preprint text |