optimization methods mentioned in your article

Vasp transition state theory tools

Moderator: moderators

Post Reply
butterflyyh
Posts: 7
Joined: Sun Aug 28, 2011 3:57 am

optimization methods mentioned in your article

Post by butterflyyh »

Hi graeme,
I read your paper in 2008(the journal of chemical physics 128,134106) about optimizers. In that paper, it mentioned that GL-BFGS(hess) is faster than L-BFGS, but the qualitative error exists.
(1) Is that means that GL-BFGS(hess) is not stable and for some systems a big error may happen?
(2) Is IOPT=1 stands for GL_BFGS(hess) optimizer? Or is other L-BFGS(hess) or L-BFGS(line) mentioned in your paper?
(3) If there are many local minimum positions in potential space. And the two end images are very close, around 1A. In this situation, is the first-order methods perform more efficient than the second-order methods? Bbecause the distance of the diffuion atom in reactant and product is short, should I lower the TIMESTEP and MAXMOVE at the same time?
Thanks in advance!
graeme
Site Admin
Posts: 2291
Joined: Tue Apr 26, 2005 4:25 am
Contact:

Re: optimization methods mentioned in your article

Post by graeme »

(1) I don't really understand what you mean by the GL-BFGS(hess) having a qualitative error. The GL-BFGS(hess) works with the NEB; there is no error. Perhaps you are referring to the fact that GL-BFGS(hess) builds up an approximate (inverse) Hessian despite the fact that formally the NEB does not have one? If so, this does not seem to have a detrimental effect on convergence.

(2) The IOPT=1 optimizer is GL-BFGS(hess). To use a line optimizer, set LLINEOPT=.TRUE.

(3) The distance between images should not influence your choice of optimizer. It is true that a first-order method will be more stable on a rough landscape. If you have high forces, lowering TIMESTEP is the best way to control convergence. Once you have settled near to an optimal path, however, you should be able to use a second order method for faster convergence.
Post Reply