Abstract. We study the use of the BFGS and DFP algorithms with step-lengths of one for minimizing quadratic functions of only two variables. The updating formulae in this case imply nonlinear three term recurrence relations between the eigenvalues of consecutive second derivative approximations, which are analysed in order to explain some gross inefficiencies that can occur. OPER 627: Nonlinear Optimization Lecture 8: BFGS and L-BFGS Department of Statistical Sciences and Operations Research Virginia Commonwealth University Sept 23, 2013 (Lecture 8) Nonlinear Optimization Sept 23, 2013 1 / 15 We investigate the BFGS algorithm with an inexact line search when applied to non- smooth functions, not necessarily convex. We deﬁne a suitable line search and show that it generates a sequence of nested intervals containing points satisfying the Armijo and weak Wolfe conditions, as- suming only absolute continuity. You can also ask for the posterior mode, which is found by optimization with the BFGS (or L-BFGS) algorithm. We intend to create more Stata commands: allowing CODA-style diagnostics and plotting after a model has been fitted and chains stored, fitting specific models given variables, matrices, globals -- in the style of the R package rstanarm, Stan implements reverse-mode automatic differentiation to calculate gradients of the model, which is required by HMC, NUTS, L-BFGS, BFGS, and variational inference. The automatic differentiation within Stan can be used outside of the probabilistic programming language.