Let B be a positive definite symmetric approximation to the second derivative matrix of the objective function of a minimization calculation. The paper studies modifications of the BFGS method that apply a scaling technique to the columns of a conjugate direction matrix Z satisfying ZTBZ=I. For a simple scaling technique similar to the ones considered by Powell it shows that, due to a two iteration cycle, linear convergence can occur when the method is applied to a quadratic function and Wolfe’s line search is employed, although for exact line searches quadratic termination can be proved. The paper then suggests a different scaling technique that prevents certain columns thought to contain important curvature information from being scaled. For this algorithm it proves global and superlinear convergence and demonstrate the superiority of our method over the BFGS formula on a range of ill-conditioned optimization problems. Moreover, the paper presents an implementation of the algorithm that requires only 3n2+O(n) multiplications per iteration.