This paper proposes an implementable proximal quasi-Newton method for minimizing a non-differentiable convex function f in ℛn. The method is based on Rockafellar's proximal point algorithm and a cutting-plane technique. At each step, we use an approximate proximal point pa(xk) of xk to define a υk ∈ ∂εk f(pa(xk)) with εk ≤ α∥υk∥, where α is a constant. The method monitors the reduction in the value of ∥υk∥ to identify when a line search on f should be used. The quasi-Newton step is used to reduce the value of ∥υk∥. Without the differentiability of f, the method converges globally and the rate of convergence is Q-linear. Superlinear convergence is also discussed to extend the characterization result of Dennis and Moré. Numerical results show the good performance of the method.