Recently, in He et al. [He BS, Tao M, Yuan XM (2012) Alternating direction method with Gaussian back substitution for separable convex programming. SIAM J. Optim. 22(2):313–340], we have showed the first possibility of combining the Douglas‐Rachford alternating direction method of multipliers (ADMM) with a Gaussian back substitution procedure for solving a convex minimization model with a general separable structure. This paper is a further study on this theme. We first derive a general algorithmic framework to combine ADMM with either a forward or backward substitution procedure. Then, we show that convergence of this framework can be easily proved from the contraction perspective, and its local linear convergence rate is provable if certain error bound condition is assumed. Without such an error bound assumption, we can estimate its worst‐case convergence rate measured by the iteration complexity.