The authors show that under the (sufficient) conditions usually given for infinitesimal perturbation analysis (IPA) to apply for derivative estimation, a finite-difference scheme with common random numbers (FDC) has the same order of convergence, namely O(n’-1’/2), provided that the size of the finite-difference interval converges to zero fast enough. This holds for both one- and two-sided FDC. This also holds for different variants if IPA, such as some versions of smoothed perturbation analysis (SPA), which is based on conditional expectation. Finally, this also holds for the estimation of steady-state performance measures by truncated-horizon estimators, under some ergodicity assumptions. The present developments do not involve monotonicity, but are based on continuity and smoothness. The authors give examples and numerical illustrations which show that the actual difference in mean square error (MSE) between IPA and FDC is typically negligible. They also obtain the order of convergence of that difference, which is faster than the convergence of the MSE to zero.