Consider n data measurements of a univariate process that have been altered by random errors. It is assumed that an underlying model function has a substantially smaller number of turning points than the observed ones. Algorithms are proposed that make least the sum of the moduli of the errors by requiring k monotonic sections, alternatively increasing and decreasing, in the sequence of the smoothed values. The main difficulty in this calculation is that the optimal positions of the joins of the monotonic sections have to be found automatically among so many combinations that it is impossible to test each one separately. Moreover, the calculation seems to be very intractable to general optimization techniques because O(nk) local minima can occur. It is shown that dynamic programming can be used for separating the data into optimal disjoint sections of adjacent data, where each section requires a single L1 monotonic calculation. This procedure is highly efficient, requiring at most O(kn2) computer operations and O(n) best L1 monotonic calculations to subranges of data for a global minimum.