We consider the online smoothing problem, in which a tracker is required to maintain distance no more than Δ≥0 from a time‐varying signal f while minimizing its own movement. The problem is determined by a metric space (X,d) with an associated cost function c:ℝ→ℝ. Given a signal
f
1,f
2,…∈X the tracker is responsible for producing a sequence a
1,a
2,… of elements of X that meet the proximity constraint: d(f
i
,a
i
)≤Δ. To complicate matters, the tracker is on‐line–the value a
i
may only depend on f
1,…,f
i
–and wishes to minimize the cost of his travels, ∑c(d(a
i
,a
i+1)). We evaluate such tracking algorithms competitively, comparing this with the cost achieved by an optimal adversary apprised of the entire signal in advance. The problem was originally proposed by Yi and Zhang (2009), who considered the natural circumstance where the metric spaces are taken to be ℤ
k
with the 𝓁
2 metric and the cost function is equal to 1 unless the distance is zero (thus the tracker pays a fixed cost for any nonzero motion).
We begin by studying arbitrary metric spaces with the ‘pay if you move’ metric of Yi and Zhang (2009) described above and describe a natural randomized algorithm that achieves a O(logb
Δ)‐competitive ratio, where b
Δ=max
x∈X
|B
Δ(x)| is the maximum number of points appearing in any ball of radius Δ. We show that this bound is tight.
We then focus on the metric space ℤ with natural families of monotone cost functions c(x)=x
p
for some p≥0. We consider both the expansive case (p≥1) and the contractive case (p
<1), and show that the natural lazy algorithm performs well in the expansive case. In the contractive case, we introduce and analyze a novel deterministic algorithm that achieves a constant competitive ratio depending only on p. Finally, we observe that by slightly relaxing the guarantee provided by the tracker, one can obtain natural analogues of these algorithms that work in continuous metric spaces.