Given a graph G=(V,E) with strictly positive integer weights ω
i
on the vertices i∈V, an interval coloring of G is a function I that assigns an interval I(i) of ω
i
consecutive integers (called colors) to each vertex i∈V so that I(i)∩I(j)=∅ for all edges {i,j}∈E. The interval coloring problem is to determine an interval coloring that uses as few colors as possible. Assuming that a strictly positive integer weight δ
ij
is associated with each edge {i,j}∈E, a bandwidth coloring of G is a function c that assigns an integer (called a color) to each vertex i∈V so that |c(i)-c(j)|≥δ
ij
for all edges {i,j}∈E. The bandwidth coloring problem is to determine a bandwidth coloring with minimum difference between the largest and the smallest colors used. We prove that an optimal solution of the interval coloring problem can be obtained by solving a series of bandwidth coloring problems. Computational experiments demonstrate that such a reduction can help to solve larger instances or to obtain better upper bounds on the optimal solution value of the interval coloring problem.