Dynamic programming is one of the fundamental techniques for solving optimization problems. In this paper we propose a general framework which can be used to decrease the time and space complexity of these algorithms with a logarithmic factor. The framework is based on word encoding, i.e. by representing subsolutions as bits in an integer. In this way word parallelism can be used in the evaluation of the dynamic programming recursion. Using this encoding the subset-sum problem can be solved in O(nb/log(b)) time and O(b/log(b)) space, where n is the number of integers given and b is the target sum. The knapsack problem can be solved in O(nm/log(m)) time and O(m/log(m)) space, where n is the number of items and m = max b,z is the maximum of the capacity b and the optimal solution value z. The problem of finding a path of a given length b in a directed acyclic graph G = (V,E) can be solved in O(|E|b/log(b)) time and O(|V)b/log(b)) space. Several other examples are given showing the generality of the achieved technique. Extensive computational experiments are provided to demonstrate that the achieved results are not only of theoretical interest but actually lead to algorithms which are up to two orders of magnitude faster than their predecessors. This is a surprising observation as the increase in speed is larger than the word size of the processor.