In their seminal article Droste, Jansen, and Wegener (2002) consider a basic direct‐search heuristic with a global search operator, namely the so‐called (1+1) Evolutionary Algorithm ((1+1)EA). They present the first theoretical analysis of the (1+1)EA’s expected runtime for the class of linear functions over the search space {0,1}
n
. In a rather long and involved proof they show that, for any linear function, the expected runtime is O(nlogn), i.e., that there are two constants c and n′ such that, for n≥n′, the expected number of iterations until a global optimum is generated is bounded above by c·nlog2
n. However, neither c nor n′ are specified–they would be pretty large. Here we reconsider this optimization scenario to demonstrate the potential of an analytical method that makes use of the distribution of the evolving candidate solution over the search space {0,1}
n
. Actually, an invariance property of this distribution is proved, which is then used to obtain a significantly improved bound on the drift, namely the expected change of a potential function, here the number of bits set correctly. Finally, this better estimate of the drift enables an upper bound on the expected number of iterations of 3.8nlog2
n+7.6log2
n for n≥2.