Let a(1) = 1.
Define a(n) = a(n-1) + n^k, where k is the lowest positive integer such that a(n-1)+n^k has more decimal digits than a(n-1).
Is it always true that a(n) has n digits?
Let a(1) = 1.
Define a(n) = a(n-1) + n^k, where k is the lowest positive integer such that a(n-1)+n^k has more decimal digits than a(n-1).
Is it always true that a(n) has n digits?