Mathematics is built on axioms that have nothing to do with numbers yet. That means that things like decimal numbers need definitions. And in the definition of decimals is literally included that if you have only nines at a certain point behind the dot, it is the same as increasing the decimal in front of the first nine by one.
There are versions of math where that isn't true, with infinitesimals that are not equal to zero. So I think it is an axium rather than a provable conclusion.
That's not how it's defined. 0.99.. is the limit of a sequence and it is precisely 1. 0.99.. is the summation of infinite number of numbers and we don't know how to do that if it isn't defined. (0.9 + 0.09 + 0.009...) It is defined by the limit of the partial sums, 0.9, 0.99, 0.999... The limit of this sequence is 1. Sorry if this came out rude. It is more of a general comment.
I study mathematics at university and I remember it being in the definition, but since it follows from the sum's limit anyways it probably was just there for claritie's sake. So I guess we're both right...