Taylor Series Intuition for Deep Learning

How we are getting u^T in the equation? Should it not be u^1,u^2…

Basically the 3rd term should have two u’s (first term 0, 2nd term 1, 3rd term 2 and so on…)
And 3rd term has two u’s: One at the beginning and one at the end(after L)
u^T<gradient_term>u

Now why its not u^2 directly and split into u^T and u, involves understanding of vectorization, which I don’t have :).

u^T*u = sum of square of each terms in vector u
which is equivalent to squaring a matrix.

Thankyou, my comment was more about how to write the expression in the first place. By this I mean, For a taylor’s series expansion involving vectors, how to know when to write uT and when to write u and where to place them. For example, if I’ve to write 4th and 5th term as well, how the expression will look like.