Error in notes: DropOut At Test Time

I am unable to understand why we are multiplying output by p. He says its because it is p% reliable.

Since the final weights of the mega network is learnt from many dropped out network. Does saying it is not 100% reliable holds true?