How to properly initialize ReLU Neurons to avoid saturation?

A bit confused whether to use high positive value of bias for ReLU to avoid saturation of negative input values or as explained in the later video to use He initialisation which ensures that the value of weights and bias are intialised on the basis of number of neurons in input hidden layer.
My doubt is if supppose no of input nuerons in hidden layers is high (value of n is large ) then the bias is initialised with very low value and hence how will we ensure no saturation with small positive value of bias?

You can check this paper to understand how He initialization helps avoid the dying ReLU problem:

You can also check this paper for a general understanding of the problem:

Thank You. It was a good read.