Cross entropy function=  { sum(yi_true* log(yi_hat)) + sum(yi_true*log(yi_true)) }
Actually, I have 2 doubts:

When using cross entropy loss, sum(yi*log(yi__hat)), suppose we get our model to train such that it predicts yi__hat to be 0. Then won’t this loss function be giving error? as log0 isn’t defined?

And it further stated it only accounts for the losses for true classes, and for yi__true =0, it will vanish. So now, suppose, we have certain case where our model predicts yihat to be 0.9999 for a class with yi_true=0. Then technically, this seems to be an error but our cross entropy function doesn’t account for it. How can we explain this?