Why not self.pred(X) directly - Module 3 of Sigmoid Neuron

In Python programming, while displaying loss, the mentor is making a change in the Sigmoid class. He is calling self.sigmoid(self.perceptron(X)) to calculate Y_pred. Why not he calls self.pred(X) to do so? pred() is already defined. Does he need to define pred() earlier to fit() function in the class to do so?

Implementation wise, it should be ok as predict function will also call self.sigmoid(self.perceptron(X)) only.

Terminology wise, (it might give some justification to how Sir used the functions)
Loss is calculated when the model is being build and loss calculation is used to tune the parameters.
While prediction is done once the parameters are already tuned by loss minimisation and model is ready to make prediction on new data.

@sanjayk
Also, I am trying to run this piece of code to see loss against range of values for hyper-parameters. I am getting this error: TypeError: float() argument must be a string or a number, not ‘dict_values’

if display_loss:
      plt.plot(loss.values())
      plt.xlabel('Epochs')
      plt.ylabel('Mean Squared Error')
      plt.show()

@sanjayk
I tried changing the plot line to the following, but still error message stays the same:

plt.plot(np.array([loss.values()]).astype(float))

change this to:
list(loss.values())

and reexecute the code block
hopefully this will work

@sanjayk

It sure did…thanks!