One of the causes of nan and inf may be a large absolute value as x in softmax cross entropy
They say normalizing the middle class is a good idea.
It didn't work.
I've heard that sometimes a lower learning rate can fix it.
Didn't work for me.
If there are zeros in the data, log10(0) will produce the following error!
That would make it nan.
Deleting all these data solved the problem!
train_snack.py:77: RuntimeWarning: divide by zero encountered in log10
P[m, :] = np.log10(np.absolute(X[m, :N/2])) # convert to logarithmic power spectrum (256 points)