Why we should be Deeply Suspicious of BackPropagation

Medium, machine learning, ML, back propagation, neural networks, GAN

That something else, call it imagination or call it dreaming, does not require validation with immediate reality. The closest incarnation we have today is the generative adversarial network (GAN). A GAN consists of two networks, a generator and a discriminator. One can consider a discriminator as a neural network that acts in concert with the objective function. That is, it validates an internal generator network with reality. The generator is an automation that recreates an approximation of reality. A GAN works using back-propagation and it does perform unsupervised learning. So perhaps unsupervised learn doesn’t require an objective function, however it may still need back-propagation.

via https://medium.com/intuitionmachine/the-deeply-suspicious-nature-of-backpropagation–9bed5e2b085e