What AI forgets could kill us, but new research is helping it remember
... Something is going on -- or not going on, as the case may be -- in the training of artificial neural networks that is causing huge gaps in cognition. The neural networks will forget all their old information while learning new things -- and then they will proceed to freeze. ...
They eschewed a conventional neural network -- one that constantly adjusts its synapses (the links between neurons) until it is able to find a solution -- for a 'spiking' one that they thought most closely resembles the human brain.
A 'spiking' network sends an output only after receiving a whole bunch of signals over time and therefore shifts around much less data and uses much less power and bandwidth, according to the researchers. In doing so, it is able to re-activate neurons involved in learning old tasks. It seemed to work.
The spiking neural network was capable of performing both tasks after undergoing sleeplike phases. ...
Meanwhile, more recently, researchers from Ohio State University steered clear of sleep while tackling the same problem of catastrophic forgetting in deep-learning neural nets. ...
In what may just be one of the more curious ironies of our times, Shroff and his colleagues found that algorithms, much like humans, were able to remember much better when fed with very different tasks in succession instead of a series of similar tasks. ...
The Ohio State researchers discovered that dissimilar tasks should be introduced very early in the continual learning process for the AI to learn new things as well as tasks similar to old ones. ...
See the full story here: https://www.zdnet.com/article/what-ai-forgets-could-kill-us-but-new-research-is-helping-it-remember/
Pages
- About Philip Lelyveld
- Mark and Addie Lelyveld Biographies
- Presentations and articles
- Tufts Alumni Bio