... In other words, if you want to teach an existing deep learning model something new, you'll likely have to retrain it from the ground up — otherwise, according to the research, the artificial neurons in their proverbial minds will sink to a value of zero. This results in a loss of "plasticity," or their ability to learn at all. ...
And training advanced AI models, as the researchers point out, is a cumbersome and wildly expensive process — making this a major financial obstacle for AI companies, which burn through a ton of cash as it is. ...
This phenomenon of plasticity loss is also a major moat between current AI models and the imagined "artificial general intelligence," or a theoretical AI that would be considered generally as intelligent as humans. ...
"A solution to continual learning is literally a billion-dollar question," Dohare told New Scientist. "A real, comprehensive solution that would allow you to continuously update a model would reduce the cost of training these models significantly."
See the full story here: https://futurism.com/the-byte/ai-models-rebuilding-problem