In recent years, AI models have been getting more performant — but also much, much larger. The amount of memory occupied by neural network weights has been steadily growing, with some featuring 500 billion parameters, or even trillions….
Article Source
https://research.ibm.com/blog/how-can-analog-in-memory-computing-power-transformer-models