Behdad Youssefi, co-founder and CEO of Areanna AI Inc. (Santa Clara, Calif.), has designed a special SRAM with additional circuits to support matrix multiplication, storage and quantization. The company has claimed performance rates as high as 100TOPS/watt.
Compute-in-memory is becoming a popular approach to building machine learning accelerators although often this is done in simpler, denser memories such as flash non-volatile memory. Nonetheless the use of specialized and customizable SRAM could allow for superior scalability down to leading-edge digital processes.
However, it is still early days for the company which is showing slide decks to venture capital companies with plans to raise the money to design and tape-out a test chip early in 2020. It remains unclear whether the company will pursue an IP licensing business model, or will develop their own silicon chips.
Related links and articles: