Samsung presents AI processing-in-memory options

Samsung presents AI processing-in-memory options

Technology News |
Samsung Electronics presented several processing-in-memory (PIM) options and a pledge to help industry standardization on PIM, at the virtual Hot Chips 33 conference.
By Peter Clarke

Share:

Samsung has reported on PIM in a high-bandwith memory (HBM) integrated with a Xilinx Alveo AI accelerator. The company said that its PIM architecture will be deployed in conventional DRAM modules and mobile memory.

The HBM-PIM is called Aquabolt-XL and was revealed in February 2021 (see Samsung embeds AI accelerator in memory chip) and it incorporates AI processing into Samsung’s HBM2 format Aquabolt stacked memory. At Hot Chips Samsung presented an integration of Aquabolt-XL integrated within a Xilinx Virtex Ultrascale+ AI accelerator called Alveo. Samsung claimed the combination delivered an improvement in performance and energy consumption.

“Through standardization of the technology, applications will become numerous, expanding into HBM3 for next-generation supercomputers and AI applications, and even into mobile memory for on-device AI as well as for memory modules used in data centers,” said Nam Sung Kim, senior vice president of DRAM Product & Technology at Samsung Electronics.

Samsung has labelled the application of PIM to a dual-in line memory module (DIMM) as AXDIMM. This brings processing to the DRAM minimizing data movements between CPU and DRAM.

It would appear that in this manifestation Samsung is not going truly PIM as the AI engine is housed in a buffer chip that can perform machine learning using multiple DRAM chips.

The AXDIMM maintains the DIMM form factor allowing drop-in replacement within systems. The AXDIMM is being tested on customers’ servers and offers a doubling of performance and 40 percent decrease in system-wide energy usage, Samsung said.

Software corporation SAP is collaborating with Samsung in this area.

Next: PIM for mobile devices


Samsung also reported on a LPDDR5-PIM mobile memory that can provide device-based AI capabilities without requiring data be sent to the datacentre. Simulations have predicted that LPDDR5-PIM can more than double performance while reducing energy usage by over 60 percent when used in applications such as voice recognition and language translation, Samsung said.

Samsung said that is plans to expand is AI-PIM portfolio by working with other companies to standardize the PIM platform in the 1H22.

Related links and articles:

www.samsung.com

News articles:

Samsung embeds AI accelerator in memory chip

Unsupervised in-memory AI learning goes digital

Processor-in-memory DRAM benchmarked on Xeon server

Processing-in-Memory supports AI acceleration at 8.8 TOPS/W

Linked Articles
eeNews Analog
10s