Samsung has reported on PIM in a high-bandwith memory (HBM) integrated with a Xilinx Alveo AI accelerator. The company said that its PIM architecture will be deployed in conventional DRAM modules and mobile memory.
The HBM-PIM is called Aquabolt-XL and was revealed in February 2021 (see Samsung embeds AI accelerator in memory chip) and it incorporates AI processing into Samsung's HBM2 format Aquabolt stacked memory. At Hot Chips Samsung presented an integration of Aquabolt-XL integrated within a Xilinx Virtex Ultrascale+ AI accelerator called Alveo. Samsung claimed the combination delivered an improvement in performance and energy consumption.
"Through standardization of the technology, applications will become numerous, expanding into HBM3 for next-generation supercomputers and AI applications, and even into mobile memory for on-device AI as well as for memory modules used in data centers," said Nam Sung Kim, senior vice president of DRAM Product & Technology at Samsung Electronics.
Samsung has labelled the application of PIM to a dual-in line memory module (DIMM) as AXDIMM. This brings processing to the DRAM minimizing data movements between CPU and DRAM.
It would appear that in this manifestation Samsung is not going truly PIM as the AI engine is housed in a buffer chip that can perform machine learning using multiple DRAM chips.
The AXDIMM maintains the DIMM form factor allowing drop-in replacement within systems. The AXDIMM is being tested on customers' servers and offers a doubling of performance and 40 percent decrease in system-wide energy usage, Samsung said.
Software corporation SAP is collaborating with Samsung in this area.
Next: PIM for mobile devices