The company will launch a number of chips to address distributed machine learning in 2019 that will be part of the STM32 microcontroller family or as dedicated SoCs, according to Claude Dardanne, president of the microcontrollers and digital group at ST.
"Our customers will use standard platforms to develop artificial intelligence, Caffe, TensorFlow and so on," said Dardanne. He added that ST has already developed its own software development environment – STM32CubeMix.AI – that allows customers to develop artificial intelligence applications generically and then target them at STM32 MCUs or more specialized ICs.
Dardanne was speaking at ST's Capital Markets Day being held in London. There were demonstrations of a 28nm FDSOI DCNN neural networking SoC that ST is using to engage with customers (see ST preps second neural network IC ). The technology is capable of delivering 5Tflops per watt.
While STM32 MCUs are available now, products with AI acceleration, based on the DCNN technology, are expected to arrive in 2019, Dardanne said.
These products will include STM32 with DCNN accelerator IP, up to dedicated ASICs. One of the forthcoming AI product families will be a microprocessor with neural network acceleration for industrial applications, Dardanne said. The first chip in this family is due to launch later in 2018.
Design work is going on at 40nm for enhanced MCUs with additional IP and dedicated chips in 28nm and 22nm FDSOI and 16nm FinFET, according to Giuseppe Desoli, chief architect and research fellow at ST Central Labs.
"Phase one the software tool is available now. Phase two, STM32 plus IP, and phase three, AI specific ICs will be introduced next year in volume production," said Dardanne.
Related links and articles: