MIT spin-off raises funds for optical processor

MIT spin-off raises funds for optical processor
Technology News |
Lightmatter Inc. (Boston, Mass.), a company developing a photonic processor for neural network applications, has received $11 million in Series A funds.
By Peter Clarke


The company was founded in 2017 by MIT researchers Nicholas Harris, Darius Bunandar, and Thomas Graham. The photonic processor technology that underlies Lightmatter was developed over four years at the MIT Research Laboratory of Electronics. The round was co-led by Matrix Partners and Spark Capital. Stan Reiss from Matrix and Santo Politi from Spark have joined Lightmatter’s board of directors.

The Lightmatter group won a $100,000 prize within MIT in 2017 for developing fully optical chips that compute using light. This means that they can work faster and using less energy than electronic circuits.

Although analog computing simple digital operations using operations were demonstrated many years ago the complexity of digital architectures, the incomplete set of photonic equivalents to electronic circuits and the need to move out of and back into the optical domain has held back the use of photonic computing.

Lightmatter has homed in on the artificial intelligence domain where digital operations are more uniform than in general purpose computing and produced a silicon chip that uses light signals, rather than electrical signals, for matrix multiplication. The system uses heated silicon channels between Mach-Zehnder interferometers to slow down, to varying degrees, optical signals that represent weights. As the signals pass through a cascade of interferometers the input weights are matrix multiplied to produce the required outputs.

In 2017 the team reported in Nature Photonics on a silicon photonic chip that includes 56 Mach-Zehnder interferometers and such a matrix of controllable wave-guides that was used to implement a neural network that recognizes four basic vowel sounds.

The system achieved 77 percent accuracy compared to about 90 percent for electronic systems but with the prospect of scaling up to outperform electronic systems.

Next: Energy efficient

Such energy-efficient photonic acceleration is particularly beneficial for neural network architectures where training is done using large datasets – and requires a lot of time and energy.

“For decades, electronic computers have been at the foundation of the computational progress that has ultimately enabled the AI revolution, but AI algorithms have a voracious appetite for computational power,” said Harris, CEO of Lightmatter, in a statement. “AI is really in its infancy, and to move forward, new enabling technologies are required. At Lightmatter, we are augmenting electronic computers with photonics to power a fundamentally new kind of computer that is efficient enough to propel the next generation of AI.”

Related links and articles:

Nature Photonics article

News articles:

Nanoscale photonics switch could enable 10-100 Terabit/s data transfer

Solid-state lidar maker opens factory

Linked Articles
eeNews Analog