Tapping into new 'probabilistic computing' paradigm can make AI chips use much less power, scientists say

A new digital system allows operations on a chip to run in parallel, so an AI program can arrive at the best possible answer more quickly.