IBM benchmarks the deep-learning capabilities of its TrueNorth brain-like chip and concludes it's faster and more power-efficient than today's GPUs and CPUs
Developing a computer that can be as decisive and intelligent as humans is on IBM's mind, and it's making progress toward achieving that goal.
IBM's computer chip called TrueNorth is designed to emulate the functions of a human brain. The company is now running tests and benchmarking TrueNorth to demonstrate how fast and power efficient the chips can be compared to today's computers.
The results of the head-to-head contest are impressive. IBM says TrueNorth can engage in deep learning and make decisions based on associations and probabilities, much like human brains. It can do so while consuming a fraction of the power used by chips in other computers for the same purpose.
The learning and computing capacity of a TrueNorth chip "will open up the possibilities of embedding intelligence in the entire computing stack from the internet of things, to smartphones, to robotics, to cars, to cloud computing, and even supercomputing," the company said in a blog entry.
IBM earlier this year demonstrated the chip in a new computer called NS16e, which is modeled after the brain. The computer has can be used for image, speech, and pattern recognition through a neural network of processing units.
A human brain has 100 billion neurons that intercommunicate via trillions of connections called synapses. One part, the cortex, is responsible for visual recognition, while other parts are responsible for motor function.
Like the brain, the NS16e has "digital neurons," but on a smaller scale, with 16 TrueNorth chips in the system. Each TrueNorth chip has 1 million neurons and 256 million synapses, which are interconnected via circuitry. The NS16e has redesigned memory, computation and communication subsystems to facilitate power-efficient data processing.
IBM said the TrueNorth processor can classify image data at between 1,200 and 2,600 frames per second while consuming only 25 to 275 milliwatts of power. The processor can identify and recognize patterns from images generated by 50 to 100 cameras at 24 frames per second. It can do so using a smartphone without a need to recharge for days.
That's much more power-efficient than servers today, which rely on conventional chips like GPUs, CPUs and FPGAs for image and speech recognition. Facebook, Google, Microsoft and Baidu use deep learning for approximating answers related to imaging and speech recognition. Those deep-learning systems are mostly driven by GPUs that draw more than 150 watts of power.
IBM's TrueNorth uses algorithms and learning models that involve recognizing patterns and associating past and current data. Algorithms are still being created for different deep learning models, but the chip can be used with existing systems like MatConvNet. Essentially, developers can create learning models on MatConvNet, and TrueNorth will do the background processing. Developers don't need to be exposed to TrueNorth.
That process is similar to the early days of game development, where programmers weren't exposed to GPUs as most didn't know how to exploit the on-chip features. Vulkan recently replaced OpenGL APIs and exposed GPU features directly to programmers, who are better equipped to exploit features on the chip.
The potential of deep learning is illustrated in self-driving cars, which use powerful computers to navigate a vehicle safely by recognizing signals, lanes, and other objects. Like chips in cars and servers, the TrueNorth chip does low-level processing on each neuron, and they are then stringed together to provide identify an object in an image, or recognize a sound. That's the technique also being used by Intel and Nvidia in their mega-chips, which are more power hungry than TrueNorth.
These are still early days for IBM's TrueNorth chip. The company plans to build a computer with these chips at the scale of a human brain, but part of the challenge is developing algorithms and applications for such a huge computer.
IBM started the development of brain-like chips in 2004 and simulated a computer model the scale of a cat's brain in 2009. A prototype chip in 2011 had 256 digital neurons and had pattern-recognition capabilities. A full computer with a brain-emulating chip could still be a long time off.
IBM is also building quantum computer as an option to replace today's PCs and servers, which are based on decades-old computer designs. Other chips that emulate human brains are being developed by Hewlett Packard Enterprise, Stanford University, the University of Heidelberg in Germany, and the University of Manchester in the U.K.
EmoticonEmoticon