IBM Still Talking Up SyNAPSE
IBM has unveiled the latest stage in its plans to generate a computer system that copies the human brain, calculating tasks that are relatively easy for humans but difficult for computers.
As part of the firm’s Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project, IBM researchers have been working with Cornell University and Inilabs to create the programming language with $53m in funding from the Defense Advanced Research Projects Agency (DARPA).
First unveiled two years ago this month, the technology – which mimics both the size and power of humanity’s most complex organ – looks to solve the problems created by traditional computing models when handling vast amounts of high speed data.
IBM explained the new programming language, perhaps not in layman’s terms, by saying it “breaks the mould of sequential operation underlying today’s von Neumann architectures and computers” and instead “is tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures”.
That, in English, basically means that it could be used to create next generation intelligent sensor networks that are capable of perception, action and cognition, the sorts of mental processes that humans take for granted and perform with ease.
Dr Dharmendra Modha, who heads the programme at IBM Research, expanded on what this might mean for the future, sayng that the time has come to move forward into the next stage of information technology.
“Today, we’re at another turning point in the history of information technology. The era that Backus and his contemporaries helped create, the programmable computing era, is being superseded by the era of cognitive computing.
“Increasingly, computers will gather huge quantities of data, reason over the data, and learn from their interactions with information and people. These new capabilities will help us penetrate complexity and make better decisions about everything from how to manage cities to how to solve confounding business problems.”
The hardware for IBM’s cognitive computers mimic the brain, as they are built around small “neurosynaptic cores”. The cores are modeled on the brain, and feature 256 “neurons” (processors), 256 “axons” (memory) and 64,000 “synapses” (communications between neurons and axons).
IBM suggested that potential uses for this technology could include a pair of glasses which assist the visually impaired when navigating through potentially hazardous environments. Taking in vast amounts of visual and sound data, the augmented reality glasses would highlight obstacles such as kerbs and cars, and steer the user clear of danger.
Other uses could include intelligent microphones that keep track of who is speaking to create an accurate transcript of any conversation.
In the long term, IBM hopes to build a cognitive computer scaled to 100 trillion synapses. This would fit inside a space with a volume of no more than two litres while consuming less than one kilowatt of power.