IBM’s SyNAPSE Chip Moves Closer to Brain-Like Computing

This week, scientists at IBM research unveiled a brain-inspired computer and ecosystem. From their press release on the so-called SyNAPSE chip:

Scientists from IBM unveiled the first neurosynaptic computer chip to achieve an unprecedented scale of one million programmable neurons, 256 million programmable synapses and 46 billion synaptic operations per second per watt. At 5.4 billion transistors, this fully functional and production-scale chip is currently one of the largest CMOS chips ever built, yet, while running at biological real time, it consumes a minuscule 70mW—orders of magnitude less power than a modern microprocessor.

MIT Technology Review has a good summary as well:

IBM’s SyNapse chip processes information using a network of just over one million “neurons,” which communicate with one another using electrical spikes—as actual neurons do. The chip uses the same basic components as today’s commercial chips—silicon transistors. But its transistors are configured to mimic the behavior of both neurons and the connections—synapses—between them.

The SyNapse chip breaks with a design known as the Von Neuman architecture that has underpinned computer chips for decades. Although researchers have been experimenting with chips modeled on brains—known as neuromorphic chips—since the late 1980s, until now all have been many times less complex, and not powerful enough to be practical (see “Thinking in Silicon”). Details of the chip were published today in the journal Science.

The new chip is not yet a product, but it is powerful enough to work on real-world problems. In a demonstration at IBM’s Almaden research center, MIT Technology Review saw one recognize cars, people, and bicycles in video of a road intersection. A nearby laptop that had been programed to do the same task processed the footage 100 times slower than real time, and it consumed 100,000 times as much power as the IBM chip. IBM researchers are now experimenting with connecting multiple SyNapse chips together, and they hope to build a supercomputer using thousands.

I think this kind of experimentation is fascinating. You can read more at Science Magazine (subscription required to view full text).

 

Touch: The Future of Computing

Jeff Atwood got his hands on the newly released tablet Microsoft Surface RT. He reviews his experience with the device in his provocatively titled post “Do You Wana Touch” But it is his take on the future of computing which I thought was worth highlighting here:

love computers, always have, always will. My strategy with new computing devices is simple: I buy ’em all, then try living with them.The devices that fall away from me over time – the ones that gather dust, or that I forget about – are the ones I eventually get rid of. So long, Kindle Fire! I knew that the Nexus 7 was really working for me when I gave mine to my father as a spontaneous gift while he was visiting, then missed it sorely when waiting for the replacement to arrive.

As I use these devices, I’ve grown more and more sold on the idea that touch is going to dominate the next era of computing. This reductionism is inevitable and part of the natural evolution of computers. Remove the mouse. Remove the keyboard. Remove the monitor. Reducing a computer to its absolute minumum leads us inexorably, inevitably to the tablet (or, if a bit smaller, the phone). All you’re left with is a flat, featureless slate that invites you to touch it. Welcome to the future, here’s your … rectangle.

He rationalizes:

I’ve stopped thinking of touch as some exotic, add-in technology contained in specialized devices. I belatedly realized that I love to touch computers. And why not? We constantly point and gesture at everything in our lives, including our screens. It’s completely natural to want to interact with computers by touching them. That’s why the more unfortunate among us have displays covered in filthy fingerprints.

I don’t disagree. I love my iPhone and iPad. But I also love my MacBook Air, on which I am composing this post. Will we see a touch MacBook Air (with an uncompromised keyboard) from Apple in a few years? After reading Jeff’s post, I want to say yes.

Predicting the Future of Computing

The New York Times has a fascinating interactive post on where you, the reader, can submit snippets on what you expect to happen in the field of computer in the near (and distant) future.

Here are some predictions I see, as of current viewing (readers can vote the prediction up or down in terms of when they think the event will happen):

2015: The price and availability of computers will be such that more than half of the world’s people will have one.

2017: Mobile web browsing on phones surpasses desktop browsing.

2019: Irrefutable evidence will show that computers consistently make more accurate diagnoses than specialists in all branches of medicine, including psychiatry.

2023:  The most common forms of cancer will be treated with a personalized therapy based on genetic sequencing. A patient’s therapy will be retargeted every six months as a result of resequencing the cancer to track its inevitable evolution.

2029: Your entire medical history from birth till death will be collectively combined in one universal system and available to all your different doctors.

2035: Most people will own and use a Personal Life Recorder which will store full video and audio of their daily lives. This will be a fully searchable archive that will radically augment a person’s effective memory.

2041: Cash will become illegal, replaced with electric currency. [Bizarre prediction!]

2071: Humans will be able to implant their dogs’ brains with a neurological device such that the images of what the dog is thinking appear on a special contact lens the dog owner is wearing.

2106: Medical advances will permit the first human to live for a period of 200 years or more.

2154: Humans will become so integrated with electronics that more people will die from computer viruses in a year than from biological viruses.

2180: Old knowledge will not have to be learned; only new knowledge will need to be created. Learning will become obsolete. All known knowledge will be contained on a super computer. Individuals can download all known knowledge pertaining to any subject directly to the brain as desired.

2225: Artificial Intelligence is awarded full citizenship.

2283: Abundance happens. Digital and physical sciences produce abundance so great that wealth becomes meaningless as a difference between people.

2416: Thought-based communication surpasses spoken and typed communication.

What do you think is the coolest prediction from above? Which one is unlikely to happen in the next 400 years?

If you have a New York Times account, you can submit your own predictions. You can vote up/down predictions without an account. It’s a fascinating experiment!