Turbotodd

Ruminations on IT, the digital media, and some golf thrown in for good measure.

Happy Anniversary, Deep Blue

leave a comment »

It was fifteen years ago today that the IBM chess-playing supercomputer, Deep Blue, beat he-who-shall-remain-nameless, a world grandmaster, after a six-game match, which brought two wins for IBM, one for the world champion, and three draws.

On May 11, 1997, an IBM computer called IBM Deep Blue beat the world chess champion after a six-game match: two wins for IBM, one for the champion and three draws. The match lasted several days and received massive media coverage around the world. It was the classic plot line of man vs. machine. Behind the contest, however, was important computer science, pushing forward the ability of computers to handle the kinds of complex calculations needed to take computing to the its next stage of evolution.

It was classic man-versus-machine, but underlying the mythology that enveloped the John Henry storyline was something far more important: The opportunity to push the frontiers of computer science, to push computers to handle the kind of complex calculations necessary for helping discover new pharmaceuticals; to conduct the kind of financial modeling needed to identify trends and do risk analysis; to perform the kinds of massive calculations needed in many fields of science.

Solving The Problem That Is Chess

Since artificial intelligence emerged as a concept along with the first real computers in the 1940s, computer scientists compared the performance of these “giant brains” with human minds, and many gravitated to chess as a way of testing the calculating abilities of computers. Chess is a game that represents a collection of challenging problems for minds and machines, but had simple rules, and was thus an excellent testbed for laying the groundwork for the “big data” era that was soon to come.

There’s but no question that Deep Blue was such a powerful computer programmed to solve the complex, strategic game of chess.  But IBM’s goal was far deeper: To enable researchers to discover and understand the limits and opportunities presented by massively parallel processing and high performance computing.

IBM Deep Blue: Analyzing 200 Million Chess Positions Per Second

If, in fact, Deep Blue could explore up to 200 million possible chess positions per second, then could this deep computing capability be used to help society handle the kinds of complex calculations required in some of these other aforementioned areas.

Deep Blue did, in fact, prove that industry could tackle these issues with smart algorithms and sufficient computational power.

I recalled earlier this year in a blog post my own experience witnessing the Deep Blue chess match.  It evoked a lot of nostalgia for me and so many others.

IBM’s Deep Blue supercomputer could explore up to 200 million possible chess positions per second on 510 processors. Juxtapose that with IBM Blue Gene’s ability a few short years later to to routinely handle 478 trillion operations every second!

But it also laid a foundation, paving the way for new kinds of advanced computers and breakthroughs, including IBM’s Blue Gene and, later, IBM Watson.

Forever In Blue Genes

Blue Gene, introduced in 2004, demonstrated the next grand challenge in computing and was both the most powerful supercomputer and the most efficient, but was built to help biologists observe the invisible processes of protein folding and gene development. Deep Blue was also one of the earliest experiments in supercomputing that propelled IBM to become a market leader in this space to this day.

Fifteen years on, we’ve seen epic growth in the volume and variety of data being generated around the planet, via business, the social media, new sensor data helping with instrumentation of the physical world vis-a-vis IBM’s smarter planet initiative.  We’ve created so much new data that, in fact, 90% of the data in the world today was created in the last two years alone!

Calling Doctor Watson

Most recently, IBM embarked upon the next wave of this computing progress through the development of IBM’s Watson, which can hold the equivalent of about one million books worth of information. But make no mistake, Watson’s significance wasn’t just the amount of information it could process, but rather, a new generation of technology that uses algorithms to find answers in unstructured data more effectively than standard search technology, while also “understanding” natural language.

The promise of IBM Watson is now being put to productive use in industry — as an online tool to assist medical professionals in formulating diagnoses; by simplifying the banking experience by analyzing customer needs in the context of vast amounts of ever-changing financial , economic, product, and client data; and, I’m sure, other industries near you soon.

Those early chess matches were exciting, nail-biting even (and who’d have thought we’d ever say that about chess?)! But they pale by comparison to the productive work and problem-solving IBM’s Watson, and other IBM technologies, are now and will continue to be involved with as the world of big data matures and becomes adopted by an ever-increasing audience.

You can now visit Deep Blue, which ultimately was retired to the Smithsonian Museum in Washington, D.C.

But its groundbreaking contributions to artificial intelligence and computing in general continues, and now extends well beyond the confines of the chess board.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: