Turbotodd

Ruminations on tech, the digital media, and some golf thrown in for good measure.

Archive for the ‘supercomputing’ Category

The Supercomputing Summit

leave a comment »

Okay, let me be up front: I’m just back from a week’s vacation and my brain is mush.

But the neurons are slowly starting to refire.

NOT as fast, I might add, as IBM’s new supercomputer, Summit, which was built in partnership with the Oak Ridge National Lab and is now the world’s smartest and most powerful AI machine.

WIRED recently wrote up the new machine, and here are some noteworthy bits:

America hasn’t possessed the world’s most powerful supercomputer since June 2013, when a Chinese machine first claimed the title. Summit is expected to end that run when the official ranking of supercomputers, from an organization called Top500, is updated later this month

Summit, built by IBM, occupies floor space equivalent to two tennis courts, and slurps 4,000 gallons of water a minute around a circulatory system to cool its 37,000 processors. 

Oak Ridge says its new baby can deliver a peak performance of 200 quadrillion calculations per second (that’s 200 followed by 15 zeros) using a standard measure used to rate supercomputers, or 200 petaflops. That’s about a million times faster than a typical laptop, and nearly twice the peak performance of China’s top-ranking Sunway TaihuLight.

Summit has nearly 28,000 graphics processors made by Nvidia, alongside more than 9,000 conventional processors from IBM.

Summit will be used to help analyze a wide array of deep learning challenges ranging from astronomy to chemistry to biology and beyond.

Written by turbotodd

June 11, 2018 at 3:53 pm

China Tops Top500 Supercomputing List

leave a comment »

For the first time ever, China has the most systems on the Top500 supercomputers list with 202, up from 159 six months ago.

By comparison, the U.S. dropped from 169 to 144, according to a report from CNET.

China also sits atop the Top500 list, with its Sunway TaihuLight supercomputer at China’s National Supercomputing Center in Wuxi reaching 93.01 petflops, or 93 quadrillion calculations per second.

But don’t rule out the U.S….

The United States might reclaim the top spot on the Top500 list, though. An IBM-built machine called Summit at Oak Ridge National Laboratory is designed to reach about 200 petaflops, double the performance of Sunway TaihuLight. It’s in a 10,000-square-foot facility that’s got a 20-megawatt power system for running the machine and keeping it cool. That’s enough electricity to power about 16,300 houses.
– via CNET

Written by turbotodd

November 14, 2017 at 9:34 am

Posted in 2017, china, ibm, supercomputing

Big States (And Countries) Need Big Computers

with one comment

IBM’s been on a roll with the supercomputer situation of late.

Last week, we announced the installation of a Blue Gene supercomputer at Rutgers, and earlier today, we discovered that the IBM Blue Gene supercomputer is coming to my great home state of Texas.

Specifically, IBM announced a partnership with Houston’s Rice University to build the first award-winning IBM Blue Gene supercomputer in Texas.

Rice also announced a related collaboration agreement with the University of Sao Paul (USP) in Brazil to initiate the shared administration and use of the Blue Gene supercomputer, which allows both institutions to share the benefits of the new computing resource.

Rice University and IBM today announced a partnership to build the first award-winning IBM Blue Gene supercomputer in Texas. Rice also announced a related collaboration agreement with the University of Sao Paulo (USP) in Brazil to initiate the shared administration and use of the Blue Gene supercomputer, which allows both institutions to share the benefits of the new computing resource.

Now, you all play nice as you go about all that protein folding analysis!

Rice faculty indicated they would be using the Blue Gene to further their own research and to collaborate with academic and industry partners on a broad range of science and engineering questions related to energy, geophysics, basic life sciences, cancer research, personalized medicine and more.

“Collaboration and partnership have a unique place in Rice’s history as a pre-eminent research university, and it is fitting that Rice begins its second century with two innovative partnerships that highlight the university’s commitments to expanding our international reach, strengthening our research and building stronger ties with our home city,” said Rice President David Leebron about the deal.

USP is Brazil’s largest institution of higher education and research, and Rodas said the agreement represents an important bond between Rice and USP. “The joint utilization of the supercomputer by Rice University and USP, much more than a simple sharing of high-tech equipment, means the strength of an effective partnership between both universities,” explained USP President Joao Grandino Rodas.

Unlike the typical desktop or laptop computer, which have a single microprocessor, supercomputers typically contain thousands of processors. This makes them ideal for scientists who study complex problems, because jobs can be divided among all the processors and run in a matter of seconds rather than weeks or months.

Supercomputers are used to simulate things that cannot be reproduced in a laboratory — like Earth’s climate or the collision of galaxies — and to examine vast databases like those used to map underground oil reservoirs or to develop personalized medical treatments.

USP officials said they expect their faculty to use the supercomputer for research ranging from astronomy and weather prediction to particle physics and biotechnology.

In 2009, President Obama recognized IBM and its Blue Gene family of supercomputers with the National Medal of Technology and Innovation, the most prestigious award in the United States given to leading innovators for technological achievement.

Including the Blue Gene/P, Rice has partnered with IBM to launch three new supercomputers during the past two years that have more than quadrupled Rice’s high-performance computing capabilities.

The addition of the Blue Gene/P doubles the number of supercomputing CPU hours that Rice can offer. The six-rack system contains nearly 25,000 processor cores that are capable of conducting about 84 trillion mathematical computations each second. When fully operational, the system is expected to rank among the world’s 300 fastest supercomputers as measured by the TOP500 supercomputer rankings.

Written by turbotodd

March 30, 2012 at 6:54 pm

Innovate 2011 Conference: Profit From Software

leave a comment »

Happy Friday.

I’ve been too busy to keep track of all that’s going on at Big Blue this week, but I did notice some nows out of Warsaw that I thought worth sharing.

The Interdisciplinary Center for Mathematical and Computational Modeling at the University of Warsaw announced earlier this week they will be the first scientific center in Poland to use the IBM Blue Gene/P system.

This supercomputer will be used in scientific research and take on computationally intensive scientific problems described as “major challenges” in areas like meterology, cosmology, materials sciences, and neurominformatics.  You can learn more about this deal here.

I also wanted to plant a reminder before the weekend: Innovate 2011, IBM’s premier event for software and systems innovation, is just around the corner.

To be held June 5-9 in Orlando, Florida, Innovate 2011 is your opportunity for the good folks with IBM Rational to show how you can cut through the complexity of developing smarter products, systems, and software delivery.

You can visit here to get all the skinny on registration.  If you’re looking for those extra special reasons to convince your boss to let you out of the office for a few days, we’ve provided “Top 5 Reasons to Attend.”

Or, go visit the “Rational Talks to You” podcast series to hear from past participants on the topics you’ve told us matter most.

Even IEEE Fellow and UML co-creator, Grady Booch, is in on the action, joining this webcast (attendees for which get $300 off their Innovate 2011 registration) to give us a sneak preview of his Innovate 2011 keynote presentation about IBM’s Jeopardy! champion computer, Watson.

Remember, software is everywhere…but it’s especially at Innovate 2011!

Deep Blue Redux

with 13 comments

Flashback: May 3, 1997

Where: The Equitable Center, New York City

What: Deep Blue v. Kasparov, The Rematch

Garry Kasparov prepares to make a move against IBM's Deep Blue supercomputer during the May 1997 rematch in which IBM's Deep Blue was ultimately victorious.

It was classic Man v. Machine.  World champion chessmaster Garry Kasparov had agreed to a rematch against the IBM Supercomputer, Deep Blue, after Kasparov had taken Deep Blue 4-2 in Philadelphia in their first meeting in February 1996.

This time, Deep Blue was out for…well, if not blood, then certainly revenge.  And Kasparov was out to show he could beat the machine once again.

Game 1 that day went to Kasparov.  Lest you were wondering how long things stick around on the Internet, you can go back and read the play-by-play coverage from the IBM Website for the event that day.

I was living up in Mount Kisco, New York, at the time, in Westchester County, and I remember trying to get onto the Website via dial-up modem and use a Java applet IBM had developed in partnership with Poppe Tyson so that people around the globe would be able to follow the action online.

For those of you were still in diapers, this was at a time when not everyone had a broadband connection into their home.

For the next match, I decided to head into the city and go to the Equitable Center in person to see for myself.

Well, not directly.  The Deep Blue computer, the IBM Research team programming Deep Blue, and Kasparov were all situated some 34 floors above the auditorium, where the “play-by-play” was being called.

Now, I’m no chess grandmaster myself.  Not even close.

But I knew enough watching the play-by-play (with several grandmasters calling the action onstage, including Maurice Ashley) up on the video screen to know this was some serious chess.

You could almost watch the IBM computer “thinking” through the moves, as seconds ticked off between moves — although on most moves, it didn’t take very many seconds.  Not for nothin’ did they classify Deep Blue as a supercomputer.

People in the Equitable Center audience would cheer when certain moves occurred, particularly those by Deep Blue, which often seemed to surprise the chess-savvy audience with the depth of Deep Blue’s chess acumen.

That was something I thought I’d never see in my lifetime: Spectators cheering on a chess game.  But it was terribly enthralling.

Because there was more to cheer about than the game itself.

One had to step back and remind oneself this wasn’t a Bobby Fischer/Garry Kasparov match.  This was Garry Kasparov playing chess against a computer.  In real-time.

This wasn’t a situation where humans were making the decision.  This was the computer in the driver’s seat, responsible for it’s own fate, but also devoid of the trappings of human emotions and frailty (which by the end of the tournament, Kasparov certainly was not, as he demonstrated in a number of his post-match temper tantrums.)

You couldn’t blame the guy.  You wouldn’t like being beaten by a computer, either!

Which is why I had the feeling I was watching history being made.  And apparently I wasn’t the only one.

IBM garnered an estimated $100M worth of free public relations exposure through the course of the rematch, but in so doing, captured the imaginations of people from around the world.

And, their attention online.

Up to that point, Kasparov vs. Deep Blue, the Rematch, was one of the most popular live events ever staged on the Internet. The Website, designed in partnership with Web design shop Studio Archetype, received more than 74 million hits during the event, which represented some 4 million users from 106 countries.

All the fanfare, all the publicity, all the hoopla…it was fun.  But you can only stretch the Man v. Machine, John Henry analogies so far.

However, the implications of the technology were…well, endless.

Dr. Mark Bregman, at the time the general manager of IBM’s RS/6000 division, wrote a guest essay for the match Website, and this is what he had to say about the match:

“Think about it. Playing chess requires knowledge of countless possibilities — quickly providing answers to any number of ‘what if’ questions. That’s what business people and members of the scientific community have come to expect from massively parallel computer systems.”

The evolution of those possibilities continue.

In June of this year, The New York Times magazine ran a cover story featuring the next move in that evolution.

If the answer is an attempt to build a computing system that can understand and answer complex questions with enough precision and speed to compete against some of the best Jeopardy! contestants out there…well, then, the question, of course, is:

What is Watson?

Written by turbotodd

December 9, 2010 at 6:01 pm