Turbotodd

Ruminations on tech, the digital media, and some golf thrown in for good measure.

Archive for October 2012

Watson Goes Back To School

leave a comment »

Im my capacity as a cheerleader for my virtual big brother, IBM’s Watson technology, I’ve received a lot of questions along the way about how does IBM plan to use the technology in industry, and how can we most effectively put Watson to work.

Great questions, and the answer is, it depends.

Yesterday, for example, we announced a new program in partnership with the Cleveland Clinic in Cleveland, Ohio, that will create a collaboration to advance Watson’s use in the medical training field.

The IBM researchers that built Watson are going to work with Cleveland Clinic clinicians, faculty, and medical students to enhance the capabilities of Watson’s Deep Question Answering technology for the area of medicine.

Calling Dr. Watson

Watson’s ability to analyze the meaning and context of human language and quickly process information to piece together evidence for answers can help healthcare decision makers, such as clinicians, nurses and medical students, unlock important knowledge and facts buried within huge volumes of information.

Watson has been gaining knowledge in the field of medicine, and Cleveland Clinic with IBM recognized the opportunity for Watson to interact with medical students to help explore a wide variety of learning challenges facing the medical industry today.

Rather than attempting to memorize everything in text books and medical journals (now acknowledged as an impossible task), students are learning through doing — taking patient case studies, analyzing them, coming up with hypotheses, and then finding and connecting evidence in reference materials and the latest journals to identify diagnoses and treatment options in the context of medical training.

This process of considering multiple medical factors and discovering and evidencing solution paths in large volumes of data reflects the core capabilities of the Watson technology.

Watson Providing Problem-Based Learning Curriculum

Medical students will interact with Watson on challenging cases as part of a problem-based learning curriculum and in hypothetical clinical simulations.

A collaborative learning and training tool utilizing the Watson technology will be available to medical students to assist in their education to learn the process of navigating the latest content, suggesting and considering a variety of hypotheses and finding key evidence to support potential answers, diagnoses and possible treatment options.

“Every day, physicians and scientists around the world add more and more information to what I think of as an ever-expanding, global medical library,” said C. Martin Harris, M.D., Chief Information Officer of Cleveland Clinic. “Cleveland Clinic’s collaboration with IBM is exciting because it offers us the opportunity to teach Watson to ‘think’ in ways that have the potential to make it a powerful tool in medicine. Technology like this can allow us to leverage that medical library to help train our students and also find new ways to address the public health challenges we face today.”

Watson Will Learn From Medical Students

Students will help improve Watson’s language and domain analysis capabilities by judging the evidence it provides and analyzing its answers within the domain of medicine. Through engagement with this education tool and Watson, medical students and Watson will benefit from each other’s strengths and expertise to both learn and improve their collaborative performance.

The collaboration will also focus on leveraging Watson to process an electronic medical record (EMR) based on a deep semantic understanding of the content within an EMR.

The shift is clearly away from memorization and towards critical thinking where medical training programs will help student to use powerful discovery and language analysis tools like Watson to help them evaluate medical case scenarios and find evidence to help them carefully rationalize decisions. The physicians will rely on their own experience and expert critical thinking skills to read the evidence and make the final judgments.

“The practice of medicine is changing and so should the way medical students learn. In the real world, medical case scenarios should rely on people’s ability to quickly find and apply the most relevant knowledge. Finding and evaluating multistep paths through the medical literature is required to identify evidence in support of potential diagnoses and treatment options,” said Dr. David Ferrucci, IBM Fellow and Principal Investigator of the Watson project.

Over time, the expectation is that Watson will get “smarter” about medical language and how to assemble good chains of evidence from available content. Students will learn how to focus on critical thinking skills and how to best leverage informational tools like Watson in helping them learn how to diagnose and treat patients.

IBM and Cleveland Clinic will discuss the role of Watson for the future of healthcare and healthcare education this week at the Cleveland Clinic Medical Innovation Summit being held October 29-31, 2012 in Cleveland, OH.

I sat down recently at the IBM InterConnect event in Singapore to conduct a fascinating mid-year employee performance review for IBM’s Watson technology with Watson GM Manoj Saxena.  You can see the fruits of our discussion in the video below.

Sandy’s Data Center Impact

leave a comment »

Well, I sat and watched the coverage of Superstorm Sandy last night, flipping between the major cable news networks and The Weather Channel, and also trying to keep up with my northeast friends via Facebook and Twitter.

You could almost mirror match the power outages with the suddenly disappearing Facebook and Twitter streams, as one friend after another dropped from the social radar screen.

Having lived in New York City and its surroundings for the better part of eight years of my life, I was completely sympathetic to their plight, and quite frankly, astonished at some of the images I was witnessing.

I’ve been out doing some research to try and understand the negative IT impact, and it didn’t take long.

This story indicated that the flooding had hobbled two data center buildings in Lower Manhattan, mainly because it took out diesel pumps (located in basements) that were needed to refuel generators.

Datagram’s 33 Whitehall basement was also inundated, taking out some major Web sites, including Gawker, Gizmodo, Buzzfeed, The Huffington Post, and Media. The attached screenshot demonstrates the message I tried going there just this morning.

Ars Technica also had a post detailing some of the outages, in which they suggested that “customers relying on hosting providers and cloud services may want to build systems that can fail over across multiple regions,” but that “even the most extensive preparations may not be enough to stay online in the face of a storm like Hurricane Sandy.”‘

IBM’s own Business Continuity Services had this message for IBM clients posted on its home page overnight:

The IBM Business Continuity and Resiliency team is monitoring the status of Hurricane Sandy and has activated our Emergency Operations Center to ensure we are prepared to assist our customers throughout the storm. Our delivery teams are assembled in BCRS recovery centers in Sterling Forest, NY, Gaithersburg, MD and Boulder CO and all facilities are secure and ready to support all client declarations. We are proactively assessing the potential impact to our customers who are projected to be in the path of the storm, and our delivery program management team will provide regular updates to our clients as the storm progresses, and will be available to respond to any questions throughout the week. If you need to call IBM to place us on alert, or to declare a disaster, please call 1-877-IBM-REC1 (877-426-7321)

Written by turbotodd

October 30, 2012 at 7:43 pm

Think Small: IBM Researchers Demonstrate Carbon Nanotubes, Potential Silicon Successors

leave a comment »

Since I posted about Hurricane Sandy earlier in the day, I’ve seen some pretty stunning pictures and video coming in, and heard more reports from friends in and around the New York City area.

The story of the crane toppling over on a very tall building being built on West 57th Street, between 6th and 7th Avenues (my old IBM office is at Madison and 57th, further east) was most stunning. You can find some of the pics or video on CNN.

While we wait to discover how big a problem Sandy presents to the northeast Atlantic coast, I’ll share with you a diversion focusing on a much smaller topic — but one with potentially huge implications.

IBM scientists recently demonstrated a new approach to carbon technology that opens up the path for commercial fabrication of dramatically smaller, faster and more powerful computer chips.

For the first time, more than ten thousand working transistors made of nano-sized tubes of carbon have been precisely placed and tested in a single chip using standard semiconductor processes.

These carbon devices are poised to replace and outperform silicon technology allowing further miniaturization of computing components and leading the way for future microelectronics.

Four Decades Of Innovation

Aided by rapid innovation over four decades, silicon microprocessor technology has continually shrunk in size and improved in performance, thereby driving the information technology revolution.

Silicon transistors, tiny switches that carry information on a chip, have been made smaller year after year, but they are approaching a point of physical limitation.

Their increasingly small dimensions, now reaching the nanoscale, will prohibit any gains in performance due to the nature of silicon and the laws of physics. Within a few more generations, classical scaling and shrinkage will no longer yield the sizable benefits of lower power, lower cost and higher speed processors that the industry has become accustomed to.

Carbon nanotubes represent a new class of semiconductor materials whose electrical properties are more attractive than silicon, particularly for building nanoscale transistor devices that are a few tens of atoms across.

Electrons in carbon transistors can move easier than in silicon-based devices allowing for quicker transport of data. The nanotubes are also ideally shaped for transistors at the atomic scale, an advantage over silicon.

These qualities are among the reasons to replace the traditional silicon transistor with carbon — and coupled with new chip design architectures — will allow computing innovation on a miniature scale for the future.

The approach developed at IBM labs paves the way for circuit fabrication with large numbers of carbon nanotube transistors at predetermined substrate positions. The ability to isolate semiconducting nanotubes and place a high density of carbon devices on a wafer is crucial to assess their suitability for a technology — eventually more than one billion transistors will be needed for future integration into commercial chips.

Hardly A Carbon Copy

Until now, scientists have been able to place at most a few hundred carbon nanotube devices at a time, not nearly enough to address key issues for commercial applications.

Originally studied for the physics that arises from their atomic dimensions and shapes, carbon nanotubes are being explored by scientists worldwide in applications that span integrated circuits, energy storage and conversion, biomedical sensing and DNA sequencing.

This achievement was published today in the peer-reviewed journal Nature Nanotechnology.

Carbon, a readily available basic element from which crystals as hard as diamonds and as soft as the “lead” in a pencil are made, has wide-ranging IT applications.

Carbon nanotubes are single atomic sheets of carbon rolled up into a tube. The carbon nanotube forms the core of a transistor device that will work in a fashion similar to the current silicon transistor, but will be better performing. They could be used to replace the transistors in chips that power our data-crunching servers, high performing computers and ultra fast smart phones.

Earlier this year, IBM researchers demonstrated  carbon nanotube transistors can operate as excellent switches at molecular dimensions of less than ten nanometers – the equivalent to 10,000 times thinner than a strand of human hair and less than half the size of the leading silicon technology. Comprehensive modeling of the electronic circuits suggests that about a five to ten times improvement in performance compared to silicon circuits is possible.

There are practical challenges for carbon nanotubes to become a commercial technology notably, as mentioned earlier, due to the purity and placement of the devices. Carbon nanotubes naturally come as a mix of metallic and semiconducting species and need to be placed perfectly on the wafer surface to make electronic circuits. For device operation, only the semiconducting kind of tubes is useful which requires essentially complete removal of the metallic ones to prevent errors in circuits.

Also, for large scale integration to happen, it is critical to be able to control the alignment and the location of carbon nanotube devices on a substrate.

To overcome these barriers, IBM researchers developed a novel method based on ion-exchange chemistry that allows precise and controlled placement of aligned carbon nanotubes on a substrate at a high density — two orders of magnitude greater than previous experiments, enabling the controlled placement of individual nanotubes with a density of about a billion per square centimeter.

The process starts with carbon nanotubes mixed with a surfactant, a kind of soap that makes them soluble in water. A substrate is comprised of two oxides with trenches made of chemically-modified hafnium oxide (HfO2) and the rest of silicon oxide (SiO2). The substrate gets immersed in the carbon nanotube solution and the nanotubes attach via a chemical bond to the HfO2 regions while the rest of the surface remains clean.

By combining chemistry, processing and engineering expertise, IBM researchers are able to fabricate more than ten thousand transistors on a single chip.

Furthermore, rapid testing of thousands of devices is possible using high volume characterization tools due to compatibility to standard commercial processes.

As this new placement technique can be readily implemented, involving common chemicals and existing semiconductor fabrication, it will allow the industry to work with carbon nanotubes at a greater scale and deliver further innovation for carbon electronics.

You can learn more in the animation below.

Written by turbotodd

October 29, 2012 at 8:02 pm

Sandy

with one comment

Hurricane Sandy is rapidly approaching the Atlantic coast of the U.S.. As of 8 a.m., the huge storm was producing sustained winds of 85 miles per hour after turning north northwest toward the coastline of New Jersey, according to the National Hurricane Center. The center of the storm is now moving at 20 m.p.h., a significant speedup from earlier in the morning.

If it’s Monday, it must be time for a Hurricane.

And I’m not referring to the cocktail emanating from Pat O’Brien’s in New Orleans.

Hurricane Sandy is drifting up the Atlantic coast and is expected to make landfall later this afternoon, probably somewhere in New Jersey.

But as of 8:52 CST this morning, she’s already having an impact well in to New York City. I’ve already seen Twitpics of Battery Park City starting to surrender to the surge, which is truly frightening considering how much of the storm is still yet to come.

As an FYI, both The New York Times and The Wall Street Journal have eliminated their paywalls and are making their content free, if you’re looking for up-to-the-minute updates on the storm.

YouTube is also streaming The Weather Channel (where NBC’s Al Roker was just seen trying to stay vertical at Point Pleasant Beach, New Jersey).

I spoke with a good friend of mine who lives on the edge of Cobble Hill (in Brooklyn), and he indicated the water had not yet lapped over the piers there, but that it was likely only a matter of time.  Forecasters are expecting a 6-to-11 surge when high tide strikes around 8 tonight.

New York governor Andrew Cuomo just held a press conference and announced the closing of both the Holland and Brooklyn Battery tunnels at 2 P.M. EST.

If you’re interested in seeing more detaila about the storm, Google’s offering up its “Crisis Map” here, and a more specific look at NYC here.

On Twitter, the National Hurricane Center is offering facts and tips at @NHC_Atlantic, and the Weather Channel can be followed at @weatherchannel.

I was living in NYC in 1985 during Hurricane Gloria, and that storm paled by comparison.  So, please, be safe out there, stay away from the ocean, stay inside, and ride this sucker out as safely as you can!

UPDATE: I just built this Turbo Sandy Twitter list, with a list of followees from a variety of media and government sources, including the Weather Channel, NASA, FEMA, and a variety of others.

Written by turbotodd

October 29, 2012 at 2:57 pm

Live @ Information On Demand 2012: Craig Rhinehart On Predictive Healthcare

leave a comment »

I made it back to Austin late last night, mostly no worse for the wear.

There were a number of key announcements made at Information On Demand 2012 over the course of the past few days in Las Vegas.

One of those that I mentioned in one of my keynote post summaries was IBM Patient Care and Insights, new analytics software based on innovations from IBM Labs that helps healthcare organizations improve patient care and lower operational costs by considering the specific health history of each individual patient.

This is a fascinating new capability with profound implications for healthcare providers.

The new IBM solution provides the core capabilities for devising predictive models of various health conditions, which can be used to identify early intervention opportunities to improve the patient’s outlook by minimizing or avoiding potential health problems.

It features advanced analytics and care management capabilities to help identify early intervention opportunities and coordinate patient care.

Providing Individualized Care

At the core of IBM Patient Care and Insights, developed by IBM’s software, research and services teams, are similarity analytics that help drive smart, individualized care delivery.

Born in IBM Research, IBM similarity analytics is a set of core capabilities and algorithms that allow healthcare professionals to examine thousands of patient characteristics at once — including demographic, social, clinical and financial factors along with unstructured data such as physicians’ notes — to generate personalized evidence and insights, and then provide care according to personalized treatment plans.

By way of example, physicians can make personalized recommendations to improve a patient’s outcome by finding other patients with similar clinical characteristics to see what treatments were most effective or what complications they may have encountered.

They can also perform patient-physician matching so an individual is paired with a doctor that is optimal for a specific condition. With this solution, caregivers can better tap into the collective memory of the care delivery system to uncover new levels of tailored insight or “early identifiers” from historical/long term patient data that enable doctors and others to help manage a patient’s healthcare needs well into the future.

Craig Rhinehart, director for IBM’s ECM Strategy and Market Development organization, sat down with Scott Laningham and I earlier this week to describe the challenges facing health care, and how the IBM Patient Care and Insights can help improve health care by delivering dynamic case-based, patient-centric electronic care plans and population analysis.

Go here for more information on IBM Patient Care and Insights and IBM Intelligent Investigation Manager.

Live @ Information On Demand 2012: Watson’s Next Job

with one comment

As I mentioned in my last post, yesterday was day 3 of Information On Demand 2012 here in Las Vegas.

There was LOTS going on out here in the West.

We started the day by interviewing keynote speaker Nate Silver (see previous post) just prior to his going on stage for the morning general session. Really fascinating interview, and going in to it I learned that his book had reached #8 on The New York Times best seller list.

In the IOD 2012 day 3 general session, IBM Fellow Rob High explains how IBM’s Watson technology may soon help drive down call center costs by 50%, using the intelligence engine of Watson to help customer service reps faster respond to customer queries.

So congrats, Nate, and thanks again for a scintillating interview.

During the morning session, we also heard from IBM’s own Craig Rinehart about the opportunity for achieving better efficiencies in health care using enterprise content management solutions from IBM.

I nearly choked when Craig explained that thirty cents out of every dollar on healthcare in the U.S. is wasted, and despite spending more than any other country, is ranked 37th in terms of care.

Craig explained the IBM Patient Care and Insights tool was intended to bring advanced analytics out of the lab and into the hospital to help start driving down some of those costs, and more importantly, to help save lives.

We also heard from IBM Fellow and CTO of IBM Watson Solutions’ organization, Rob High, about some of the recent advancements made on the Watson front.

High explained the distinction between programmatic and cognitive computing, the latter being the direction computing is now taking, and an approach that provides for much more “discoverability” even as it’s more probabilistic in nature.

High walked through a fascinating call center demonstration, whereby Watson helped a call center agent more quickly respond to a customer query by filtering through thousands of possible answers in a few second, then honed in on the ones most likely that would answer the customer’s question.

Next, we heard from Jeff Jonas, IBM’s entity analytics “Ironman” (Jeff also just competed his 27th Ironman triathlon last weekend), who explained his latest technology, context accumulation.

Jeff observed that context accumulation was the “incremental process of integrating new observations with previous ones.”

Or, in other words, developing a better understanding of something by taking more into account the things around it.

Too often, Jeff suggested, analytics has been done in isolation, but that “the future of Big Data is the diverse integration of data” where “data finds data.”

His new method allows for self-correction, and a high tolerance for disagreement, confusion and uncertainty, and where new observations can “reverse earlier assertions.”

For now, he’s calling the technology “G2,” and explains it as a “general purpose context accumulating engine.”

Of course, there was also the Nate Silver keynote, the capstone of yesterday’s morning session, to which I’ll refer you back to the interview Scott and I conducted to get a summary taste of all the ideas Nate discussed.  Your best bet is to buy his book, if you really want to understand where he thinks we need to take the promise of prediction.

Written by turbotodd

October 25, 2012 at 5:38 pm

Live @ Information On Demand 2012: A Q&A With Nate Silver On The Promise Of Prediction

with 2 comments

Day 3 at Information On Demand 2012.

The suggestion to “Think Big” continued, so Scott Laningham and I sat down very early this morning with Nate Silver, blogger and author of the now New York Times bestseller, “The Signal and the Noise” (You can read the review of the book in the Times here).

Nate, who is a youngish 34, has become our leading statistician through his innovative analyses of political polling, but made his original name by building a widely acclaimed baseball statistical analysis system called “PECOTA.”

Today, Nate runs the award-winning political website FiveThirtyEight.com, which is now published in The New York Times and which has made Nate the public face of statistical analysis and political forecasting.

In his book, the full title of which is “The Signal and The Noise: Why Most Predictions Fail — But Some Don’t,” Silver explores how data-based predictions underpin a growing sector of critical fields, from political polling to weather forecasting to the stock market to chess to the war on terror.

In the book, Nate poses some key questions, including what kind of predictions can we trust, and are the “predicters” using reliable methods? Also, what sorts of things can, and cannot, be predicted?

In our conversation in the greenroom just prior to his keynote at Information On Demand 2012 earlier today, Scott and I probed along a number of these vectors, asking Nate about the importance of prediction in Big Data, statistical influence on sports and player predictions (a la “Moneyball”), how large organizations can improve their predictive capabilities, and much more.

It was a refreshing and eye-opening interview, and I hope you enjoy watching it as much as Scott and I enjoyed conducting it!

Live @ Information On Demand 2012: Big On Business Analytics

with one comment

Day two of Information On Demand.

Note to self: Bring a hot water boiler next time. Check bathroom for Bengali tiger.  Pack a vaporizer.  And bring some 5 Hour Energy Drinks.

Oh, and be sure to wear comfortable shoes.

Today, I missed the general session, as I was in my room preparing a presentation and also tuning in to the Apple webcast where CEO Tim Cook announced the new iPad Mini, among other products.

IBM Business Analytics general manager Les Rechan explains to the audience how over 6,000 clients and prospects have now taken the “Analytics Quotient” quiz since it went live last year.

But I did make it down to the Business Analytics keynote, led by IBM Business Analytics general manager Les Rechan, and I was glad I did.

The session started with a motivating video featuring a number of IBM customers on the vanguard of using business analytics to improve their businesses.  When Les came onstage, he first highlighted several of IBM’s BA “Champions,” clients from around the globe who were in the “Advanced” category of business analytics.

Les’ birds-eye view centered on the Analytics Quotient, a self-analyzing quiz IBM created and released for customers last year. About 70 percent of the 6,000+ respondents year-to-date indicated they are in the “novice” or “builder” categories, and only 30 percent in the “leader” or “master” categories.

Where IBM can help move the needle is through a variety of resources Les pointed out, including the Analytics Zone, as well as through enablement services and training.

He also highlighted a new book, “5 Keys To Business Analytics Program Success,” a book recently published that features a number of IBM business analytics customer success stories (written by them!).

Over 70 percent of respondents to the IBM “Analytics Quotient” online exam find themselves in the “novice” or “builder” categories, indicating there’s plenty of upside yet in pursuing basic business analytics capabilities across a great diversity of organizations.

Michelle Mylot, the Business Analytics team’s chief marketing officer, then came onstage and pointed out that those organizations that integrated analytics into the fabric of their businesses are the ones that drive the most impact.

She highlighted a number of key areas around which IBM’s business analytics team has been increasingly focused, including social network analyis, entity resolution, decision management, and operational analytics.

Doug Barton, whose interview I’m attaching below at the end of this post, came on stage and gave a brilliant presentation that should provide financial analysts everywhere (including CFOs and all their staffs) incentive to run directly to their nearest reseller and purchase Cognos Disclosure Management.

It’s difficult to describe a demo, but basically, Doug presented a scenario where a company was preparing to announce its earnings and, rather than operating from a plethora of disparate spreadsheets, he demonstrated how Cognos Disclosure Management could create a symphony of collaboration as a CFO prepared for a quarterly earnings call.

Isolated spreadsheets and PowerPoints became integrated narratives of the earnings statement, where an update in one part of the report would magically alter the performance graph in another.

Pure financial geek magic. Doug, take it away in our Q&A below.

Think Big, iPad Small

with one comment

It’s a big day in tech, all the way around.

We’ll continue our mission to “Think Big” here in Las Vegas at the IBM Information On Demand 2012 event.

We’ll also get a glimpse into how big the mobile market is becoming as Facebook announces its earnings after the bell later today.

But of course, one of the biggest stories of the day has to do with the downsizing of one of our favorite tablets, the Apple iPad.

Rumors abound about the new iPad “Mini,” which I very look forward to referring to as my “MiniMePad.”

If you’re using an Apple device (including an AppleTV), you should be able to tune in to watch the announcement live starting at 10 AM PST.

If not, there will be shortage of bloggers out there giving you the blow-by-blow.

Why am I so interested in the Mini iPad?

First, Apple set the bar for tablets with the original iPad, which I still use to this day.

Second, the smaller form factor is raising a lot of questions about price. Can Apple afford to take down the price from $499 to the $200 range, especially when their iPod Touch is still priced at $299 (the last time I looked…I can’t look this morning, as the Apple store is down getting busy for the Mini introduction).

I’d say the question more is, can they afford not to? Like the early browser wars, this is a market AND mindshare battle.  iOS and Android are lined up for a full cage death match, and if Apple’s to maintain its market share lead of 69.6% (as of Q2 2012), they’re going to have to compete aggressively on price.

The new Nexus 7 and Kindle Fire HDs are coming in at under $200, and while I doubt that’s a price Apple can match, they’re going to have to strive to stay somewhat price competitive, figuring the Apple premium could be worth $100 per unit or so.

Third, the original iPad was the starting line of the shift away from desktop-centric technology, and as Microsoft attempts to come into this market with its Surface tablet, a key question emerges: Can Apple continue to entice productivity hounds away from the Microsoft ecosystem, despite the advent of the Surface, and stay price competitive in a burgeoning competitive market?

As for me, you might ask, will I buy one?  I’ll never say never. The iPad has become a full-on personal entertainment and productivity workhorse for me, an elegant blended use case of both the personal and the professional.

I watch movies on the thing, I use it for blogging and broadcasting, I play games, I do email, I read books, I hold conference calls.  There’s not a lot I can’t do on it.

So, I can easily justify the upgrade, and I’d love to get a faster iPad, but like with the original, I may wait for an initial software upgrade so Apple has the opportunity to work some of the kinks out.

Then again, I may not.

(Almost) Live @ Information On Demand 2012: A Q&A With IBM’s Jeff Jonas

with 2 comments

Jeff Jonas sat down last evening with Scott and I in the Information On Demand 2012 Solutions EXPO to chat about privacy in the Big Data age, and also gave a sneak look into the new “Context Accumulation” technology he’s been working on.

You really ought to get to know IBM’s Jeff Jonas.

As chief scientist of the IBM Entity Analytics group and an IBM Distinguished Engineer, Jeff has been instrumental in driving the development of some ground-breaking technologies, during and prior to IBM’s acquisition of his company, Systems Research & Development (SRD), which Jonas founded in 1984.

SRD’s technology included technology used by the surveillance intelligence arm of the gaming industry, and leveraged facial recognition to protect casinos from aggressive card counting teams (never mind the great irony that IBM’s Yuchun Lee was once upon a time one of those card counters — I think we need to have an onstage interview between those two someday, and I volunteer to conduct it!)

Today, possibly half the casinos in the world use technology created by Jonas and his SRD team, work frequently featured on the Discovery Channel, Learning Channel, and the Travel Channel.

Following an investment in 2001 by In-Q-Tel, the venture capital arm of the CIA, SRD also played a role in America’s national security and counterterrorism mission. One such contribution includes a unique analysis of the connections between the 9/11 terrorists.

This “link analysis” is so unique that it is taught in universities and has been the widely cited by think tanks and the media, including an extensive one-on-one interview with Peter Jennings for ABC PrimeTime.

Following IBM’s acquisition of SRD, these Jonas-inspired innovations continue to create big impacts on society, including the arrest of over 150 child pornographers and the prevention of a national security risk poised against a significant American sporting event.

This technology also assisted in the reunification of over 100 loved ones separated by Hurricane Katrina and at the same time was used to prevent known sexual offenders from being co-located with children in emergency relocation facilities.

Jonas is also somewhat unique as a technologist in that he frequently engages with those in the privacy and civil liberties community. The essential question: How can government protect its citizens while preventing the erosion of long-held freedoms like the Fourth Amendment? With privacy in mind, Jonas invented software which enables organizations to discover records of common interest (e.g., identities) without the transfer of any privacy-invading content.

That’s about where we start this interview with Jeff Jonas, so I’ll let Scott and myself take it from there…

%d bloggers like this: