Archive for August 2011
IBM To Acquire Intelligence Analytics Firm i2
IBM today announced a definitive agreement to acquire i2 to accelerate its business analytics initiatives and help clients in the public and private sectors address crime, fraud and security threats.
Financial terms were not disclosed.
With more than 4,500 customers in 150 countries, i2 is a leading provider of intelligence analytics for crime and fraud prevention based in Cambridge, UK with U.S. headquarters in McLean, Va. i2’s clients span multiple sectors globally such as banking, defense, health care, insurance, law enforcement, national security and retail. i2 solutions are currently used by 12 of the top 20 retail banks globally and eight of the top 10 largest companies in the world.
Organizations in both the public and private sectors today are facing an exponential increase in “big data” — information and intelligence coming from disparate and unstructured sources including social media, biometrics and criminal databases.
When it is accessible to the people who need it, this information can be used to anticipate potential problems, make better, faster decisions, and coordinate resources to deliver exceptional service to citizens and customers.
This acquisition will extend IBM’s leadership in helping clients harness this data through the addition of i2’s intelligence analysis and tactical lead generation capabilities, which help cities, nations, international bodies and private enterprises combat fraud and security threats.
Fraud and security threat reduction is of strategic importance across all industries today. According to FBI statistics, in 2009 there were 1.3 million violent crimes, 9.3 million property crimes and 6.3 million larceny/thefts in the United States. Consumer-facing fraud for retailers alone costs $100 billion a year in the United States.
In response to these threats, businesses and government agencies are seeking to empower public safety officers, analysts, managers, detectives and investigators with industry-specific intelligence analysis and operational insight solutions. For example, government agencies today are concerned with increased threats to public safety, which are driving the need for secure information sharing.
Using IBM real-time analytical solutions in combination with the technologies of i2, public agencies and private enterprises battling fraud will now have the capability to better collect, analyze and process all the relevant data at their disposal. In the past, data overload often led to critical information or opportunities being missed.
Now local, state, and federal authorities can harness new intelligence to instantly detect and respond to security threats. Investigative leads can be identified quickly, helping government agencies solve crimes faster and keep officers and communities safer.
With IBM and i2, clients will have access to a comprehensive range of visualization and multidimensional analytics for the timely delivery of intelligence, including threat and fraud analytics. These tools will help analysts quickly identify connections, patterns and trends in complex data sets and easily model data in the way they think, in a single environment yielding faster analysis results, strategic reports and bulletins.
i2 will be integrated into IBM’s Software Group.
Headquartered in the UK, i2 has 350 employees and additional offices in McLean, Va.; Tucson, Ariz.; Ottawa, Ontario; and Canberra, Australia. The acquisition is anticipated to close in the fourth quarter of 2011, subject to the satisfaction of customary closing conditions and applicable regulatory reviews.
Don’t Let Your Business Become A Disaster
This year seems as though it’s been nothing but a series of disasters.
Literally.
The Japan earthquake and tsunami. An horrific season of tornadoes across the south and mid-west. Amazing drought throughout Texas, where agricultural losses are upwards of $3B. And our most recent friend, Hurricane Irene, which visited devastation up the mid-Atlantic and, incredibly, leaving Vermont and Connecticut more harmed than anyone would have estimated.

IBM recently announced six tips that individuals and businesses can use to help prepare their IT environments for natural disasters and a wide range of other threats.
It just goes to show, you can never be ready enough for acts of God.
That includes individuals and businesses which are dependent on their IT environments for conducting their business and ensuring continuity through one of these disasters.
In preparation for Irene, we saw many people in high risk areas rushing around to buy emergency supplies like flashlights, water, and wood to board up their houses. But how many considered the preparedness of businesses and government agencies?
Given these impending natural disaster and other top causes of disasters like power outages and network failure that disrupt the flow of information, businesses and individuals should also be assessing their business and disaster recovery plans in advance of disaster scenarios, when things are calmer and they can focus on sensible risk mitigation.
In today’s on demand environment, it’s critically important to rapidly adapt and respond to risks, as well as opportunities, to maintain continuous access to data for personal and business reasons.
IBM recently offered up a few tips on disaster preparedness:
- Validate your data backup plan – Verify that your data is out of harm’s way and/or is accessible to your recovery location. Consider using a cloud service to store key data and allow your organization more flexibility to respond to changing conditions with minimal interruption to the business.
- Consider employees and the personal impact of a disaster – A company’s most important asset are their people, but the most important asset for people are their families. Consider how you would move them and their families if required, think about providing financial support to your employees during a crisis event, and consider offering counseling to help them deal with the aftermath of the crisis.
- Develop various ways to communicate with employees, partners – After people, the next most important element is communication. Communications efforts must be timely, clear and honest, as miscommunication can make a disaster even worse. Consider how you would communicate with your employees, partners, clients, media, industry, and vice versa, what training you have provided, what tools are you using and — very important — test the communications plan.
- Think about the “domino effect” when considering business risk – Years of experience monitoring regional disasters has shown that these events often create other events. For example, a hurricane normally has high winds and heavy rains that can lead to flooding, structural damage, power outage, telecommunication and/or travel disruptions.
- Plan for catastrophic events that could last a while – For example, businesses must consider the impact if the duration of the disruption to the facility, network, technology, or people is longer than a period of three days, one week, etc. Over the past decade, we have seen more devastating disaster events with a longer term duration and financial impact. Companies need to consider their options if their primary environment or key people are not available for more than two weeks.
- Think broadly – Each company is part of a supply chain or network. While you may do everything right, if you have a critical partner, supplier, vendor or provider of service, your preparedness is only as good as those other businesses. As part of your disaster recovery plan, ensure everyone upstream and downstream from your business is also prepared.
With more than 40 years of experience keeping businesses up and running, IBM uses its software, hardware and services expertise to help clients and individuals across the globe to protect their data.
IBM helps them to manage risks, protect valuable business assets, comply with standards and regulations, and continue business operations.
“People and businesses are relying on technology now more than ever, which creates an urgent need to protect critical data and keep IT systems up and running when a natural disaster or other unexpected outage occurs,” said Rick Ruiz, general manager of IBM’s Business Continuity and Resiliency Services. “In these situations, it’s clear that those who have moved from the old model of ‘experience and react’ to a new one of ‘anticipate and adjust’ will fare much better.”
Visit this site to learn more about IBM’s Disaster Recovery Services.
Watson Redux
If you missed your chance to watch the competition aired nationally in North America this past February between IBM and America’s favorite quiz show Jeopardy!, fear not: IBM announced today that Jeopardy! will broadcast an encore presentation of the first-ever man vs. machine Jeopardy! competition between IBM’s “Watson” computing system and the show’s two greatest contestants – Ken Jennings and Brad Rutter.
Millions of North American viewers will be able to again witness TV history as Watson successfully competes against two human champions in two matches played over three consecutive days, September 12, 13, and 14, 2011.
(Spoiler Alert: developerWorks’ Scott Laningham and I interviewed the principal investigator and project lead of the Watson effort, Dr. David Ferrucci, during this year’s SXSW Interactive conference in March of this year. Do NOT watch the video Q&A below if you haven’t yet seen the broadcast/re-broadcast if you don’t want to spoil the ending! In the interview, Ferrucci explains in some detail the AI methods behind Watson’s madness!)
“With the Jeopardy! challenge, we accomplished what was thought to be impossible – building a computer system that operates in the near limitless, ambiguous and highly contextual realm of human language and knowledge,” said Dr. David Ferrucci, IBM Fellow and scientist leading the IBM Research team that created Watson. “Watching the match again reminds us of the great power and potential behind Watson to be able to make sense of the massive amounts of data around us and to solve problems in new ways.”
Six months after the original competition, Watson’s Deep Question Answering (QA) technology has already driven progress in new fields such as the healthcare industry. IBM is working with Nuance Communications, Inc. to explore and develop applications to help critical decision makers, such as physicians and nurses, process large volumes of health information in order to deliver quicker and more accurate patient diagnoses. Working with universities and clients, IBM is identifying many potential uses for Watson’s underlying QA technology.
The technology underlying Watson analyzes the structure and wording of the question or challenge being investigated, and formulates an answer that it has the highest level of ‘confidence’ is correct. Watson answers ‘natural language’ questions, which can contain puns, slang, jargon and acronyms that must all be evaluated as part of Watson’s confidence in returning an answer.

The Watson v. Jeopardy! man v. machine contest, featuring Jeopardy! champions Ken Jennings and Brad Rutter, will be re-broadcast in North America in mid-September.
“We recognized the Jeopardy! IBM Challenge was not only a historic moment for television, but also for scientific discovery and innovation,” said Harry Friedman, executive producer of Jeopardy! “We wanted to provide the opportunity for more viewers to once again enjoy this ground-breaking exhibition match.”
IBM and the other contestants gave $1.25 million to charity, with $1 million coming from IBM.
What is Watson?
Watson, named after IBM founder Thomas J. Watson, is a breakthrough human achievement in the scientific field of Question and Answering, also known as “QA.” The Watson software is powered by an IBM POWER7 server optimized to handle the massive number of tasks that Watson must perform at rapid speeds to analyze complex language and answer questions posed in natural language with speed, accuracy and confidence.
Beyond providing correct responses, Watson had to analyze Jeopardy! clues that involved subtle meaning, irony, riddles, and other complexities in which humans excel and computers traditionally do not. The system incorporates a number of proprietary technologies for the specialized demands of processing an enormous number of concurrent tasks and data while analyzing information in real time.
You can learn more about the Watson research initiative here.
New developerWorks Podcast: Steve Jobs, HP, Motorola, Turbo’s 20th
This has been a crazy Friday, so I didn’t have much time to blog.
But, Scott Laningham and I were able to cut our first developerWorks “videopodcast,” where we covered some of the major recent IT and tech news, including the announcement of Steve Jobs resignation (I apologize in advance for saying his name both ways!), HP/Autonomy deal, Google/Motorola, and even a few bits on my 20th anniversary with Big Blue.
For those of you in the path of Irene, please be safe and heed all the warnings of your public officials. We’ll be thinking about you all along the East Coast down here in drought-laden Texas. We need some rain, but we prefer it not come in the form of a hurricane (although I’m sure some farmers in South Texas might argue with me about now).
Here in Austin, the forecast has us at around 109 degrees Fahrenheit tomorrow. Yikes!
The Legacy Of Steve Jobs
Minds greater than mine will write the eloquent and fitting tributes to Steve Jobs’ reign as CEO of Apple, both as co-founder and and Renaissance CEO king who could do no wrong.
Me, I’m simply stunned at the suddenness of the announcement.
We all knew this day would soon arrive, but having watched IBM and Apple be both partner, competitor, and “co-opetor” during my own twenty-year tenure at Big Blue, many of us also perhaps came to think of Steve Jobs as invincible.
While it would be easy to sit back and write plaudits and wonderful things about Jobs as a business leader and innovator, it’s much easier to sit back and reminisce about the impact his tools and technologies have had on me personally.
I first used a Mac during one of my first real office jobs in college, using Pagemaker on a Mac SE to put together technical journals and even an underground newspaper. Back then, a portable computer meant carrying your heavy SE down to the local watering hole by hand.
Later, of course, came the first Mac I bought, the iMac, after having been enslaved on Wintel machines for much of my work experience, and later a range of Apple products, from iPods to MacBooks to the first iPad….
What always distinguished the Mac for me was it that they mostly just worked. If I were to compare the countless hours I spent tuning Microsoft Windows-based machines, going into control panels and registries and heaven knows where else I didn’t belong poking around just trying to get the things to run….well, with Apple machines I just did my work.
And that continues to hold true today.
Either I could focus on the work, or I could focus on the technology.
That fact alone may have been a key contributor to Apple’s now prominent economic position in the industry.
With Jobs leaving as CEO, of course, it raises the question as to whether or not that legacy will continue.
The pipeline of Jobs’ influenced products can only last so long. Can former COO and now CEO Tim Cook lead Apple to the new promised land?
I guess that depends on how much you think Apple has become a cult of personality (of Steve Jobs), or one more traditional in nature.
I’ve had friends who’ve worked at Apple who were pretty convincing that Jobs made a lot of decisions at the company, decisions that in a more hierarchical organization would have been made via a more decentralized structure, with seemingly less critical decisions pushed down into the organization.
No matter your belief, there’s no arguing about Jobs’ impact on not only the tech industry, but the media and entertainment industries, operating systems development, publishing, and others as well.
I don’t know how ill Jobs is or how much time he has left with us – none of us knows that about ourselves, for that matter – but I can say with the time he had, Jobs seems to have found his passion and made the most of it, and changed the world in the process.
And for that, we can all be thankful.
He set a high standard for himself and for everyone around him.
In so doing, he forced the rest of the industries he impacted to up their game, bigtime.
Sometimes they won, sometimes they failed, but they were always better off for having stretched by Apple to try and do their very best, just as Apple had.
That, in the end, may be the most crucial of Steve Job’s legacies: Always reaching to that next precipice to bring things to the world that people didn’t even known they needed, and making them better and easier to use all the while.
When It Rains, It Pours
Considering that the city of Austin and much of Texas have not gotten very much rain this year, it’s somewhat ironic that IBM and the University of Texas at Austin are announcing today applied advanced analytics solutions for river systems that can help with flood prevention and preparedness.
Floods are the most common natural disaster in the United States, but traditionally flood prediction methods are focused only on the main stems of the largest rivers — overlooking extensive tributary networks where flooding actually starts, and where flash floods threaten lives and property.
IBM researchers and researchers from UT Austin are using these analytics to predict the Guadalupe River’s behavior more than 100 times the normal speed.
IBM’s new flood prediction technology can simulate tens of thousands of river branches at a time, and could scale further to predict the behavior of millions of branches simultaneously.
By coupling analytics software with advanced weather simulation models, such as IBM’s Deep Thunder, municipalities and disaster response teams could make emergency plans and pinpoint potential flood areas on a river.
As a testing ground, the team is presently applying the model to predict the entire 230 mile-long Guadalupe River and over 9,000 miles of tributaries in Texas. In a single hour the system can currently generate up to 100 hours of river behavior!
“Combining IBM’s complex system modeling with our research into river physics, we’ve developed new ways to look at an old problem,” said Ben Hodges, Associate Professor at UT Austin Center for Research in Water Resources. “Unlike previous methods, the IBM approach scales-up for massive networks and has the potential to simulate millions of river miles at once. With the use of river sensors integrated into web-based information systems, we can take this model even further.”
Speed on this scale is a significant advantage for smaller scale river problems, such as urban and suburban flash flooding caused by severe thunderstorms.
Within the emergency response network in Austin, Texas, professors from University of Texas at Austin are linking the river model directly to NEXRAD radar precipitation to better predict flood risk on a creek-by-creek basis.
In addition to flood prediction, a similar system could be used for irrigation management, helping to create equitable irrigation plans and ensure compliance with habitat conservation efforts.
The models could allow managers to evaluate multiple “what if” scenarios to create better plans for handling both droughts and water surplus.
The project is currently being run on IBM’s Power 7 systems, which accelerate the simulation and prediction significantly, allowing for additional disaster prevention and emergency response preparation.
Old New Toys
I’ve been following this whole HP TouchPad fire sale with much amusement.
I stopped by my friendly Best Buy late last week to take a stroll and try to avoid temptation (it’s a willpower thing) to buy something, anything.
The HP TouchPads were sitting on their pedestal at the end of aisle, all lonely and glancing wantonly over the aisle at the iPad 2s, which had actual humans picking them up and playing with them.
The whole thing reminded me of “Toy Story 3,” where the old toys never get played with by the kids. Only in this saga, the old toys were the new toys, and the new toys old, and it was the old new toys getting played with and not the new old toys!
Then, HP announces its decision that it’s going to sell of its PC unit (Wait a minute, didn’t we do that back in 2005??), and lowers the price on the HP TouchPad –- a liquidation event of the HP Way kind –- offering up 16GB TouchPads for a bargain basement $99!!
So, then the market, with complete rational unrationality, goes nutso, and the next thing you know, HP TouchPads are selling on eBay for upwards of $300!
Mon dieu, I love this industry. And people wonder why I’ve stuck around here for 20 years? It’s never a dull moment!
Although I have to say, I’m not completely in love with the fact that HP’s leaving the PC biz.
I bought one of their computers last year. I’m an equal opportunity PC purchaser. I own an Acer netbook, a MacBook and MacBook Pro, an IBM ThinkPad, the HP Pavilion, and this Dell Latitude that I’ve been using recently and am really digging.
I bought the HP ‘cause I loved the keyboard – it felt just like the MacBook keyboard, only without the MacBook price. Hey, when you write a lot, keyboards matter.
I think I got the HP Pavilion at Office Depot for also a good price, around $550 (with rebate). Now I’m wondering if I can put it up for sell on eBay for $1,000, see if I can’t tap into some of that HP sentimentality!
Of course, I paid some beaucoup bucks for my first gen iPad back in April of 2010. And I didn’t even have good reason to buy the thing – I just gave in to temptation. But after over a year’s use now, and having traveled the globe with the thing, I have to say, I’m a pretty happy iPad camper.
I’ve used it for everything from reading books and magazines and newspapers online (my primary use), to playing video games, to watching Netflix, to writing blog posts. Tablet computing’s time has come, although if you forced me to admit it, I’d explain that I really do miss the mouse while using an iPad.
Someone still needs to build the better mouse for tablets!
In the meantime, I’m going to share soon the fruits of another new technology experience I’ve had, that of working with Nuance’s Dragon Dictation 11 software.
People have worried me for years about the day I would start talking to my computer. I’m here to tell you, that day has come — and it’s not pretty.
The moment it starts talking back, I’ll know I’m in trouble.
20 Years @ Big Blue
Happy Anniversary to me!
Today, I celebrate my 20th year with the IBM Company.
When I tell people I’ve been here this long, they just shake their heads. People just don’t do that anymore!
It’s been a wild and amazing ride, and the interesting thing is, it only gets that more interesting. I can honestly say that I’ve honestly said at any number of particular points in my career, “it only gets more interesting.”
These days, it’s the opportunity to further explore the outer reaches of search marketing, customer response management online, and social media intelligence.
When I started IBM at its Southlake facility on August 19, 1991, near the DFW airport, it was desktop publishing.
In between, it was OS/2 v. Windows, the early commercialization of the Internet, the Y2K threat, IBM’s own transformation into an e-business, the rise of Linux and open computing, and so much more.
The day I started IBM was the same day that Boris Yeltsin stood on the tanks outside the Russian White House, in defiance of the coup plotters. But instead of getting the news from my iPad, I got it from a printed edition of The New York Times.
When I started work here, I was 25 years old and greener than Augusta National golf course. I remember them telling me I had to talk to people on the phone: What was I going to say??
My computer was a PS/2 workstation when I started, running my beloved OS/2, but a lot of our work was done via the mainframe green screen (VM!). I sometimes miss those character-based terminals. They weren’t always pretty, but they were FAST, and they got the job done (which, for me, at the time was as a writer and editor of several IBM magazines).
I still remember putting up my first Website. I was not then, and still am not, a programmer, but I taught myself HTML so I could publish our magazine Software Quarterly on the World Wide Web. Nobody knew what that was at the time, but that didn’t stop me.
During my tenure, I’ve visited cities and countries that I never envisioned I would ever see in person, and in the process I’ve gained a greater understanding of the world and our collective humanity.
I’ve also witnessed some dramatic evolutions of the conditions of the IBM business, of the use of our technology to solve real-world business problems, and dramatic changes in our communications and marketing.
When I first joined, IBM was talking to the world about building solutions for a smaller planet.
Now, responding to the challenging business conditions and the unique opportunities a smaller, networked world presents, we’re talking about a smarter planet instead.
That’s a perfect reduction of my past twenty years with Big Blue — my own world has become much smaller and much smarter.
And that, I can assure you, is because of the gift of having had the opportunity to work with some of the most talented people around on this smaller and smarter world.
Because if you think it’s a small world outside Big Blue, you should see what life is like inside IBM after twenty years.
You find yourself working with people for awhile, then moving on and working with another group of people, only to years later, finding yourself working with someone else you’ve worked with before, and this time, like you never missed a beat.
I think maybe we should start referring to the company as “Small Blue” instead.
No matter the moniker, it’s the rare opportunity a human gets to do work that one loves in collaboration with people whom one greatly admires in an effort to literally change the way the world does its business, and all while having the chance to travel to the nether regions of our smaller and smarter planet.
To all of you inside and outside “Small Blue” who have played a part in my 20- year journey thus far, in this, IBM’s own centennial year, I just wanted to take this quick opportunity to say “Thanks!”
Or should I say, T-H-I-N-K. ; )
Brain Man
Exciting news from IBM Research today.
The researchers there unveiled a new generation of experimental computer chips today that are designed to emulate the brain’s abilities for perception, action and cognition.

This image provides a map of the brain's network connections, which the new IBM SyNAPSE processors are looking to emulate.
Such a technology could yield many orders of magnitude less power consumption and space than is used in today’s computers.
Tastes great, less filling!
This breakthrough is a sharp departure from more traditional concepts in designing and building computers. These first neurosynaptic (say that three times quickly) computing chips recreate the phenomena between spiking neurons and synapses in biological systems, such as the brain (well, most brains), through advanced algorithms and silicon circuitry.
The first two prototype chips have already been fabricated and are currently undergoing testing (no Frankenstein jokes, please).
Called “cognitive computers,” systems built with these chips won’t be programmed the same way traditional computers are today.
Rather, cognitive computers are expected to learn through experiences, find correlations, create hypotheses, and remember — and learn from — the outcomes, mimicking the brains structural and synaptic plasticity.
To do this, IBM is combining principles from nanoscience, neuroscience and supercomputing as part of a multi-year cognitive computing system.
Faster Synapses
The company and its university collaborators also announced they have been awarded approximately $21 million in new funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 2 of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project.
The goal of SyNAPSE is to create a system that not only analyzes complex information from multiple sensory modalities at once, but also dynamically rewires itself as it interacts with its environment –- all while rivaling the brain’s compact size and low power usage.
“This is a major initiative to move beyond the von Neumann paradigm that has been ruling computer architecture for more than half a century,” said Dharmendra Modha, project leader for IBM Research.
“Future applications of computing will increasingly demand functionality that is not efficiently delivered by the traditional architecture. These chips are another significant step in the evolution of computers from calculators to learning systems, signaling the beginning of a new generation of computers and their applications in business, science and government.”
Neurosynaptic Chips
While they contain no biological elements, IBM’s first cognitive computing prototype chips use digital silicon circuits inspired by neurobiology to make up what is referred to as a “neurosynaptic core” with integrated memory (replicated synapses), computation (replicated neurons) and communication (replicated axons).
IBM has two working prototype designs. Both cores were fabricated in 45 nm SOI-CMOS and contain 256 neurons. One core contains 262,144 programmable synapses and the other contains 65,536 learning synapses.
The IBM team has successfully demonstrated simple applications like navigation, machine vision, pattern recognition, associative memory and classification.
(There’s no word yet as to whether or not this new technology might could revive Lee Majors career as the six million dollar man.)
IBM’s overarching cognitive computing architecture is an on-chip network of light-weight cores, creating a single integrated system of hardware and software.
This architecture represents a critical shift away from traditional von Neumann computing to a potentially more power-efficient architecture that has no set programming, integrates memory with processor, and mimics the brain’s event-driven, distributed and parallel processing.
IBM’s long-term goal is to build a chip system with ten billion neurons and hundred trillion synapses, while consuming merely one kilowatt of power and occupying less than two liters of volume.
Smarter Chips For The Real World
Future chips will be able to ingest information from complex, real-world environments through multiple sensory modes and act through multiple motor modes in a coordinated, context-dependent manner.
For example, a cognitive computing system monitoring the world’s water supply could contain a network of sensors and actuators that constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making.
Similarly, a grocer stocking shelves could use an instrumented glove that monitors sights, smells, texture and temperature to flag bad or contaminated produce.
Making sense of real-time input flowing at an ever-dizzying rate would be a Herculean task for today’s computers, but would be natural for a brain-inspired system.
“Imagine traffic lights that can integrate sights, sounds and smells and flag unsafe intersections before disaster happens or imagine cognitive co-processors that turn servers, laptops, tablets, and phones into machines that can interact better with their environments,” said Dr. Modha.
IBM has a rich history in the area of artificial intelligence research going all the way back to 1956 when IBM performed the world’s first large-scale large-scale (512 neuron) cortical simulation. Most recently, IBM Research scientists created Watson, an analytical computing system that specializes in understanding natural human language and provides specific answers to complex questions at rapid speeds.
Watson represents a tremendous breakthrough in computers understanding natural language, “real language” that is not specially designed or encoded just for computers, but language that humans use to naturally capture and communicate knowledge.
IBM’s cognitive computing chips were built at its highly advanced chip-making facility in Fishkill, N.Y. and are currently being tested at its research labs in Yorktown Heights, N.Y. and San Jose, Calif.