Archive for August 2011
IBM today announced a definitive agreement to acquire i2 to accelerate its business analytics initiatives and help clients in the public and private sectors address crime, fraud and security threats.
Financial terms were not disclosed.
With more than 4,500 customers in 150 countries, i2 is a leading provider of intelligence analytics for crime and fraud prevention based in Cambridge, UK with U.S. headquarters in McLean, Va. i2’s clients span multiple sectors globally such as banking, defense, health care, insurance, law enforcement, national security and retail. i2 solutions are currently used by 12 of the top 20 retail banks globally and eight of the top 10 largest companies in the world.
Organizations in both the public and private sectors today are facing an exponential increase in “big data” — information and intelligence coming from disparate and unstructured sources including social media, biometrics and criminal databases.
When it is accessible to the people who need it, this information can be used to anticipate potential problems, make better, faster decisions, and coordinate resources to deliver exceptional service to citizens and customers.
This acquisition will extend IBM’s leadership in helping clients harness this data through the addition of i2’s intelligence analysis and tactical lead generation capabilities, which help cities, nations, international bodies and private enterprises combat fraud and security threats.
Fraud and security threat reduction is of strategic importance across all industries today. According to FBI statistics, in 2009 there were 1.3 million violent crimes, 9.3 million property crimes and 6.3 million larceny/thefts in the United States. Consumer-facing fraud for retailers alone costs $100 billion a year in the United States.
In response to these threats, businesses and government agencies are seeking to empower public safety officers, analysts, managers, detectives and investigators with industry-specific intelligence analysis and operational insight solutions. For example, government agencies today are concerned with increased threats to public safety, which are driving the need for secure information sharing.
Using IBM real-time analytical solutions in combination with the technologies of i2, public agencies and private enterprises battling fraud will now have the capability to better collect, analyze and process all the relevant data at their disposal. In the past, data overload often led to critical information or opportunities being missed.
Now local, state, and federal authorities can harness new intelligence to instantly detect and respond to security threats. Investigative leads can be identified quickly, helping government agencies solve crimes faster and keep officers and communities safer.
With IBM and i2, clients will have access to a comprehensive range of visualization and multidimensional analytics for the timely delivery of intelligence, including threat and fraud analytics. These tools will help analysts quickly identify connections, patterns and trends in complex data sets and easily model data in the way they think, in a single environment yielding faster analysis results, strategic reports and bulletins.
i2 will be integrated into IBM’s Software Group.
Headquartered in the UK, i2 has 350 employees and additional offices in McLean, Va.; Tucson, Ariz.; Ottawa, Ontario; and Canberra, Australia. The acquisition is anticipated to close in the fourth quarter of 2011, subject to the satisfaction of customary closing conditions and applicable regulatory reviews.
This year seems as though it’s been nothing but a series of disasters.
The Japan earthquake and tsunami. An horrific season of tornadoes across the south and mid-west. Amazing drought throughout Texas, where agricultural losses are upwards of $3B. And our most recent friend, Hurricane Irene, which visited devastation up the mid-Atlantic and, incredibly, leaving Vermont and Connecticut more harmed than anyone would have estimated.
It just goes to show, you can never be ready enough for acts of God.
That includes individuals and businesses which are dependent on their IT environments for conducting their business and ensuring continuity through one of these disasters.
In preparation for Irene, we saw many people in high risk areas rushing around to buy emergency supplies like flashlights, water, and wood to board up their houses. But how many considered the preparedness of businesses and government agencies?
Given these impending natural disaster and other top causes of disasters like power outages and network failure that disrupt the flow of information, businesses and individuals should also be assessing their business and disaster recovery plans in advance of disaster scenarios, when things are calmer and they can focus on sensible risk mitigation.
In today’s on demand environment, it’s critically important to rapidly adapt and respond to risks, as well as opportunities, to maintain continuous access to data for personal and business reasons.
IBM recently offered up a few tips on disaster preparedness:
- Validate your data backup plan – Verify that your data is out of harm’s way and/or is accessible to your recovery location. Consider using a cloud service to store key data and allow your organization more flexibility to respond to changing conditions with minimal interruption to the business.
- Consider employees and the personal impact of a disaster – A company’s most important asset are their people, but the most important asset for people are their families. Consider how you would move them and their families if required, think about providing financial support to your employees during a crisis event, and consider offering counseling to help them deal with the aftermath of the crisis.
- Develop various ways to communicate with employees, partners – After people, the next most important element is communication. Communications efforts must be timely, clear and honest, as miscommunication can make a disaster even worse. Consider how you would communicate with your employees, partners, clients, media, industry, and vice versa, what training you have provided, what tools are you using and — very important — test the communications plan.
- Think about the “domino effect” when considering business risk – Years of experience monitoring regional disasters has shown that these events often create other events. For example, a hurricane normally has high winds and heavy rains that can lead to flooding, structural damage, power outage, telecommunication and/or travel disruptions.
- Plan for catastrophic events that could last a while – For example, businesses must consider the impact if the duration of the disruption to the facility, network, technology, or people is longer than a period of three days, one week, etc. Over the past decade, we have seen more devastating disaster events with a longer term duration and financial impact. Companies need to consider their options if their primary environment or key people are not available for more than two weeks.
- Think broadly – Each company is part of a supply chain or network. While you may do everything right, if you have a critical partner, supplier, vendor or provider of service, your preparedness is only as good as those other businesses. As part of your disaster recovery plan, ensure everyone upstream and downstream from your business is also prepared.
With more than 40 years of experience keeping businesses up and running, IBM uses its software, hardware and services expertise to help clients and individuals across the globe to protect their data.
IBM helps them to manage risks, protect valuable business assets, comply with standards and regulations, and continue business operations.
“People and businesses are relying on technology now more than ever, which creates an urgent need to protect critical data and keep IT systems up and running when a natural disaster or other unexpected outage occurs,” said Rick Ruiz, general manager of IBM’s Business Continuity and Resiliency Services. “In these situations, it’s clear that those who have moved from the old model of ‘experience and react’ to a new one of ‘anticipate and adjust’ will fare much better.”
Visit this site to learn more about IBM’s Disaster Recovery Services.
If you missed your chance to watch the competition aired nationally in North America this past February between IBM and America’s favorite quiz show Jeopardy!, fear not: IBM announced today that Jeopardy! will broadcast an encore presentation of the first-ever man vs. machine Jeopardy! competition between IBM’s “Watson” computing system and the show’s two greatest contestants – Ken Jennings and Brad Rutter.
Millions of North American viewers will be able to again witness TV history as Watson successfully competes against two human champions in two matches played over three consecutive days, September 12, 13, and 14, 2011.
(Spoiler Alert: developerWorks’ Scott Laningham and I interviewed the principal investigator and project lead of the Watson effort, Dr. David Ferrucci, during this year’s SXSW Interactive conference in March of this year. Do NOT watch the video Q&A below if you haven’t yet seen the broadcast/re-broadcast if you don’t want to spoil the ending! In the interview, Ferrucci explains in some detail the AI methods behind Watson’s madness!)
“With the Jeopardy! challenge, we accomplished what was thought to be impossible – building a computer system that operates in the near limitless, ambiguous and highly contextual realm of human language and knowledge,” said Dr. David Ferrucci, IBM Fellow and scientist leading the IBM Research team that created Watson. “Watching the match again reminds us of the great power and potential behind Watson to be able to make sense of the massive amounts of data around us and to solve problems in new ways.”
Six months after the original competition, Watson’s Deep Question Answering (QA) technology has already driven progress in new fields such as the healthcare industry. IBM is working with Nuance Communications, Inc. to explore and develop applications to help critical decision makers, such as physicians and nurses, process large volumes of health information in order to deliver quicker and more accurate patient diagnoses. Working with universities and clients, IBM is identifying many potential uses for Watson’s underlying QA technology.
The technology underlying Watson analyzes the structure and wording of the question or challenge being investigated, and formulates an answer that it has the highest level of ‘confidence’ is correct. Watson answers ‘natural language’ questions, which can contain puns, slang, jargon and acronyms that must all be evaluated as part of Watson’s confidence in returning an answer.
“We recognized the Jeopardy! IBM Challenge was not only a historic moment for television, but also for scientific discovery and innovation,” said Harry Friedman, executive producer of Jeopardy! “We wanted to provide the opportunity for more viewers to once again enjoy this ground-breaking exhibition match.”
IBM and the other contestants gave $1.25 million to charity, with $1 million coming from IBM.
What is Watson?
Watson, named after IBM founder Thomas J. Watson, is a breakthrough human achievement in the scientific field of Question and Answering, also known as “QA.” The Watson software is powered by an IBM POWER7 server optimized to handle the massive number of tasks that Watson must perform at rapid speeds to analyze complex language and answer questions posed in natural language with speed, accuracy and confidence.
Beyond providing correct responses, Watson had to analyze Jeopardy! clues that involved subtle meaning, irony, riddles, and other complexities in which humans excel and computers traditionally do not. The system incorporates a number of proprietary technologies for the specialized demands of processing an enormous number of concurrent tasks and data while analyzing information in real time.
You can learn more about the Watson research initiative here.
This has been a crazy Friday, so I didn’t have much time to blog.
But, Scott Laningham and I were able to cut our first developerWorks “videopodcast,” where we covered some of the major recent IT and tech news, including the announcement of Steve Jobs resignation (I apologize in advance for saying his name both ways!), HP/Autonomy deal, Google/Motorola, and even a few bits on my 20th anniversary with Big Blue.
For those of you in the path of Irene, please be safe and heed all the warnings of your public officials. We’ll be thinking about you all along the East Coast down here in drought-laden Texas. We need some rain, but we prefer it not come in the form of a hurricane (although I’m sure some farmers in South Texas might argue with me about now).
Here in Austin, the forecast has us at around 109 degrees Fahrenheit tomorrow. Yikes!
Minds greater than mine will write the eloquent and fitting tributes to Steve Jobs’ reign as CEO of Apple, both as co-founder and and Renaissance CEO king who could do no wrong.
Me, I’m simply stunned at the suddenness of the announcement.
We all knew this day would soon arrive, but having watched IBM and Apple be both partner, competitor, and “co-opetor” during my own twenty-year tenure at Big Blue, many of us also perhaps came to think of Steve Jobs as invincible.
While it would be easy to sit back and write plaudits and wonderful things about Jobs as a business leader and innovator, it’s much easier to sit back and reminisce about the impact his tools and technologies have had on me personally.
I first used a Mac during one of my first real office jobs in college, using Pagemaker on a Mac SE to put together technical journals and even an underground newspaper. Back then, a portable computer meant carrying your heavy SE down to the local watering hole by hand.
Later, of course, came the first Mac I bought, the iMac, after having been enslaved on Wintel machines for much of my work experience, and later a range of Apple products, from iPods to MacBooks to the first iPad….
What always distinguished the Mac for me was it that they mostly just worked. If I were to compare the countless hours I spent tuning Microsoft Windows-based machines, going into control panels and registries and heaven knows where else I didn’t belong poking around just trying to get the things to run….well, with Apple machines I just did my work.
And that continues to hold true today.
Either I could focus on the work, or I could focus on the technology.
That fact alone may have been a key contributor to Apple’s now prominent economic position in the industry.
With Jobs leaving as CEO, of course, it raises the question as to whether or not that legacy will continue.
The pipeline of Jobs’ influenced products can only last so long. Can former COO and now CEO Tim Cook lead Apple to the new promised land?
I guess that depends on how much you think Apple has become a cult of personality (of Steve Jobs), or one more traditional in nature.
I’ve had friends who’ve worked at Apple who were pretty convincing that Jobs made a lot of decisions at the company, decisions that in a more hierarchical organization would have been made via a more decentralized structure, with seemingly less critical decisions pushed down into the organization.
No matter your belief, there’s no arguing about Jobs’ impact on not only the tech industry, but the media and entertainment industries, operating systems development, publishing, and others as well.
I don’t know how ill Jobs is or how much time he has left with us – none of us knows that about ourselves, for that matter – but I can say with the time he had, Jobs seems to have found his passion and made the most of it, and changed the world in the process.
And for that, we can all be thankful.
He set a high standard for himself and for everyone around him.
In so doing, he forced the rest of the industries he impacted to up their game, bigtime.
Sometimes they won, sometimes they failed, but they were always better off for having stretched by Apple to try and do their very best, just as Apple had.
That, in the end, may be the most crucial of Steve Job’s legacies: Always reaching to that next precipice to bring things to the world that people didn’t even known they needed, and making them better and easier to use all the while.
Considering that the city of Austin and much of Texas have not gotten very much rain this year, it’s somewhat ironic that IBM and the University of Texas at Austin are announcing today applied advanced analytics solutions for river systems that can help with flood prevention and preparedness.
Floods are the most common natural disaster in the United States, but traditionally flood prediction methods are focused only on the main stems of the largest rivers — overlooking extensive tributary networks where flooding actually starts, and where flash floods threaten lives and property.
IBM researchers and researchers from UT Austin are using these analytics to predict the Guadalupe River’s behavior more than 100 times the normal speed.
IBM’s new flood prediction technology can simulate tens of thousands of river branches at a time, and could scale further to predict the behavior of millions of branches simultaneously.
By coupling analytics software with advanced weather simulation models, such as IBM’s Deep Thunder, municipalities and disaster response teams could make emergency plans and pinpoint potential flood areas on a river.
As a testing ground, the team is presently applying the model to predict the entire 230 mile-long Guadalupe River and over 9,000 miles of tributaries in Texas. In a single hour the system can currently generate up to 100 hours of river behavior!
“Combining IBM’s complex system modeling with our research into river physics, we’ve developed new ways to look at an old problem,” said Ben Hodges, Associate Professor at UT Austin Center for Research in Water Resources. “Unlike previous methods, the IBM approach scales-up for massive networks and has the potential to simulate millions of river miles at once. With the use of river sensors integrated into web-based information systems, we can take this model even further.”
Speed on this scale is a significant advantage for smaller scale river problems, such as urban and suburban flash flooding caused by severe thunderstorms.
Within the emergency response network in Austin, Texas, professors from University of Texas at Austin are linking the river model directly to NEXRAD radar precipitation to better predict flood risk on a creek-by-creek basis.
In addition to flood prediction, a similar system could be used for irrigation management, helping to create equitable irrigation plans and ensure compliance with habitat conservation efforts.
The models could allow managers to evaluate multiple “what if” scenarios to create better plans for handling both droughts and water surplus.
The project is currently being run on IBM’s Power 7 systems, which accelerate the simulation and prediction significantly, allowing for additional disaster prevention and emergency response preparation.