Archive for the ‘ibm research’ Category
IBM made a significant announcement earlier today concerning new technologies designed to help companies and governments tackle Big Data by making it simpler, faster and more economical to analyze massive amounts of data. The new data acceleration innovation results in as much as 25 times faster reporting and analytics.
Today’s announcement, which represents the work of hundreds of IBM developers and researchers in labs around the world, includes an industry-first innovation called “BLU Acceleration,” which combines a number of techniques to dramatically improve analytical performance and simplify administration.
Also announced was the new IBM PureData System for Hadoop, designed to make it easier and faster to deploy Hadoop in the enterprise. Hadoop is the game-changing open-source software used to organize and analyze vast amounts of structured and unstructured data, such as posts to social media sites, digital pictures and videos, online transaction records, and cell phone location data.
The new system can reduce from weeks to minutes the ramp-up time organizations need to adopt enterprise-class Hadoop technology with powerful, easy-to-use analytic tools and visualization for both business analysts and data scientists.
In addition, it provides enhanced Big Data tools for monitoring, development and integration with many more enterprise systems.
IBM Big Data Innovations: More Accessible, Enterprise-ready
As organizations grapple with a flood of structured and unstructured data generated by computers, mobile devices, sensors and social networks, they’re under unprecedented pressure to analyze much more data at faster speeds and at lower costs to help deepen customer relationships, prevent threat and fraud, and identify new revenue opportunities.
BLU Acceleration enables users to have much faster access to key information, leading to better decision-making. The software extends the capabilities of traditional in-memory systems — which allows data to be loaded into Random Access Memory instead of hard disks for faster performance — by providing in-memory performance even when data sets exceed the size of the memory.
During testing, some queries in a typical analytics workload were more than 1000 times faster when using the combined innovations of BLU Acceleration.
Innovations in BLU Acceleration include “data skipping,” which allows the ability to skip over data that doesn’t need to be analyzed, such as duplicate information; the ability to analyze data in parallel across different processors; and greater ability to analyze data transparently to the application, without the need to develop a separate layer of data modeling.
Another industry-first advance in BLU Acceleration is called “actionable compression,” where data no longer has to be decompressed to be analyzed.
Not IBM’s First Big Data Rodeo
The new offerings expand what is already the industry’s deepest portfolio of Big Data technologies and solutions, spanning software, services, research and hardware. The IBM Big Data platform combines traditional data warehouse technologies with new Big Data techniques, such as Hadoop, stream computing, data exploration, analytics and enterprise integration, to create an integrated solution to address these critical needs.
IBM PureData System for Hadoop is the next step forward in IBM’s overall strategy to deliver a family of systems with built-in expertise that leverages its decades of experience reducing the cost and complexity associated with information technology.
This new system integrates IBM InfoSphere BigInsights, which allows companies of all sizes to cost-effectively manage and analyze data and add administrative, workflow, provisioning and security features, along with best-in-class analytical capabilities from IBM Research.
Today’s announcement also includes the following new versions of IBMs Big Data solutions:
- A new version of InfoSphere BigInsights, IBM’s enterprise-ready Hadoop offering, which makes it simpler to develop applications using existing SQL skills, compliance security and high availability features vital for enterprise applications. BigInsights offers three entry points: free download, enterprise software and now an expert integrated system, IBM PureData System for Hadoop.
- A new version of InfoSphere Streams, unique “stream computing” software that enables massive amounts of data in motion to be analyzed in real-time, with performance improvements, and simplified application development and deployment.
- A new version of Informix including TimeSeries Acceleration for operational reporting and analytics on smart meter and sensor data.
Pricing and Availability
All offerings are available in Q2, except the PureData System for Hadoop, which will start shipping to customers in the second half 2013. Credit-qualified clients can take advantage of simple, flexible lease and loan packages with no up-front payments for the software and systems that deliver a new generation of data analytics.
IBM Global Financing offers attractive leasing programs with 90-day payment deferrals for the PureData System for Hadoop, as well as zero percent loans for the broader portfolio of IBM big data solutions.
Back in September of 2011 I mentioned in this blog post that one of Watson’s first jobs outside of playing Jeopardy! was going to be in the healthcare industry.
Well, earlier today WellPoint, Inc. and Memorial Sloan-Kettering Cancer Center today unveiled the first commercially developed Watson-based cognitive computing breakthroughs.
These innovations stand alone to help transform the quality and speed of care delivered to patients through individualized, evidence based medicine.
Check out this short video to learn more about how physicians and other medical professionals are able to use IBM’s Watson technology to help them with their medical diagnostic tasks.
The American Cancer Society projects that 1.6 million new cancer cases will be diagnosed in the U.S. this year alone. Studies suggest that the complexities associated with healthcare have caused one in five health care patients to receive a wrong or incomplete diagnosis.
These statistics, coupled with a data explosion of medical information that is doubling every five years, represents an unprecedented opportunity for the health care industry and next generation cognitive computing systems, to combine forces in new ways to improve how medicine is taught, practiced and paid for.
For more than a year now, IBM has partnered separately with WellPoint and Memorial Sloan-Kettering to train Watson in the areas of oncology and utilization management.
During this time, clinicians and technology experts spent thousands of hours “teaching” Watson how to process, analyze and interpret the meaning of complex clinical information using natural language processing, all with the goal of helping to improve health care quality and efficiency.
“IBM’s work with WellPoint and Memorial Sloan-Kettering Cancer Center represents a landmark collaboration in how technology and evidence based medicine can transform the way in which health care is practiced,” said Manoj Saxena, IBM General Manager, Watson Solutions (see my interview with Manoj at last fall’s InterConnect event in Singapore further down in the post).
“These breakthrough capabilities bring forward the first in a series of Watson-based technologies, which exemplifies the value of applying big data and analytics and cognitive computing to tackle the industries most pressing challenges.”
Evidence Based Medicine: Addressing Oncology Issues By Quickly Assimilating Massive Amounts Of Medical Information
To date, Watson has ingested more than 600,000 pieces of medical evidence, two million pages of text from 42 medical journals and clinical trials in the area of oncology research.
Watson has the power to sift through 1.5 million patient records representing decades of cancer treatment history, such as medical records and patient outcomes, and provide to physicians evidence based treatment options all in a matter of seconds.
In less than a year, Memorial Sloan-Kettering has immersed Watson in the complexities of cancer and the explosion of genetic research which has set the stage for changing care practices for many cancer patients with highly specialized treatments based on their personal genetic tumor type.
Starting with 1,500 lung cancer cases, Memorial Sloan-Kettering clinicians and analysts are training Watson to extract and interpret physician notes, lab results and clinical research, while sharing its profound expertise and experiences in treating hundreds of thousands of patients with cancer.
“It can take years for the latest developments in oncology to reach all practice settings. The combination of transformational technologies found in Watson with our cancer analytics and decision-making process has the potential to revolutionize the accessibility of information for the treatment of cancer in communities across the country and around the world,” said Craig B.Thompson, M.D., President of Memorial Sloan-Kettering Cancer Center. “Ultimately, we expect this comprehensive, evidence-based approach will profoundly enhance cancer care by accelerating the dissemination of practice-changing research at an unprecedented pace.”
The Maine Center for Cancer Medicine and WESTMED Medical Group are the first two early adopters of the capability. Their oncologists will begin testing the product and providing feedback to WellPoint, IBM and Memorial Sloan-Kettering to improve usability.
Speeding Patient Care Through WellPoint’s Utilization Management Pilot
Throughout WellPoint’s utilization management pilot, Watson absorbed more than 25,000 test case scenarios and 1,500 real-life cases, and gained the ability to interpret the meaning and analyze queries in the context of complex medical data and human and natural language, including doctors notes, patient records, medical annotations and clinical feedback.
In addition, more than 14,700 hours of hands-on training was spent by nurses who meticulously trained Watson. Watson continues to learn while on the job, much like a medical resident, while working with the WellPoint nurses who originally conducted its training.
Watson started processing common, medical procedure requests by providers for members in WellPoint affiliated health plans in December, and was expanded to include five provider offices in the Midwest. Watson will serve as a powerful tool to accelerate the review process between a patient’s physician and their health plan.
“The health care industry must drive transformation through innovation, including harnessing the latest technology that will ultimately benefit the health care consumer,” said Lori Beer, WellPoint’s executive vice president of Specialty Businesses and Information Technology. “We believe that WellPoint’s data, knowledge and extensive provider network, combined with the IBM Watson technology and Memorial Sloan-Kettering’s oncological expertise can drive this transformation.”
Watson-Powered Health Innovations
As a result, IBM, Memorial Sloan-Kettering and WellPoint are introducing the first commercially based products based on Watson. These innovations represent a breakthrough in how medical professionals can apply advances in analytics and natural language processing to “big data,” combined with the clinical knowledge base, including genomic data, in order to create evidence based decision support systems.
These Watson-based systems are designed to assist doctors, researchers, medical centers, and insurance carriers, and ultimately enhance the quality and speed of care. The new products include the Interactive Care Insights for Oncology, powered by Watson, in collaboration with IBM, Memorial Sloan-Kettering and WellPoint.
The WellPoint Interactive Care Guide and Interactive Care Reviewer, powered by Watson, designed for utilization management in collaboration with WellPoint and IBM.
New Interactive Care Insights for Oncology
- The cognitive systems use insights gleaned from the deep experience of Memorial Sloan-Kettering clinicians to provide individualized treatment options based on patient’s medical information and the synthesis of a vast array of updated and vetted treatment guidelines, and published research.
- A first of-its-kind Watson-based advisor, available through the cloud, that is expected to assist medical professionals and researchers by helping to identify individualized treatment options for patients with cancer, starting with lung cancer.
- Provides users with a detailed record of the data and information used to reach the treatment options. Oncologists located anywhere can remotely access detailed treatment options based on updated research that will help them decide how best to care for an individual patient.
New WellPoint Interactive Care Guide and Interactive Care Reviewer
- Delivers the first Watson-based cognitive computing system anticipated to streamline the review processes between a patient’s physician and their health plan, potentially speeding approvals from utilization management professionals, reducing waste and helping ensure evidence-based care is provided.
- Expected to accelerate accepted testing and treatment by shortening pre-authorization approval time, which means that patients are moving forward with the first crucial step toward treatment more quickly.
- Analyzes treatment requests and matches them to WellPoint’s medical policies and clinical guidelines to present consistent, evidence-based responses for clinical staff to review, in the anticipation of providing faster, better informed decisions about a patient’s care.
- WellPoint has deployed Interactive Care Reviewer to a select number of providers in the Midwest, and believes more than 1,600 providers will be using the product by the end of the year.
Watson: Then and Now
The IBM Watson system gained fame by beating human contestants on the television quiz show Jeopardy! almost two years ago. Since that time, Watson has evolved from a first-of-a-kind status, to a commercial cognitive computing system gaining a 240 percent improvement in system performance, and a reduction in the system’s physical requirements by 75 percent and can now be run on a single Power 750 server.
The transformational technology, named after IBM founder Thomas J. Watson, was developed in IBM’s Research Labs. Using advances in natural language processing and analytics, the Watson technology can process information similar to the way people think, representing a significant shift in the ability for organizations to quickly analyze, understand and respond to vast amounts of Big Data.
The ability to use Watson to answer complex questions posed in natural language with speed, accuracy and confidence has enormous potential to improve decision making across a variety of industries from health care, to retail, telecommunications and financial services.
For more information on IBM Watson, please visit www.ibmwatson.com.
You can also follow Watson on Facebook here, and via Twitter at hashtag #IBMWatson.
And below, you can see the aforementioned video where I interviewed IBM Watson general manager Manoj Saxena about Watson’s future at last year’s IBM InterConnect event.
IBM released its annual “5 in 5″ list yesterday, the seventh year in a row whereby IBM scientists identify a list of innovations that have the potential to change the way people work, live and interact during the next five years.
The IBM 5 in 5 is based on market and societal trends, as well as emerging technologies from IBM’s R&D labs around the world. This year, the 5 explores innovations that will be underpinnings of the next era of computing, what IBM has described as “the era of cognitive systems.”
This next generation of machines will learn, adapt, sense, and begin to experience the world as it really is, and this year’s predictions focus on one element of the this new era: The ability of computers to mimic the human senses — in their own manner, to see, smell, touch, taste and hear.
But before you try and spoon-feed your iPad some vanilla yogurt, let’s get more practical.
These new sensing capabilities will help us become more aware, productive, and help us think — but not do our thinking for us.
Rather, cognitive systems will help us see through and navigate complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, and break down all kinds of barriers — geographical, language, cost, even accessibility.
Now, on to our five senses.
1) Touch: You will be able to touch through your phone. Imagine using your smartphone to shop for your wedding dress and being able to feel the satin or silk of the gown, or the lace on the veil, from the surface on the screen. Or to feel the beading and weave of a blanket made by a local artisan half way around the world. In five years, industries like retail will be transformed by the ability to “touch” a product through your mobile device.
IBM scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric — as a shopper brushes her finger over the image of the item on a device screen. Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material.
2) Sight: A pixel will be worth a thousand words. We take some 500 billion photos a year, and 72 hours of video is uploaded to YouTube every minute. But computers today only understand pictures by the text we use to tag or title them; the majority of the information — the actual content of the image — is a mystery.
In the next five years, systems will not only be able to look at and recognize the contents of images and visual data, they will turn the pixels into meaning, making sense out of it similar to the way a human views and interprets a photographs. In the future, “brain-like” capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media, having a potentially huge impact on industries ranging from healthcare to retail to agriculture.
But please, no Escher drawings, at least for now…that’s just plain mean.
3) Hearing: Computers will hear what matters. Ever wish you could make sense of all the sounds around you and be able to understand what’s not being said? Within five years, distributed systems of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies.
It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will “listen” to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.
I’m ever hopeful such systems will be able to “listen” to my golf swing and help me course correct so I can play more target golf!
4) Taste: Digital taste buds will help you to eat smarter. What if we could make healthy foods taste delicious using a different kind of computing system built for creativity? IBM researchers are developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer.
By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham.
“Top Tasting Computer Chefs,” anyone?
5) Smell: Computers will have a sense of smell. During the next five years, tiny sensors embedded in your computer or cell phone will detect if you’re coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone’s breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes, and epilepsy by detecting which odors are normal and which are not.
Already, IBM scientists are sensing environment conditions to preserve works of art, and this innovation is starting to be applied to tackle clinical hygiene, one of the biggest healthcare challenges today. In the next five years, IBM technology will “smell” surfaces for disinfectants to determine whether rooms have been sanitized. Using novel wireless mesh networks, data on various chemicals will be gathered and measured by sensors, and continuously learn and adapt to new smells over time.
Watch the video below to listen to IBM scientists describe some of these new innovations and their potential impact on our world.
It’s Monday, and here in Austin, Texas, it officially got cold overnight.
Yesterday, it was partly cloudy and almost steamy warm. And this morning, it’s like I was transplanted back to IBM’s Somers, New York, location, where the wind streams across the Westchester landscape and chills native Texans like me to their core.
But enough talk about the weather. I want to get to the topic of the day: Making little things that move information faster.
Earlier today, IBM announced a major advance in the ability to use light instead of electrical signals to transmit information for future computing.
The breakthrough technology — called “silicon nanophotonics” — allows the integration of different optical components side-by-side with electrical circuits on a single silicon chip using, for the first time, sub-100nm semiconductor technology.
Silicon nanophotonics takes advantage of pulses of light for communication and provides a super highway for large volumes of data to move at rapid speeds between computer chips in servers, large data centers, and supercomputers, thus alleviating the limitations of congested data traffic and high-cost traditional interconnects.
Big Light, Bigger Data
The amount of data being created and transmitted over enterprise networks continues to grow due to an explosion of new applications and services.
Silicon nanophotonics, now primed for commercial development, can enable the industry to keep pace with increasing demands in chip performance and computing power. Businesses are entering a new era of computing that requires systems to process and analyze, in real-time, huge volumes of information known as “big data.”
Silicon nanophotonics technology provides answers to big data challenges by seamlessly connecting various parts of large systems, whether few centimeters or few kilometers apart from each other, and move terabytes of data via pulses of light through optical fibers.
Building Proof Beyond Concept
Building on its initial proof of concept in 2010, IBM has solved the key challenges of transferring the silicon nanophotonics technology into the commercial foundry.
By adding a few processing modules into a high-performance 90nm CMOS fabrication line, a variety of silicon nanophotonics components such as wavelength division multiplexers (WDM), modulators, and detectors are integrated side-by-side with a CMOS electrical circuitry.
As a result, single-chip optical communications transceivers can be manufactured in a conventional semiconductor foundry, providing significant cost reduction over traditional approaches.
IBM’s CMOS nanophotonics technology demonstrates transceivers to exceed the data rate of 25Gbps per channel. In addition, the technology is capable of feeding a number of parallel optical data streams into a single fiber by utilizing compact on-chip wavelength-division multiplexing devices.
Learning More About Nanophotonics
The ability to multiplex large data streams at high data rates will allow future scaling of optical communications capable of delivering terabytes of data between distant parts of computer systems.
“This technology breakthrough is a result of more than a decade of pioneering research at IBM,” said Dr. John E. Kelly, Senior Vice President and Director of IBM Research. “This allows us to move silicon nanophotonics technology into a real-world manufacturing environment that will have impact across a range of applications.”
Further details will be presented this week by Dr. Solomon Assefa at the IEEE International Electron Devices Meeting (IEDM) in the talk titled, “A 90nm CMOS Integrated Nano-Photonics Technology for 25Gbps WDM Optical Communications Applications.”
You can learn more about IBM silicon integrated nanophotonics technology here.
Since I posted about Hurricane Sandy earlier in the day, I’ve seen some pretty stunning pictures and video coming in, and heard more reports from friends in and around the New York City area.
The story of the crane toppling over on a very tall building being built on West 57th Street, between 6th and 7th Avenues (my old IBM office is at Madison and 57th, further east) was most stunning. You can find some of the pics or video on CNN.
While we wait to discover how big a problem Sandy presents to the northeast Atlantic coast, I’ll share with you a diversion focusing on a much smaller topic — but one with potentially huge implications.
IBM scientists recently demonstrated a new approach to carbon technology that opens up the path for commercial fabrication of dramatically smaller, faster and more powerful computer chips.
For the first time, more than ten thousand working transistors made of nano-sized tubes of carbon have been precisely placed and tested in a single chip using standard semiconductor processes.
These carbon devices are poised to replace and outperform silicon technology allowing further miniaturization of computing components and leading the way for future microelectronics.
Four Decades Of Innovation
Aided by rapid innovation over four decades, silicon microprocessor technology has continually shrunk in size and improved in performance, thereby driving the information technology revolution.
Silicon transistors, tiny switches that carry information on a chip, have been made smaller year after year, but they are approaching a point of physical limitation.
Their increasingly small dimensions, now reaching the nanoscale, will prohibit any gains in performance due to the nature of silicon and the laws of physics. Within a few more generations, classical scaling and shrinkage will no longer yield the sizable benefits of lower power, lower cost and higher speed processors that the industry has become accustomed to.
Carbon nanotubes represent a new class of semiconductor materials whose electrical properties are more attractive than silicon, particularly for building nanoscale transistor devices that are a few tens of atoms across.
Electrons in carbon transistors can move easier than in silicon-based devices allowing for quicker transport of data. The nanotubes are also ideally shaped for transistors at the atomic scale, an advantage over silicon.
These qualities are among the reasons to replace the traditional silicon transistor with carbon — and coupled with new chip design architectures — will allow computing innovation on a miniature scale for the future.
The approach developed at IBM labs paves the way for circuit fabrication with large numbers of carbon nanotube transistors at predetermined substrate positions. The ability to isolate semiconducting nanotubes and place a high density of carbon devices on a wafer is crucial to assess their suitability for a technology — eventually more than one billion transistors will be needed for future integration into commercial chips.
Hardly A Carbon Copy
Until now, scientists have been able to place at most a few hundred carbon nanotube devices at a time, not nearly enough to address key issues for commercial applications.
Originally studied for the physics that arises from their atomic dimensions and shapes, carbon nanotubes are being explored by scientists worldwide in applications that span integrated circuits, energy storage and conversion, biomedical sensing and DNA sequencing.
This achievement was published today in the peer-reviewed journal Nature Nanotechnology.
Carbon, a readily available basic element from which crystals as hard as diamonds and as soft as the “lead” in a pencil are made, has wide-ranging IT applications.
Carbon nanotubes are single atomic sheets of carbon rolled up into a tube. The carbon nanotube forms the core of a transistor device that will work in a fashion similar to the current silicon transistor, but will be better performing. They could be used to replace the transistors in chips that power our data-crunching servers, high performing computers and ultra fast smart phones.
Earlier this year, IBM researchers demonstrated carbon nanotube transistors can operate as excellent switches at molecular dimensions of less than ten nanometers – the equivalent to 10,000 times thinner than a strand of human hair and less than half the size of the leading silicon technology. Comprehensive modeling of the electronic circuits suggests that about a five to ten times improvement in performance compared to silicon circuits is possible.
There are practical challenges for carbon nanotubes to become a commercial technology notably, as mentioned earlier, due to the purity and placement of the devices. Carbon nanotubes naturally come as a mix of metallic and semiconducting species and need to be placed perfectly on the wafer surface to make electronic circuits. For device operation, only the semiconducting kind of tubes is useful which requires essentially complete removal of the metallic ones to prevent errors in circuits.
Also, for large scale integration to happen, it is critical to be able to control the alignment and the location of carbon nanotube devices on a substrate.
To overcome these barriers, IBM researchers developed a novel method based on ion-exchange chemistry that allows precise and controlled placement of aligned carbon nanotubes on a substrate at a high density — two orders of magnitude greater than previous experiments, enabling the controlled placement of individual nanotubes with a density of about a billion per square centimeter.
The process starts with carbon nanotubes mixed with a surfactant, a kind of soap that makes them soluble in water. A substrate is comprised of two oxides with trenches made of chemically-modified hafnium oxide (HfO2) and the rest of silicon oxide (SiO2). The substrate gets immersed in the carbon nanotube solution and the nanotubes attach via a chemical bond to the HfO2 regions while the rest of the surface remains clean.
By combining chemistry, processing and engineering expertise, IBM researchers are able to fabricate more than ten thousand transistors on a single chip.
Furthermore, rapid testing of thousands of devices is possible using high volume characterization tools due to compatibility to standard commercial processes.
As this new placement technique can be readily implemented, involving common chemicals and existing semiconductor fabrication, it will allow the industry to work with carbon nanotubes at a greater scale and deliver further innovation for carbon electronics.
You can learn more in the animation below.