Archive for the ‘Uncategorized’ Category
IBM unveiled today the annual “IBM 5 in 5” (#ibm5in5) – a list of ground-breaking scientific innovations with the potential to change the way people work, live, and interact during the next five years.
Drum roll, please…:
In 1609, Galileo invented the telescope and saw our cosmos in an entirely new way. He proved the theory that Earth and other planets in our solar system revolve around the Sun, which until then was impossible to observe.
IBM Research continues this work through the pursuit of new scientific instruments – whether physical devices or advanced software tools – designed to make what’s invisible in our world visible, from the macroscopic level down to the nanoscale.
“The scientific community has a wonderful tradition of creating instruments to help us see the world in entirely new ways. For example, the microscope helped us see objects too small for the naked eye and the thermometer helped us understand temperature of the Earth and human body,” said Dario Gil, vice president of science & solutions at IBM Research. “With advances in artificial intelligence and nanotechnology, we aim to invent a new generation of scientific instruments that will make the complex invisible systems in our world today visible over the next five years.”
Innovation in this area could enable us to dramatically improve farming, enhance energy efficiency, spot harmful pollution before it’s too late, and prevent premature physical and mental health decline as examples. IBM’s global team of scientists and researchers is steadily bringing these inventions from the realm of our labs to the real world.
The IBM 5 in 5 is based on market and societal trends as well as emerging technologies from IBM’s Research labs around the world that can make these transformations possible. Here are the five scientific instruments that will make the invisible visible in the next 5 years:
Brain disorders, including developmental, psychiatric and neurodegenerative diseases, represent an enormous disease burden, in terms of human suffering and economic cost.
For example, today, one in five adults in the U.S. experiences a mental health condition such as depression, bipolar disease or schizophrenia, and roughly half of individuals with severe psychiatric disorders receive no treatment. The global cost of mental health conditions is projected to surge to US$6 trillion by 2030.
If the brain is a black box that we don’t fully understand, then speech is a key to unlock it. In five years, what we say and write will be used as indicators of our mental health and physical wellbeing.
Patterns in our speech and writing analyzed by new cognitive systems will provide tell-tale signs of early-stage developmental disorders, mental illness and degenerative neurological diseases that can help doctors and patients better predict, monitor and track these conditions.
At IBM, scientists are using transcripts and audio inputs from psychiatric interviews, coupled with machine learning techniques, to find patterns in speech to help clinicians accurately predict and monitor psychosis, schizophrenia, mania and depression. Today, it only takes about 300 words to help clinicians predict the probability of psychosis in a user.
In the future, similar techniques could be used to help patients with Parkinson’s, Alzheimer’s, Huntington’s disease, PTSD and even neurodevelopmental conditions such as autism and ADHD. Cognitive computers can analyze a patient’s speech or written words to look for tell-tale indicators found in language, including meaning, syntax and intonation.
Combining the results of these measurements with those from wearable devices and imaging systems and collected in a secure network can paint a more complete picture of the individual for health professionals to better identify, understand and treat the underlying disease.
What were once invisible signs will become clear signals of patients’ likelihood of entering a certain mental state or how well their treatment plan is working, complementing regular clinical visits with daily assessments from the comfort of their homes.
More than 99.9 percent of the electromagnetic spectrum cannot be observed by the naked eye. Over the last 100 years, scientists have built instruments that can emit and sense energy at different wavelengths.
Today, we rely on some of these to take medical images of our body, see the cavity inside our tooth, check our bags at the airport, or land a plane in fog. However, these instruments are incredibly specialized and expensive and only see across specific portions of the electromagnetic spectrum.
In five years, new imaging devices using hyperimaging technology and AI will help us see broadly beyond the domain of visible light by combining multiple bands of the electromagnetic spectrum to reveal valuable insights or potential dangers that would otherwise be unknown or hidden from view.
Most importantly, these devices will be portable, affordable and accessible, so superhero vision can be part of our everyday experiences.
A view of the invisible or vaguely visible physical phenomena all around us could help make road and traffic conditions clearer for drivers and self-driving cars. For example, using millimeter wave imaging, a camera and other sensors, hyperimaging technology could help a car see through fog or rain, detect hazardous and hard-to-see road conditions such as black ice, or tell us if there is some object up ahead and its distance and size.
Cognitive computing technologies will reason about this data and recognize what might be a tipped over garbage can versus a deer crossing the road, or a pot hole that could result in a flat tire.
Embedded in our phones, these same technologies could take images of our food to show its nutritional value or whether it’s safe to eat. A hyperimage of a pharmaceutical drug or a bank check could tell us what’s fraudulent and what’s not. What was once beyond human perception will come into view.
IBM scientists are today building a compact hyperimaging platform that “sees” across separate portions of the electromagnetic spectrum in one platform to potentially enable a host of practical and affordable devices and applications.
Today, the physical world only gives us a glimpse into our interconnected and complex ecosystem. We collect exabytes of data – but most of it is unorganized. In fact, an estimated 80 percent of a data scientist’s time is spent scrubbing data instead of analyzing and understanding what that data is trying to tell us.
Thanks to the Internet of Things, new sources of data are pouring in from millions of connected objects – from refrigerators, light bulbs and your heart rate monitor to remote sensors such as drones, cameras, weather stations, satellites and telescope arrays.
There are already more than six billion connected devices generating tens of exabytes of data per month, with a growth rate of more than 30 percent per year. After successfully digitizing information, business transactions and social interactions, we are now in the process of digitizing the physical world.
In five years, we will use machine learning algorithms and software to help us organize the information about the physical world to help bring the vast and complex data gathered by billions of devices within the range of our vision and understanding. We call this a “macroscope” – but unlike the microscope to see the very small, or the telescope that can see far away, it is a system of software and algorithms to bring all of Earth’s complex data together to analyze it for meaning.
By aggregating, organizing and analyzing data on climate, soil conditions, water levels and their relationship to irrigation practices, for example, a new generation of farmers will have insights that help them determine the right crop choices, where to plant them and how to produce optimal yields while conserving precious water supplies.
In 2012, IBM Research began investigating this concept at Gallo Winery, integrating irrigation, soil and weather data with satellite images and other sensor data to predict the specific irrigation needed to produce an optimal grape yield and quality. In the future, macroscope technologies will help us scale this concept to anywhere in the world.
Beyond our own planet, macroscope technologies could handle, for example, the complicated indexing and correlation of various layers and volumes of data collected by telescopes to predict asteroid collisions with one another and learn more about their composition.
Early detection of disease is crucial. In most cases, the earlier the disease is diagnosed, the more likely it is to be cured or well controlled. However, diseases like cancer can be hard to detect – hiding in our bodies before symptoms appear.
Information about the state of our health can be extracted from tiny bioparticles in bodily fluids such as saliva, tears, blood, urine and sweat. Existing scientific techniques face challenges for capturing and analyzing these bioparticles, which are thousands of times smaller than the diameter of a strand of human hair.
In the next five years, new medical labs “on a chip” will serve as nanotechnology health detectives – tracing invisible clues in our bodily fluids and letting us know immediately if we have reason to see a doctor. The goal is to shrink down to a single silicon chip all of the processes necessary to analyze a disease that would normally be carried out in a full-scale biochemistry lab.
The lab-on-a-chip technology could ultimately be packaged in a convenient handheld device to allow people to quickly and regularly measure the presence of biomarkers found in small amounts of bodily fluids, sending this information securely streaming into the cloud from the convenience of their home.
There it could be combined with real-time health data from other IoT-enabled devices, like sleep monitors and smart watches, and analyzed by AI systems for insights. When taken together, this data set will give us an in depth view of our health and alert us to the first signs of trouble, helping to stop disease before it progresses.
At IBM Research, scientists are developing lab-on-a-chip nanotechnology that can separate and isolate bioparticles down to 20 nanometers in diameter, a scale that gives access to DNA, viruses, and exosomes. These particles could be analyzed to potentially reveal the presence of disease even before we have symptoms.
Most pollutants are invisible to the human eye, until their effects make them impossible to ignore. Methane, for example, is the primary component of natural gas, commonly considered a clean energy source. But if methane leaks into the air before being used, it can warm the Earth’s atmosphere. Methane is estimated to be the second largest contributor to global warming after carbon dioxide (CO2).
In the United States, emissions from oil and gas systems are the largest industrial source of methane gas in the atmosphere. The U.S. Environmental Protection Agency (EPA) estimates that more than nine million metric tons of methane leaked from natural gas systems in 2014.
Measured as CO2-equivalent over 100 years, that’s more greenhouse gases than were emitted by all U.S. iron and steel, cement and aluminum manufacturing facilities combined.
In five years, new, affordable sensing technologies deployed near natural gas extraction wells, around storage facilities, and along distribution pipelines will enable the industry to pinpoint invisible leaks in real-time.
Networks of IoT sensors wirelessly connected to the cloud will provide continuous monitoring of the vast natural gas infrastructure, allowing leaks to be found in a matter of minutes instead of weeks, reducing pollution and waste and the likelihood of catastrophic events.
Scientists at IBM are tackling this vision, working with natural gas producers such as Southwestern Energy to explore the development of an intelligent methane monitoring system and as part of the ARPA-E Methane Observation Networks with Innovative Technology to Obtain Reductions (MONITOR) program.
At the heart of IBM’s research is silicon photonics, an evolving technology that transfers data by light, allowing computing literally at the speed of light.
These chips could be embedded in a network of sensors on the ground or within infrastructure, or even fly on autonomous drones; generating insights that, when combined with real-time wind data, satellite data, and other historical sources, can be used to build complex environmental models to detect the origin and quantity of pollutants as they occur.
For more information about the IBM 5 in 5, please visit: http://ibm.biz/five-in-five.
I wasn’t in Boston over the weekend, so I wasn’t there to see Neil Diamond sing “Sweet Caroline” live and in person at Fenway Park.
But I was introduced to the tradition during my own first visit to Fenway a year ago this May, and I can’t think of a more fitting way to kick away the dust of fear and horror last week than something as American as having Neil Diamond show up at the ball park to sing “Sweet Caroline”!
If you’ve never experienced it firsthand, basically here’s how it goes: In the middle of the eighth inning, since 2002, “Sweet Caroline” is played over the loudspeakers at Fenway, and the great citizens of Bostons (and Red Sox fans everywhere) do a little audience participation. It’s not quite a “Rocky Horror Picture Show” level of audience participation, but then again, this is baseball and we’re between innings people!
Go find some of the video recaps to see for yourself, but if you did see Diamond out there on the diamond doing it live this weekend, amd if that didn’t send a couple of tears to your eyes, you’d better check to make sure the drones from Tom Cruise’s new movie “Oblivion” (and which I saw this weekeend…two thumbs up!) haven’t taken over.
Of course, I guess if you didn’t want anyone to see you cry you could invest in some of these new techno glasses, Google’s or otherwise.
According to The New York Times, Oakley’s also getting into the act, working to introduce goggles that can display incoming text messages, have embedded GPS, Bluetooth, and video cameras.
Skiers, please, keep your eyes on the slopes at all times!
That goes for you cyclists looking to check your heartbeat in your newfangled high tech cycling glasses every five seconds.
Don’t get me wrong, I’m all for having performance biometrics, even in real-time, but I think we have to think very carefully about how that information is presented back to athletes, especially those mid-mountain or mid-peloton.
If you’ve ever nearly been run over by someone who was texting while driving, you know exactly what I’m talking about.
I texted while driving for a time. But about the fourth time I nearly rear-ended someone, it dawned on me that texting while driving was a bad idea. Very bad. And this was well before any of those anti-texting public ad campaigns had emerged.
These days, I find myself constantly scanning my rear-view mirror in fear of some other idiot not having come around to a similar conclusion, which is its own kind of dangerous distraction.
So what’s going to happen on the ski mountains across the globe when folks are too busy checking their optimum heart rate to see those trees racing up towards their performance glasses?
There will be an inordinate demand for well trained ski patrol professionals, that’s what!
I was sitting here at JFK waitin’ on a plane and IBM’s 1Q 2013 earnings came across the wire, so here goes:
- Diluted EPS: GAAP: $2.70, up 3 percent; Operating (non-GAAP): $3.00, up 8 percent
- Net income: GAAP: $3.0 billion, down 1 percent; Operating (non-GAAP): $3.4 billion, up 3 percent
- Gross profit margin: GAAP: 45.6 percent, up 0.6 points; Operating (non-GAAP): 46.7 percent, up 1.0 points
- Revenue: $23.4 billion, down 5 percent, down 3 percent adjusting for currency
- Free cash flow of $1.7 billion, down $0.2 billion
- Software revenue flat, up 1 percent adjusting for currency; Pre-tax: income up 4 percent; margin up 1.2 points
- Services revenue down 4 percent, down 1 percent adjusting for currency; Pre-tax: income up 10 percent; margin up 2.0 points
- Services backlog of $141 billion, up 1 percent, up 5 percent adjusting for currency; Closed 22 deals of more than $100 million in the quarter
- Systems and Technology revenue down 17 percent, down 16 percent adjusting for currency
- Growth markets revenue down 1 percent, up 1 percent adjusting for currency
- Business analytics revenue up 7 percent; Smarter Planet revenue up more than 25 percent; Cloud revenue up more than 70 percent
- Reiterating full-year 2013 operating (non-GAAP) EPS expectation of at least $16.70.
IBM announced first-quarter 2013 diluted earnings of $2.70 per share, a year-to-year increase of 3 percent. Operating (non-GAAP) diluted earnings were $3.00 per share, compared with operating diluted earnings of $2.78 per share in the first quarter of 2012, an increase of 8 percent.
First-quarter net income was $3.0 billion, down 1 percent year-to-year. Operating (non-GAAP) net income was $3.4 billion compared with $3.3 billion in the first quarter of 2012, an increase of 3 percent. Total revenues for the first quarter of 2013 of $23.4 billion were down 5 percent (down 3 percent, adjusting for currency) from the first quarter of 2012.
“In the first quarter, we grew operating net income, earnings per share and expanded operating margins but we did not achieve all of our goals in the period. Despite a solid start and good client demand we did not close a number of software and mainframe transactions that have moved into the second quarter. The services business performed as expected with strong profit growth and significant new business in the quarter,” said Ginni Rometty, IBM chairman, president and chief executive officer.
“Looking ahead, in addition to closing those transactions, we expect to benefit from investments we are making in our growth initiatives and from the actions we are taking to improve under-performing parts of the business. We remain confident in this model of continuous transformation and in our ability to deliver our full-year 2013 operating earnings per share expectation of at least $16.70.”
Pre-tax income decreased 6 percent to $3.6 billion. Pre-tax margin decreased 0.1 points to 15.4 percent. Operating (non-GAAP) pre-tax income decreased 1 percent to $4.1 billion and pre-tax margin was 17.4 percent, up 0.8 points.
IBM’s tax rate was 15.9 percent, down 4.1 points year over year; operating (non-GAAP) tax rate was 17.3 percent, down 3.2 points compared to the year-ago period. The lower tax rate is primarily due to benefits recorded to reflect changes in tax laws enacted during the quarter, including the reinstatement of the U.S. Research and Development Tax Credit.
Net income margin increased 0.5 points to 13.0 percent. Total operating (non-GAAP) net income margin increased 1.2 points to 14.4 percent.
The weighted-average number of diluted common shares outstanding in the first-quarter 2013 was 1.12 billion compared with 1.17 billion shares in the same period of 2012. As of March 31, 2013, there were 1.11 billion basic common shares outstanding.
Debt, including Global Financing, totaled $33.4 billion, compared with $33.3 billion at year-end 2012. From a management segment view, Global Financing debt totaled $25.2 billion versus $24.5 billion at year-end 2012, resulting in a debt-to-equity ratio of 7.2 to 1. Non-global financing debt totaled $8.2 billion, a decrease of $0.6 billion since year-end 2012, resulting in a debt-to-capitalization ratio of 34.3 percent from 36.1 percent.
IBM ended the first-quarter 2013 with $12.0 billion of cash on hand and generated free cash flow of $1.7 billion, excluding Global Financing receivables, down approximately $0.2 billion year over year. The company returned $3.5 billion to shareholders through $0.9 billion in dividends and $2.6 billion of gross share repurchases. The balance sheet remains strong, and the company is well positioned to support the business over the long term.
Anybody watch that little ol’ college basketball game last night between Louisville and Michigan?
Whoa. Talk about saving the best for last. “The end of the road,” indeed.
Hats off to Louisville to reaching and staying number one, especially after the first half of the final, when I thought Michigan might be running away with the show!
Now that the Final Four is over, I can give my undivided attention to my favorite sport, the game of golf.
For the longest time, golf has been a sport that has exalted in its traditions and basked in its conservatism, technological and otherwise.
But in order to keep the sport vibrant, everyone from golfing bodies to entrepreneurs are finding new ways of introducing, bolstering, and sharing information about the sport.
Yesterday in Augusta, chairman Billy Payne inaugurated a new “Drive, Chip and Putt Championship” for youngsters ages 7-15, which will hold its first finals at Augusta National just prior to next year’s Masters.
And though we’ve seen remarkable technology evolution with regards to playing equipment on the golf course, I think we’re just getting going in terms of using data and analytics to help the amateur golfer.
I’ve been using a product called “GolfshotGPS” for some time now to help me conduct some basic analysis of my golf game, but let’s face it, having to do data entry on the golf course takes time away from playing and enjoying the scenery.
Enter “GAME GOLF,” an outfit that originated in Galway, Ireland and who are working to bring more sophisticated analytics more easily to the game of golf, and doing so in a way where we mere amateurs will be able to “compete with the likes of PGA veterans like Graeme McDowell and Lee Westwood (two pros who have done early prototype development of GAME GOLF’s technology.)
The idea is simple: Using GAME GOLF’s small wireless hub and a set of golf club “tags,” one for each club in a golfer’s bag, GAME will analyze all the critical elements of one’s golf game. Think of it as having RFID tags on every one of your golf clubs.
GAME records every club used, every swing made, every yard covered in each round, WITHOUT pausing play to enter info into your Android device.
Then, GAME calculates key statistics: Scoring, number of putts, greens in regulation, driving accuracy, and so forth.
But GAME doesn’t just give YOU, the golfer, data. Golf is at its essence a competitive sport you play against yourself and others, so GAME will also share your performance with friends on social networks, and also help connect you with other golfers on GAME’s network.
I can tell you from having analyzed my own game with the limited data I’ve had access to, I’ve been able to improve my game (although improving my “mechanics” was where I saw my biggest improvement).
Golf is an iterative game when it comes to improving, but the smallest of tweeks can have relatively big payoffs (Steve Stricker’s recent putting advice for Tiger Woods, by way of example).
If you know you’re 3 putting 60% of the holes you play…well, I hate to tell you, but you probably ought to head out right now and spend some significant time on the putting green.
But it’s the “fantasy” aspect of GAME that serves up the most intrigue for me. What if Tiger and Brent and Bubba and others also started using GAME during their Tour events, and now suddenly I and my fantasy golf friends could start competing directly with the pros in “virtual” matches.
First, yes, me and my amateur friends, we’d lose, and big time.
But, with the proper handicap adjustments, suddenly we find ourselves on the first tee at Augusta the first Thursday of April with Tiger and Phil, shaking in our boots and hoping we don’t kill someone in the fairway with our first drive.
You can learn more about GAME GOLF in the video below. There’s currently a crowdsourcing fundraiser that has been extended to April 15th.
It’s too early to tell if this will be a golf GAME changer or not, but I think with golf, more information is always better than less.
GAME GOLF seems to provide just enough new information (without hassle in acquiring it) that has the potential to make me a better golfer, and to make the game that much more fun.
Who can argue with that?
Hello Monday, in my favorite week of the year.
Yes, it’s that time again, Masters Week, where the best golfers come together on a classic golf course down in Augusta to test for the best.
The history, the traditions, and such behind The Masters are all well and good, but for true and rabid golf fans like myself, it’s the actual competitive golf from Thursday through Sunday afternoon that we live and breathe for.
Though all eyes this week are on Tiger Woods, Rory McIlroy’s new Nike clubs seemed to be warming up to him down here in the heart of Texas over the weekend.
Rory shot a 66 yesterday at the Valero Texas Open, catching some much needed momentum heading into Augusta and finishing second at -12, just two strokes back from winner Martin Laird.
If I were a handicapping man, I’d also be on the lookout this week for the likes of Brant Snedeker, whose hot putter will likely find lots of love at Augusta National; Freddie Jacobson, whose painter’s cap could very well point him in the right directions on Augusta’s undulating nightmare greens; Nicolas Colsaerts, the “Belgium Bomber” playing his first Masters, still one of the best putters in the world; Matt Kuchar, whose victory at WGC-Accenture showed he can hold up under the pressure and take it into the homestretch.
And let’s not forget last year’s winner, Bubba Watson, who might just be up for a repeat.
It’s a difficult tournament to handicap, which is what makes it so interesting to watch.
Speaking of golf, it’s a crazy game. I went out and played twice this weekend…on Saturday, I made three birdies and still shot a 90 (but I also had a 10 on one hole, where I had a Kevin Na-like moment as I tried to hack my way out of some woods).
On Sunday, I rediscovered my swing (especially for my driver), hitting a 350-yard drive on one par 4, and overshooting a 290-yard par 4 with a 3-wood (the wind was VERY much at my back on both holes). Both just gorgeous shots that I couldn’t believe came off of MY club.
I shot an 81, my low for the calendar year, and it was night and day, like I’d been playing two different games from Saturday to Sunday.
Then again, that’s golf!
The last time Tiger Woods was the number one ranked golfer in the world was October 2010. That’s a grand total of 29 months ago.
That all changed this week at Arnold Palmer’s Bay Hill Invitational, which Tiger Woods won running away at -13. That’s Woods’ eighth time to win the same PGA tournament.
Justin Rose gave Woods his best, but faltered on Saturday before attempting a comeback on Monday’s round (after torrential storms in and around Orlando postponed play on Sunday), and Ricky Fowler tried to match Woods’ performance in the final grouping, but Woods’ irons were too much for Fowler and all the “chasers.”
And then there was Woods’ putting, which was nothing short of masterful. For the week, he made 19 of 28 putts between 7 and 20 feet. It was like the Tiger of old — the golf ball seemed to just follow a line from Woods’ putter to the middle of the hole, over and over and over again.
You could hear professional golfers around the globe simply deflate with each stroke of Tiger’s Nike Method putter.
So, Tiger has now won 77 PGA Tour wins, only 5 away from legend Sam Snead’s 82.
And then there’s The Masters coming up in Augusta in mid-April, the golfing equivalent of the Super Bowl.
You think a few odds makers in Vegas now have Tiger to win this year’s Masters?
Not that I would ever gamble on such a thing, but money does talk, and in this case, online casino Bovada already has Tiger at 11/4 odds to take this year’s green jacket.
But since this is a data driven, technology-oriented blog, let’s look at a few more numbers.
Bleacherreport’s Ryan Rudnansky observes that in 2010, Tiger ranked 109th in putting (strokes gained). 45th in 2011. 36th last year. And this year?
You got it? Numero uno.
At Doral, he recorded just 100 putts for the 72 holes, the lowest putting mark in his career.
Oh, yes, and he’s won three times this year in four stroke-play tournaments (we’ll disregard his nasty bit of business at the Accenture Match Play, where Charles Howell III ousted him in the first match).
Is Tiger’s taking the Master’s in two weeks a done deal?
Of course not.
Would I pick him over all the other players in the field?
What do you think?