Turbotodd

Ruminations on tech, the digital media, and some golf thrown in for good measure.

Archive for the ‘Uncategorized’ Category

IBM Expands Watson Data Platform to Help Unleash AI

leave a comment »

 IBM today announced new offerings to its Watson Data Platform, including data cataloging and data refining, which is designed to make it easier for developers and data scientists to analyze and prepare enterprise data for AI applications, regardless of its structure or where it resides. By improving data visibility and helping to better enforce data security policies, users can now connect and share data across public and private cloud environments.

By 2018, nearly 75 percent of developers will build AI functionality into their apps, according to IDC. However, they also face the obstacle of making sense of increasingly complex data that lives in different places, and that must be securely and continually ingested to power these apps.

Addressing these challenges, IBM has expanded the functionality of its Watson Data Platform, an integrated set of tools, services and data on the IBM Cloud designed to enable data scientists, developers and business teams to gain intelligence from the data most important to their roles, as well as easily access services like machine learning, AI and analytics.

“We are always looking for new ways to gain a more holistic view of our clients’ campaign data, and design tailored approaches for each ad and marketing tactic,” said Michael Kaushansky, Chief Data Officer at Havas, a global advertising and marketing consultancy. “The Watson Data Platform is helping us do just that by quickly connecting offline and online marketing data. For example, we recently kicked off a test for one of our automotive clients, aiming to connect customer data, advertising information in existing systems, and online engagement metrics to better target the right audiences at the right time.”

Specifically, this expansion includes:

  • New Data Catalog and Data Refinery offerings, which bring together datasets that live in different formats on the cloud, in existing systems and in third party sources; as well as apply machine learning to process and cleanse this data so it can be ingested for AI applications;
  • The ability to use metadata, pulled from Data Catalog and Data Refinery, to tag and help enforce a client’s data governance policies. This gives teams a foundation to more easily identify risks when sharing sensitive data.
  • The general availability of Analytics Engine to separate the storage of data from the information it holds, allowing it to be analyzed and fed into apps at much greater speeds. As a result, developers and data scientists can more easily share and build with large datasets.

More details on the new offerings of the IBM Watson Data Platform may be found here.

To further help companies grasp control of all of their data no matter where it resides, IBM is also announcing a series of new features to its Unified Governance Platform. These bring greater visibility and management of clients’ global data, including new capabilities that help clients as they better prepare for impending data protection regulations such as GDPR.

Built on open source technologies and fueled by IBM Cloud, the Watson Data Platform brings together IBM’s cloud infrastructure, powerful data services and decades of experience helping clients across industries solve their data challenges. Linked closely with the most popular communities among data scientists and developers, including Python and Spark, the Watson Data Platform continues to evolve to build the most open and complete data operating system on the cloud.

For more information on the Watson Data Platform, visit: https://www.ibm.com/analytics/us/en/watson-data-platform/.

To try and explore the Watson Data Platform, visit the tutorial: www.ibm.biz/wdptutorial.

Written by turbotodd

November 2, 2017 at 9:52 am

The iPhone X Files

leave a comment »

Some notable funding rounds…TechCrunch reports that Seattle-based international remittance provider Remitly is raising a Series D round of $115 million led by PayU. The company lets people in the U.S., U.K., and Canada send money electronically to friends and family in developing countries across Africa, South America, and Asia.

And Berlin-based Ada Health, which is an AI-driven app that “works a little like an ‘Alexa for health,'” has raised a $47M round led by Access Industries. Also according to TechCrunch, the the funds will be used to improve the product, hire staff and open a new office in the U.S.

Ada has become one of the world’s fastest-growing medical apps in 2017. In a chat interface, it helps people decipher their ailments, but then also connects them with real doctors.

In the meantime, reviews of the iPhone X are starting roll off the presses…

Mashable: “Holding the iPhone X in my hand triggers a distinct memory of the the moment I first cradled the original iPhone…There is no home button, just a stunning 5.8-inch screen that hugs the edge, reaching for the silver band that’s just millimeters away. All of it is so beautiful, save for that peninsula of darkness at the top.”

BuzzFeed: “The iPhone X might be a huge step forward in terms of hardware, but it runs iOS 11 just the same as other recent iPhones, and you won’t really be missing out on anything except Animoji. Face ID seems like it’s off to a good start, but it’s definitely inconsistent in certain lighting conditions. And until your favorite apps are updated, you won’t be able to make use of that entire beautiful display.”

Engadget: “Once my face was enrolled in Face ID — a process that took less than 20 seconds — unlocking the phone worked damn near perfectly every time. You still have to swipe up to view your home screen after you’ve unlocked the X with your mug, but the whole process is very nearly as fast as using a Touch ID sensor in a recent iPhone.”

Last I checked, my iPhone SE — despite being oh-so-2016 — still works fine and does pretty much everything the X does (except for FaceID), so I’m good. But nice to hear the (mostly) positive X reviews after all that buildup!

Written by turbotodd

October 31, 2017 at 2:07 pm

Posted in 2017, apple, iphone x, Uncategorized

Tagged with

Better AI

leave a comment »

Happy Monday.

That was some baseball game last night between the Houston Astros and the LA Dodgers at Minute Maid Park in downtown Houston.

The Astros go down 4-0 early, then come back and tie the game, and then the score seesaws back and forth all the way through the bottom of the tenth before Houston’s Bregman hit a walk-off single to drive in the winning run.

No AI-driven computer simulation could likely have foreseen such an insane game with that outcome.

Then again, we might out imagine what would might become of us if it could.

Which is why the Information Technology Industry Council, whose members include IBM, Amazon, Facebook, and Oracle, have listed five areas to improve the development of artificial intelligence.

According to a report in the Wall Street Journal, suggestions include designing autonomous systems that are consistent with international conventions and preserve human rights; prioritizing user safety; using large, representative data sets while identifying potential bias; and creating accountability frameworks to assuage concerns over liability when AI takes action after making a decision.

You can read the full ITI AI Policy Principles here.

 

Written by turbotodd

October 30, 2017 at 9:47 am

U.S. Economy Sees 3 Percent GDP Growth in Q3 2017

leave a comment »

TGIF.

And thank heavens for another good quarter of U.S. GDP growth.

The U.S. economy grew at a 3 percent annual rate from July to September, propelled by steady spending from American businesses and households, according to the U.S. Commerce Department.

Consumer spending increased at a 2.4 percent rate in the quarter (with spending likely suppressed by storms Harvey and Irma), while businesses continued to step up investing spending (with non-residential fixed investment growth at 3.9 percent in the quarter).

The increase in real GDP in the third quarter  reflected positive contributions from personal consumption expenditures, private inventory investment, nonresidential fixed investment, exports, and federal government spending.

These increases were partly offset by negative contributions from residential fixed investment in state and local government spending.

You can see the full announcement from the Bureau of Economic Analysis here.

Written by turbotodd

October 27, 2017 at 8:34 am

Posted in 2017, economy, GDP, Uncategorized

IBM Transforms FlashSystem, Drives Down Cost of Data

leave a comment »

IBM today announced sweeping advances in its all-flash storage solutions and software to drive down the costs of data and extend its solutions for hybrid and private cloud environments.

Some of the changes and additions include:

  • New ultra-dense FlashSystem array capable of storing more data in the same footprint, contributes to lower data capacity costs by nearly 60 percent.
  • New Spectrum Virtualize software allows simplified migration and disaster recovery of data to and from the IBM Public Cloud;
  • New software enables IBM and non-IBM storage to be used with popular Docker and Kubernetes containers environments;
  • Cloud-based software beta program integrates storage with artificial intelligence and machine learning through new software to collect inventory and diagnostic information in order to help optimize the performance, capacity and health of clients’ storage infrastructure.

“Companies are seeking guidance in modernizing their data from being a passive cost center to being the central hub for their business. IBM understands that only those that extensively analyze and exploit their data will benefit from it,” said Ed Walsh, GM, IBM Storage and SDI. “To help clients make this transformation, we are introducing new all-flash solutions that will dramatically lower the cost of storage systems while making data availability – whether on-site or in the cloud – a central part of their business strategy.”

In addition to the aforementioned features, updates to IBM Storage systems and software include:

  • New Platform Speeds Private Cloud Deployments – IBM Spectrum Access solutions offer what storage admins users need to deploy a private cloud quickly and efficiently, delivering the economics and simplicity of the cloud with accessibility, virtualization and performance of an on-premises implementation;
  • Consumption-Based Pricing – new utility offering enables a consumption-based buying model for hybrid cloud environments leveraging most of the IBM storage and VersaStack portfolios for users preferring to buy storage as an operating expense;
  • Consolidated User Interface – new interface for FlashSystem 900 consolidates activity and performance information in a single dashboard. Consistent with user interfaces used in other IBM storage systems and IBM Spectrum Storage software, the UI simplifies operations and helps improve productivity;
  • VersaStack with FlashSystem – incorporating the newest FlashSystem being announced today an extensive refresh to the IBM/Cisco VersaStack converged infrastructure offerings;
  • Investment Protection – several of the new all-flash storage and VersaStack solutions announced today are NVMe ready, enabling them to take advantage of the NVMe offerings coming in 2018.

“With this announcement, IBM is demonstrating, among other things, how highly leveraged their FlashCore strategy is,” said Eric Burgener, Research Director for Storage at IDC.  “Next generation FlashCore enhancements, including higher density 3D TLC NAND-based media and hardware-assisted in-line compression and encryption, immediately improve the capabilities of multiple IBM All Flash Arrays by providing features that drive higher infrastructure density and improved security more cost-effectively.”

IBM leadership in storage systems and software is based upon more than 380 system patents, including IBM FlashCore technology and more than 700 patents for IBM Spectrum Storage software. As a result IBM’s flash arrays have been ranked as Leader in Gartner Magic Quadrant for Solid State Arrays for four years in a row and for the 3rd year in a row has been named the #1 Software-Defined Storage vendor by IDC.

The new features to IBM’s all-flash systems and IBM Spectrum Storage software will be available in Q4. Clients interested in participating in the IBM beta program for cognitive support can inquire by visiting ibm.biz/FoundationPilot.

For more information about IBM Flash Storage please visit: https://www.ibm.com/storage/flash.

Written by turbotodd

October 26, 2017 at 9:12 am

You Don’t Know Me

leave a comment »

Bloomberg is reporting that Apple is having production problems for the new iPhone X due to the sophisticated requirements of its new facial recognition technology, and is said to have told suppliers they could reduce the accuracy of the face recognition technology to make it easier to manufacture.

The iPhone X is set to debut on November 3.

Also on the Apple Beat…Apple has acquired ten-year-old Auckland-based wireless charging company, PowerbyProxi, for an undisclosed sum, and according to Stuff Technology, plans on keeping the company powered up in New Zealand.

Meanwhile, if you’re not confused yet about the continuing evolution of bitcoin, there’s a new fork of forks. Blockchain firm Bloq (see what they did there with the name?) indicated it has created a rival to bitcoin called “Metronome” that will go on sale in December, according to Fortune.

Today, bitcoin faces existential threats from forks, developer drama and so on. Knowing what we know and having a clean sheet of paper, we asked what what would we build and the answer is this. Jeff Garzik, CEO of Bloq and a longtime bitcoin developer.

Written by turbotodd

October 25, 2017 at 10:47 am

IBM: Five Innovations That Will Help Change Our Lives Within Five Years

leave a comment »

IBM unveiled today the annual “IBM 5 in 5” (#ibm5in5) – a list of ground-breaking scientific innovations with the potential to change the way people work, live, and interact during the next five years.

Drum roll, please…:

In 1609, Galileo invented the telescope and saw our cosmos in an entirely new way. He proved the theory that Earth and other planets in our solar system revolve around the Sun, which until then was impossible to observe.

IBM Research
continues this work through the pursuit of new scientific instruments – whether physical devices or advanced software tools – designed to make what’s invisible in our world visible, from the macroscopic level down to the nanoscale.

 

“The scientific community has a wonderful tradition of creating instruments to help us see the world in entirely new ways. For example, the microscope helped us see objects too small for the naked eye and the thermometer helped us understand temperature of the Earth and human body,” said Dario Gil, vice president of science & solutions at IBM Research. “With advances in artificial intelligence and nanotechnology, we aim to invent a new generation of scientific instruments that will make the complex invisible systems in our world today visible over the next five years.”

Innovation in this area could enable us to dramatically improve farming, enhance energy efficiency, spot harmful pollution before it’s too late, and prevent premature physical and mental health decline as examples. IBM’s global team of scientists and researchers is steadily bringing these inventions from the realm of our labs to the real world.

The IBM 5 in 5 is based on market and societal trends as well as emerging technologies from IBM’s Research labs around the world that can make these transformations possible. Here are the five scientific instruments that will make the invisible visible in the next 5 years:

With AI, our words will open a window into our mental health

Brain disorders, including developmental, psychiatric and neurodegenerative diseases, represent an enormous disease burden, in terms of human suffering and economic cost.

For example, today, one in five adults in the U.S. experiences a mental health condition such as depression, bipolar disease or schizophrenia, and roughly half of individuals with severe psychiatric disorders receive no treatment. The global cost of mental health conditions is projected to surge to US$6 trillion by 2030.

If the brain is a black box that we don’t fully understand, then speech is a key to unlock it. In five years, what we say and write will be used as indicators of our mental health and physical wellbeing.

Patterns in our speech and writing analyzed by new cognitive systems will provide tell-tale signs of early-stage developmental disorders, mental illness and degenerative neurological diseases that can help doctors and patients better predict, monitor and track these conditions.

At IBM, scientists are using transcripts and audio inputs from psychiatric interviews, coupled with machine learning techniques, to find patterns in speech to help clinicians accurately predict and monitor psychosis, schizophrenia, mania and depression. Today, it only takes about 300 words to help clinicians predict the probability of psychosis in a user.

In the future, similar techniques could be used to help patients with Parkinson’s, Alzheimer’s, Huntington’s disease, PTSD and even neurodevelopmental conditions such as autism and ADHD. Cognitive computers can analyze a patient’s speech or written words to look for tell-tale indicators found in language, including meaning, syntax and intonation.

Combining the results of these measurements with those from wearable devices and imaging systems and collected in a secure network can paint a more complete picture of the individual for health professionals to better identify, understand and treat the underlying disease.

What were once invisible signs will become clear signals of patients’ likelihood of entering a certain mental state or how well their treatment plan is working, complementing regular clinical visits with daily assessments from the comfort of their homes.

Hyperimaging and AI will give us superhero vision

More than 99.9 percent of the electromagnetic spectrum cannot be observed by the naked eye. Over the last 100 years, scientists have built instruments that can emit and sense energy at different wavelengths.

Today, we rely on some of these to take medical images of our body, see the cavity inside our tooth, check our bags at the airport, or land a plane in fog. However, these instruments are incredibly specialized and expensive and only see across specific portions of the electromagnetic spectrum.

In five years, new imaging devices using hyperimaging technology and AI will help us see broadly beyond the domain of visible light by combining multiple bands of the electromagnetic spectrum to reveal valuable insights or potential dangers that would otherwise be unknown or hidden from view.

Most importantly, these devices will be portable, affordable and accessible, so superhero vision can be part of our everyday experiences.

A view of the invisible or vaguely visible physical phenomena all around us could help make road and traffic conditions clearer for drivers and self-driving cars. For example, using millimeter wave imaging, a camera and other sensors, hyperimaging technology could help a car see through fog or rain, detect hazardous and hard-to-see road conditions such as black ice, or tell us if there is some object up ahead and its distance and size.

Cognitive computing technologies will reason about this data and recognize what might be a tipped over garbage can versus a deer crossing the road, or a pot hole that could result in a flat tire.

Embedded in our phones, these same technologies could take images of our food to show its nutritional value or whether it’s safe to eat. A hyperimage of a pharmaceutical drug or a bank check could tell us what’s fraudulent and what’s not. What was once beyond human perception will come into view.

IBM scientists are today building a compact hyperimaging platform that “sees” across separate portions of the electromagnetic spectrum in one platform to potentially enable a host of practical and affordable devices and applications.

Macroscopes will help us understand Earth’s complexity in infinite detail

Today, the physical world only gives us a glimpse into our interconnected and complex ecosystem. We collect exabytes of data – but most of it is unorganized. In fact, an estimated 80 percent of a data scientist’s time is spent scrubbing data instead of analyzing and understanding what that data is trying to tell us.

Thanks to the Internet of Things, new sources of data are pouring in from millions of connected objects – from refrigerators, light bulbs and your heart rate monitor to remote sensors such as drones, cameras, weather stations, satellites and telescope arrays.

There are already more than six billion connected devices generating tens of exabytes of data per month, with a growth rate of more than 30 percent per year. After successfully digitizing information, business transactions and social interactions, we are now in the process of digitizing the physical world.

In five years, we will use machine learning algorithms and software to help us organize the information about the physical world to help bring the vast and complex data gathered by billions of devices within the range of our vision and understanding. We call this a “macroscope” – but unlike the microscope to see the very small, or the telescope that can see far away, it is a system of software and algorithms to bring all of Earth’s complex data together to analyze it for meaning.

By aggregating, organizing and analyzing data on climate, soil conditions, water levels and their relationship to irrigation practices, for example, a new generation of farmers will have insights that help them determine the right crop choices, where to plant them and how to produce optimal yields while conserving precious water supplies.

In 2012, IBM Research began investigating this concept at Gallo Winery, integrating irrigation, soil and weather data with satellite images and other sensor data to predict the specific irrigation needed to produce an optimal grape yield and quality. In the future, macroscope technologies will help us scale this concept to anywhere in the world.

Beyond our own planet, macroscope technologies could handle, for example, the complicated indexing and correlation of various layers and volumes of data collected by telescopes to predict asteroid collisions with one another and learn more about their composition.

Medical labs “on a chip” will serve as health detectives for tracing disease at the nanoscale

Early detection of disease is crucial. In most cases, the earlier the disease is diagnosed, the more likely it is to be cured or well controlled. However, diseases like cancer can be hard to detect – hiding in our bodies before symptoms appear.

Information about the state of our health can be extracted from tiny bioparticles in bodily fluids such as saliva, tears, blood, urine and sweat. Existing scientific techniques face challenges for capturing and analyzing these bioparticles, which are thousands of times smaller than the diameter of a strand of human hair.

In the next five years, new medical labs “on a chip” will serve as nanotechnology health detectives – tracing invisible clues in our bodily fluids and letting us know immediately if we have reason to see a doctor. The goal is to shrink down to a single silicon chip all of the processes necessary to analyze a disease that would normally be carried out in a full-scale biochemistry lab.

The lab-on-a-chip technology could ultimately be packaged in a convenient handheld device to allow people to quickly and regularly measure the presence of biomarkers found in small amounts of bodily fluids, sending this information securely streaming into the cloud from the convenience of their home.

There it could be combined with real-time health data from other IoT-enabled devices, like sleep monitors and smart watches, and analyzed by AI systems for insights. When taken together, this data set will give us an in depth view of our health and alert us to the first signs of trouble, helping to stop disease before it progresses.

At IBM Research, scientists are developing lab-on-a-chip nanotechnology that can separate and isolate bioparticles down to 20 nanometers in diameter, a scale that gives access to DNA, viruses, and exosomes. These particles could be analyzed to potentially reveal the presence of disease even before we have symptoms.

Smart sensors will detect environmental pollution at the speed of light

Most pollutants are invisible to the human eye, until their effects make them impossible to ignore. Methane, for example, is the primary component of natural gas, commonly considered a clean energy source. But if methane leaks into the air before being used, it can warm the Earth’s atmosphere. Methane is estimated to be the second largest contributor to global warming after carbon dioxide (CO2).

In the United States, emissions from oil and gas systems are the largest industrial source of methane gas in the atmosphere. The U.S. Environmental Protection Agency (EPA) estimates that more than nine million metric tons of methane leaked from natural gas systems in 2014.

Measured as CO2-equivalent over 100 years, that’s more greenhouse gases than were emitted by all U.S. iron and steel, cement and aluminum manufacturing facilities combined.

In five years, new, affordable sensing technologies deployed near natural gas extraction wells, around storage facilities, and along distribution pipelines will enable the industry to pinpoint invisible leaks in real-time.

Networks of IoT sensors wirelessly connected to the cloud will provide continuous monitoring of the vast natural gas infrastructure, allowing leaks to be found in a matter of minutes instead of weeks, reducing pollution and waste and the likelihood of catastrophic events.

Scientists at IBM are tackling this vision, working with natural gas producers such as Southwestern Energy to explore the development of an intelligent methane monitoring system and as part of the ARPA-E Methane Observation Networks with Innovative Technology to Obtain Reductions (MONITOR) program.

At the heart of IBM’s research is silicon photonics, an evolving technology that transfers data by light, allowing computing literally at the speed of light.

These chips could be embedded in a network of sensors on the ground or within infrastructure, or even fly on autonomous drones; generating insights that, when combined with real-time wind data, satellite data, and other historical sources, can be used to build complex environmental models to detect the origin and quantity of pollutants as they occur.

For more information about the IBM 5 in 5, please visit: http://ibm.biz/five-in-five.

Written by turbotodd

January 5, 2017 at 8:53 am

Posted in Uncategorized

%d bloggers like this: