Turbotodd

Ruminations on tech, the digital media, and some golf thrown in for good measure.

Smarter Chips

leave a comment »

Couldn’t help but notice these two in-the-same-orbit headlines from Amazon and Google re: their own AI chips.

First, in The Information, it’s being reported that Amazon is developing a chip designed for AI to work on the Echo and other hardware powered by Alexa. 

They report that the chip should allow Alexa-powered devices to respond more quickly to commands, by allowing more data processing to be handled on the device than in the cloud. 

It seems the cloud’s edge is moving back towards the center.

And at Google, according to a post in the Google Cloud Platform blog, the company’s cloud Tensor Processing Units (TPUs) are available in beta to help machine learning experts train and run their ML models more quickly.

Some speeds and feeds deets:

Cloud TPUs are a family of Google-designed hardware accelerators that are optimized to speed up and scale up specific ML workloads programmed with TensorFlow. Built with four custom ASICs, each Cloud TPU packs up to 180 teraflops of floating-point performance and 64 GB of high-bandwidth memory onto a single board. These boards can be used alone or connected together via an ultra-fast, dedicated network to form multi-petaflop ML supercomputers that we call “TPU pods.” We will offer these larger supercomputers on GCP later this year.

Written by turbotodd

February 12, 2018 at 4:28 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s

%d bloggers like this: