Google Makes A Really Fast Microprocessor, RIP Core Competency Concept

Google Makes A Really Fast Microprocessor, RIP Core Competency Concept

Nonsense to the concept of core competency. Anyone can do anything because the cost of doing everything is lower. That is one of the problems of companies that have no platforms. That is why IBM is struggling because it has no platform. Everything it does can be easily done by its clients with marginal pains. As technology costs drop, there is no need to outsource. You can simply build in-house.

Google, Facebook and host of many companies are doing so. Facebook makes some of its servers, using largely open source solutions. Google has entered into the chip business.

Google has designed and deployed a second generation of its TensorFlow Processor Unit (TPU) and is giving access to the machine-learning ASIC as a cloud service for commercial customers and researchers. A server with four of the so-called Cloud TPUs delivers 180 TFlops that will be used both for training and inference tasks.

The effort aims to harness rising interest in machine learning to drive use of Google’s cloud services. It also aims to rally more users around its open-source TensorFlow framework, the only software interface that the new chip supports.

The Cloud TPU supports floating-point math, which Google encourages for both training and inference jobs to simplify deployment. The first-gen ASIC used quantized integer math and was focused solely on inference jobs.

What is it that Google cannot do?

Share this post