Inference

Today Qualcomm is revealing more information on last year’s announced “Cloud AI 100” inference chip and platform. The new inference platform by the company is said to have entered production already with the first silicon successfully coming back, and with first customer sampling having started. The Cloud AI 100 is Qualcomm’s first foray into the datacentre AI inference accelerator business, representing the company’s investments into machine learning and leveraging their expertise in the area from the consumer mobile SoC world, and bringing it to the enterprise market. Qualcomm had first revealed the Cloud AI 100 early last year, although admittedly this was more of a paper launch rather than a disclosure of what the hardware actually brought to the table. Today, with actual silicon in the...

Cerebras Wafer Scale Engine News: DoE Supercomputer Gets 400,000 AI Cores

One of the more interesting AI silicon projects over the last couple of years has been the Cerebras Wafer Scale Engine, most notably for the fact that a single...

8 by Dr. Ian Cutress on 8/21/2020

Hot Chips 31 Live Blogs: Intel 10nm Spring Hill NNP-I Inference Chip

One of Intel's future 10nm products is the Spring Hill NNP-I 1000 Inference Engine. Today the company is lifting the lid on some of the architecture behind the chip.

3 by Dr. Ian Cutress on 8/20/2019

Intel Acquires Omnitek: FPGA Video Acceleration and Inferencing

One of the characteristics of Intel is its investment into new IP. This usually takes several forms, such as internal R&D, investing in other companies through Intel Capital, or...

11 by Ian Cutress on 4/16/2019

AI On The Edge: New Flex Logix X1 Inference AI Chip For Fanless Designs

A large number of inference demonstrations published by the big chip manufacturers revolve around processing large batch sizes of images on trained networks. In reality, when video is being...

4 by Ian Cutress on 4/10/2019

Scaling Inference with NVIDIA’s T4: A Supermicro Solution with 320 PCIe Lanes

When visiting the Supercomputing conference this year, there were plenty of big GPU systems on display for machine learning. A large number were geared towards the heavy duty cards...

12 by Ian Cutress on 11/19/2018

Log in

Don't have an account? Sign up now