Americas

  • United States

Mythic AI gets funding to mass-produce edge chips

News Analysis
May 19, 20213 mins
Data CenterEdge Computing

The company specializes in inferencing with its analog, in-memory processor.

neural network1
Credit: Dell Technologies

Just six months after unveiling its first AI inferencing processor, Mythic AI has announced a new round of funding for $70 million in Series C investment to begin mass production of its chips and to develop its next generation of hardware and software products.

In November, the company announced the M1108 Analog Matrix Processor (AMP) aimed at edge AI deployments across a wide range of applications, including manufacturing, video surveillance, smart cities, smart homes, AR/VR, and drones.

For a company that is nine years old and has zero sales, it’s got some heavy hitters behind it. The new investment round was led by led by venture fund giant BlackRock and Hewlett Packard Enterprise (HPE). Other investors include Alumni Ventures Group and UDC Ventures.

m.2.card ri processor Mythic

Mythic M1108

Mythic AI said it will use the latest funding to accelerate its plans to begin mass production of the M1108 while expanding its support to customers globally, building up its software offerings, and developing the next-generation of its hardware platform.

Inference is the second step in machine learning, following training, and has far lower compute requirements. Training requires GPUs, FPGAs, and CPUs with their massive horsepower, but inference is just a yes/no comparison and a CPU is overkill.

By way of comparison, Intel’s early stab at an inference processor, the Nervana (since discontinued), consumed as little as 10 watts. A CPU consumes 200 watts and a GPU up to 500 watts. The M1108 would use as little as 4 watts, so you can see why it might be ideal for a low-power edge deployment.

The M1108 chips are analog processors designed to provide high-performance with low power requirements. At the heart of the M1108 is the Mythic Analog Compute Engine for analog compute-in-memory with on-chip deep neural-network model execution and weight-parameter storage with no external DRAM.

Each Mythic ACE is complemented by a digital subsystem that includes a 32-bit RISC-V nano processor, SIMD vector engine, 64KB of SRAM, and a high-throughput network-on-chip router. It uses a M.2 design which is becoming rather popular among SSD designs.

M.2 is about the size of a stick of gum and plugs into the motherboard, lying flat. Depending on the motherboard, the M.2 uses PCI Express Gen3 or Gen4. The Mythic processor board has a four-lane PCIe interface with up to 2GB/s bandwidth.

Device makers and original-equipment manufacturers (OEMs) can choose from the single-chip M1108 Mythic AMP or a variety of PCIe card configurations, including the M.2 M Key and M.2 A+E Key form factors to fit a variety of needs.

On the software side, the M1108 supports standard machine-learning frameworks like PyTorch, TensorFlow 2.0, and Caffe.

Mythic has not said when it plans to bring its products to market.

Andy Patrizio is a freelance journalist based in southern California who has covered the computer industry for 20 years and has built every x86 PC he’s ever owned, laptops not included.

The opinions expressed in this blog are those of the author and do not necessarily represent those of ITworld, Network World, its parent, subsidiary or affiliated companies.