The race to build the next great artificial intelligence startup has shined a light on both the demand and necessity for GPUs for computing power.
Every AI startup from Silicon Valley to Singapore has been searching for GPUs to power their generative AI projects or large language models. The situation has gotten so extreme that it’s led to some bold behaviour, including venture funds buying up their own clusters of GPUs simply to be able to offer it to portfolio companies.
However, with all the euphoria around GPUs, some have been asking: what about NPUs?
What is it:
- NPUs, which stands for Neural Processing Units, are microprocessors which accelerate machine learning algorithms
- NPUs can more easily handle the exponential computations required for machine learning and artificial intelligence in comparison to GPUs
Why it matters:
- The rapid increase in demand for GPUs has sent chip makers such as Nvidia soaring to new heights and stirred up serious debate around the future of AI computing
- Many in the startup and venture capital community believe that NPUs could be the next most important iteration in computing power for artificial intelligence
- Venture funds and startups alike are trying to position themselves for what is to come next in AI
Who is making moves:
- Hyundai Motors recently invested $50M in Canadian AI company Tenstorrent to power their moves in autonomous vehicles, which will require NPUs to for deep learning
- Wireless connectivity company CEVA Inc (NASDAQ: CEVA) recently rolled out next generation NPU IP to be able to handle generative AI effectively and at a reduced cost, announcing their AI engines are compatible with classic AI models as well as deep learning models
- South Korea based startup AiM recently closed a Series A round for their leading edge IP around NPUs, with notable investment from L&S Venture Capital, Hi Investment Partners, WE Ventures and others