Groq

Startup

AI inference chip company delivering the fastest LLM inference speeds.

Founded: 2016
HQ: Mountain View, CA
CEO: Jonathan Ross
groq.com

Employees

200-500

Funding

$640M+

Valuation

$2.8B

Status

Private

About Groq

Groq is an AI semiconductor company that has built the Language Processing Unit (LPU), a custom chip architecture designed specifically for fast AI inference. Groq's LPU delivers dramatically faster inference speeds than traditional GPU-based solutions, generating hundreds of tokens per second for large language models. Founded by Jonathan Ross, who previously worked on Google's TPU project, Groq has demonstrated record-breaking LLM inference performance. The company offers a free API playground and cloud inference service, making ultra-fast AI accessible to developers. Groq's approach represents a fundamental rethinking of the hardware needed for AI inference at scale.

Key People

Jonathan Ross

CEO & Founder

Other Startup Companies

Try All AI Models in One Place

Vincony has all 400+ AI models in one place — compare responses, AI debate, Image/Video/Voice generator, Song Creator, SEO Studio, Legal Advisor, strong memory and 20 more tools.

Visit Vincony.com