Sunday, June 01, 2025

AI inference: Groq vs NVIDIA? "LPU" vs "GPU"

Groq’s founder on why AI’s next big shift isn’t about Nvidia - YouTube

Groq is Fast AI Inference

Delivering Fast AI Inference with the LPU

The Groq Language Processing Unit, the LPU, is the technology that meets this moment. The LPU delivers instant speed, unparalleled affordability, and energy efficiency at scale. Fundamentally different from the GPU – originally designed for graphics processing – the LPU was designed for AI inference and language.



free dev option available





Groq, Inc. is an American artificial intelligence (AI) company that builds an AI accelerator application-specific integrated circuit (ASIC) that they call the Language Processing Unit (LPU) and related hardware to accelerate the inference performance of AI workloads.

Examples of the types AI workloads that run on Groq's LPU are: large language models (LLMs),[2][3] image classification,[4] anomaly detection,[5][6] and predictive analysis.


LiveScript: functional programming language, compiles to JavaScript

LiveScript (programming language) - Wikipedia

LiveScript is a functional programming language that transpiles to JavaScript. It was created by Jeremy Ashkenas, the creator of CoffeeScript, and others.

hello = ->
  console.log 'hello, world!'


"hello!" |> capitalize |> console.log
# > Hello!



LiveScript is a language which compiles to JavaScript. It has a straightforward mapping to JavaScript and allows you to write expressive code devoid of repetitive boilerplate. While LiveScript adds many features to assist in functional style programming, it also has many improvements for object oriented and imperative programming.

LiveScript is a fork of Coco and an indirect descendant of CoffeeScript, with which it has much compatibility.