Saturday, May 09, 2026

Max Market Cap, May 2026

AI-maxing... Google is long term "winner" from "first principles"... 

Until SpaceX and Tesla are merged, Terrafab is complete, and if growth continues :)

 Largest Companies by Market Cap in 2026 | The Motley Fool


Nvidia (NASDAQ:NVDA)$5.2 trillion$215.16Information Technology
Alphabet (NASDAQ:GOOG)$4.8 trillion$396.94Communication Services
Apple (NASDAQ:AAPL)$4.2 trillion$293.41Information Technology
Microsoft (NASDAQ:MSFT)$3.1 trillion$415.17Information Technology
Amazon (NASDAQ:AMZN)$2.9 trillion$272.66Consumer Discretionary
Taiwan Semiconductor Manufacturing (NYSE:TSM)$2.1 trillion$410.67Information Technology
Broadcom (NASDAQ:AVGO)$2.0 trillion$430.18Information Technology
Meta Platforms (NASDAQ:META)$1.5 trillion$609.53Communication Services
Tesla (NASDAQ:TSLA)$1.6 trillion$427.99Consumer Discretionary
Walmart (NASDAQ:WMT)$1.0 trillion$130.40Consumer Staples
Berkshire Hathaway (NYSE:BRKA)$1.0 trillion$716,020.00Financials


how did Broadcom - Wikipedia become so highly valued?
Similar to Nvidia, with relatively few employees (33K) 


Anthropic AI: Bun.js: Zig fork; Rust port?

zig.lang does not allow LLM contributions... so need to be forked to leverage AI tools?

now as part of Anthropic, likely with unlimited AI tokens available, bun.js can as well be ported to any language of choice, like Rust... both are compiled by LLVM... 

Zig is "human friendly" language, Rust is much more complicated... 

And Anthropic is "playing to win" not to "be nice"... 

oven-sh/bun: Incredibly fast JavaScript runtime, bundler, test runner, and package manager – all in one GitHub


Contributor Poker and Zig's AI Ban | Loris Cro's Blog

Bun on X: "We do not currently plan to upstream this, as Zig has a strict ban on LLM-authored contributions." / X

Bun’s Zig fork got 4x faster compilation times - Showcase - Ziggit

Bun's Rewrite It In Rust branch : r/rust

Jarred-Sumner, the creator of Bun (a JS runtime), has created a Rust port branch in Bun's repository with Claude AI, which has 760k LoC at the moment.

I work on Bun and this is my branch This whole thread is an overreaction. 302 co... | Hacker News

"I’m curious to see what a working version of this looks, what it feels like, how it performs and if/how hard it’d be to get it to pass Bun’s test suite and be maintainable. I’d like to be able to compare a viable Rust version and a Zig version side by side."


Zig is at a crossroads - YouTube by PrimeTime

This video explores a controversy involving Bun (a JavaScript utility) now owned by Anhropic.ai and its decision to fork the Zig programming language. Here are the key takeaways:

  • The Fork Controversy: Bun created a fork of the Zig language to implement a performance enhancement for parallel debug builds. However, they opted not to upstream these changes due to Zig’s strict policy prohibiting LLM-authored contributions.

  • The Zig Philosophy: Zig is known for its careful, deliberate approach to development, often prioritizing long-term stability and correct design over rapid feature velocity. The language maintainers have expressed that they have their own, more robust solutions in development for the performance issues Bun was trying to solve.

  • AI and Open Source: The video highlights a growing friction point where organizations using advanced AI models attempt to force rapid changes onto open-source projects. This creates a cultural clash between "move fast" engineering practices and the methodical, community-driven development style of long-standing language projects.

  • Engineering Reality Check: Zig maintainers clarified that the parallelization strategy Bun implemented could introduce non-deterministic bugs (causing random compilation failures). They emphasized that true performance gains come from addressing architectural bottlenecks, such as LLVM overhead, rather than simply hacking in parallelization.

  • The Bigger Picture: The incident serves as a case study for the risks of forking software based on AI-generated spikes. It reinforces the importance of deep, domain-specific engineering over superficial speed-ups that may compromise code reliability.

Bun Zig-to-Rust Rewrite: Anthropic Bets on AI Code | byteiota

Anthropic is rewriting Bun—the JavaScript runtime with 89,000 GitHub stars and 7 million monthly downloads—from Zig to Rust using Claude AI to auto-generate the port. A GitHub branch named claude/phase-a-port shows 1,799 files changed across 43 commits, and the comparison is so large GitHub can’t even render it. This isn’t a refactor. It’s a complete language migration driven by AI, and it’s happening right now. (as a test)


Home ⚡ Zig Programming Language

A Simple Language
Focus on debugging your application rather than debugging your programming language knowledge.No hidden control flow.
No hidden memory allocations.
No preprocessor, no macros.

 ⚡ Comptime

A fresh approach to metaprogramming based on compile-time code execution and lazy evaluation.Call any function at compile-time.
Manipulate types as values without runtime overhead.
Comptime emulates the target architecture.

 ⚡ Maintain it with Zig

Incrementally improve your C/C++/Zig codebase.Use Zig as a zero-dependency, drop-in C/C++ compiler that supports cross-compilation out-of-the-box.
Leverage zig build to create a consistent development environment across all platforms.
Add a Zig compilation unit to C/C++ projects, exposing the rich standard library to your C/C++ code.

3D web: Babylon.js, Three.js and related tools

Three.js – JavaScript 3D Library

AxiomeCG/awesome-threejs: 3️⃣ A curated list of awesome ThreeJS resources


Announcing Babylon.js 9.0Microsoft's popular rendering engine for building interactive, 3D web experiences now has a node-based particle editor, volumetric lighting, advanced Gaussian splatting, and more.


The choice between Three.js and Babylon.js depends on whether you need a flexible rendering toolkit or a comprehensive game engine.

Core Comparison
  • Nature: Three.js is a lightweight 3D library primarily used for rendering and creative visuals. Babylon.js is a full-featured 3D engine designed for games and complex interactive applications.
  • Philosophy: Three.js focuses on minimalism and "unopinionated" flexibility, requiring you to manually assemble features like physics or UI. Babylon.js is "batteries-included," providing built-in systems for physics, animation, and GUIs.
Key Strengths & Differences
  • Ecosystem & Community: Three.js has a massive community (over 5 million weekly downloads) and is the industry standard for artistic web experiences. Its React Three Fiber wrapper is widely considered the best way to use 3D in React.
  • Feature Set: Babylon.js includes high-level tools like a Node Material Editor and a robust Inspector for real-time debugging.
  • Stability: Babylon.js prioritizes backward compatibility, making it ideal for long-term enterprise projects. Three.js updates more rapidly, which can occasionally lead to breaking changes.
  • Performance: Both are highly performant. Three.js is lighter (approx. 168kB) for simple scenes. Babylon.js (approx. 1.4MB) offers better stability in large, complex scenes with thousands of objects.
Best Use Cases
  • Choose Three.js for:
    • Immersive marketing landing pages and brand experiences.
    • Artistic visualizations and creative coding projects.
    • Lightweight applications where minimal bundle size is critical.
  • Choose Babylon.js for:
    • Web-based 3D games (supports Havok Physics).
    • Complex enterprise applications like product configurators.
    • Advanced VR/AR (WebXR) experiences with "out-of-the-box" controller support.

dgreenheck/ez-tree: Procedural tree generator written with JavaScript and Three.js

EZ-Tree is a procedural tree generator with dozens of tunable parameters. The standalone tree generation code is published as a library and can be imported into your own application for dynamically generating trees on demand. Additionally, there is a standalone web app which allows you to create trees within the browser and export as .PNG or .GLB files.

Friday, May 08, 2026

SpaceX, xAI, $55B => Terafab

Interesting that this is separate from Tesla, while Tesla may be a major customer for those chips.

And big puzzle: how a huge very high-tech HW factory can cost less than a SW startup?

 Apple’s AI Strategy, SpaceX Chips, & AI Drugmaking

SpaceX Plans $55 Billion Investment to Make A.I. Chips (Paywall)

Rather than build at industry scale, SpaceX's proposed Terafab complex in Grimes County, Texas would start at $55B and potentially reach $119B — multiples of the $10–30B price tag for a typical modern fab — while SpaceX seeks tax breaks at a county hearing next month. The filing lands weeks before SpaceX's expected June IPO and slots into a wider Musk AI build-out: SpaceX absorbed xAI earlier this year at a combined $1.25T valuation, announced a $60B deal for AI coding startup Cursor last month, and just routed Colossus 1's full compute stack to Anthropic.



web dev: TanStack Start + AI = replace React :)

TanStack Start: A Client First Web Framework - Tanner Linsley - YouTube

Showcasing TanStack Start, it's architecture decisions/tradeoffs 
and what it can offer developers who want to build full-stack web applications in 2026



Issue #473: TanStack's experimental React clone — React Status

Projecting React — Tanner Linsley (he of TanStack) spent a day prompting an AI agent to regenerate React’s public API as a ~9KB runtime scoped to TanStack Start, and quietly shipped it on his blog and tanstack.com, where it runs at 2–3× the speed of stock React. But the most interesting part is why he isn’t properly releasing it (though it is on npm if you're curious).

Tanner Linsley

💡 It's also noteworthy that Tanner didn't find Preact to be a good, lighter 'drop-in' replacement for React due to its slow drift away over the years.

Thursday, May 07, 2026

AnthropicAI @ xAI @ SpaceX

 xAI on X: "SpaceXAI will provide @AnthropicAI with access to Colossus 1, one of the world’s largest and fastest-deployed AI supercomputers, to provide additional capacity for Claude → https://t.co/nfDR9S822L https://t.co/EQAz0S84m2" / X

SpaceXAI signed a deal Wednesday giving Anthropic access to Colossus 1 — its Memphis supercomputer with over 220,000 NVIDIA GPUs and more than 300 megawatts of capacity.


Elson on X: "@xai @AnthropicAI Super excited about this, Claude has been my favorite model and now with Space X just makes it so much sweeter! https://t.co/VIT84pmQmW" / X



Reading between the lines - YouTube by MaxS




IR for JavaScript - MLIR

 [RFC] JSIR: A High-Level IR for JavaScript - MLIR - LLVM Discussion Forums

This RFC introduces JSIR, a high-level IR for JavaScript:

  • JSIR preserves all information from the AST and supports high-fidelity round-trip between source ↔ AST ↔ JSIR;
  • JSIR uses MLIR regions to represent control flow structures;
  • JSIR supports dataflow analysis.

JSIR is developed and deployed in production at Google for code analysis and transform use cases.

JSIR is open source here: GitHub - google/jsir: Next-generation JavaScript analysis tooling · GitHub

Industry trend of building high-level language-specific IRs

The compiler industry is moving towards building high-level language-specific IRs. For example, the Rust and Swift compilers perform certain analyses on their high-level IRs before lowering down to LLVM. There are also a number of ongoing projects in this direction, such as Clang IRMojo, and Carbon.

The need for a high-level JavaScript IR

Why do we need a high-level IR for JavaScript specifically? While much of JavaScript tooling relies on ASTs (like ESTree), complex analyses require a control flow graph (CFG) and dataflow analysis capabilities, which JSIR provides by using the MLIR framework.


[2024 LLVM DevMtg] JSIR - Adversarial JavaScript Analysis with MLIR PDF

This RFC (Request for Comments) introduces JSIR, a high-level Intermediate Representation (IR) for JavaScript developed by Google and built on the MLIR framework.

What is JSIR?

JSIR is designed to bridge the gap between abstract syntax trees (ASTs) and low-level IRs. Unlike typical IRs that lose source-level information during lowering, JSIR is "reversible," meaning it supports a lossless round-trip between:

Source $\leftrightarrow$ AST $\leftrightarrow$ JSIR

Key Features

  • High-Fidelity Round-tripping: It preserves enough information to lift the IR back into valid JavaScript source code, achieving a 99.9%+ success rate in internal Google evaluations.

  • MLIR-Powered Analysis: It uses MLIR regions to represent JavaScript control flow structures (like if, while, and logical expressions) as nested blocks rather than flat graphs.

  • Enhanced Dataflow API: It provides a simplified wrapper over MLIR’s dataflow analysis, making it easier for developers to define lattices and transfer functions without manually managing worklists.

Primary Use Cases

Google currently uses JSIR in production for:

  • Decompilation: Lifting Hermes bytecode back into readable JavaScript.

  • Deobfuscation: Transforming obfuscated code into a clearer format, sometimes in combination with LLMs like Gemini.

  • Code Transformation: Performing complex refactoring or optimizations that require both dataflow insights and the ability to output source code.

Current Status & Future

  • Open Source: The project is hosted on GitHub.

  • Community Integration: The authors are exploring upstreaming JSIR to the LLVM/MLIR project, though they note practical challenges regarding dependencies like QuickJS (for constant folding) and Babel/SWC (for parsing).