Wednesday, April 22, 2026

AI Literate Programming => Markdown?!

Literate Programming was a good and influential idea, to include "intent" for code as text
and then manage code and text together;

but didn't really get much attention, difficult tooling and management

Now when much of new code development is being quickly automated with AI Agents,
markdown is becoming a primary "source artifact", and much of code is "downstream" from it.

Trouble is that md and code files are often not connected, and even related/referenced.

And that can be changed with big benefit.

If we start using markdown for code comments, 
and adjust tools to support it (should be "easy" change),
we could actually achieve key objectives of...

Literate programming - Wikipedia

Literate programming (LP) is a programming paradigm introduced in 1984 by Donald Knuth in which a computer program is given as an explanation of how it works in a natural language, such as English, interspersed (embedded) with snippets of macros and traditional source code, from which compilable source code can be generated.[1] The approach is used in scientific computing and in data science routinely for reproducible research and open access purposes.[2] Literate programming tools are used by millions of programmers today.[3]






AI Overview


Donald Knuth is most famous for creating the TeX typesetting system and the METAFONT font design system, but he also developed several specialized programming languages and paradigms:
  • WEB and CWEB: These are systems for Literate Programming, a methodology Knuth invented to allow programmers to write code interspersed with natural language explanations.
    • WEB was originally designed to work with Pascal.
    • CWEB is a later version designed for CC++, and Java.
  • MIX and MMIX: These are low-level instruction set architectures (ISAs) and their associated assembly languages.
    • MIX was used in early volumes of his landmark series, The Art of Computer Programming.
    • MMIX is a more modern, 64-bit RISC architecture that replaced MIX in newer editions.
  • SOL: A symbolic language for general-purpose systems simulation.
  • Literate Programming: More than a single language, this is a paradigm where the source code is treated as a work of literature, meant to be read by humans as much as executed by machines.
Knuth also has a deep history with existing languages; he was an early expert in ALGOL 60 and FORTRAN, and his early work on the TeX system was originally written in SAIL (Stanford Artificial Intelligence Language) before being converted to Pascal using his literate programming tools.

Tuesday, April 21, 2026

Anthropic prep 4 IPO...

circular or not, deals are deals, and the demand and performance is real... 

 Amazon and Anthropic expand strategic collaboration

  • Anthropic to secure up to 5 gigawatts (GW) of current and future generations of Amazon’s Trainium chips to train and power their advanced AI models.
  • Anthropic’s Claude Platform available on AWS, providing their full AI developer experience in one place.
  • Amazon to invest $5 billion in Anthropic today and up to an additional $20 billion in the future.



Google DeepMind has assembled a “strike team” of engineers and researchers to improve its AI coding models, prompted in part by advances from Anthropic. Internal comparisons suggest Anthropic’s tools outperform Google’s Gemini in code generation, driving urgency among senior leaders including co-founder Sergey Brin.
Currently, AI generates about 50% of Google’s code, versus Anthropic’s reported near-total automation


Nvidia to max AI usage

 Jensen Huang says Nvidia engineers should use AI tokens worth half their annual salary every year to be fully productive — compares not using AI to using paper and pencil for designing chips | Tom's Hardware

Nvidia CEO Jensen Huang said that he’ll be deeply alarmed if an engineer getting paid $500,000 a year does not consume at $250,000 worth of AI tokens to get their job done. The leather-clad chief of the world’s most valuable AI company said this during an episode of the All-In Podcast shot on the last day of Nvidia GTC 2026

Huang went on to compare an employee not using AI tokens to a chip designer saying that they will use paper and pencil and eschew CAD tools to get their work done.


Jensen Huang: Nvidia's Future, Physical AI, Rise of the Agent, Inference Explosion, AI PR Crisis - YouTube


Build to last: 10000 years clock

a good philosophy, think and do long term

also interesting design challenge, both for nature and engineering

fascinating

How do things last? Part 2: Millennia with Alexander Rose – David Eagleman

What is a 10,000 year clock? What is the Y10k bug? What allows some organizations to last a millennium? What do ancient ceramics have to do with ball bearings in satellites? What does any of this have to do with bristlecone pine trees, cymbals, or an extant hotel that launched in the sixth century? Join today for thinking about ourselves on a 10,000 year timescale with guest Alexander Rose.

The Long Now Foundation

The 10000 Year Clock 
(also known as the Clock of the Long Now)

The Rosetta Project

Alexander (Zander) Rose

Key Takeaways:
  • The 10,000-Year Clock: A project designed to endure for 10 millennia, highlighting the need to look beyond short-term election cycles (5:55-6:02). It is powered by temperature changes and requires human interaction to wind (13:12).
  • Digital Dark Age: Modern digital data is surprisingly fragile (4:40). Hard drives fail and formats become obsolete faster than stone or paper (42:30).
  • Long-Lasting Organizations: Successful long-term institutions share traits like flexibility, storytelling, and being right-sized rather than focused on exponential growth (30:35-39:10).
  • Examples: The oldest hotel in Japan (started in 718) and the ghats in Varanasi (managed for over 3,000 years) (23:42-24:42).
  • Intentional Stewardship: We have a duty to consciously decide what knowledge, myths, and technologies to pass on to future generations (53:40).

How do things last? Part 1: Neurons to Civilizations – David Eagleman

What makes things last, and what do very different lasting things have in common? Why might a space alien not be able to understand music? Why do windows in medieval cathedrals look thicker at the bottom, and what does this reveal about the world’s religions? What was the most important weapon in ancient history, and how did it disappear? Join today for the story of persistence, from sharks to schizophrenia to Roman concrete to DNA.

Seymour LM, Maragh J, Sabatini P, Di Tommaso M, Weaver JC, Masic A. Hot mixing: Mechanistic insights into the durability of ancient Roman concreteScience Advances. 2023 Jan 6;9(1):eadd1602.

Monday, April 20, 2026

Nvidia AI 3DGS: Lyra 2.0: Explorable Generative 3D Worlds

3D Gaussian Splatting (3DGS) is a cutting-edge 3D reconstruction and rendering technique that converts 2D images or video into highly detailed, photorealistic 3D scenes. Unlike traditional mesh-based methods, 3DGS uses millions of tiny 3D Gaussians (spheres/points) optimized via machine learning to represent scenes, enabling real-time, high-fidelity rendering for VR/AR and 3D modeling.

https://en.wikipedia.org/wiki/Gaussian_splatting


Lyra 2.0: Explorable Generative 3D Worlds






Method overview. (Left) Given an input image, Lyra 2.0 iteratively generates video segments guided by a user‑defined camera trajectory from an interactive 3D explorer and an optional text prompt, lifting each segment into 3D point clouds fed back for continued navigation. Generated video frames are finally reconstructed and exported as 3D Gaussians or meshes. (Right) At each step, history frames with maximal visibility of the target views are retrieved from the spatial memory. Their canonical coordinates are warped to establish dense 3D correspondences and injected into DiT via attention, together with compressed temporal history.

AI + Semantic Web in 2026? SHACL vs OWL

While initiated by the same person, WWW creator Tim Berners-Lee.
Semantic Web has never got similar attention and following as the original "web".

Maybe that is because there was not Mosaic and Netscape that made WWW what it is: extremely popular.

That is not to say that Semantic Web is not good or useful.
It is just not for "mainstream", since it is complicated.

But wait, isn't AI LLM also super complicated, but still extremely popular?

Yes, and this is exactly the answer:
LLMs got their "Netscape" moment in OpenAI ChatGPT.
Easy to use user interface, and related API.

Anyway, Semantic Web still exists, it is getting better, while not popular.

A not so new addition (2017) to stack called SHACL has additional handicap: awkward name.
Previous was at least trying to be clever/cute "OWL"

This is a good place to read about this "semantic" progress.

Kurt Cagle | Substack

How SHACL Makes Your LLMs Hum - by Kurt Cagle

Shapes Constraint Language (SHACL) @w3.org

SHACL - Wikipedia

Shapes Constraint Language[1] (SHACL) is a World Wide Web Consortium (W3C) standard language for describing Resource Description Framework (RDF) graphs. SHACL has been designed to enhance the semantic and technical interoperability layers of ontologies expressed as RDF graphs.

Base standards
Related standards
DomainSemantic Web

AI Overview

SHACL (Shapes Constraint Language) is a W3C standard language used to validate and describe 
RDF graphs by enforcing structural rules (shapes) on data. It ensures RDF data conforms to required formats (e.g., specific datatypes, cardinalities), acting as a validation schema. SHACL is designed to work directly with RDF and uses SPARQL for complex validations.
What is SHACL?
  • Shapes Graph: Defines constraints using node shapes (about the node) and property shapes (about values connected to the node).
  • Data Validation: It checks a "data graph" against a "shapes graph" to ensure compliance.
  • Capabilities: It ensures data quality, validates RDF against structural requirements, and can define constraints such as mandatory fields, data types (e.g., xsd:string), or valid value ranges.
Relationship to RDF
  • Native RDF Integration: SHACL shapes themselves are expressed in RDF, usually via the Turtle format.
  • Validates Data Graphs: SHACL operates directly on RDF triples (graphs), validating subjects, predicates, and objects.
  • Class/Instance Validation: It often targets RDF instances of specific classes within a dataset.
Relationship to SPARQL
  • Backend Engine: SHACL-SPARQL is an extension mechanism where validation constraints are defined as SPARQL queries.
  • Complex Rules: While core SHACL handles basic validation, SPARQL is used for complex cross-property or complex structural validation rules.
  • Query Transformation: A SHACL processor can transform shape definitions into SPARQL queries to validate data.
SHACL vs. RDF Schema/OWL
  • RDF Schema (RDFS) and OWL are used for inferencing (deriving new knowledge), while SHACL is used for validation (checking if data is right).
SHACL provides a standard way to validate that RDF data matches the intended structure and content constraints.

Sunday, April 19, 2026

AI dev tool: Mastering Claude Code

 Mastering Claude Code by Andreas Horn @ LinkedIn






Icon: concrete house for $99K?

 3D-Printed Homes for $99K: ICON’s Jason Ballard on the future of housing | E2277 - YouTube
This Week in Startups - YouTube

Key details regarding the ICON $99k home initiative:
  • Goal: To produce 3D-printed homes with construction costs, including printing and finishing, for less than 
    .
  • Technology: Utilizing advanced, faster 3D printing systems such as the "Phoenix" printer, which has a 70-foot reach, and CarbonX7/CarbonX99 concrete.
  • Affordability: The initiative aims to reduce the cost of building, with some 3D wall systems starting around 
     per square foot, and fully printed/finished wall systems at approximately 
     per square foot.
  • Purpose: These homes are designed to meet International Building Code (IBC) standards to provide accessible housing for individuals, couples, and small families.
  • Location: The project is part of a broader push to make 3D-printed homes a mainstream, affordable option, with developments already underway in areas like Community First! Village in Texas.



600sqft?


ICON is now taking orders for projects using Phoenix starting at $25/square foot for wall systems or $80/square foot including foundation and roof. This cost to build is lower than the most recent publicly available data for conventional construction of wall systems*. This wall system cost would represent a savings of up to $25,000 for the average American home versus conventional construction.

ICON: Architecture for Humanity and Building Technology

ICON (@ICON3DTech) / X

Jason Ballard (@JasonDBallard) / X

SEBY (@Sebyverse) / X

RESI The Real Estate Oracle
Institutional Grade Property Pricing API