Saturday, May 02, 2026

Local AI PCs

 The Cheapest 4TB DGX Spark Alternative… ASUS GX10 - YouTube


The Most Powerful APU on Earth - YouTube

$2,000 cheaper Beelink GTR9 Pro through LLM benchmarks against the Mac Studio M3 Ultra and other AMD Ryzen AI Max+ 395 (Strix Halo) boxes

Amazon.com: Beelink Mini PC, GTR9 Pro AMD Ryzen AI Max+ 395 CPU (126 Tops), 128GB RAM 2TB Crucial SSD, Mini Computer 10GbE Dual LAN/WiFi 7+BT5.4/8K Quad Display/USB4.0 * 2/SD Card Slot/DeepSeek 70B : Electronics
$2999


Cute, but powerful: meet NanoCluster, a tiny supercomputer - YouTube

AI Agent Skills

An AI agent skill is a portable, reusable bundle of instructions, scripts, and context (a skill.md file) that extends an agent's capabilities to perform specific tasks.

These skills act as "modular tools" that provide domain expertise, ensure consistent workflows, and reduce token usage by being loaded only when needed


 Agent Skills Overview - Agent Skills.io

A standardized way to give AI agents new capabilities and expertise.

Agent Skills are a lightweight, open format for extending AI agent capabilities with specialized knowledge and workflows.At its core, a skill is a folder containing a SKILL.md file. This file includes metadata (name and description, at minimum) and instructions that tell an agent how to perform a specific task. Skills can also bundle scripts, reference materials, templates, and other resources.

my-skill/
├── SKILL.md          # Required: metadata + instructions
├── scripts/          # Optional: executable code
├── references/       # Optional: documentation
├── assets/           # Optional: templates, resources
└── ...               # Any additional files or directories

Agent Skills – Codex | OpenAI Developers



code.claude.com/docs/en/skills.md


You're likely missing out on agent skills true potential! - YouTube
by Maximilian Schwarzmüller - YouTube


Agent Skills 101: Never Explain Twice Again... - YouTube
by NeuralNine - YouTube


Agent Skills :Standard for Smarter AI | by Plaban Nayak | Medium



Friday, May 01, 2026

AI models distillation: cross-learning

AI models can be trained from other AI models.
That makes for much more efficient (cheaper) learning, and potentially more compact models.
It is legal, moral and business "gray zone".
Initial AI training is on public data, protected or not, free or not.
Without open source projects AI would not be able to learn to code.
Why not "give back" in one way or another?
China has different objectives, and can subsidize AI...
It is messy and complicated period, unlikely to get clearer any time soon...

Knowledge distillation - Wikipedia

In machine learning, knowledge distillation or model distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have more knowledge capacity than small models, this capacity might not be fully utilized. It can be just as computationally expensive to evaluate a model even if it utilizes little of its knowledge capacity. Knowledge distillation transfers knowledge from a large model to a smaller one without loss of validity. As smaller models are less expensive to evaluate, they can be deployed on less powerful hardware (such as a mobile device).

Model distillation is not to be confused with model compression, which describes methods to decrease the size of a large model itself, without training a new model. Model compression generally preserves the architecture and the nominal parameter count of the model, while decreasing the bits-per-parameter.

 

 Elon Musk testifies that xAI trained Grok on OpenAI models | TechCrunch

OpenAI and Anthropic have been on the warpath lately against third-party efforts to train new AI models by prompting their publicly accessible chatbots and APIs, a process known as “distillation.”

That conversation has focused on Chinese firms using distillation to create open-weight models that are nearly as capable as U.S. offerings, but available at a much lower cost. However, tech workers have widely assumed that American labs use these techniques on each other to avoid falling behind competitors.

Now we know it’s true in at least one case: On the stand in a California federal court on Thursday, Elon Musk was asked if xAI has used distillation techniques on OpenAI models to train Grok, and he asserted it was a general practice among AI companies. Asked if that meant “yes,” he said, “Partly.”


OpenAI Misses Targets, Codex vs Claude, Elon vs Sam Trial, Big Hyperscaler Beats, Peptide Craze - YouTube

 

LaTeX.js

TeX text looks beautiful... 

can render even on web directly

Pandoc - Demos

Pandoc - index


LaTeX.js

JavaScript LaTeX to HTML5 translator

 LaTeX.js Live Playground | LaTeX.js

quicklatex.com

Learn LaTeX in 30 minutes - Overleaf, Online LaTeX Editor