1
Stop Overhyping TOON — JSON is still good (also, stop with the copy-paste!) I’ve seen a few posts claiming 60%+ LLM cost savings by switching from JSON to formats like TOON. The reality: this is not… | Francesco Martino | 154 comments
www.linkedin.comStop Overhyping TOON — JSON is still good
(also, stop with the copy-paste!)
I’ve seen a few posts claiming 60%+ LLM cost savings by switching from JSON to formats like TOON. The reality: this is not where your AI costs are coming from.
If you minify JSON or use simple list/CSV-style structures, you get nearly identical compression, there is no need for a new serialization language and I get the impression that's just a new trend.
Teams overspend because they send huge prompts, repeat instructions, include irrelevant data, and default to large models when smaller ones work fine. Fix these first — the savings are far bigger than switching formats.
JSON is standard, you can directly reuse and no extra work is necessary
To be clear, I do not want to be extreme, TOON isn’t bad, and in some cases may be even better than JSON, but it is not what people claim it to be. Maybe results are better, who knows.
In the image I show the difference between JSON and TOON, and indeed it seems to show a 50% reduction in tokens. Stringify the same JSON, 0% reduction (TOON vs JSON).
If we even increase the size of our JSON, we get different results:
A standard JSON with 871 character can result in 259 tokens. TOON? 154 tokens, -60%. Stringified JSON? 136 tokens, even a bigger reduction!
- Token calculation with OpenAI Tokenizer
- JSON to TOON: toonifyit
Want to reduce LLM cost? Send less data. Cache smartly. Use smaller models. | 154 comments on LinkedIn
You must log in or register to comment.

