> then pointed AI at it and had it implement code until every test passed.
You used to have two problems. Now you have three.
As long as you are using JSON, you will be able to optimize.
Did you know that you can pass numbers up to 2 billion in 4 constant bytes instead of as a string of 20 average dynamic bytes? Also, fun fact, you can cut your packets in half by not repeating the names of your variables in every packet, you can instead use a positional system where cardinality represents the type of the variable.
And you can do all of this with pre AI technology!
Neat trick huh?
These "solutions" place a lot of faith in a "complete" set of test cases. I'm not saying don't do this, but I'd feel more comfortable doing this plus hand-generating a bunch of property tests. And then generating code until all pass. Even better, maybe Claude can generate some / most of the property tests by reading the standard test suite.
These articles remind me so much of those old internet debates about "teleportation" and consciousness.
Your physical form is destructively read into data, sent via radio signal, and reconstructed on the other end. Is it still you? Did you teleport, or did you die in the fancy paper shredder/fax machine?
If vibe code is never fully reviewed and edited, then it's not "alive" and effectively zombie code?
I mean, great, but which CTO gave greenlight to such a weird architectural choice. Sorry for the rant!
Congrats to the team. Unfortunately many comments here are missing the big picture by attacking the previous architectural decisions with no context about why they were taken. It's always easy to say so in retrospect.
Also, I have to comment on the many commenters that spent time researching existing Go implementations just to question everything, because "AI bad". I don't know how much enterprise experience the average HN commenter these days have, but it's not usually easy to simply swap a library in a production system like that, especially when the replacement lib is outdated and unmaintened (which is the case here). I remember a couple of times I was tasked with migrating a core library in a production system only to see everything fall apart in unexpected ways the moment it touched real data. Anyway, the case here seems to be even simpler: the existing Go libs, apart from being unmaintened and obscure, don't support current feature of the JSONata 2.x, which gnata does. Period.
The article missed anticipating such critics and explaining this in more detail, so that's my feedback to the authors. But congrats anyway, this is one of the best use cases for current AI coding agents.
the real lesson is that Jsonata should have been written in C so anyone could link to it and keep the parser resident in memory, to avoid $300k vCPU costs spent on marshalling & RPC
Think of the gigawatts wasted on this nonsense.
$500k for JSON files LOL OK
[dead]
[dead]
[dead]
[flagged]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
The most baffling thing here is that they allowed a very very simple JSON expression language to become a 500k/year cost burden on their business
My god. But I am happy that they finally realised their error and put it right.