logoalt Hacker News

cogman10yesterday at 3:54 PM4 repliesview on HN

Interesting but, IMO, probably one of the worst uses of JSON. The data you would want to consume is already not "human readable" so it instead introduces a lot of bloat for really no benefit.

If you have a non-insignificant amount of data points to track this is going to eat just a ton of memory while also being pretty slow to encode/decode.

Imagine, for example, if we encoded this as a binary. First 2 bytes for the feature type, second 2 bytes for the geometry type, 3 bytes for a fixed point x, 3 bytes for a fixed point y, and you could optionally provide the properties as a json blob in a trailing string. That's 10 bytes for all the coordinate stuff. Less bytes than what currently stores the `"type": "Feature"` string.


Replies

doginasuityesterday at 4:06 PM

Do you mean geocoordinates when you say not human readable? Those are obviously at the heart of geospatial information but there is quite a bit more to the spec that does benefit from being human readable, and I'd include longitude/latitude among them. There are also solutions like cbor which allow them to be transferred and decoded/encoded from binary. For performance critical data you can also use something like protobuf, but it would be a huge pain to handle everything that way. Json is a great choice as a general spec.

morganherlockeryesterday at 8:07 PM

> If you have a non-insignificant amount of data points to track this is going to eat just a ton of memory while also being pretty slow to encode/decode.

This is a fair critique, however, for any large GeoJSON, the coordinate arrays will dominate the size. I think it's also safe to assume this data will be gzipped at rest and over the wire, which will eliminate most of the "header" metadata size you mention. As you point out, it would be much more efficient to have a binary format, and there are good examples like these, that are ~2-3x smaller in benchmarks:

https://flatgeobuf.org/ https://github.com/mapbox/geobuf

That said, I think GeoJSON should be compared against other human readable formats like KML, which has a lot of wasted space as well, while being more difficult to read/write.

dinkumthinkumtoday at 5:20 AM

This is just pretty wrong. Sure, geojson can be bloated but it is not for "no benefit." It is a very popular format and it is easy to encode and decode, even if it is slow for large data. It is more for sharing than long term storage. Take a site like below, it is very convenient to render json this way.

https://geojson.io