Jsoncrush #MatchURL : No. 1 Premium URL Shortener

Jsoncrush #MatchURL : No. 1 Premium URL Shortener

04.Nov.2021

The JSONCrush project has been created to provide a simple and powerful tool that can be used to compress your

JSON documents before you store them in a repository, thus reducing the size of the stored data. This is especially important when dealing with mobile applications where every byte counts!

The need for this tools arose from an internal sanitization process we have at which loads large amounts of JSON from various sources. We found out that our Mobile Apps were using a lot more bandwidth than expected because of all the back-and-forth trips they were doing between the backend and their respective repositories while parsing said JSONs.

The solution was to create a simple command line utility that would crunch down all the data under its scrutiny, providing us with much leaner datasets.

One of the main goals of the project was to reduce the footprint as much as possible, therefore we were quite worried about using another format than our beloved JSON. We wanted to retain all the information that was there initially, so sticking to JSON was a must!

Even though you can compress your data down to around 20% or so, it still does not mean that it will end up in a smaller physical size; this is because documents tend to grow due to overhead (generally caused by XPath or other DOM traversal) and the fact that most JSON parsers are not optimized for this use-case.

JSONCrush works by parsing your document, building an internal representation of its data structure and then serializing it to a string using pretty print formatting; at this point you can compress the resulting string if you wish to. Once everything is completed, you will receive the new JSON document with all the information still there, just packed in a more efficient manner.

One thing that we had to take into consideration was that even though our documents were quite clean upon arrival (we do sanitize them before storage) they could become corrupted after compression because of their size meaning that some extra bytes could be used where they should not. As such, we decided to use the deflate algorithm when compressing our documents in order to minimize these problems.

Recommendations

Ensure that your JSON files are not too large before using this tool, otherwise it might be quite slow. If you have an average size document of around 500KB then you should be fine - however if you have a very large file then this might cause memory issues of which the application was not designed for.

You can use the --help flag to get more information about how to use this application, but I think it's quite self-explanatory! Have fun crunching some JSONs!

Some notes on the use of this tool: the current implementation only works with UTF-8 encoded documents. If you try to feed it garbage then it might blow up, so please use at your own risk ;P

If you want to check out our commercial offerings please visit https://www.matchurl.com/ or if you wish to just post a job opening for an experienced NodeJS developer types 'webapp' in our search bar (no quotes) and we will contact you ASAP! We are always looking for new talent!

We are social