Any claims about compressing programs are extremely data-dependent so any general claims will be false for certain test cases.
I do a lot of data compression and decompression, and I would have liked a lot to find a magic algorithm that works better than all others, to simplify my work.
After extensive tests I have found no such algorithm. Depending on the input files and depending on the compromise between compression ratio and execution times that is desired, I must use various algorithms, including zstd and xz, but also bzip2, bzip3 and even gzip.
I use quite frequently gzip (executed after lrzip preprocessing) for some very large files, where it provides better compression at a given execution time, or faster execution at a given compression ratio, than zstd with any options.
Of course for other kinds of files, zstd wins, but all the claims that zstd should be used for ALL applications are extremely wrong.
Whenever you must compress or decompress frequently, you must test all available algorithms with various parameters, to determine what works best for you. For big files, something like lrzip preprocessing must also be tested, as it can change a lot the performance of a compression algorithm.