Are there any things in Linux that need to be started over from scratch?
I'm curious how software can be created and evolve over time. I'm afraid that at some point, we'll realize there are issues with the software we're using that can only be remedied by massive changes or a complete rewrite.
Are there any instances of this happening? Where something is designed with a flaw that doesn't get realized until much later, necessitating scrapping the whole thing and starting from scratch?
I admit I haven't done a great deal of research, so maybe there are problems, but I've found that lzip tends to do better at compression than xz/lzma and, to paraphrase its manual, it's designed to be a drop-in replacement for gzip and bzip2. It's been around since at least 2009 according to the copyright messages.
That said, xz is going to receive a lot of scrutiny from now on, so maybe it doesn't need replacing. Likewise, anything else that allows random binary blobs into the source repository is going to have the same sort of scrutiny. Is that data really random? Can it be generated by non-obfuscated plain text source code instead? etc. etc.