I think the graph title explains it pretty well, but here's an example from it. If you have a task you do 5 times a day, and you are able to optimize it to save 30 seconds each time you do the task, then you can spend up to 3 days doing that optimization before you've wasted more time than you'll save (over the course of 5 years).
Pretty helpful in the context of something like automation, you can figure out how long you should spend automating a task before you've wasted more time than you would have just doing the task manually every time.