Ask HN: How many CPU / GPU cycles are wasted unnecessarily?

For years I have been asking this question to myself: how much code actually wastes CPU / GPU cycles, assuming optimization flags are set to their highest option possible?Is there actually such thing with the flags set on, or not?I’m really curious.