I agree with this statement. The fastest system that doesn't do it's job is worse than the slow system that does. Landon Noll gave a keynote at XP2019 and talked about his multiple prime number accomplishments. One of the things he pointed out is that the results had to be unquestionable. Enough error checking and validation during processing to catch and correct (if possible) any small mistake had to be in place, or else it wasn't going to be accepted, even if correct.
Having a system that produces /a/ result fast is a terrible system if we cannot trust the result.
A system that is not efficient is a system that can be improved. A system that is not reliable is one that needs to be replaced.
An aspect of the largest prime searches - How do you know it's given back the correct answer? How much work needs to go into validating it? What if it's invalid, and you run it again and get another number... ... That system can't be trusted; just stop trying to use it.
We're back to 50% of the principles I agree with.
Poor efficiency can be isolated and improved; poor reliability will be hard to isolate/improve, and perhaps even hard to discover.