Originally posted by joneschr
But I disagree that it's the common case. I personally find the more common case is that programs perform like dogs because they lack simplicity, not complexity.
In my experience, optimizing normally consists of the following steps:
1. Implement caching where possible - extra complexity.
2. Implement compression where possible - extra complexity.
3. Use various coding methods that reduce the number of instructions executed such as moving methods inline or optimizing the source code - extra complexity.
4. Customize the code for the specific hardware - extra complexity.
5. Implement improved algorithms such as sorting methods etc - extra complexity.
6. Instead of using the code / html produced by the various tools, edit it directly to remove all the bloat the tools produce - extra complexity.
Remember that the extra complexity I am referring to my not always be in the actual code but in the maintainability of the code.
Behold the lowly bubble sort which outperforms on small data sets.
And thats simpler than what? I have never implemented bubble sort in preference to another method. If I ever did, it would probably be adding complexity because I would be coding it myself instead of using a built in utility method.
Or consider also the case of the Linux kernel, who Linus himself acknowledges suffers from performance problems due to bloat:
http://www.theregister.co.uk/2009/09/22/linus_torvalds_linux_bloated_huge/
That article doesn't indicate that simplifying the kernel would be the solution. It would be things like assembly code optimization that would have the most impact. There may be code duplication that could be dealt with but that is likely to affect size more than performance.