Yesterday, a good friend of mine and ex-coworker contacted to me to share his frustration.
(he hates to be called “Polino”, so I won’t.. doh!)
He finished a software solution for a customer, and now an expert is reviewing his Java code.
The expert code reviewer insists on small performance optimizations, but he is way off target. He wants to micro-optimize, and to do it blindly.
For example, he reported that the following code was doing “inefficient String concatenations”:
String myString = "Some text here "+ "Some text there "+ "Some more... ";
And that this was an “inefficient way of creating
These examples are probably well optimized by modern Java compilers. But even if they weren’t, they probably don’t affect much to the performance of the system as a whole.
This expert may ignore that the Pareto principle often applies to software optimization problems: about 80% of the resources (including execution time) are consumed by about 20% of the code.
You need to find those sections of code that have the most impact in the overall performance of the system.
One could argue that the two examples above are not in that top list (unless the code is wrapped by several nested loops). But, in any case, you cannot base your optimization efforts only on arguments and intuition: you need real data.
Tools exist to find that 20% of the code: Profilers. A good profiler can present very useful information about memory consumption, execution times of each function/method, blocking threads, deadlocks, and more.
So, don’t do blind optimizations, and leave micro-optimizations for last. Collect real data and then concentrate your efforts where it really matters.
And, even more important, define upfront what metrics you will use to measure performance, and agree on which results will be acceptable for the system. Otherwise data, you will never know when to stop optimizing.
“Premature optimization is the root of all evil” – CarHoare (inventor of the QuickSort algorithm)
Use the following link to trackback from your own site: