Micro and Blind Optimizations 1

Posted by F 03/11/2005 at 22h16

Yesterday, a good friend of mine and ex-coworker contacted to me to share his frustration.

(he hates to be called “Polino”, so I won’t.. doh!)

He finished a software solution for a customer, and now an expert is reviewing his Java code.

The expert code reviewer insists on small performance optimizations, but he is way off target. He wants to micro-optimize, and to do it blindly.

For example, he reported that the following code was doing “inefficient String concatenations”:

String myString = "Some text here "+
                  "Some text there "+
                  "Some more... ";

And that this was an “inefficient way of creating Longs”:

 myList.add(new Long(1));

These examples are probably well optimized by modern Java compilers. But even if they weren’t, they probably don’t affect much to the performance of the system as a whole.

This expert may ignore that the Pareto principle often applies to software optimization problems: about 80% of the resources (including execution time) are consumed by about 20% of the code.

You need to find those sections of code that have the most impact in the overall performance of the system.

One could argue that the two examples above are not in that top list (unless the code is wrapped by several nested loops). But, in any case, you cannot base your optimization efforts only on arguments and intuition: you need real data.

Tools exist to find that 20% of the code: Profilers. A good profiler can present very useful information about memory consumption, execution times of each function/method, blocking threads, deadlocks, and more.

There are tons of profilers for the Java platform. And many are open-source, so you have no excuse: see here and there.

So, don’t do blind optimizations, and leave micro-optimizations for last. Collect real data and then concentrate your efforts where it really matters.

And, even more important, define upfront what metrics you will use to measure performance, and agree on which results will be acceptable for the system. Otherwise data, you will never know when to stop optimizing.

Related:

“Premature optimization is the root of all evil” – CarHoare (inventor of the QuickSort algorithm)

Trackbacks

Use the following link to trackback from your own site:
http://neuroning.com/trackbacks?article_id=4

  1. juancn 20/03/2006 at 14h41
    The first example of string concatenation, according to the Java Language Specification, must be optimized. That is, there isn't any string concatenation going on there since the expression is constant. Even if it weren't, Java uses a StringBuffer to implement the '+' operator. For example, doing the concatenation in two different statements would be less optimal than a single large statement. Anyway, I wholeheartly agree with you that code optimization without quantitative measurements is a complete waste of resources.