; Date: Mon Aug 08 2005
The leader of the Findbugs project, Bill Pugh visited Sun this week. He gave talks to various teams about the tool, showing its usefulness. One of the talks was to the management of my team, the Java Quality Engineering organization.
It's a real interesting tool, doing what's called static analysis. If you haven't read about it, the basic technique is to scan through the classfiles (not source code) looking for patterns that indicate bugs. For example the tool identifies somewhere around 10 places in the JDK source where, if that code section is executed, the JDK will enter an infinite loop. Hurm.
It's an interesting tool, and Dr. Pugh's talk shows a great deal of real bugs the tool catches. It seems what he showed sure caught the attention of my management team, as we are very interested in how we can apply it in testing Java.
Unfortunately the tool doesn't generate test cases, it simply shows potential problems. In my career I'd worked as a developer for 8+ years before joining the Java Quality team, and I know very well that developers time is valuable. Many of the reports this tool gives the developer is going to ask "is it worth my time to fix this", and try to brush it off as happening in rare situations.
I wanted to see for myself how useful the tool is, and I tried it on a project I'd written over the last year. The project had 157 classes, so it's not terribly big, and it's based on the Jemmy source code. It found around 120 "bugs", and gave a bug percentage of 75% (or so).
However I have a hard time classifying most of the reported things as bugs. For example in several cases I called a method that returned an Object, which I stored in a local variable. I have some commented out code which printed the Object, and otherwise the Object is going unused. Findbugs reported that as a potential problem, which it is since sometimes you mean to do something with such returned objects. But then what, either I get the warning for not doing anything with the Object, or I don't store the returned value in anything, in which case findbugs is going to warn me about an ignored return value.
Since this is java5, there ought to be an annotation to apply which will make findbugs not warn about this.
Another report was of some instances of inner class usage where findbugs recommended I use a static inner class. Hurm, I'm not clear on when it's good to use a static inner class. This points to a major feature request I'd like to see in findbugs - namely - for each warning type, there ought to be available a bit of explanation about the warning, some examples that exemplify what's being warned, and some examples of how to fix the problem. Maybe this warning about needing to make the inner classes static is important, but I don't know why it's important, and I'd love to learn more.
Of the 120 or so problems there were 4 instances where I felt it was a real problem that justified being fixed. Three of them were failing to close a stream, and the fourth was a mistake in initializing a local variable in an inner class. That fourth issue would have led to some confusing behavior.
I think findbugs is pretty useful. This despite the high ratio of nattering issues it reported. See, it's possible to easily filter out reported issues so you can focus on what you really want to look at. Plus, as you use it over history findbugs remembers things it has warned about in the past, and won't warn you about them again.
There is no need for testcases. Normally tools like FindBugs, PMD, Checkstyle, etc. would ran as part of the continuous builds, e.g. after unit tests and its results are published within build report.
BTW, Eclipse has nice plugin for FindBugs that take existing project and without further confuguration produce report, which you can sort, filter, prioritize, categorize, etc. Certain bug detectors can be switched off and plugin also has nice view that provide some explanation about the aserted issues.
Posted by: euxx on August 06, 2005 at 12:09 PM
You might want to look at other inspection tools as well. IntelliJ IDEA 5.0 supports over 500 inspections for common problems, performance issues, and style issues, and it provides most of the same inspections FindBugs and PMD perform. It supports the @SuppressWarnings annotation for preventing certain warnings for a particular method or class.
Posted by: keithkml on August 06, 2005 at 08:08 PM
From euxx: "There is no need for testcases."
Are you trying to say, "There is no need for JUnit, HTTPUnit, etc. unit test cases, as this product and products simliar to it do such a great job it renders unit testing obsolete?" Clarify.
Posted by: phlogistic on August 07, 2005 at 08:13 AM
phlogistic, I think euxx was saying that you don't need unit tests for running findbugs because it can be run as part of the build process.
Posted by: keithkml on August 07, 2005 at 10:24 AM
Yes, that's what I understood euxx to be saying. That these static analysis tools make their own report and are independant of test cases.
However .. let me explain where I was coming from with what I said in the posting.
I'm evaluating its use in the Java quality engineering team, not the Java development team. Hence, when we find a quality issue, we don't directly change the implementation to fix the quality issue, but instead report it as a bug. Hence, our job is to get the attention of the developers to get them to fix things.
Typically we do this with test cases and test suites. With a test case in hand we can say "this fails".
The output of findbugs and the similar tools mentioned is a report of likely problems. We are expecting that, to get the developers attention, we need to have a test case that demonstrates it's a real problem. Like I said in the posting, I know the developers standpoint, I know they want to work on significant problems rather than fix nattering issues that don't make any difference, etc.
But this raises a question ... suppose we found something in Java via the findbugs report? But that issue is buried deep in a sun.some.thing.SomeClass and it's not clear how to call that line of code with the arguments which will cause failure. e.g. one way to demonstrate it's a real and serious problem is a test case that causes the offending line of code to be executed, and which shows a range of values over which there is failure.
What I'm thinking about is a call graph analysis that shows the linkages from a public API method to this internal line of code. If we then analyze the linkages we should be able to find the right values to pass that will tickle the particular line of code. Hence, a test case.
Posted by: robogeek on August 07, 2005 at 12:00 PM
"What I'm thinking about is a call graph analysis that shows the linkages from a public API method to this internal line of code. If we then analyze the linkages we should be able to find the right values to pass that will tickle the particular line of code. Hence, a test case."
This would be extremely helpful for maintaining applications. Not just "what was called"... we also need to know "how it was called".
Posted by: johnreynolds on August 08, 2005 at 06:12 AM
David, if you want to be that ridiculus, you can actually wrap FindBugs into JUnit test cases. That and disabling unneeded bug detectors would give a reproduceable "failures" with reasonale severity, which you can use against those developers who don't believe in reports.
FindBugs is designed to search for a common issues, these are so stupid, that must befixed (well, especially those with high severity, such as potential NPE). FindBug's data-flow analysis is virtually executing bytecode on a much lower level, so it is easier and more realistic then doing backtracking to public API calls (which is also doable, but will definetely require more effort).
Posted by: euxx on August 09, 2005 at 07:05 PM
"What I'm thinking about is a call graph analysis that shows the linkages from a public API method to this internal line of code..."
It's called IntelliJ IDEA. : ) Cursor on the method; Ctrl-Alt-H
Posted by: grlea on August 10, 2005 at 12:14 AM
FWIW, PMD also supports the @SuppressWarnings annotation. Good times...
Posted by: tcopeland on October 27, 2005 at 03:39 PM