Channels ▼

Eric Bruno

Dr. Dobb's Bloggers

Java Performance Tuning at the NYJavaSIG

November 02, 2009

In October, I attended one of the regular meetings of the New York Java Users Group (nyjavasig.com/). I've written about past meetings, and the great job Frank Greco from Kaazing (kaazing.com) does with the group. This time, Kirk Pepperdine was there to speak about Java performance tuning. Although he gave only a brief overview of what is otherwise an in-depth topic, the information he shared was quite valuable.

 

Kirk Pepperdine provides performance tuning services as part of his business activities at Kodewerk (kodewerk.com), so he has a lot of experience on the subject. The first point Kirk conveyed was how important it is to NOT dive into the code right away when tuning. Instead, he advocates using the right tools (mostly very simple ones, actually) to observe and measure the application's behavior while executing. 

 

For instance, the first step was to determine the dominating activities in a sample application he ran on his MacBook Pro. The top level activities he typically measures are:

 

  • Time spent in the OS
  • Time spent in the Java VM
  • Time spent executing application code

 

1) Determine the dominating consumer:

 

As an example, he noted that application appeared to be CPU bound while running. (Note that if performance were poor, and your application is not CPU bound, this can indicate contention for other resources such as the network or filesystem). Looking deeper, he noticed that more time was spent executing system code (by observing the Unix reporting of system execution time) which indicates OS-level activity. The main culprit for this is usually the application threads locking on shared resources and locks. 

 

Key learning: determine the dominating consumer by measuring user and system code execution times and percentages.

 

2) Perform application and JVM monitoring:

 

Next, to determine where the contention lies, he performed a simple thread dump to see which threads were blocked versus waiting, and what they were blocked on. Going further, he recommends performing some light application monitoring. For instance, it's a good idea to turn on Java's garbage collection logging to measure that activity and its impact on your application. To do this, add the following parameters to your JVM command-line:

 

-Xloggc:gc.log -XPrintGCDetails

 

The results from -XPrintGCDetails (within the gc.log file) look like this:

[GC [ParNew: 31936K->1964K(37248K), 0.0185084 secs] 31936K->1964K(518976K), 0.0187525 secs] [Times: user=0.04 sys=0.00, real=0.02 secs] 
[GC [ParNew: 33900K->5312K(37248K), 0.0380581 secs] 33900K->7249K(518976K), 0.0381451 secs] [Times: user=0.05 sys=0.01, real=0.04 secs] 
[GC [ParNew: 37248K->5312K(37248K), 0.0611352 secs] 39185K->13075K(518976K), 0.0612368 secs] [Times: user=0.09 sys=0.01, real=0.06 secs] 
[GC [ParNew: 37248K->5312K(37248K), 0.0815016 secs] 45011K->18915K(518976K), 0.0815791 secs] [Times: user=0.09 sys=0.01, real=0.08 secs] 
[GC [ParNew: 37248K->5312K(37248K), 0.0474382 secs] 50851K->24077K(518976K), 0.0475378 secs] [Times: user=0.08 sys=0.00, real=0.05 secs] 
[GC [ParNew: 37248K->5312K(37248K), 0.0522814 secs] 56013K->30506K(518976K), 0.0524781 secs] [Times: user=0.08 sys=0.01, real=0.05 secs] 
[GC [ParNew: 37248K->3821K(37248K), 0.0288528 secs] 62442K->32433K(518976K), 0.0289258 secs] [Times: user=0.05 sys=0.01, real=0.03 secs] 
[GC [ParNew: 35757K->3932K(37248K), 0.0114677 secs] 64369K->32544K(518976K), 0.0116085 secs] [Times: user=0.02 sys=0.01, real=0.01 secs] 
... 

 

In all cases, the number to the left of the "->" is what memory usage was before the GC event, and the number that follows indicates memory usage after the GC event. The first set of numbers indicates eden space (new generation) usage, while the second set is for the old generation. Note that an increasing amount of memory usage (as shown in the sample above for the old generation) with no drop over time could indicate a memory leak. This can be caused by objects being added to collections or other references that are not removed over time.

 

Key learning: know how to measure JVM activity, such as garbage collection and JIT compilation, to understand when the JVM is interfering with application execution.

 

3) Use stack traces and thread dumps to have the system and JVM tell you where to begin performance tuning.

 

Use tools such as

 

  • HPjmeter (http://www.javaperformancetuning.com/tools/hpjmeter/index.shtml)
  • VisualVM (http://java.sun.com/javase/6/docs/technotes/guides/visualvm/index.html)
  • VisualGC (http://java.sun.com/performance/jvmstat/visualgc.html)

 

to dive deep into the JVM's processing of your application. Better yet, Kirk showed how NetBeans has these tools built in, integrated with your project's code, to identify performance issues and the code contributing to them. 

 

Key learning: perform code profiling at the end of the performance tuning process. Much can be learned from the OS reporting and other tools.

 

In the end, the sample application that Kirk worked with to illustrate had two main issues: 

 

1) Excessive locking, which was identified by high system code execution and thread dumps, and was remedied by switching from a HashMap to a ConcurrentHashMap collection class (and removing the use of the "synchronized" keyword). 

 

2) Excessive garbage generation and collection (from a high object allocation rate) due to the use of String objects as the key in the collections. The way the objects were inserted and located (via the compare method) within the collections caused a lot of temporary String objects to be created. Switching to a composite key based on a class where Kirk wrote his own compare method without the use of String objects resolved this issue.

 

For more information, see Kirk's kodewerk.com site, along with other resources such as:

 

  • http://java.sun.com/javase/technologies/hotspot/index.jsp
  • http://java.sun.com/javase/technologies/hotspot/gc/index.jsp
  • http://www.javaperformancetuning.com/

 

Happy coding (and tuning)!

-Eric

 

 

Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.
 


Video