Before the final integration some concerns were voiced regarding a performance degradation in JFR enabled builds - even when a JFR recording was not started.
This really didn't correspond to my experience so I decided to give it a quick spin with a standard benchmark suite. For the licensing terms and the ease of getting such a suite I picked SPECjvm2008 - even though rather outdated it still can provide meaningful numbers.
Setup
- dedicated c5.2xlarge AWS instance running Ubuntu 18.04
- jdk8 builds (with and without JFR) downloaded from https://builds.shipilev.net/
- SPECjvm2008 downloaded from https://www.spec.org/jvm2008/
Run
The benchmarks were run with '-i 7' argument to force 7 iterations for each particular case. This should reduce the test jitter but unfortunately it seems to be making these results 'non-compliant' but that should be fine for this quick check.
The runs (with and without JFR) were done in sequence to remove all interference and the host was fully dedicated to benchmarks.
Results
Results are, well, unsurprising. There is no statistically significant difference between the same JDK8 build with and without JFR and no JFR recording started. The overall composite scores are for all purposes equal.
In the following table you can find a more detailed breakdown of the benchmark runs - the 'diff' column shows the performance difference between the run with and without JFR where regression is indicated by a negative number.
Addendum
In parallel to this quick SPECjvm2008 run we at DataDog ran also a bunch of more exhaustive benchmarks which happen to be internal and therefore irreproducible in public. But they all confirm the initial hunch that there would be no performance regression whatsoever for JFR enabled JDK8u - meaning that you can go and enjoy JFR on JDK8u!
No comments:
Post a Comment