How to Create and Destroy Java Memory Leaks
by June 24, 2013

Filed under: Performance Monitoring

AppNeta no longer blogs on DevOps topics like this one.

Feel free to enjoy it, and check out what we can do for monitoring end user experience of the apps you use to drive your business at

Thanks to Java’s memory management built into the JVM, developers are shielded from common memory leak problems in lower level by automated memory allocation and garbage collection. However, Java is not immune from memory problems caused by higher-level code design defects.

Java Memory Leak? How is that possible?

The first memory pitfall that come into many Java developers’ minds is probably mismanaged Collections (and Maps).

Let’s walk through a really simple example! We’ll use a static Map as a cache for Data loaded for a certain User:

public class UserDataLoader {
private static Map cache = new HashMap();
//...code to load and cache the data

Now there are 2 concerns to address:

1. The User object itself will not be Garbage collected as this map is still strongly referencing it, even if the User object is no longer referenced anywhere else in the JVM

2. The Map can easily grow out of control if not managed properly

An obvious solution is to remove the User keys that are no longer needed. This could be done by exposing this clean up function to whoever manages the User objects or using a listener pattern.

However, this might not be suitable in certain scenarios, such as an interaction with external API. Adding that function also forces more complexity on users of the API, increasing the coupling between classes. Not good.

A rather naive but feasible solution is to instead use the Id of the User object (for example an int) as the Map key to address concern 1 – the Map will no longer prevent Garbage Collection on the User object. In this case, there would have to be code logic that periodically purges the cache based on timestamp (or smarter algorithms) in order to keep the total size in check.

java.util.WeakHashMap for the rescue!

WeakHashMap provides a way more elegant solution for this problem – simply instantiate the cache Map as a java.util.WeakHashMap!  Entries in the WeakHashMap will be removed automatically when the User (as map key) is no longer referenced elsewhere.

As described in the Java WeakHashMap API – “the presence of a mapping for a given key will not prevent the key from being discarded by the garbage collector”.

What does that really mean? Let’s inspect the snippet below:

Map<Object, Object> objectMap = new WeakHashMap<Object, Object>();
for (int i = 0; i < 1000; i++) {
    objectMap.put(String.valueOf(i), new Object());
    System.out.println("Map size :" + objectMap.size());

The output will show a lot of 1s and 0s (instead of increasing map size). Since the String.valueOf(i) would create a temporary String instance which no one else references (but the objectMap), the System.gc() calls will garbage collect the String instance and essentially removing it from the objectMap.

Take note that the behavior is non-deterministic (1s and 0s, or 2s if you are lucky!) as gc() call does not guarantee clearing everything up right the way.

Java Memory Leak

Comparison of memory usage on WeakHashMap and HashMap (same code logic, no explicit cleanups):
Top bar indicates WeakHashMap, which garbage collection correctly purged entries periodically. Memory peaked at around 100MB
Bottom bar indicates HashMap, without cleanup code, the map grew gradually, took up 600+ MB and eventually ran into “OutOfMemoryException” (source image included memoryUsage.png)

WeakHashMap is Awesome! But use it with caution!

The first thing to watch out is the selection of Keys! Let’s examine 2 interesting cases below:

1. Integer keys

Map<Object, Object> objectMap = new WeakHashMap<Object, Object>();
for (int i = 0; i < 1000; i++) {
    objectMap.put(i, new Object());
    System.out.println("Map size :" + objectMap.size());

We will probably expect 1s and 0s, but instead we will see a lot of 128s! Interestingly, not everything gets garbage collected as expected. In the above scenario, the int i gets converted to Integer implicitly, but Integer.class actually keeps a cache of values in the range of -128 to 127. Therefore those entries with key <= 127 will never get purged automatically!

Java Memory Leak 2

Illustration of WeakHashMap that has Integer keys, The first 128 entries are never purged.

Java Memory Leak 3

Illustration of WeakHashMap that has String keys, entries are getting purged constantly as the String objects are short-lived.

On the other hand, any Integer keys with values >= 128 might get accidentally purged even if they are valid IDs of referenced Object. Imagine a getId() method that returns primitive int, if such an ID is inserted to the WeakHashMap, it is implicitly wrapped as an Integer Object. The wrapped integer object of value >= 128 would have no other external reference after the insertion and will be eligible for garbage collection.

Therefore, do not use primitives (wrappers) and Strings as WeakHashMap keys as those objects are usually short lived and do not share the same lifespan as the actual target tracking objects.

2. Cloned keys

Certain designs promote methods returning cloned instances to avoid callers from unintentionally modifying the original object instance (though this can be achieved by creating immutable objects). This will become a problem for WeakHashMap, as even if the original object instance still exists in the JVM, the cloned object might not be! Therefore, some entries might “magically” disappear even if the original object instances are still valid.

Now the second thing to watch out is the value, if the value somehow references back to the key, the entries will never get purged automatically neither:

Object tempObj = new Object();
objectMap.put(tempObj, new MyContainer(tempObj)); //MyContainer stores a reference of tempObj in its field
System.out.println("Map size :" + objectMap.size());

The output shows that the Map just keeps growing!

This looks like a weird design, you will probably think it will NEVER happen. But remember “never say never”! Especially in more complicated setups, indirect reference is always possible.

So the Conclusion is…

WeakHashMap is still very useful in a lot of different scenarios, as it provides a very elegant way to manage Maps with loose coupling. However extra care should be taken when selecting the right key especially when the key is provided by external code modules.

In my humble opinion, I prefer regular HashMap whenever code logic has the ability to perform proper key cleanup with ease. The reasons are:

1. Relying on WeakHashMap to clear itself by the assumption of no other external references are holding the keys is not bullet proof. In some cases, you might not have control on how others use/reference the same instance of your keys. If others misuse the same instances of the keys, your WeakHashMap would suffer too!

2. Keeping something in the map for the full lifespan of the key might not be the most optimized solution. Instances of the keys might be long living (due to other usages), but their corresponding entries in the Map do not have to be!

3. If a field is declared as a “Map”, the only guaranteed behaviors are those defined in the Map interface. Even if the field is currently a WeakHashMap, the assumption that it will always self-purge does not always stands. The field can be replaced by other non weak reference Map implementation without violating its declaration type as a Map.

Advanced Tricks for Java Coders

Coding in Java? Make sure you’re up-to-date on all of the tips and tricks like how to create and destroy Java memory leaks and a painless introduction to Java’s ThreadLocal storage. Download the article