Consider this piece of code which allocates a boolean array of 100,00:
boolean array = new boolean;
What should the size of this array be?
Considering that a boolean can just be either true or false, every element in the array only needs a single bit of space each. Thus, the size of our boolean array in bytes should be:
100,000/8 + (overhead of array object) = 12,500 bytes + (overhead of array object)
But there lies the hidden evil….
The Evil Inside
As it turns out, when you allocate a boolean array, Java uses an entire byte for each element in the boolean array!
Thus the size of the boolean array is in fact:
100,000 bytes + (overhead of array object)
So is there any way not to use the 7 extra bytes when you only need the 1 bit? Its here that the
java.util.BitSet class comes to rescue.
The BitSet class does indeed use a single bit to represent a true/false boolean value. Its implementation uses an array of ‘long’ values, where each bit of the long value can be individually manipulated to set any position in the entire BitSet to true or false.
The BitSet implementation does add a bit of cpu overhead since it has to shift and or bits together to set the value of the bit in the correct position. Thus you need to weigh the cost of memory savings versus the extra cpu overhead when using this class.
As an example, here is a screenshot from the YourKit profiler, showing the memory used by a boolean array and a bitset object, both 100000 element long:
As can be be seen:
boolean array takes:
100,000 + 16 (array overhead) = 100,016 bytes
BitSet object only takes :
100,000 / 64 (size of long) = 1562.5 = 1563 long values after rounding
1563*8 + 16 (array overhead) = 12, 520 bytes
A few extra bytes are added for the BitSet object itself and the extra 2 fields that it contains.
To put things in perspective, here is a graph showing the space occupied by both for 100,000 elements:
As can be seen from the graph above, using the BitSet instead of a boolean can save you tons of memory at the cost of just a few extra cpu cycles