• Bug
  • Status: Closed
  • 2 Major
  • Resolution: Fixed
  • ehcache-core
  • hsingh
  • Reporter: amiller
  • October 27, 2009
  • 0
  • Watchers: 0
  • January 17, 2013
  • November 04, 2009

Description

MemoryStore does this:

    map = new ConcurrentHashMap(maximumSize, DEFAULT_LOAD_FACTOR, CONCURRENCY_LEVEL);

expecting the map to be sized sufficiently to hold maximumSize (according to the Javadoc). However, it will resize once you reach maximumSize*DEFAULT_LOAD_FACTOR (assuming uniform hashing). Would be better to size it initially to maximumSize / DEFAULT_LOAD_FACTOR. CHM will actually bump that up to the next 2^n due to its hash function. And an unlucky hash situation can still cause buckets to need a resize separately, but can’t imagine that’s worth worrying about.

Anyhow, would recommend changing to:

    map = new ConcurrentHashMap(maximumSize / DEFAULT_LOAD_FACTOR, DEFAULT_LOAD_FACTOR, CONCURRENCY_LEVEL);

and drop a comment about the rationale by this formula.

Comments

Abhishek Singh 2009-11-04

Fixed, also added a unit test.

Himadri Singh 2009-11-19

InitialCapacityTest covers it.