Improve Performance with Object Pool

In a former article I’ve shown the consequences of memory allocation and garbage collection. Object pool is a simple design pattern that can be used in order to solve both problems.

The Problem

Memory allocation is a slow process. The less memory allocations our code requires, the better.

If we look at the example from the memory allocation post, we can see that memory allocation makes our code run much slower – sometimes, 3 or 4 times slower.

But memory allocation is just the first part of the problem. The problem described in the former post mostly considers the creation of new objects. What happens when you need to delete objects as well?

When you delete objects in Javascript, they remain in memory until they are released. The mechanism responsible for this memory release is called Garbage Collection (GC from here on). The JS engine tracks down deleted variables and removes them from memory.

This is a very short introduction to Garbage Collection. If you want to read more about it, I suggest reading this V8 blog article.

Ok, so the JS engine does it for me. Who cares?

To illustrate the problem of GC, let’s look at some code and profile it:

class Demo {
constructor(counter) {
this.counter = counter;
setMe() {
this.counter.count += 1;
function createAndCount() {
const objs = [];
const counter = {
count: 0
for (let i = 0; i < 1000; i++) {
for (let demo = 0; demo < 1000; demo++) {
const currBoom = objs[objs.push(new Demo(counter)) - 1];
objs.length = 0;
console.log(`Counter is: ${counter.count}`);
view raw garbageCollection.js hosted with ❤ by GitHub

Code Snippet 1: Runtime allocation and GC

The code is pretty simple. We have a class Demo that is created a 1000 times. We then do something with it (the setMe method).

The profiler shows us something interesting:

Figure 1: Bottom up view of the profiler results. Almost 50% of the runtime is spent on Minor GC

Figure 1 is the bottom up view of the runtime. We can see that GC took almost 50% of our runtime. That’s almost 50% performance budget we can save!

What’s causing all of this GC?

When we create more and more objects inside the for loop, the JSEngine stores them in memory. Then, we clear the array (objs.length = 0).

The JS engine looks at the discarded data as a target for garbage disposal (GC). Then we create new objects in the array, to which the JS engine allocates new memory and so on.

While we repeat the process, the JS engine repeats the GC process.

This is, in a nutshell, memory allocation and GC.

How can your app suffer from GC?

Well, in JS we create and delete objects with abandon (almost everything is an object). We sometimes keep them in arrays, and sometimes they are created in different functions.

If you have a function that creates an auxiliary array – the array is most probably Garbage Collected after the function finishes (unless you do something about it). See Figure 2 for example.

Figure 2: A function that allocates an array and once finished – the array is garbage collected

Another use case might be quick or bulk updates from a server that adds or removes new entities in your data. If you hold 5000 entities, and remove/create 1000 entities in every cycle – this might cause a GC issue in your application.

There is a solution – Object Pool

The idea in object pools is to preallocate the memory you need and reuse it.

Let’s understand this using some code:

function createAndCount() {
const counter = {
count: 0
const objs = new Array(1000).fill(null).map(new Demo(counter));
for (let i = 0; i < 1000; i++) {
for (let demo = 0; demo < 1000; demo++) {
const currBoom = objs[demo];
console.log(`Counter is: ${counter.count}`);
view raw noGCExample.js hosted with ❤ by GitHub

Code Snippet 2: Pre-allocate an array and reuse the objects

The code snippet above pre allocates the demos, and instead of creating them – reuses them.

When looking at the bottom-up view again, we see that the GC is completely gone!

Figure 3: Filtering the Bottom-Up view shows zero GC!

I hope now you have some motivation to learn the algorithm itself, because this is the next step.

The Algorithm

The example above, while showing the motivation to use object pool, is not very useful in real life. We would like the objects we get from the pool to be exactly like objects we’ve just instantiated with new Demo().

The algorithm below is how we can implement an object pool:

  1. Create an array of the types of entities you intend to create and delete a lot
  2. Tag and save “free” entities in this array
  3. When you need a new entity, request a “free” entity
    1. If no free entity exists:
      1. create a new one and add it to the array
      2. Return the new entity
  4. Release the object:
    1. Reset the object
    2. Return it to the array

In Javascript:

class ObjectPool {
constructor(objectConstructor, objectReseter, initialSize = 5000) {
this.objectConstructor = objectConstructor;
this.objectReseter = objectReseter;
this._pool = [];
for (let i = 0; i < initialSize; i++) {
_addObjectToPool() {
const newObj = {
alive: false,
data: this.objectConstructor()
_allocate(object) {
object.alive = true;
return object;
getNew() {
for (let i = 0; i < this._pool.length; i++) {
if (this._pool[i].alive === false) {
return this._allocate(this._pool[i]);
return this._addObjectToPool();
release(object) {
object.alive = false;
view raw object-pool.js hosted with ❤ by GitHub

Code Snippet 3: Basic Object Pool

The constructor initiates an array of objects of a certain size (defaults to 5000). 

When we need a new object, we request it via the getNew method. This method searches for the first “dead” object. It then returns it for usage as a “live” object.

When one finishes using the object, it can be released via the release method.

The tables in Figure 4 compare the total GC with a pool vs. the total GC without a pool of the same task. It is based on the example in the following link:

Figure 4: Left table – total GC of the code running with Object Pool. Right table – total GC of the code running without an Object Pool.

You can play with it and profile it to see the effects of the pool.


The algorithm described in the former part is very simple for the purpose of teaching it. There are some issues with it.

Reduce free object lookup time

Looking for a free entity in our current algorithm is costly – O(n). 

One solution can be to create an array of “free objects”. Every time we request a new one, we remove the “not free” entity from the array. When we release an object, we push it back.

Removing and adding the objects to the “free” array can be a costly allocation process in itself.  This is what we’ve set out to avoid in the first place.

We can further optimize and add a new array with indices of free objects instead of the removing and adding the objects themselves.  This will keep retrieval of a free object O(1) while avoiding the allocation of the big data array.

The solution above mostly replaces object allocation with integer allocation. While in many cases this is a huge improvement, we can do more. In addition, if we talk about “big data”, creating another array in the size of the original array might not be such a good idea…

If you look closely, we only use the “free” array to “add” and “pop” and always take the latest or the first. This is either a FIFO (First In First Out) or a LIFO (Last In First Out) strategy. It yells “a stack or a queue”!

In this case, you’d might want to create the a queue of free entities such that every entity refers to the next. This way, just keeping a reference to the next free entity allows us to retrieve a free object with constant time while keeping all the objects in their own array.

Here’s the queue solution implemented:

class PoolObject {
constructor(data) { = data;
this.nextFree = null;
this.previousFree = null;
class Pool {
constructor(objCreator, objReseter, initialSize = 5000) {
this._pool = [];
this.objCreator = objCreator;
this.objReseter = objReseter;
for (let i = 0; i < initialSize; i++) {
addNewObject(obj) {
return obj;
release(poolObject) {
// flag as free = true;
// set in the queue
poolObject.nextFree = null;
poolObject.previousFree = this.lastFree;
// if we had a last free, set the last free's next as the new poolObject
// otherwise, this is the first free!
if (poolObject.previousFree) {
this.lastFree.nextFree = poolObject;
} else {
this.nextFree = poolObject;
// set the new object as the last in the queue
this.lastFree = poolObject;
// reset the object if needed
getFree() {
// if we have a free one, get it - otherwise create it
const freeObject = this.nextFree ? this.nextFree : this.addNewObject(this.newPoolObject());
// flag as used = false;
// the next free is the object's next free
this.nextFree = freeObject.nextFree;
// if there's nothing afterwards, the lastFree is null as well
if (!this.nextFree) this.lastFree = null;
// return the now not free object
return freeObject;
newPoolObject() {
const data = this.objCreator();
return new PoolObject(data, this.lastFree, this.nextFree);
releaseAll() {
this._pool.forEach(item => this.release(item));
view raw objectPool.js hosted with ❤ by GitHub

Code Snippet 4: Implement the Object Pool solution with a queue for O(1) look up time

In the solution above ( getFree method), we always keep the pool’s nextFree as a reference to the next available free object. Once we fetch it, we set the nextFree to be the next of the current nextFree and this is how the queue goes on.

The solution is very simple and straight forward.

Optimizing the number of elements in the pool

If you’ve noticed, the default number of pre allocated entities was 5000. Two questions arise:

  1. How many entities should we pre allocate?
  2. What happens if we reach this number?

The first question is pretty short – you should know your application and estimate the amount of data you are going to hold.

The second question has few approaches to take. In the code above, the moment we had no free objects to use, we just created a new one and it stayed there forever.

You could, on the other hand, decide on a threshold (like always have at least 2% free entities). Every time you cross that threshold, you double the amount of entities you have, or add 100 or anything you dim fit.

Eventually, optimizing for the number of entities requires some knowledge about your application.


In this article we dived a bit deeper into memory allocation and GC.

We saw an example of memory allocation in which we allocated memory for objects in an array. This example is closer to what will happen in real life, as arrays of integers are rarely used in real life applications.

The trick used in the memory allocation article, in which we pre allocated the array beforehand worked and removed GC from our runtime (Figure 2).

The problem with this simplified solution is that it is not robust enough. All we did was use the same objects over and over, but just in a loop. Most of the time, we’d like to instantiate a new object with some meta data and use it OOP style (oh no – I said the O word!!!).

This is where the Object Pool design pattern comes in handy. Actually, you can take the code in the gist and use it in your app right now. Just follow the API.

An important thing to note is this: you should always be aware that these kind of solutions have a price. The price is usually an overhead in code complexity.

In this case, we’ve created a whole API just to create and destroy entities. If your app does not have a lot of create/destroy actions – you are better off using a “regular” new. Thanks to MichalKutz for bringing that up.

I hope you now understand better memory allocation and GC and their implication on your app’s performance. If you have any question or comment – feel free to use the comments area below.

Sign up to my newsletter to enjoy more content:

Leave a Reply

Your email address will not be published. Required fields are marked *