Constant-Time Comparison Sorting Algorithm

That’s right! A comparison sorting algorithm that doesn’t scale with input size. No matter what input size you give it, it will always take the same time to finish.

When determining the time complexity of an algorithm, the larger terms win. A linear O(n^2 + n) is just O(n^2) because the n^2 term dwarfs the n term as input sizes get larger. Similarly, O(n log(n) dwarfs a constant O(1).

That logic is fine for small constants. However, when your constant is so large, it dwarfs any possible input you give it, so it cancels the other terms.

What constants can we use? A few contenders come to mind:

  • Planck lengths in the observable universe: Assuming you don’t need any space or matter for the computer, if you could store one bit of data in one Planck length and you used up the entire observable universe, your input size would be approximately 5.45e61.
  • Universe simulation: Let’s say you are simulating the entire universe to the Planck length resolution and want to sort the Planck voxels. You’d need approximately 8.4e184 bits.
  • Time till the heat death of the universe: Let’s say you choose to not store all the input data at once. Rather, you generate it constantly until the heat death of the universe. Assuming you don’t need time at the end for your computer to run the calculation, you’d need approximately 10e1000 years.

Without further ado, I present Heat Death Sort:

function heatDeathSort(array) {
  const sorted = [...array].sort((a, b) => a - b);

  const yearsUntilHeatDeath = 10 ** 1000;
  const msUntilHeatDeath = yearsUntilHeatDeath * 365.25 * 24 * 60 * 60 * 1000;

  console.log(
    `Sorting complete. Waiting 10^1000 * 3.15e7 ms until the heat death of the universe...`
  );

  return new Promise((resolve) => {
    setTimeout(() => {
      console.log("The last black hole has evaporated. Returning your sorted array.");
      resolve(sorted);
    }, msUntilHeatDeath);
  });
}