The Magical Relationship between Reciprocal Numbers and the Logarithm

Tejas Hiremani
4 min readDec 22, 2020

The numbers 1/n for any natural number n are called reciprocal numbers. What’s special about them? Let’s find out ahead!

The sum of reciprocal numbers is very counterintuitive. If we consider

1 + 1/2 + 1/3 + 1/4 + …

we see that the numbers we add get smaller and smaller. Then does the series converge to a finite number, might one ask? And the surprising answer, is it does not! The series will go past every possible number we can conceive off to infinity. This series, called the Harmonic Series, is shown in the chart below for the first 50 numbers.

How to prove this? It’s as follows:

H = 1 + 1/2 + 1/3 + 1/4 + 1/5 + 1/6 + 1/7 + 1/8 + 1/9 + 1/10 …

H > 1 + 1/2 + (1/4 + 1/4) + (1/8 + 1/8 + 1/8 + 1/8) + (1/16 + …) + …

H > 1 + 1/2 + 1/2 + 1/2 + 1/2 + …

Thus H goes off to ∞.

This is some of the basics of the harmonic series, but can we do more? It turns out we certainly can. In fact we can get an estimate of the rate at which this series diverges, which is surprisingly given by:

1 + 1/2 + 1/3 + 1/4 + … 1/n = ln(n) + 𝛾

as n → ∞

The proof of the above is beyond the scope of the current post. The constant 𝛾 in the equation is the Euler-Mascheroni constant of fame. It’s value is given by:

𝛾 = 0.5772156649015328606065120900824024310421…

And we can graphically visualise it as the blue area in the graph below:

The purple area is under the line y = 1/x, is given by ln(X).

And we see that there is some deep underlying relationship between the sum of the reciprocals and the logarithm. How deep is this really? Turns out the rabbit hole goes far deeper than was initially presumed! For instance, let’s replace the positive sign of each alternate term by a negative. This is the alternating Harmonic Series

S = 1 – 1/2 + 1/3 – 1/4 …

Does this series converge? It does, and its value is given by …

ln(2) = 1 – 1/2 + 1/3 – 1/4 …

And once again the logarithm turns up for this series!

Can we prove the sum? We can and in fact we can prove something even more profound. What we can show is the logarithm of any natural number can be represented as an infinite sum of some reciprocal numbers.

Consider the sum

(1 – 1/k) + (1/2 – 1/k) + (1/3 – 1/k) … + (1/(k – 1) – 1/k) +

(1/(k + 1) – 1/(2*k)) + (1/(k + 2) – 1/(2*k)) … + (1/(2 * k – 1) – 1/(2*k)) + …

The general term for this series is (1/i – 1/(a*k)) where a is chosen such that a*k is the smallest multiple of k greater than i.

if we add (1/a*k – 1/(a*k) to each row, we can simplify the sum to (for some large N). Also let us select N to be a multiple of k for simplicity.

S = 1 + 1/2 + 1/3 + … 1/N – (1 + 1/2 + 1/3 … k/N)

S = 1/(N/k + 1) + 1/(N/k + 2) + … 1/N

S = 1/N * [1/(1/k + 1/N) + 1/(1/k + 2/N) … 1]

and converting the sums to an integral, because N is very large,

S = ∫(1/x)dx from 1/k to 1

Thus S = ln(k)

We have found a formula to get the logarithm of any natural number!

Substituting k = 2, we get

ln(2) = 1 – 1/2 + 1/3 – 1/4 …

And substituting k = 3 we get the novel expansion

ln(3) = 1 + 1/2 – 2/3 + 1/4 + 1/5 – 2/6 + 1/7 + 1/8 – 2/9 …

In general

ln(k) = 1 + 1/2 + … + 1/(k – 1) – (k – 1)/k + 1/(k + 1) + 1/(k + 2) … + 1/(2 * k – 1) – (k – 1) / (2 * k) + …

Is it possible to generalize this even further? This is certainly possible. We can replace the function 1/x by a power of x to get:

n¹⁻ˢ *(kˢ⁻¹ –1) / (s – 1) = 1⁻ˢ + 2⁻ˢ + 3⁻ˢ + 4⁻ˢ + … + k⁻ˢ – kˢ * k⁻ˢ + (k+1)⁻ˢ + (k+2)⁻ˢ + … (2*k)⁻ˢ –kˢ * (2*k)⁻ˢ …

And if s > 1, the left hand side turns to 0, thus simplifying to

1⁻ˢ + 2⁻ˢ + 3⁻ˢ + 4⁻ˢ + … + (k-1)⁻ˢ + (k+1)⁻ˢ +(k+2)⁻ˢ + … = (1-kˢ) * (k⁻ˢ + (2k)⁻ˢ …)

For instance, putting s = 2 and k = 2

1/1² + 1/3² + 1/5² … = 3 * (1/2² + 1/4² + 1/6² …)

And for k = 3

1/1² + 1/2² + 1/4² + 1/5² + 1/7² + 1/8² … = 8 * (1/3² + 1/6² + 1/9² …)

--

--