In his equation (4), Servois derived the inverse \(\sum\) of the finite difference operator \(\Delta\).

To find an expression for \(\sum y = \Delta^{-1} y = (E -1)^{-1}y\), we observe that by the definition of inverse, we must have \((E-1)(E-1)^{-1}y = y\). We then check that

\[\begin{align*}(E-1) & [E^{-1}y + E^{-2}y + E^{-3}y + \cdots] \\& = (y - E^{-1}y) + (E^{-1}y - E^{-2}y) + (E^{-2}y - E^{-3}y) + \cdots,\end{align*}\]

which is a telescoping sum, formally equal to \(y\); this gives us a candidate for \(\sum y\).

There is also the observation that if \(K\) is any quantity at all satisfying \(\Delta K=0\), then \((E-1)K=0\).

Such a quantity \(K\) plays the role for \(\sum\) that is analogous to the arbitrary constant \(C\) in an indefinite integral. Therefore,

\[\sum y = E^{-1} y + E^{-2} y + E^{-3} y + \cdots + K,\]

a formal sum that we might write as

\[\begin{align*}\sum^{\infty}_{k=1} f(x - k\omega) + K&= \sum^{\infty}_{k=1} f(x_0 + n\omega - k\omega) + K \\&= \sum^{\infty}_{k=n+1} f(x_0 + (n-k)\omega) + \sum^{n}_{k=1} f(x_0 + (n-k)\omega) + K \\ &= \sum^{\infty}_{k=1} f(x_0 - k\omega) + \sum^{n-1}_{k=0} f(x_k) + K.\end{align*}\]

When a modern reader encounters a series like this, the question of convergence naturally arises; to us the series only makes sense if the terms \(f(x_0 - k\omega)\) go to zero fast enough for convergence. To Arbogast, Servois and their contemporaries, this series was generally viewed as a formal expression, which can be manipulated by the rules of algebra without concern over convergence. However, Servois was quite concerned with questions of convergence and divergence when they arose in relation to numerical aspects of series. In [Servois 1817, pp.103–106], he even criticized the practice employed by some mathematicians of this period of using divergent series for numerical approximation.

Where did Servois actually use an expression like \(\sum y\)? It turns up in the second part of his equation (9), where we have the expression \(\sum y - \sum v\). But because \(v=f(x_0)\), we have

\[\sum y - \sum v = \left[\sum^{\infty}_{k=1} f(x_0 - k\omega) + \sum^{n-1}_{k=0} f(x_k) + K\right] - \left[\sum^{\infty}_{k=1} f(x_0 - k\omega) + K\right].\]

Formally subtracting the infinite sum from itself, we are left with the finite sum

\[\sum^{n-1}_{k=0} f(x_k).\]

In an analogous fashion, we must understand Servois' use of the notation \(\int y \,dx\) and \(\int v \, da\) to mean the improper integrals

\[\int_{-\infty}^x F(t)\,dt \quad \mbox{and} \quad \int_{-\infty}^a F(t)\,dt,\]

again ignoring questions of convergence. Formal subtraction in the expression for \(Z\)

in Servois' equation (9) gives

\[Z = \int_{-\infty}^x F(t)\,dt - \int_{-\infty}^a F(t)\,dt = \int^x_a F(t) \,dt,\]

an ordinary definite integral. We note that Servois did not use the modern notation for definite integrals. In a different place in the paper, on page [103], he used the notation

\[\int Fx dx \left\{\frac{x}{a}\right\}\]

to stand for the expression that we would write today as

\[\int^x_a F(t)\,dt.\]