Asymptotics and generating functions

(Warning, equation-heavy post!)

A while ago I stumbled upon a following problem (when doing analysis of a certain distributed exploration algorithm):

We have n groups of agents, each of equal size (lets say =1. Positions of groups are indexed: 1, 2,\ldots,n. Groups ,,diffuse”, and they diffuse at different speeds, depending on the position of the group:
\lambda_i = \frac{1}{i}
(where \lambda_i is the amount of agents disappearing from group being at position i in one unit of time)

Whenever a group size is reduced to 0, (it will be always a first group), every other one is shifted:
2 \to 1, 3 \to 2, \ldots, n \to (n-1)
[0,a_2,a_3,\ldots,a_n] \text{ into} [a_2,a_3,\ldots,a_n]

Question: How long does it take to dissipate all the groups? (We are not interested in exact formula, just asymptotic one.)

For example, if n=3:
[1,1,1] \to [0,\frac{1}{2},\frac{2}{3}]
(1 unit of time)
[\frac{1}{2},\frac{2}{3}] \to [0,\frac{5}{12}]
(\frac{1}{2} units of time)
[\frac{5}{12}] \to [0]
(\frac{5}{12} units of time)
and the answer is 1 + \frac{1}{2} + \frac{5}{12} = \frac{23}{12}

A good observation is that instead of operating on finite size lists, we can start with infinite one:
[1,1,1,1,\ldots] \to [0,\frac{1}{2},\frac{2}{3},\frac{3}{4},\ldots]
[\frac{1}{2},\frac{2}{3},\frac{3}{4},\ldots] \to [0,\frac{5}{12},\frac{7}{12},\ldots]
[\frac{5}{12},\frac{7}{12}] \to \ldots

If we name leading terms of the consecutive steps:
t_0 = 1, t_1 = \frac{1}{2},a_2 = \frac{5}{12},\ldots
(so t_i is a time between total dissipation of (i-1)th and ith groups):
t_0 = 1
t_1 = 1 - \frac{1}{2}\cdot t_0
t_2 = 1 - \frac{1}{3}\cdot t_0 - \frac{1}{2}\cdot t_1
t_3 = 1 - \frac{1}{4}\cdot t_0 - \frac{1}{3}\cdot t_1 - \frac{1}{2}\cdot t_2
\ldots

Above can be rewritten to:
t_0 = 1
\frac{1}{2}\cdot t_0 + t_1 = 1
\frac{1}{3}\cdot t_0 + \frac{1}{2}\cdot t_1 + t_2 = 1
\frac{1}{4}\cdot t_0 + \frac{1}{3}\cdot t_1 + \frac{1}{2}\cdot t_2 + t_3 = 1
\sum_{i=1}^{n} \frac{1}{i}\cdot t_{n-i} = 1

With a little bit of guessing and numerology, I found following solution:
\sum_{i=1}^{n-2} \frac{1}{i}\cdot \frac{1}{\log (n-i)} = (\frac{1}{1} \cdot \frac{1}{\log (n-1)} + \frac{1}{2} \cdot \frac{1}{\log (n-2)} + \ldots +\frac{1}{n/2}\cdot \frac{1}{\log (n/2)}) +
+ (\frac{1}{n/2+1} \cdot \frac{1}{\log (n/2-1)} + \ldots + \frac{1}{n-2} \cdot \frac{1}{\log 2 }) =
=(\frac{1}{1} + \ldots + \frac{1}{n/2}) \cdot \Theta(\frac{1}{\log n}) + n \cdot \Theta(\frac{1}{n})\cdot O(1) =
=\Theta(\log n) \cdot \Theta(\frac{1}{\log n}) + O(1) = \Theta(1)

So, (for i>1) t_n \sim\frac{1}{\log n}
and our answer is:
\sum_{i=1}^{n} t_i = \Theta(\frac{n}{\log n})

But, I dislike being forced to guess..

The operation we did previously, really closely resembles multiplication of polynomials.
If we write the generating functions:
T(X) = t_0 + t_1 X + t_2 X^2 + t_3 X^3 + \ldots
\Lambda(X) = \lambda_1 + \lambda_2 X + \lambda_3 X^2 + \ldots = 1 + \frac{1}{2} X + \frac{1}{3} X^2 \ldots
then, not surprising:
T(X)\cdot \Lambda(X) = t_0 + (\frac{1}{2}\cdot t_0 + t_1)X + (\frac{1}{3}\cdot t_0 + \frac{1}{2}\cdot t_1 + t_2)X^2 + \ldots
And bam! We got our terms here.
So our original constraints could be written:
T(X) \cdot (1 + \frac{1}{2} X + \frac{1}{3} X^2 \ldots) = (1 + X + X^2 + X^3 + \ldots)

But:
1+X+X^2+X^3+\ldots = \frac{1}{1-X}
and by integrating both sides:
X + \frac{1}{2}X^2 + \frac{1}{3}X^3 + \ldots = - \ln (1-X)

so our constraint is:
T(X) \cdot \frac{- \ln(1-X)}{X} = \frac{1}{1-X}

so our original problem reduces to finding asymptotic of nth term of:
T(X) = \frac{X}{(1-X)(- \ln(1-X))}

Is there a theory connecting asymptotic of terms of sequences with generating functions?

What we found is that:
\sum_i X^i\cdot \Theta(\frac{1}{\log i}) \sim \frac{1}{(1-X)(- \ln(1-X))}
(we can skip X from nominator,as it is just shift by one)

what if I want to closed formula, for, lets say:
\sum_i X^i \cdot \Theta(\frac{1}{\log^2 i})
(by writing \Theta() I mean the fact that exact sum probably doesn’t have finite representation
as an elementary function, so we allow to change the terms by finite multiplier to make the resulting function ,,nice”)
or asymptotic behavior of terms of
\frac{1}{\sqrt{(1-X)}} ?

So, I would appreciate any help with this problem.

Advertisements

One response to “Asymptotics and generating functions

  1. “Is there a theory connecting asymptotic of terms of sequences with generating functions?”
    You might be interested in this: http://en.wikipedia.org/wiki/Analytic_combinatorics
    The theory has to do with using generating functions and complex analysis to prove asymptotic statements in combinatorics.
    And there is also a two-part class on Coursera on this.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s