I never spent much time thinking about the Quadratic variation of a random process when I was first studying stochastic calculus. It turns out that it sits right at the heart of the definition of the Ito integral, the workhorse of stochastic calculus. A consequence of my ignorance was that when it came time to use a quadratic variation argument (say for instance in the a final exam) I was never able to make my solution very convincing.

So recently I decided I would sit down and try to understand exactly what quadratic variation is, why is was so important and when was the correct time to use it.

In order to talk about the quadratic variation of a random process we need a probability space (a sample space , a set of possible events , and a method for assigning probabilities to the events ) and a index that keeps track of time. Let be a real-valued stochastic process. Now, all stochastic processes have a property called its **quadratic variation** defined by the following quantity:

There’s quite a bit going on in this defintion. There is a partition of time, a mesh going to zero, and a sum of squared differences. Another interesting remark that is often overlooked is that the quadratic variation is actually itself a stochastic process, it has a value for each point in time (note the time subscript).

In order to see that the quadratic variation this is a stochastic process, the time index must first be **partitioned** via , where with such that time is partitioned like this:

in other words, we chop up the time interval in to little pieces. The size of the little pieces is called the **mesh**, , and it is defined by

If we specify, say, and calculate the quadratic variation at time we get

and as we increase the number of partitions to we get more and more sums.

### Wiener Process

Things get interesting when we make the assumption that the random process is a Wiener process. Wiener processes have a number of special characteristics, namely that its value at time zero is always zero; it is continuous (almost surely); it has independent increments which are normally distributed with mean zero and variance ,

Since, by definition of a Wiener process, the variance of is just then we have

But there is a well-known relationship between variances and expectations, so we can write this equation like this:

With this fact in mind let’s replace the squared differences in the expectation with the *sum* of squared differences and introduce to make a notational simplification:

and the summation just flows through the equations and pops out at the end:

Now wait a minute, if you sum up all these time differences you just get the final value of time (since the first time value is 0 and all the in-between values cancel out telescope style). Therefore we have the result:

and by the definition of discrete expectation,

We therefore say that Brownian motion accumulates quadratic variation at a rate of 1 per unit of time.

### When do we use it?

The quadratic variation of a Wiener process, , is used extensively throughout stochastic calculus. Of course, as I did, it is not necessary to remember that every time you use an Ito integral you are in fact taking advantage of the quadratic variation property of the underlying Wiener process.

The **Ito integral** of a (left-continuous) stochastic process is defined as

We won’t go in to details in this article about how to use this integral, but rather point out how much it resembles the definition of quadratic variation of a Wiener process. If we replace the random process with the constant unit process 1 and square both sides, then we get

But we’ve already figured out that the right-hand side of this equation is just equal to , so we have this identity:

Differentiating both sides with respect to time gives another important stochastic identity

### The Quadratic Variation of Time

Time too has a quadratic variation. Imagine the time interval and its partition in to subintervals as described above. Then, by definition of quadratic variation,

Since time is determined by the start and end points and further by the fact that we’ve partitioned it in to subintervals, each time increment must have length equal to . For example, if the interval were and we set , then each increment would have a mesh of . Therefore, in the above summation we simply replace the time increment by this determined value:

Now, since there are no variables indexed by , the summation vanishes and we are left with

which, as approaches infinity becomes zero:

We can now use the definition of an Ito integral to write

and by differentiating both sides with respect to time, obtain the result of this section:

i Think sigma (t/n)^2 = (t/n)^2 sigma (1) = (t/n)^2 * n = t^2/n