I left out one line at the beginning, which was pointed out in class.

We start with some function, f(t) which is continuous on [a,b]. We define:

F(x)=\int_a^x f(t)dt

By the fundamental theorem of calculus, this is continuous of [a,b] and differentiable on (a,b).

Now, we use the mean value theorem for derivatives (somewhere in an interval, a function has a gradient which is equal to its average gradient) which say that if g is differentiable on (a,b) and continuous on [a,b] then there exists some c between a and b such that g'(c)=\frac{g(b)-g(a)}{b-a}.

Now simply using our function F(x) in the mean value theorem for derivatives, we have that:

F'(c)=\frac{F(b)-F(a)}{b-a}

But we know that F'(c)=f(c). We also have defined F(x) above, so we can plug in a and b to get:

f(c)=\frac{\int_a^b f(t) dt-\int_a^a f(t) dt}{b-a}

The second term on the right hand side is zero as the two limits are the same, so we have:

f(c)=\frac{\int_a^b f(t) dt}{b-a}

But we know that \int_a^b f(t) dt=\int_a^b f(x) dx as t and x are just dummy variables, and so we have proved that:

f(c)=\frac{\int_a^b f(x) dx}{b-a}

This finishes the proof.

How clear is this post?