|
||
Title: Mean Value Theorem Post by KeyBlader01 on Oct 24th, 2008, 7:50pm Assume that f is a differentiable function such that f(0)=f '(0)=0. Show by example that it is not necessarily true that f(x)=0 for all x. Find the flaw in the following bogus "proof". Using the Mean Value Theorem with a=x and b=0, we have f '(c)= f(x)-f(0) / x-0. Since f(0)=0 and f '(c)=0, we have 0=f(x)/ x so that f(x)=0. For the first part, we did that in class. We were able to prove that if we picked a function such that f(x)= x^2 and f '(x)=2x. Then it would disprove the first part because we're showing that f(x) =/= 0 for all x. The second part, our professor said there is something wrong with it. What is the wrong information the previous part was using in the proof in the second part? I'm confused at what the problem is asking. Can someone explain the problem from the beginning and go in steps? Thanks a lot! |
||
Title: Re: Mean Value Theorem Post by towr on Oct 25th, 2008, 5:22am Where does f '(c)=0 come from? It's plainly not true for f(x)=x2 and c in the open interval (0,x) |
||
Powered by YaBB 1 Gold - SP 1.4! Forum software copyright © 2000-2004 Yet another Bulletin Board |