Find δ(a; ε) for the given f and J.
f(x) = lnx, J = (0,∞)
I have let ε > 0 and δ = ? s.t. if x, y belong to J and |x-y| <δ then |f(x)-f(y) | < ε
so | f(x)-f(y) | = | ln(x) - ln(y) | we want this less that epsilon. From here I'm not sure I can do this but i did.
e^| ln(x) - ln(y) | = | x - y | < e^(ε). ----------------- Not sure if I can do this step!
So we want δ = e^(ε).
Then I just restate this formely:
et ε > 0 and δ = e^(ε). s.t. if x, y belong to J and |x-y| <δ then |f(x)-f(y) | < ε
Does this seem right?
Copyright © 2024 1QUIZZ.COM - All rights reserved.
Answers & Comments
Verified answer
A couple things:
(1) f(x) = ln(x)
is _not_ uniformly continuous on J = (0, ∞).
(2) What is the 'a' in δ(a; ε)?
(3) This step:
e^| ln(x) - ln(y) | = | x - y |
is false. It should be
e^| ln(x) - ln(y) | = max{ x / y , y / x }.
Although I don't know how this would help you...