main

prev        

Statement of a problem № m53451

        next    

Construct a sequence of interpolating values yn to f (1 + √10), where f (x) = (1 + x2)−1 for −5 ≤ x ≤ 5, as follows: For each n = 1, 2, . . . , 10, let h = 10/n and yn = Pn(1+√10), where Pn(x) is the interpolating polynomial for f (x) at the nodes x0(n) , x1(n) , . . . , xn(n) and xj(n) = −5 + jh, for each j = 0, 1, 2, . . . , n. Does the sequence {yn} appear to converge to f (1 +√10)?




New search. (Also 1294 free access solutions)

Online calculators