Machine learning: Homework: Landau kernel

Landau kernel #

Background #

The family of Landau kernels \(\{\ell_n(x)\}_{n \in \mathbb{N} } \) is defined by:

\[ \ell_n(x) = \begin{cases} c_n (1-x^2)^n & \text{if } |x| \leq 1 \\ 0 & \text{otherwise}\\ \end{cases} \] where the constants \(c_n\) are chosen to ensure the integrals of the \(\ell_n\) are all equal to one. We have shown in class that if \(f\) is a continuous function, then \(\lim f * \ell_n\) is a sequence of polynomials that converges uniformly to \(f\).

Question: These polynomial approximations work well in theory. But how well do they work in practice?

Computing Landau estimates #

We will investigate whether using the approximations given by convolution with the Landau kernels is an efficient way of approximating continuous functions. Follow the given outline:

Homework exercise:

Choose a function \(f\) you would like to approximate. In fact, to make things easy for yourself, choose \(f\) to be a polynomial of small degree, say three, but make it something mildly interesting. We will approximate \(f\) over the interval \([0,1]\).

  • First, compute the constants \(c_n\) for a few values of \(n\). Feel free to use your favorite app to do so; for instance, Matematica or Sagemath.
  • Now compute \(f*\ell_n\) for a few values of \(n\). Plot \(f*\ell_n\) and \(f\) and assess the accuracy of the estimates you obtained. Do the estimates converge quickly to \(f\)? Again, you may use an app of your choosing. Here is some sample code written in Mathematica.

Write up your observations and conclusion briefly, supported by the images you found above.