# Week 3 - Instance Based Learning

Last edited: 2025-12-05

Instance-based learning

# K-nearest neighbour

k-nearest neighbour

# Run times

Instance-based learning is often called “lazy” learning because you delay computation until query time. As you can see in the table below, we assume $A = B = \mathbb{R}$ and that the input training data is sorted.

AlgorithmRunning timeSpace
1-NN learning1$n$
1-NN query$n\log(n)$1
$k$-NN learning1$n$
$k$-NN query$n\log(n) + k$1
Linear regression learning$n$1
Linear regression query11
Why is linear regression $n$ time to learn?

# The curse of dimensionality

The curse of dimensionality

# Locally weighted regression

You can use k-nearest neighbour in regression problems along with an algorithm like polynomial regression to replace your voting function $V$. This means at each point $a \in A$ you construct a local polynomial to map that, which you can piece together to fit a larger curve.

This has a nice property that whilst your modelling paradigm of your regression might limit you to certain types of functions - your actual output can be a far more complex function.