Week 3 - Fast Integer multiplication

OMSCS
Multiplying $n$-bit integers problem

Given $n$-bit integers $x$ and $y$ what is $z = xy$?

Naïve approaches will do this in $O(n^2)$ time but you can do it faster with divide and conquer.

Gauss’s trick (aside)

Suppose have two complex numbers $a + bi$ and $c + di$ where you want to compute the product $(a + bi)(c + di)$ in the fastest way possible.

Multiplication of real numbers uses $O(n^2)$ whereas addition/subtraction takes $O(n)$. So if you want to do it faster - it would be best to avoid multiplication.

Looking at the expansion

$$(a + bi)(c + di) = ac - bd + (ad + bc)i$$

it may look like you need to calculate 4 multiplications. However, note the following

$$(a + b)(c + d) = ac + bd + (ad + bc).$$

This tells us from computing $ac$, $bd$ and $(a+b)(c+d)$ we can in fact get all the components we need to work out the complex multiple:

$$(a + bi)(c + di) = ac - bd + ((a + b)(c+d) - bd - ac)i.$$

Naïve divide and conquer approach

Suppose we have $x$ and $y$ which are $2^k=n$-bit numbers. We cut $x$ and $y$ in half so

$$x = X_L2^{2^{k-1}} + X_R \mbox{ and } Y = Y_L2^{2^{k-1}} + Y_R.$$

then there multiple is

$$xy = X_LY_L2^{2^k} + (X_LY_R + X_RY_L)2^{2^{k-1}} + X_RY_R.$$

At which point we have broken the problem into sub-problems. We can then turn this into a recursive algorithm.

EasyMultiply(x,y):
Input: n=2^k-bit integers x & y
Output xy
if k = 1:
	return xy
else:
	X_L = 1st 2^(k-1)-bits of x, X_R = Last 2^(k-1)-bits of x
	Y_L = 1st 2^(k-1)-bits of y, Y_R = Last 2^(k-1)-bits of y
	A = EasyMultiply(X_L, Y_L), B = EasyMultiply(X_R, Y_L)
	C = EasyMultiply(X_L, Y_R), D = EasyMultiply(X_R, Y_R)
	return 2^nA + 2^(2^k-1)(B + C) + D

To analyse the run time of this algorithm not the split and shift operations take $O(n)$ time, just the same as addition. So dividing $x$ and $y$ and computing the addition of the multiplied components are both $O(n)$ operations. However, if you let the run time of the algorithm be $T(n)$ then the sub multiplications take $4T(n/2)$. This gives

$$T(n) = 4 T(n/2) + O(n) = O(n^2).$$

Better approach

This time we combine Gauss’s trick with the divide and conquer approach to speed up this algorithm.

Suppose we have $x$ and $y$ which are $2^k=n$-bit numbers. We cut $x$ and $y$ in half so

$$x = X_L2^{2^{k-1}} + X_R \mbox{ and } Y = Y_L2^{2^{k-1}} + Y_R.$$

then there multiple is

$$xy = X_LY_L2^{2^k} + (X_LY_R + X_RY_L)2^{2^{k-1}} + X_RY_R.$$

Which we now compute using

$$X_LY_L, \ X_RY_R, \mbox{ and } (X_L + X_R)(Y_L + Y_R).$$

With the substitution

$$(X_LY_R + X_RY_L) = (X_L + X_R)(Y_L + Y_R) - X_LY_L - X_RY_R.$$

At which point we have broken the problem into sub-problems. We can then turn this into a recursive algorithm.

EasyMultiply(x,y):
Input: n=2^k-bit integers x & y
Output xy
if k = 1:
	return xy
else:
	X_L = 1st 2^(k-1)-bits of x, X_R = Last 2^(k-1)-bits of x
	Y_L = 1st 2^(k-1)-bits of y, Y_R = Last 2^(k-1)-bits of y
	A = EasyMultiply(X_L, Y_L), B = EasyMultiply(X_R, Y_R)
	C = EasyMultiply(X_L + X_R, Y_L + Y_R)
	return 2^nA + 2^(2^k-1)(C - A - B) + B

This works slightly faster with $O(n^{1.5})$.