What if we just compute a b by letting the computer multiply the two numbers together?
Takes approximately constant time (about 45 milliseconds per a million multiplications, independent of the size of the numbers being multiplied).
Rule #1 of Good Algorithm Design: If you already have a good subroutine to solve the problem, use it!