The 1620 instruction cycle time was on the order of 20us. With an order NlogN multiplication algorithm one could multiply two (N=10000)-digit numbers in one second, barely. (BCD arithmetic, remember.) The question is, how good was the 1620's algorithm?