Richard Russell has posted something interesting on the raspberry pi forum. I thought it might be of interest to people here.
>I mentioned in the other thread that I was thinking of writing a native code BigInt library for BBC BASIC; it would be an interesting exercise and hopefully provide a route to an efficient solution of the Fibonacci challenge. I've started to look at this, but I'm facing a dilemma: should I use a 10^n radix for the limbs (as the classic BASIC Fibo program does) or a more conventional 2^n radix? The advantage of the former is that the conversion back from the BigInt to decimal becomes trivial (and fast) which is of particular value when outputting a million-digit result! The disadvantage - and it's a big one - is that the BigInt computations themselves become significantly less efficient.
>For a general-purpose BigInt library I'm in little doubt that the 2^n limb radix is better. In a typical application the formatting of the output as a decimal number is likely to be a small proportion of the total work (it might not even always be needed, if the output value isn't intended for 'human consumption') and efficiency of the BigInt arithmetic will be the most important factor (GMP uses a 2^n radix incidentally). But I suspect that in the specific case of the million-digit Fibonacci challenge it would give poor results because of the time taken to convert the result to decimal (which I assume is supposed to be part of the challenge).
>What to do?! Does anybody have some BASIC code for multiple-precision binary-to-decimal conversion that I could benchmark?