Talk:Legendre prime counting function

From Rosetta Code

several questions

Several questions: In principle this function has infinite recursion, so it should have a cut-off for n smaller than some number. 2? Then it is not clear how primes are counted I presume 2 is the first prime, 3 the second prime etc. Perhaps this could be made clear in the intro. The case of '1' is also a bit strange (i see 0 and 1 in the results, probably because some ambiguity in how the question is now posed/explained).

I checked and indeed pi(1) should be 0 (fixed that). Yes, the first prime would be 2.--Wherrera (talk) 17:27, 5 August 2021 (UTC)

There are a few ways to exit the recursion:

One is to use the base case that pi(n) = 0 for n < 2.

Another method would be to keep a table of small primes (say primes <= 127) and count from the array when n is small.

the third method is to count from the sieved primes that you need to generate anyway, using a binary search, as the sieved primes could be large enough for binary search to be preferred.

In the examples, the CoffeeScript uses the identity pi(n) = 0 when n < 2. The Erlang version performs a binary search on the sieved primes (method 3.) --Davgot (talk) 19:07, 5 August 2021 (UTC)

Thanks for clarifying, and thanks for the update to the description, this makes it a lot more clear. Updated my Mathematica solution to have pi(1) = 0.

V and C entries giving the wrong answer for a trillion numbers

The V entry gives a count of 35,460,428,370 primes as does the C entry (which I've translated from V).

However, according to this independent source, the correct answer is 37,607,912,018.

Curiously, the Go and Wren entries which I've also translated from V give the correct answer for a trillion numbers and the Go entry carries on giving the correct answers up to 10^15 numbers (Wren up to 10^13 - I didn't bother after that). However, both V and C produce a segmentation fault when I try 10^13.

It eventually dawned on me that the reason for this is that the 'int' type in V and also C (at least in GCC) is 4 bytes, whereas in Go (and I think Nim) it's 8 bytes on a 64-bit system. Wren effectively has 53 bit integers.

Consequently, this line in the V example is producing an overflow:

mut ans := larges[0] + i64(((rilmt + 2 * (nbps - 1)) * (rilmt - 1) / 2))

If we change it to:

mut ans := larges[0] + i64(rilmt + 2 * (nbps - 1)) * i64(rilmt - 1) / 2

then the correct number of primes is returned for a trillion numbers.

There's still a segmentation fault when I try to go above a trillion which I haven't been able to track down though, as the Go example is working, probably the easiest way to fix this is to change 'int' to 'i64' throughout. --PureFox (talk) 22:21, 10 November 2022 (UTC)