Talk:Almost prime: Difference between revisions

(→‎K-almost prime entries that are doubled from previous K entries: added a note about the optimization just added into the second REXX entry.)
 
(One intermediate revision by one other user not shown)
Line 31:
 
 
The question then becomes, how to <strike>know</strike> determine how many previous entries
can be just doubled? &nbsp; I don't know if there is a
formulaic way to compute that number.
Line 43:
 
I have &nbsp; (since the above posting) &nbsp; added optimization into the 2<sup>nd</sup> REXX entry, making it about a hundred times faster. &nbsp; -- [[User:Gerard Schildberger|Gerard Schildberger]] ([[User talk:Gerard Schildberger|talk]]) 02:22, 12 September 2017 (UTC)
 
: In general, you can reduce k when n < [http://oeis.org/A078843 OEIS A078843](k). You can repeat to find some <code>r</code> where <code>nth(k,n) = nth(k-r,n) << r</code>. Finding the reduction amount <code>r</code> can be handy when looking at bounds or approximations where you apply the shift at the end.
: A similar process can be used with counts, where we find an <code>r</code> s.t. <code>count(k,n) = count(k-r, n>>r)</code>, where <code>n < 3^k</code> is used as the test.
: One method that works fairly well for large n is to use fast counting methods, then for the nth, use interpolation / binary search. E.g. nth(k=31,n=3e6) in about a second, nth(k=5,n=1e9) in half a second. It seems like there should be a much faster way, similar to quickly generating the nth Hamming number for large n. [[User:Danaj|Danaj]] ([[User talk:Danaj|talk]]) 10:21, 5 July 2020 (UTC)
 
<br>
-----
Anonymous user