Talk:Word wheel: Difference between revisions

→‎Algorithms: 12x larger dictionary
(→‎Algorithms: Memory fine.)
(→‎Algorithms: 12x larger dictionary)
Line 8:
::Hi Gerald, no waste of memory in the Julia case as nested loops are used to generate word candidates one-at-a-time and then quickly checked if they are in the set of dictionary words. Probably hundreds of thousands of lookups which is OK for todays laptops. As for dictionary size, the task ''specifies'' a ''particular'' dictionary to use; going so far outside of that may interest, but is outside the task boundary.
:: Cheers, --[[User:Paddy3118|Paddy3118]] ([[User talk:Paddy3118|talk]]) 03:32, 5 July 2020 (UTC)
 
:: Just ran the larger [https://raw.githubusercontent.com/dwyl/english-words/master/words.txt dictionary] its 12x the size of the standard dictionary and runs in 15x the time using the Python code. (There is a lot of "cruft" padding out that larger dictionary from the look of the first 100 words). --[[User:Paddy3118|Paddy3118]] ([[User talk:Paddy3118|talk]]) 10:18, 5 July 2020 (UTC)
Anonymous user