Talk:Rosetta Code/Find unimplemented tasks: Difference between revisions

From Rosetta Code
Content added Content deleted
No edit summary
 
(8 intermediate revisions by 4 users not shown)
Line 1: Line 1:
About: "Since most of these other implementations seem to be fetching tasks already implemented, I did the same..."
Please stop flagging Python as incorrect.
E works at the same way.


All examples are working. Except AutoHotkey, that is getting all implemented tasks from a language. --[[User:Guga360|Guga360]] 01:37, 28 May 2009 (UTC)
And normal users can't fetch more than 500 results, Only bots.
We have only 253 tasks. --[[User:Guga360|Guga360]] 15:05, 9 February 2009 (UTC)


Oops, I didn't see the part about getting all tasks, in my translation... <br>
Please look at the MediaWiki API documentation on [http://www.mediawiki.org/wiki/API:Query#Continuing_queries continuing queries]. The E program implements this continuation - I tested it by setting the limit to 50. --[[User:Kevin Reid|Kevin Reid]] 23:42, 9 February 2009 (UTC)
Its fixed now. Thanks. [[User:tinku99|tinku99]]
:I think Kevin is right. New tasks are popping up all over the place and once we have more than 500 the Perl and Python code won't work. If you can assure that you go through all of the category's members, you should. I don't understand how this MW API stuff works, but it looks like the documentation he linked to tells you what you need to do to go to the next page. --[[User:Mwn3d|Mwn3d]] 03:26, 23 February 2009 (UTC)

== Server load ==

<s>Unfortunately, a large number of server requests lately appear to be from this task. I'd like this task to be suspended until I provide an alternate resource for the category data. (ImplSearchBot already generates JSON files for each category, but I don't have them in a location the httpd can see them from.) --[[User:Short Circuit|Michael Mol]] 10:02, 3 September 2009 (UTC)</s>
: Task rewritten to take advantage of "static" files. --[[User:Short Circuit|Michael Mol]] 04:35, 14 September 2009 (UTC)
:: The "static" (JSON) files mentioned in the task description are no longer available. The README file in the linked folder suggests that "calling MediaWiki's API for JSON data shouldn't be as painful as it once was". Should the task description be reverted to use the MediaWiki API again? --[[User:Tikkanz|Tikkanz]] 00:35, 29 January 2010 (UTC)
::: That's what I would recommend. Sorry about not leaving a note here before. (The script that was updating the JSON files was consuming 18M of RAM, and I needed that to deal with StumbleUpon while trying to get other parts of the server configuration tuned. One consequence of the tuning is that dynamic JSON generation should be much faster, if MediaWiki takes advantage of caching there.) --[[User:Short Circuit|Michael Mol]] 04:54, 29 January 2010 (UTC)
:::: OK, have restored task description to pre-JSON-source-file state. --[[User:Tikkanz|Tikkanz]] 22:28, 29 January 2010 (UTC)

Latest revision as of 13:22, 6 February 2010

About: "Since most of these other implementations seem to be fetching tasks already implemented, I did the same..."

All examples are working. Except AutoHotkey, that is getting all implemented tasks from a language. --Guga360 01:37, 28 May 2009 (UTC)

Oops, I didn't see the part about getting all tasks, in my translation...
Its fixed now. Thanks. tinku99

Server load

Unfortunately, a large number of server requests lately appear to be from this task. I'd like this task to be suspended until I provide an alternate resource for the category data. (ImplSearchBot already generates JSON files for each category, but I don't have them in a location the httpd can see them from.) --Michael Mol 10:02, 3 September 2009 (UTC)

Task rewritten to take advantage of "static" files. --Michael Mol 04:35, 14 September 2009 (UTC)
The "static" (JSON) files mentioned in the task description are no longer available. The README file in the linked folder suggests that "calling MediaWiki's API for JSON data shouldn't be as painful as it once was". Should the task description be reverted to use the MediaWiki API again? --Tikkanz 00:35, 29 January 2010 (UTC)
That's what I would recommend. Sorry about not leaving a note here before. (The script that was updating the JSON files was consuming 18M of RAM, and I needed that to deal with StumbleUpon while trying to get other parts of the server configuration tuned. One consequence of the tuning is that dynamic JSON generation should be much faster, if MediaWiki takes advantage of caching there.) --Michael Mol 04:54, 29 January 2010 (UTC)
OK, have restored task description to pre-JSON-source-file state. --Tikkanz 22:28, 29 January 2010 (UTC)