Wikipedia talk:Bots/Requests for approval
From Wikipedia, the free encyclopedia
Archives |
|
---|---|
Archive 1 | |
Archive 2 (Election) | |
Archive 3 | |
Archive 4 | |
Archive 5 |
Contents |
[edit] Multiple Requests
I have around three four requests for extra tasks on MartinBotII to make, all relating to wikiprojects. To avoid spamming the BRFA page, should I put the the three really simple tasks (which don't edit the mainspace) in one request, and the other (which, at most, involves putting article talk pages in categories, and which runs similarly to MathBot) in another? Or all four together? I'm not sure what the norm is for this - any input appreciated. Martinp23 14:29, 4 November 2006 (UTC)
- What are the tasks? --kingboyk 14:34, 4 November 2006 (UTC)
- I'll outline them below:
- Do the same as task 2 on MartinBotII (approved), but for other Wikiprojects (per requests), using exactly the same system, just different message, recipient and opt-out pages
- Update the Wikipedia:WikiProject Trains/Recent changes list of articles by editting sub-pages of that, based on which article talk pages contain the project banner (diff)
- Produce a list of all wikiproject pages for the WikiProject Directory, displaying them in a table diff and making a seperate list of new projects diff.
- These are the non-mainspace (talk) edtting tasks, which are fairly simple. All have been fully tested, as shown by the diffs. The other task is for WP V1.0 and does the same sort of thing as MathBot, except that all features aren't yet implemented. The main difference is that, in its tables, it only lists articles which have been rated by a wikiprject and achieve a score based on calculations done by the bot based on the rating and importance of the article. For the tests thus far, all those articles which have achieved the minimum score on the wikiprjects which were tested were put into sub-pages of this page. This was only a very limited trial - when approved, the bot will add all the talk pages of all those articles (and images) which meet the requirements to a category - this is the reason that I've isolated this project, along with the potential for a huge number of edits (depending on overall quality of course, but could go into tens of thousands (spread out over weeks). Martinp23 14:49, 4 November 2006 (UTC)
- I'll outline them below:
[edit] Request for proposed bot review
I have not used or written a WP bot before (except simple read-only for analysis) and I have a couple of questions:
- Is there a centralized place for someone to propose (not to request) a bot that doesn't yet exist, to get the equivalent of a pre-implementation design review?
- On the assumption that this is the place (for now), my idea is to write a bot (in perl) that can revert in two modes:
-
- Vandal mode: by looking for specific strings and regular expressions; and
- Linkspam mode: by looking for links to web sites that are on a blacklist
The sites that are 'guarded' would normally be a very small number (like someone's small watch list) and each could have its own custom set of 'typical' vandal strings/regexps, and/or its own set of blacklisted external link sites. All reversions would include Talk page message to the user, would follow 3RR (or less, for safety), would leave clear edit summaries, etc. Of course once a prototype version is available, I plan to bring it here for approval (it would be run in manual mode until bullet proof), but I would appreciate getting preliminary comments prior to investing any serious effort. Thanks, Crum375 17:11, 9 November 2006 (UTC)
- This seems similar to my second bot, which goes after shared IPs and suspicious new users (likely socks) to a watchlist, along with link spam revert ability.Voice-of-All 17:14, 9 November 2006 (UTC)
- Do you have a link to its specs that I can read? Is it Open Source? Crum375 17:49, 9 November 2006 (UTC)
- See User:VoABot II.Voice-of-All 18:43, 9 November 2006 (UTC)
- Do you have a link to its specs that I can read? Is it Open Source? Crum375 17:49, 9 November 2006 (UTC)
-
-
-
- I read the external 'specs' available at your bot page. If it supports specific tools for EL enforcement (by specifiying blacklisted linked sites per article) then I can't find it. If it supports custom regexps for vandal edit detection per article then I can't find that either. Those are the 2 main missions I see for my proposed bots - customization per article for vandal and linkspam protection. My plan is to implement it in perl. Do you, or does anyone else, see a problem with that concept? I know I myself need it as a user, but I don't want to invest the effort in it if it's already done somewhere, or if it violates some fundamental bot rule and has no chance for approval here. Any comments would be appreciated. Crum375 21:37, 9 November 2006 (UTC)
-
-
-
-
-
-
- My bot only gets 20-80 edits/day usually (now its at a low point). If it only watched single pages it would probably be of very little use.Voice-of-All 23:21, 9 November 2006 (UTC)
-
-
-
-
-
-
-
-
- As I noted above, the issue for me is not need, as to me personally it would be very useful to have that functionality - it would save me personally lots of menial work, which is what bots are for. I suspect that even if no one else except for me used that bot, it would still 'pay for itself' in time saved, after several months. If anyone else used it, it would be a bonus. But the real reason for my questions here is whether there is some procedural flaw that I am missing, that would preclude this proposed bot from ever being approved here, or whether someone has already built it and I missed it while going over the existing bot list. I am still waiting for an answer to either question. Crum375 23:49, 9 November 2006 (UTC)
-
-
-
-
- Per-article customisation is an interesting idea, and could in theory capture persistent vandalism on a particular article that it's not practical or "safe" to revert in general. I think VoA may well be correct, though, that doing it this way could involve a lot of work, for relatively little benefit, but I don't think that's a basis on which the bot would fail to be approved on (wherein the principle is basically, prevention of bots doing harm). Also bear in mind that for linkspam that should be blacklisted in the general caes, there's m:spam blacklist, which solves the problem "at source". If you want a more formal answer from the BAG (as against from here in the peanut gallery), you might consider filing an approval request, but specifying that the trial wouldn't begin at once, assuming it wouldn't be an excessively long period to hold the request 'open' for. Alai 01:53, 11 November 2006 (UTC)
- Thanks for the detailed response. Regarding the effort/benefit ratio, I think I can live with that, and the effort is (mostly) one-time, vs. hopefully a long period of benefits. Regarding the linkspam list, I think the issue there is that some linkspam is only posted (persistently) into one article, and I am not sure if it would instantly qualify for the global blacklist, although it may eventually. Regarding the application in a 'hold mode', I think I'll take a chance on the 'peanut gallery', enough to whip up a basic prototype, which I can then submit for approval. Thanks again, Crum375 02:21, 11 November 2006 (UTC)
[edit] Scepbot
Back in June, I requested a flag for my bot (here). A day later, I added a section for possible tasks to do (e.g. template substitution) if there were no redirects to fix, but never got a response either way. I'm a bit confused on whether it should run these tasks or not, however. Will (message ♪) 00:30, 13 November 2006 (UTC)
[edit] InterWikiLink for arabic page
please add someone who knows how to, the [[ar:ويكيبيديا:الميدان/تقنية]] to the list - thanks --Mandavi 16:11, 13 November 2006 (UTC)
- Done; the correct page to add the link to is Wikipedia:Bots/Requests for approval/Header. --ais523 16:24, 13 November 2006 (UTC)
[edit] Expiring requests?
TaeBot, TheJoshBot, Huzzlet the bot all didn't have any discussions for weeks, are they going to expire or stay on the list longer? --WinHunter (talk) 16:45, 13 November 2006 (UTC)
- we will move them soon Betacommand (talk • contribs • Bot) 17:10, 13 November 2006 (UTC)
- What is the approval for the TheJoshBot stalled on? At Template:Infobox Australian Place (under WP:AUSTPLACES), we have been waiting patiently for approval so the conversion can be made.SauliH 15:39, 16 November 2006 (UTC)
[edit] Trial results
I have posted the trial results of my bot here. May I request somebody in BAG to have a look. Thanks -- Lost(talk) 14:53, 17 November 2006 (UTC)
[edit] Betacommand Deletion Bot
I have blocked Betacommand for 1 week for operating an unapproved deletion bot. If I am somehow wildly mistaken and this behavior was properly authorized please correct me.
See: WP:AN#Massive Image Deletion.
Dragons flight 08:49, 28 November 2006 (UTC)
- Just an update for everyone's information: there was no bot involved in this incident. -- RM 14:58, 28 November 2006 (UTC)
[edit] Process streamlining
Sorry, but I still can't figure when to grant access to a bot approved by the committe. Suggest the following setup:
- User places a request on Page 1. Discussion takes place here.
- Committee approves the request, moves the approved transcluded subpage to Page 2
- Bureaucrat who has page 2 on his watchlist then comes around granting the flag. Archives the template to page 3.
How does this setup sound? =Nichalp «Talk»= 05:23, 3 December 2006 (UTC)
- The approval process has been done readily by Redux and Taxman on numerous occasions without any problems. Perhaps I should explain how a bureaucrat should go about approving bots. When a bot is approved an approvals member places the new entry at Wikipedia:Bots/Requests for approval#Approved Requests. When you want to know if any bots need work, look there. The page itself is not editted that often except to add and remove (approved or otherwise) requests, so has been sufficient to merely watch the approvals page. I see no compelling reason to change how this works, although if there were enough people who wanted to we could. When a bot requires a flag, it will say so. So all you have to do is to look for something like "bot flag required" and take appropriate action. At that point it is the job of the bureaucrat to ensure that the approval is legitimate and properly sanctioned. The following diffs will show exactly how this is done: here and here. The last link to the log shows how the bureaucrat has to manually check the approval to ensure that the approval is valid. -- RM 18:14, 6 December 2006 (UTC)
- With regards to your suggestion, transcluded subpages generally are only used for the initial approval. Since they are large and take up a lot of space, we only use them for new approvals. New tasks and approved tasks are just linked from the main page. In your example there is no reason to move a transcluded subpage, since once a bot is initially approved or denied, the subpage is then no longer transcluded. I've found that using everyone on one page streamlines the process. When I want to get an overview of what is going on, I have one central page to deal with, rather than a 3 page system. I think you'd find that others would agree. -- RM
[edit] Pockbot - request to un-archive approva request
PockBot's Request for Approval was arhived after I requested a suspension to allow me to complete the bot's development. I've now finished coding the bot and have run some test edits on it and it seems ot be working. Can I get the bot's RFA un-archived and the bot approved please? - PocklingtonDan 17:56, 6 December 2006 (UTC)
- Done. -- RM 18:07, 6 December 2006 (UTC)
- Thanks —The preceding unsigned comment was added by PocklingtonDan (talk • contribs).
[edit] Interesting discussion
Something relevant that the Bot Approvals group may want to have a look at is being discussed at the Main Page's talk. It may involve the approval of an adminbot (basically, since it would be used exclusively to edit a protected page). Titoxd(?!?) 04:05, 9 December 2006 (UTC)