Wikipedia:Bots/Requests for approval/VixDaemon 4
From Wikipedia, the free encyclopedia
- The following discussion is an archived debate. Please do not modify it. Subsequent comments should be made in a new section. The result of the discussion was Approved.
[edit] VixDaemon
tasks • contribs • count • sul • logs • page moves • block user • block log • flag log • flag bot
Operator: KyraVixen
Automatic or Manually Assisted: Automatic
Programming Language(s): Python
Function Summary: Clone of BetacommandBot, task 3. Wikipedia:Bots/Requests for approval/BetacommandBot 3
Edit period(s) (e.g. Continuous, daily, one time run):
Edit rate requested: Unknown. Edits as needed.
Already has a bot flag (Y/N): Yes
Function Details: Performs linksearches (ie, checking if a specified domain exists within Wikipedia) using Special:Linksearch using requests taken from IRC and posts the results of the search to subpages of WP:WPSPAM. This is a clone of Wikipedia:Bots/Requests for approval/BetacommandBot 3; additional information is there (questions, comments, etc), but all pertinent information is here.
The following was added 01:13, 23 March 2007 (UTC): When a user initiates a linksearch from IRC, the bot queries Wikipedia using Special:Linksearch, and compiles a list of pages with the link on them. If the command passed to the bot is linksearch then the bot creates a page containing the links such as this one, or updates it with the most current list if the page already exists.
If the request is a linksearch2 the bot will do the above, but instead of creating a separate page, it will log the number of links here. It will do this on a 'linksearch' as well.
I don't plan on doing the next function alot, if at all, but a linkgenupdate uses the entries on this page to update the count of the pages that have the domain at Wikipedia:WikiProject Spam/Report. This function updates the corresponding /Linksearch/<Site> page if the number of links within Wikipedia has changed for the corresponding domain.
The crosswiki command searches for a supplied domain link throughout the following wikis: en, de, ja, fr, pl, it, nl, es, pt, zh, ru, fi, no, he, and sco. The pages it finds are made into a list, and the result overwrites the list at this page, as well as supplying the username of who initiated the cross-wiki search.
The blacklistupdate command is a "privileged" command (meaning trusted users that are added to the bot can execute this function) which simply performs a standard 'linksearch' with domain names stored in an array that are added by privileged users.
The bot does not remove any links, nor does it have the capability to do so. It merely reports the links gathered by Special:Linksearch.
The following was added 01:57, 23 March 2007 (UTC): The command linkfilter will move entries with zero links on Wikipedia from Wikipedia:WikiProject Spam/LinkSearch/List to Wikipedia:WikiProject Spam/LinkSearch/Holding 1, the contents from holding 1 will be moved to Wikipedia:WikiProject Spam/LinkSearch/Holding 2, and the contents of holding 2 will be moved to Wikipedia:WikiProject Spam/LinkSearch/Old. No further moves occur. Every 24 hours the bot checks /List, /Holding 1, and /Holding 2. If the link has no instances on Wikipedia, it will be shuffled off to the next page, with /Old as the dumping ground. However if a link is detected in holding one or two with one or more links, the link will be returned to /List.
[edit] Discussion
Speedy approved Betacommand (talk • contribs • Bot) 02:50, 21 March 2007 (UTC)
Hold up. Please wait to run this bot until the other request is approved. Also, Betacommand, if you could use the correct templates listed at {{BAG Admin Tools}} it would help the BAGBot keep track of bot requests. And don't approve your own bots or clones of them before the original one is approved. —METS501 (talk) 03:11, 21 March 2007 (UTC)
- then can someone approve /BetacommandBot 3 as its been operational for a while and have had zero complaints and their are no issues with how it operates? Betacommand (talk • contribs • Bot) 03:33, 21 March 2007 (UTC)
{{BotDenied}} See Wikipedia:Bots/Requests for approval/BetacommandBot 3. —METS501 (talk) 00:29, 22 March 2007 (UTC)
Application is being reopened. Mets, who closed the application, has agreed to this action by email. --kingboyk 20:23, 22 March 2007 (UTC)
I have added a more detailed function list of the commands that directly interact with Wikipedia to try and further clarify the function of this code; there are a few others in the code (such as add to a temporary list of 'watchlisted' sites used for a blacklistupdate, or adding users to the list of trusted users), but they seem rather trivial at the moment. If it is desired, I will go back through the code and hash out what they do. Kyra~(talk) 01:13, 23 March 2007 (UTC)
- One more function added. Kyra~(talk) 01:57, 23 March 2007 (UTC)
- As a non-bag member, I don't see any reason to deny this request, all wikipedia edits are kept to within WP:WPSPAM space. —— Eagle101 Need help? 02:53, 23 March 2007 (UTC)
- Is it possible to specify a maximum edit rate? While the edit rate can be unknown the maximum edit rate should be known and controllable. I understood that the bot only uses Special:Linksearch when this is requested by a human user. Is that right?
- If so I don't have any objections. Sounds quite useful. Can you please list the bot under Wikipedia:WikiProject External links#Bots so the WikiProject External links can keep track of all bots working on external links? — Ocolon 08:01, 23 March 2007 (UTC)
- Is there any reason why we have both a Wikipedia:WikiProject Spam and a Wikipedia:WikiProject External links? --kingboyk 13:12, 23 March 2007 (UTC)
- I don't really think this belongs here and I am not familiar with the historical reasons of that. It seems they were created at the same time. However, there is (a) non-link spam and there are (b) link issues not connected to spam (e.g. dead links). — Ocolon 13:27, 23 March 2007 (UTC)
- Well the only reason I ask is that we seem to have commentators here from 2 different WikiProjects. Perhaps you guys could work together that's all. Any further discussion can indeed go elsewhere. --kingboyk 13:30, 23 March 2007 (UTC)
- I don't really think this belongs here and I am not familiar with the historical reasons of that. It seems they were created at the same time. However, there is (a) non-link spam and there are (b) link issues not connected to spam (e.g. dead links). — Ocolon 13:27, 23 March 2007 (UTC)
- Ocolon, at the moment there is no maximum edit throttle specified, but the code seems to have one built in. In the worst case scenario (four users querying four different domains) I would guess that the max would be twelve edits per minute, although when operating under normal conditions it is much less. An example (the length it sleeps varies) of what I mean by the built in limiter is the following snippet from the console window after a page was saved:
Getting page Wikipedia:WikiProject Spam/LinkSearch/irishabroad.com
Sleeping for 8.5 seconds, 2007-03-23 17:27:31
Changing page en:Wikipedia:WikiProject Spam/LinkSearch/irishabroad.com
- And yes, the bot only queries Special:Linksearch on request from a human in IRC. In addition, I have also added the bot to WP:WPEL, per your request. Kyra~(talk) 03:00, 24 March 2007 (UTC)
- Sounds good, thanks. — Ocolon 07:44, 25 March 2007 (UTC)
- Is there any reason why we have both a Wikipedia:WikiProject Spam and a Wikipedia:WikiProject External links? --kingboyk 13:12, 23 March 2007 (UTC)
Approved for trial. You may run this task for up to 3 days while discussion continues. At the request of any admin or BAG member you must stop immediately. Data collected is not to be used at this stage by any automated or semi-automated process for link removal. --kingboyk 13:12, 23 March 2007 (UTC)
- The first edit in this trial was March 23, 17:46 UTC, so I will terminate the bot on March 26 and report back with the results. So far, it is running without incident. Kyra~(talk) 03:00, 24 March 2007 (UTC)
- Great. Good luck. --kingboyk 15:52, 25 March 2007 (UTC)
Approved. While the results have not been posted here, I've looked over them and checked a sample number of them for accuracy with no problems detected. It even filters out duplicates nicely. There have been no comments posted to this user's talk page about this bot being problamatic either. This is a clone of Betacommand's bot which was just speedily approved on the basis of this test and its own previous work. -- RM 12:44, 26 March 2007 (UTC)
- Yes, I'd been keeping an eye on contribs and talk too. Seems fine. --kingboyk 12:56, 26 March 2007 (UTC)
- The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made in a new section.