Wikipedia:Bot owners' noticeboard/Archive 3

From Wikipedia, the free encyclopedia

Archive This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page.

Contents

HBC Archive Indexerbot bot flag

HBC Archive Indexerbot (HBCAI) (taskscontribscountsullogspage movesblock userblock logflag logflag bot) automatically generates indexes of archived discussions. For an example, see Wikipedia talk:What Wikipedia is not/Archive index. The bot's user account is presently not flagged as a bot account. In the approval decision, Mets501 remarked that the bot did not need a flag. However, I'm thinking that should be reexamined. The bot is now generating quite a few indexes (over 100), and that is likely to grow. I first mentioned this to the bot's operator, Krellis, and s/he suggested I raise the question here. So here I am.  :) Thanks for listening! —DragonHawk (talk|hist) 21:29, 31 December 2007 (UTC)

For what it's worth, I don't mind either way, as the current operator/maintainer of the bot - given the number of indexes it's generating, and the fact that it is increasing (slowly but surely) all the time, I'd say it probably does make sense to have a flag, to avoid cluttering recent changes. —Krellis (Talk) 23:47, 31 December 2007 (UTC)
Im guessing the only reason it was not flagged was because it had a low number of edits and so it wasn't really deemed necessary. Saying that, at the moment we seem to flag anything that is a bot, no matter how low the edit count, so yes it should be flagged. -- maelgwn - talk 01:27, 1 January 2008 (UTC)
Flagged. WjBscribe 17:49, 10 January 2008 (UTC)
Thanks, WJBscribe! —DragonHawk (talk|hist) 19:31, 10 January 2008 (UTC)
Thanks! —Krellis (Talk) 21:05, 10 January 2008 (UTC)

BetacommandBot using wrong tags

What can be done about BetacommandBot using the wrong tag when tagging image files that do not have the required FUR? He is continuing to use the following tag on pages that have no rationale at all. It is confusing to other users.

{{di-disputed fair use rationale|concern=invalid rationale per [[WP:NFCC#10c]] The name of each article in which fair use is claimed for the item, and a separate fair-use rationale for each use of the item, as explained at [[Wikipedia:Non-free use rationale guideline]]. The rationale is presented in clear, plain language, and is relevant to each use.|date=January 2 2008}}

And using the following edit summary: tagging as invalid rationale per WP:NONFREE

See: http://en.wikipedia.org/w/index.php?title=Image:Parque.JPG&diff=prev&oldid=181652274

This is just not the best way to deal with the problem of missing FUR's

Is an edit rate of over 700/minute considered acceptable for a bot? Most of his requests do not indicate any requested edit rate and one I did find when someone asked about it was:

"as all my previous tasks and for the edit limit it will be 10-15, per minute same as with all my tasks. I thought this was known all ready as this is only my 8th bot request that you guys have handled. Betacommand (talk • contribs • Bot) 03:24, 5 May 2007 (UTC)

Though I did find some that were marked Max delay =5 Maxlag=5

And of course, it is nearly impossible to tell anything from his bot user page as it is only a link of a bunch of request for approval pages that do not clearly match up with the current edits he has been doing. He seems to be of the belief that if another bot has been approved to do something, he can also do it without requesting permission or even bothering to identify such on his user page.

But then again, I guess BetacommandBot is considered a super bot that does not have to follow the rules that apply to more lowly bots. Dbiel (Talk) 03:02, 10 January 2008 (UTC)

700 edits per minutes? That is a super bot! BJTalk 03:05, 10 January 2008 (UTC)
Dbiel I follow Maxlag which is not max delay. there is a major difference, I know our bot policy better most. I also know what the servers can take, how to operate with them and prevent my bot from causing problems. I use a setting of maxlag=5. which is better on the servers than any hard coded edit per minute rate. when the servers have a high load BCBot does not edit. so unless you know what your talking about please stop trolling and shut up. βcommand 04:20, 10 January 2008 (UTC)
Sorry about the typo and I have no doubt that you fully understand the policy better than most, but my point is that you push the policy to extremes rather than setting the best example of following it. Granted, MaxLag is better, but that is not what you requested in the majority of your requests and that is not what you have stated on your user page. You get an approval for one task and then apply that approval to other tasks and fail even to note that fact on your user page or request that the change apply to your other tasks as well. You call me a troll, fine, but you also thanked me for pointing out a bug in your bot. Then as you have previously stated, you follow the WP:IAR yet at the same time enjoy enforcing the letter of the rules on images as related to FUR's while tagging them with misleading tags. Instead of tagging Images with no FUR with {{nrd}} as stated in your request, you use the misleading tag indicated above. Just another case of saying one thing and doing something else. Dbiel (Talk) 15:35, 10 January 2008 (UTC)
when I filed the previous request maxlag did not exist. Instead of filing new request, just to enable a better system of doing the same thing. IAR has uses and limitatons. I do not state any kind of edit rate on the bot user page. I was approved for tagging images with no valid rationale and that is all the bot does. βcommand 16:39, 10 January 2008 (UTC)
Interesting, you say that you do not state any kind of edit rate on the bot user page, yet you claim you state what the bot is doing on your user page as required by the guidelines; yet the only reference to the bot functions is a link labled "tasks" which is only a list of bot approval requests that do state specific edit rates. So if one is to say that the link meets the requirements of identifing what the bot is done on the bot user page (which I do not believe that it does), then it must be said that the edit rates are also identified on the bot user page. You can't have it both ways. Dbiel (Talk) 19:56, 10 January 2008 (UTC)
Those BRFA's give EPM rates and maxlag=5 as edit rates. Bot policy does not require you to give the rate that you plan on editing. βcommand 22:25, 10 January 2008 (UTC)
My bot also runs at a different edit rate than I requested in the BRfA and I don't list it on my bot's talk page. Are you honestly suggesting that every change made to approved bots needs a new BRfA? BJTalk 23:11, 10 January 2008 (UTC)
No I am not suggesting that every change needs a new BRfA. But I am suggesting that the basic polices as found at the Wikipedia:Bot policy be followed. And yes, changes to the bot should be reflected on the bot's user page. Why link to a bot approval page as the only record of what the bot is doing if it is no longer accurate. If you want an example of what I would expect to see, take a look at User:Polbot

<blockquote>

The bot account's user page should identify the bot as such using the {{bot}} tag. The following information should be provided on, or linked from, both the bot account's userpage and the approval request:

  • Details of the bot's task, or tasks
  • Whether the bot is manually assisted, or runs automatically
  • When it operates (continuously, intermittently, or at specified intervals), and at what rate
  • The language and/or program that it is running

</blockquote>

If not why not simply change to policy to say that it only applies to new bots. and that previously approved bot can make changes at will without any notice, or record of what changes have been made. Dbiel (Talk) 23:48, 10 January 2008 (UTC)
I don't see any rate on that page. BJTalk 00:27, 11 January 2008 (UTC)
I am sorry that I did not make it clear enough see User:Polbot/older tasks The single task on the main page is missing the edit rate but it is also hung up in BAG. It does include the other key elements. Also please note that the other tasks were on the main page until a short time ago.Dbiel (Talk) 03:11, 11 January 2008 (UTC)
Dbeil, like I said stop tolling When it operates is not its edit rate. I cover that on the user page BetacommandBot will be an ongoing bot, run whenever I can run it, or feel like running it.. Like I said before you dont understand the bot policy so dont attempt to force it on others unless you know it. βcommand 02:06, 11 January 2008 (UTC)
Apparently I continue to fail to communicate in a way that you can understand what I am trying to say. The edit rate was not the issue. The last issue I was making was simply in reply to your statement that "it is not on my use page" but also saying the the link to the approval requests is all that is needed for a user to understand what your bot is doing and that it meets the requirement that the information is linked to the user page, in which case the edit rate IS on your user page due to the link. But again, this is not the main point. The main point is that no one can look at your user page and figure out what the bot is doing. But it seems that no one else cases so I will drop the issue.

On a separate point, I still think it is wrong for you to put a 7 day delete tag on images that are used properly under fair user rules, but lack the technically required FUR which is then followed up by an administrator who blindly deletes the images based solely on your tag. And for an example see Image:Parque.JPG where an admin made deletions on 70+ pages in less than one minute using TW. or Image:NforDisco.jpg where the deletion record reads 17:48, January 10, 2008 East718 (Talk | contribs) deleted "Image:NforDisco.jpg" ‎ (CSD I7: Bad justification given for fair use and the uploader was notified more than 48 hours ago) where the image was uploaded over a year ago and the last edit to the page using it was on Nov 11, 2007. a 7 day notice posted on the article page just is not enough. Since nobody seems to care, I guess I might as well give up. The result is going to be that a lot of pages that are using fair use images correctly, will find their images deleted. Which is a shame as it is not that hard to generate the required FUR but it is time consumming. I am slowly reverting all of your Polbot reverts that you marked as vandalism as so far I have not found one reverted on any valid grounds other that it was made by an unauthorized bot. Dbiel (Talk) 03:11, 11 January 2008 (UTC)

Semi-automated tagging of Shared IP Addresses

Betacommand has requested that I post for community discussion about a bot that I just proposed, called IPTaggerBot. If you are interested in commenting on the subject, please review the bot approval request at Wikipedia:Bots/Requests for approval/IPTaggerBot as well as the bot's userpage at User:IPTaggerBot. Thank you. Ioeth (talk contribs friendly) 17:02, 24 January 2008 (UTC)

Bot tag

Perusing the rollback logs, I see User:VoABot II has rollback but not a Bot membership and User:VoABot III has nothing. Shouldn't they both have the bot flag to keep their edits from cluttering recent changes? MBisanz talk 04:11, 27 January 2008 (UTC)

Anti-vandalism bots are not normally flagged. βcommand 05:14, 27 January 2008 (UTC)

Please block PipepBot as out-of-control bot

PipepBot (talkcontribs) is broken and is removing lots of valid interlanguage links, e.g. [1] [2][3][4][5][6] [7][8][9][10] (there are many more examples). It is also moving existing interlanguage links around (out of alphabetic order) for no good reason, e.g. [11]. This is causing disruption. The bot owner has been notified of these concerns [2], but I am suggesting a temporary block to prevent the bot causing further unnecessary disruption. - Neparis (talk) 19:09, 27 January 2008 (UTC)

I think that you should ask at WP:AN/I for a temporary block, not here. NicDumZ ~ 19:14, 27 January 2008 (UTC)
Actually, I copied your message overthere. NicDumZ ~ 19:21, 27 January 2008 (UTC)
Thanks, hope it gets blocked soon. - Neparis (talk) 19:24, 27 January 2008 (UTC)
(cross-posted from WP:ANI [12]) The bot is still operating across other wikis, e.g. fr.wiki, de.wiki, it.wiki (probably more wikis too). It is removing valid interlanguage links there too. I presume it cannot be blocked by admins on en-wiki. Is there a central cross-wiki noticeboard for reporting a bot that is misbehaving across multiple wikis? (rather than making multiple reports to different wikis) - Neparis (talk) 19:40, 27 January 2008 (UTC)
The bot is very active and very out of control on multiple wikis. Is anybody around to give advice on how to handle this? - Neparis (talk) 20:02, 27 January 2008 (UTC)
Ok, this bot is not out of control. The user is fixing interwiki conflicts. Please unblock this bot. Nothing wrong with these edits:
Looks like Neparis owes someone an apology - multichill (talk) 23:11, 27 January 2008 (UTC)
well, as far as I know, fr:Ville is the translation of City, even if it is also the meaning of Town. Interwiki.py usually don't remove "controversial" interwikis like these, unless the -force option is activated. It should not. NicDumZ ~ 23:16, 27 January 2008 (UTC)
City/town is a mess. Probably unfixable in the current interwiki system. But you're wrong about the -force option. I happen to run an interwiki bot myself and i never use the -force option. I do however fix interwiki conflicts every once in a while. This means i pick a page and run the bot without the -autonomous option (and without -force option). Bot asks me a lot of questions and in the end adds and removes a lot of links. Looks like Pipet did the same. multichill (talk) 23:23, 27 January 2008 (UTC)
I also run a bot by myself... I know how the script works. And I don't think he was online, because he hasn't answered on his talkpage... NicDumZ ~ 23:34, 27 January 2008 (UTC)
(this hopping back and forth between WP:ANI and WP:BON is confusing :-); I just replied on ANI — can we stay over there for followups please?):
Well, the ones that really caught my eye were the interlanguage link removals for dioxin.[23] I just reviewed them again and at least some of them still look like they might be considered at least somewhat controversial link removals. I could be wrong about it, but some wikis (e.g. Danish) seem to me to have an article on dioxin, but not yet an article on polychlorinated dioxins, which is a specific type of dioxin. In such a case, having interlanguage links to dioxin, as the general term, seems quite useful to me. User:Blech from de-wiki has told the bot owner that most of the interwiki links that the bot removed were correct and that he has reverted the bot.[2] I have not checked any of the other examples in detail, but I had a quick look at one of them — the aerosol link removals.[24] Particulates are a cause of aerosols, and, though I may well be wrong about it, some wikis (e.g. French) seem to have an article on the latter but not the former, so, in such a case, having the interlanguage links, e.g. to fr:Aérosol, seems quite useful to me. I am acting in good faith here, and if I have made a mistake I will certainly say sorry to the bot owner. Please let me know your thoughts — I can take a wikitrout or two. - Neparis (talk) 00:26, 28 January 2008 (UTC)
Hallo Neparis, I am the operator of PipepBot. Doing my edits I follow the policy in Help:Interlanguage links#Bots and links to and from a section, which says "The activity of the bots also requires that interlanguage links are only put from an article to an article covering the same subject, not more and not less." It is not always easy to find the right interlinks, but until now PipepBot has done thousends of edits in en:wikipedia and many thousands in other wikipedias, most of thems in manual mode, and I never had problems. Of course it is possible, that some edits are not optimal, but I am open for discussion. --Pipep (talk) 20:44, 28 January 2008 (UTC)
NicDumZ, with bot i meant interwiki.py. You dont seem to understand that. How can a bot fix an interwiki conflict without human intervention? It cant, a user has to be present. The force option only removes links to non-existant pages, -autonmous just skips them and in normal mode the bot ask you if you're sure you want to remove a link. So please stop about the -force option.
Neparis, you cant have interwiki's to multiple articles in the same language, that's an interwiki conflict. You have to keep the sets separated. So en:polychlorinated dioxins -> <some language>:dioxin -> en:dioxin is an interwiki conflict. You have to create two sets. One set with articles with a link to dioxin and one set with articles with a link to polychlorinated dioxins. One link between these two sets and you get a conflict. multichill (talk) 21:11, 28 January 2008 (UTC)
Multichill, please, stop this. I know how pywikipedia works. I submit patches overthere, and I'm also a regular user of interwiki.py, whatever you may think.
My point was : a bot owner should check his talkpage, whatever happens. He was obviously not, and his bot made controversial changes. I moved the request to AN/I to request a protective temporary block. That's it. Period.
NicDumZ ~ 22:28, 28 January 2008 (UTC)
I was obviously present while solving interwiki conflicts. It is impossible to solve interwiki conflicts without beeing present, even if I would have enabled the option -force. At the time the bot was blocked, the bot was offline and me too. --Pipep (talk) 19:17, 29 January 2008 (UTC)

Cleaning up the bot status page.

Would anyone mind if I cleaned up Wikipedia:Bots/Status? Three things I'd like to do, for easier readability:

  • Archive the bots on that lists which are currently discontinued
  • Add the RFBA link, since several of them don't have it
  • Expand the "purpose" section, since some of them are empty or have ambiguous descriptions like "Various tasks"

  Zenwhat (talk) 06:13, 29 January 2008 (UTC)

As an example of why this needs to be done: User:Polbot was approved to do minor tasks, like wikifying data for U.S. politicians [25] and removing piped linking in disambig pages [26]. In practice, the bot had been automatically-generating "fair use rational" for tons and tons of images, something he hadn't gotten approval for. When it discovered, the bot was temporarily blocked. [27][28]

Now, let's say hypothetically this user decided to act in bad-faith. They aren't, I don't think -- they seem like a good user, but it is still possible.

A newbie happens to see this bot making changes to fair use rational, wondering, "How strange! I wonder if this bot is approved!" They go to the bot status page and it says it's active and approved for "various -- see bot's user page." But then, if you look at the bot's userpage, the bot is no longer active at the moment and it hasn't been approved for all the stuff it's been doing. Records like this should be kept in good order.

One might also check to see if the list matches up with the list of users flagged as bots. [29], in Category:Wikipedia bots, and Wikipedia:Registered bots should have a link to Wikipedia:Bots/Status to avoid confusion like this. [30] Almost a year ago, Betacommand said he'd merge the two pages. [31]

  Zenwhat (talk) 06:31, 29 January 2008 (UTC)

I totally support record cleanup :) -- SatyrTN (talk / contribs) 05:29, 31 January 2008 (UTC)

Free bandwidth.

I have a pretty decent connection on broadband and I have two computers (a little wannabe\mini-workstation). If any bot-owners here need somebody to run a bot for them, I'd be willing to do it either my main PC (Windows Vista) or my secondary PC (Ubuntu Linux, but I can wipe it clean and install any version of Linux you like, so long as you can walk me through setting up the KDE/GNOME GUI). Preferably, I'd want it to be on my secondary PC, the Linux box, since I rarely use it.   Zenwhat (talk) 23:26, 30 January 2008 (UTC)

what are the system stats? βcommand 23:36, 30 January 2008 (UTC)

Main PC (32-bit Windows Vista)

  • CPU: AMD Athlon 64, 3400+, 2.20 GHz
  • RAM: 3 GB
  • HDs: Dual HDs (Master 150 GB, Slave 75 GB), each currently with about 4 GBs of free space (but I could easily free up a lot more if you need it)

Secondary PC (Ubuntu Linux)

  • RAM: 775 MB
  • CPU: Intel Celeron, 1.70 GHz
  • HDs: Single HD, 30 GB, 22 GB of free space.

Speedtest from Speedtest.net:

  • DL: 14076 kb/s
  • UL: 1537 kb/s

Also, I'm mostly pretty good about keeping my computers clean of spyware and viruses, not the typical end-user "omfg teh bad man haxed me, help me AOL tech supp!!"   Zenwhat (talk) 01:52, 31 January 2008 (UTC)

Im sorry but that is not enough RAM for my bots. βcommand 01:56, 31 January 2008 (UTC)
Beta needs more pickles. BJTalk 02:45, 31 January 2008 (UTC)
(EC)Betacommand, would your bot benefit from a clustered environment? I'm still waiting on a few parts to come in, to complete it, and, I need to get off my butt, and finish setting it up, but, I should have my OpenSSI cluster up and running soon. I would suppose, that the biggest bottleneck will be my uplink (I've only got 1mbps up x 10mbps down or so), but, if you've got a task that'll need more CPU than BW, I might could help with that soon. It'd have to be a threaded app, however, to take advantage of it (depending on how the hardware works, it should be between 12 and 48 machines, all P3-600's, w/256Mb ram. Due to power costs however, I don't plan to run it 24x7) SQLQuery me! 02:54, 31 January 2008 (UTC)

3 GB is not enough RAM? How much do you generally use?   Zenwhat (talk) 02:54, 31 January 2008 (UTC)

On a Vista box 3 GB RAM usually means about ~1536 MB of usable RAM. toss in the user doing anything else and that cuts down even more, and then with only a 2.2GHz processor.... Ill take my 8GB RAM linux toolserver any day. βcommand 03:43, 31 January 2008 (UTC)
If I had to guess, I'd assume the vista box would be less than usable for BCBot (is py available on win32/64? I haven't used windows in a long long time) SQLQuery me! 02:56, 31 January 2008 (UTC)
Yes, Python is available for everything, even toasters! BJTalk 02:59, 31 January 2008 (UTC)
Heh, learn something new every day :) My experience with *nix tools in a windows environment has always been "It works... Technically" :P But, my last real windows encounter was with a Windows 2000 machine :) SQLQuery me! 03:02, 31 January 2008 (UTC)
Python is supported well, unlike most other unix tools. BJTalk 03:08, 31 January 2008 (UTC)

BAG confirmation running

Just a quick note that there's a WP:BAG confirmation (from the trial membership) of myself at Wikipedia talk:Bots/Approvals group#Confirmation under the old system (Snowolf). As has been required in the past, I'm posting this notice on WP:AN, WP:BOWN, WP:BRFA & WP:VP. Snowolf How can I help? 15:54, 1 February 2008 (UTC) Changed link based on change to that page Martinp23

Similarly there is a confirmation running for Cobi (talk · contribs) at the same location. Martinp23 18:28, 1 February 2008 (UTC)
As well as Dreamafter (talk · contribs). ~ Dreamy § 21:23, 1 February 2008 (UTC)

Bot to track bots?

I came across another unapproved bot, who has been uploading obscure European athletes to Wikipedia. [32] This is the problem with having bad records of bots and folks don't seem to be keeping very good track of it, which is a very bad thing, considering how much damage bots can do and how difficult it is to remove.

So, here's an idea for a bot:

  • Check a random user's contribs.
    • If they are uploading stuff at the rate that bots do (1 contribution a minute or more, for several hours at a time) check their name on the bot-status page.
    • If it isn't there, send them a warning.
  • Repeat the same process all over again

Anybody willing and able to do this would be appreciated and it seems to be of critical importance.   Zenwhat (talk) 00:39, 2 February 2008 (UTC)

Do you mean more than 1 edit per minute or more than 1 new article. I can envision many users in the sciences who upload an article a minute for long periods of time (Blofeld is another example) and there are probably many more users with AWB or other scripts adding tags and cats and project boxes to articles at a similar rate. How would we screen them out? white list maybe? MBisanz talk 01:54, 2 February 2008 (UTC)
If an editor is a bot, how would sending them an automated talk page message do anything? High-speed editing isn't always a bad thing, and it doesn't imply that the user is a bot. Some semi-automated scripts can make a couple of edits a minute. GracenotesT § 01:57, 2 February 2008 (UTC)

Yeah, MBisanz, a whitelist. Or you could tighten up the algorithm even further, to make it multiple edits a minute.

Keep in mind, I'm talking about 1 edit a minute for several hours straight. We're human beings, so that shouldn't be possible, unless Blofeld is being fed by intravenous fluid and using a bedpan or empty bottle to pee. Gracenotes: In the case above, the user seemed to be using his regular account as a bot and that's what a fair amount of users do. Even if an editor is using a secondary account as a bot, he should see the talkpage message after the bot is blocked.

Also, the diff above was incorrect!

I accidentally posted the message on the wrong talkpage, rofl. Not even the right user's talkpage, but Talk:Bitburg.

I changed the diff above to this one. [33] I also removed my comments from Talk:Bitburg.

It's User:Markussep. Look at his contribs.   Zenwhat (talk) 05:40, 2 February 2008 (UTC)

Did WP:BOT change recently? I seem to remember at some point that any user could run a bot on their own account, so long as it wasn't disruptive (flooding, etc) and that they took personal responsibility for its edits and couldn't have them excluded from RC via the BOT flag. MBisanz talk 05:49, 2 February 2008 (UTC)
Also on the issue at hand, I'd say it should track and output the names to some sort of noticeboard/editabuse page, but not warn the user. Given the possibility of false positives, a human editor should review/investigate. MBisanz talk 05:55, 2 February 2008 (UTC)

MBisanz, WP:BOT says all bots must be approved. Also, about your suggestion: Your idea isn't exclusive to mine. The algorithm could do both. Obviously, if somebody is editing at a rate of 1 edit a minute for weeks on end, they're using a bot. The specific issues of when and how to warn the user or post it to a noticeboard would probably best be worked out when the bot is tested.   Zenwhat (talk) 08:49, 2 February 2008 (UTC)

WP:IAR has been used, with community approval, for behavior that's against the word of WP:BOT. The only reason the rate limitation is in place is to make sure someone doesn't cause too much damage – and if they do, it's their responsibility to fix it. So long as a task is desired and implemented correctly, it should be fine for someone to run it as an automated task at a reasonable rate (although semi-automated is better, imho). A bot to track who reverted more than three times on one page in an interval of 24 hours was recently on BRFA, but did not pass; there were concerns admins reviewing a situation and jumping in the middle of it might make an incorrect assumption about the reverter and make an inappropriate block. A noticeboard for semi-automated tasks might be useful for preventing abuse, since someone wishing to run such a task can simply say "I plan on doing so-and-so at so-and-so a time", and if no one objects, he/she could run it.
To make these edits (warning: long page), I did not use a bot; in fact, I can show you screenshots of the semi-automated software I used (the screenshots were created when I was outright accused of using a bot, several weeks after the incident). The only interaction I received from the incident was a minor barnstar, initially given to someone else who was using the same script. If this "bot"-tracking bot existed, would I instead be warned? I dunno... GracenotesT § 17:44, 2 February 2008 (UTC)
When I revert vandalism, I'm a lot faster. I dislike this whole idea. If you want, you can take the dumps and analyse them, as it has been done from time to time (I remember dragon flight's report on adminbot of some time ago). I don't see any use in it, unauthorized bots are usually blocked only if they are disruptive or controversial. Uusually, when somebody is running a bot on his main account, a polite suggestion may be used. Snowolf How can I help? 18:32, 2 February 2008 (UTC)
I think a non-bot analyzer script(?) might be more effective than a bot. A smart vandal will learn what timings this bot operates on (say an edit a minute for more than 6 hours) and just change their program to edit only for 5 hours straight and then take an hour off. On the other hand, an analyzer could be run on various filters that would pick up a larger variety of forbidden bots. MBisanz talk 18:52, 2 February 2008 (UTC)
This is the sort of example of an unapproved bot that has community backing and may or may not be picked up by an automated checker User_talk:Misza13/Archives/2007/02#admin_actions_bot (and the weird WP:BOT reference I had stuck in my head). I'd say a human user would need to review all warnings, since there is so much variety in scripts, unapproved bots with community consensus, fast editors, smart vandals, etc. MBisanz talk 18:59, 2 February 2008 (UTC)

Who's been doing this?

Resolved.

See this. [34] Did anyone approve this? And if they didn't, who is doing this?

Now, again, it's clear why it's so critical to have very neat records of bots that are updated and regularly checked for inaccuracies, and that such inaccuracies are investigated.   Zenwhat (talk) 11:06, 4 February 2008 (UTC)

That has nothing to do with bot rights; anyone could run a script under their main account to make 2000 edits. The bot flag is irrelevant. — Carl (CBM · talk) 14:46, 4 February 2008 (UTC)
FYI, this is being discussed on Wikipedia:Administrators' noticeboard#Adding useless revisions to pages to make them undeletable.--Dycedarg ж 20:39, 4 February 2008 (UTC)
Tim has already acted by blocking BCBot. Snowolf How can I help? 21:08, 4 February 2008 (UTC)

BAG or the bureaucrats should probably issue a formal warning to Betacommand - it seems the ArbCom case hasn't taught him not to mess with unapproved bots. 1200 edits to even a userspace page is a big enough deal that it should be approved beforehand. There is probably a topic on this at WT:BAG, so this can be a cross-posting. Go read, I suppose. --uǝʌǝsʎʇɹnoɟʇs(st47) 20:59, 6 February 2008 (UTC)

User:BoxCrawler

Resolved.

I've blocked it indefinately for the time being, pending discussion and resolution

It should not be making edits such as this [35] - Nothing more than removing whitespace. [36] is bad enough, however, if it is going to do that, it shouldnt be just doing that as an edit alone (makes no difference to the template/categorisation).

Reedy Boy 18:00, 14 February 2008 (UTC)

Fully agree. βcommand 18:05, 14 February 2008 (UTC)
I'm sure you've checked with the owner, right? My bot has often done that when I'm in the middle of testing. Usually to no more than 10 articles, but it has happened :) -- SatyrTN (talk / contribs) 18:31, 14 February 2008 (UTC)
Per User_talk:Alanbly#User:BoxCrawler, it appears to be known and have been an issue for at least 3 days. MBisanz talk 18:42, 14 February 2008 (UTC)
Geez - when my bot's making stupid mistakes like that, I don't let it run three hours, much less three days! Thanks for checking! :) -- SatyrTN (talk / contribs) 18:47, 14 February 2008 (UTC)
Yeah, the owners page was a good confirmation of my action. Reedy Boy 19:18, 14 February 2008 (UTC)
OK I'm fine with toning the bot down. What is enough for an edit? How incorrect does the format have to be? The bot works the way i designed it I just didn't think it was this much of an issue. I don't know of any policy against bots cleaning up whitespace and reformatting a template so i went for perfect formatting. I'm perfectly fine with changing the bot I just need some guidance. Adam McCormick (talk) 00:57, 15 February 2008 (UTC)
From WP:AWB's instructions "Avoid making insignificant or inconsequential edits such as only adding or removing some white space, moving a stub tag, converting some HTML to Unicode, removing underscores from links (unless they are bad links), or something equally trivial." Basically in my hand editting, there needs to be at least a spelling correction or major reformat of text to make it worth it to save. Things like a reflist substitution or spacing or heading character spacing, isn't significant enough to warrant an edit. MBisanz talk 01:09, 15 February 2008 (UTC)
Ok, I'm not running AWB so it's not where I was looking. My bot is completely automated once I start it. These are the kind of Edits I would make myself if I ran across them so I didn't see a problem. I still need to know how much of an edit. Does fixing internal spacing count? Capitalization of inputs? Changing to avoid redirect? Removing duplicate of parameters? I guess I'm just asking if there is a line in the sand I can draw so i can fix this. Adam McCormick (talk) 01:16, 15 February 2008 (UTC)
Duplicate parameters yes. Changing to avoid redirects i would say is ok.. Where it is just "capitialising" paramters, or removing/adding a line of whitespace, that is where the problem lies. Reedy Boy 18:31, 15 February 2008 (UTC)
I think a bot shouldn't edit to avoid redirects at all, per Wikipedia:Redirect#Do not change links to redirects that are not broken. -- Jitse Niesen (talk) 19:53, 15 February 2008 (UTC)
And respond to inquiries about the bot. I asked about this on 12 February and got no reply. Gimmetrow 20:38, 15 February 2008 (UTC)
I apologize that I missed your comment. I didn't notice your signature and lumped it with the comment below it. Adam McCormick (talk) 23:38, 15 February 2008 (UTC)

(unindent) Ok, I have made edits to the bot so that it disregard spaces and caps. Is it OK if it changes these things any time it does make an edit (for other reasons)? Is there anything else, or can my bot start running? Adam McCormick (talk) 23:52, 15 February 2008 (UTC)

Yes, if the bot identifies a genuinely useful edit, it's OK to combine other edits. Gimmetrow 23:57, 15 February 2008 (UTC)
Alright, I've changed the bot to correct all this, can it be unblocked now? Adam McCormick (talk) 00:22, 17 February 2008 (UTC)
You still need to have a WP:BRFA, as I understand this, this is an unapproved fully automated bot. It must be approved before it is unblocked --Chris 00:36, 17 February 2008 (UTC)
Ok I've seen the original brfa now, but you will still need approval if the edits aren't combined with adding the infobox or whatever it does --Chris 00:41, 17 February 2008 (UTC)
The bot was approved to edit the {{WPSchools}} template. That's all I'm doing. The other edits support this activity either by cleaning up after former bot errors or by making the template easier to read (the bot only edits templates with certain factors). They are not drastically different from it's initial request. Adam McCormick (talk) 01:02, 17 February 2008 (UTC)
Please unblock my bot or give me reason why not. I have complied with this discussion and would like to complete the run (with changes in place of course). Adam McCormick (talk) 00:03, 19 February 2008 (UTC)
It looks like they want you to do another BFRA. It's important to do so because it details EXACTLY what tasks the bot will perform. (Just my 2 cents) Compwhiz II(Talk)(Contribs) 02:17, 19 February 2008 (UTC)
I could do that but it would consist mostly of reposting the original. I haven't changed what this bot does (or have reverted all changes) it shouldn't be this big a deal to get the bot unblocked. Adam McCormick (talk) 02:21, 19 February 2008 (UTC)
Please, unblock my bot. Adam McCormick (talk) 03:09, 22 February 2008 (UTC)
Unblocked, seeing as no one has brought this to BAG and the issues were minor. -- SatyrTN (talk / contribs) 05:44, 22 February 2008 (UTC)

Betacommandbot

Regarding the Feb 13-14 spree of around 20,000 articles tagged for speedy deletion I have attempted to initiate conversation with betacommand over possible glitches in his bot. It seems that it often fails to notify either the image uploader or the article talkpage. All those images will be up for deletion tomorrow and I'm certain that some "helpful" admins will burn through the lot in no time. I think that it's out of order that images can be deleted in this manner and I attempted to discuss the issue with Betacommand here but with this edit he moved the conversation to an obscure location on his talkpage where his responses were "that page is full of lies and bullshit and is a complete farce and it will not affect how I operate BCBot" and "as for the other images uploaded by English peasant, that was caused by a user re-name while the bot was running." I don't feel that this adaquetly addresses my concerns.

Also the state he leaves inactive users talkpages is disgraceful [37] 532,500k and growing, and the vast majority of the messages concern images that are only lacking a backlink to the article.

The vindictive attitude towards a vocal critic here and here is also way out of line. English peasant 11:51, 18 February 2008 (UTC)

Deletion bots and transparency

Please read Wikipedia:Bot_requests#Image_deletion_bot. BetaCommand is claiming that deletion bots are being run de facto covertly. If this is truly the case, I encourage anyone doing this to come clean. It's entirely unacceptable, not to mention entirely unnecessary. The unease of the community with admin bots is in large part due to the abuse of bot operators who run unauthorized bots because they believe the rest of the community shouldn't have jack to say about their fantastic bot. Pascal.Tesson (talk) 17:18, 2 May 2008 (UTC)

Misza and cyde both admit clearly that they do this. βcommand 2 18:05, 2 May 2008 (UTC)
Well we shouldn't let them. If they believe it's the right thing to do, they should be able to make their case clearly to the community. The community is too dumb to realize that admin bots have some merit? Come on, that argument is beyond condescending. The RfA for ProtectionBot would have been successful if it hadn't been withdrawn and would have faced only marginal opposition if it hadn't been closed-source. Wikipedia:Requests for adminship/RedirectCleanupBot faced a total of 15 opposes. Pascal.Tesson (talk) 19:23, 2 May 2008 (UTC)
I've notified Misza and Cyde of this thread. I'll be interested to see the responses. Franamax (talk) 19:32, 2 May 2008 (UTC)
And just in case people haven't read the other thread, I'm all in favor of image deleting bots and am ready to help in convincing the community to accept these. I'm just uncomfortable with (and actually pretty pissed at) bots doing this through a non-bot admin account. Pascal.Tesson (talk) 19:55, 2 May 2008 (UTC)
Yes deletion bots have been run, and I assume they continue to be run. Anyone looking at Misza's deletion log would have a big clue. This has also been discussed at some length at AN / ANI in the past, though good luck finding the right log for that discussion. Dragons flight (talk) 19:59, 2 May 2008 (UTC)
Naahh, Misza just had a browser with 100 tabs open. Isn't that the standard argument brought up when edits happen really fast? More seriously, there seems to be a gap in definition of what exactly "bot edits" are. Franamax (talk) 20:16, 2 May 2008 (UTC)
Not to mention his rigid compulsion to start his deletions at precise times each day. Dragons flight (talk) 20:20, 2 May 2008 (UTC)
As far as what constitutes bot edits goes: Any process which is fully automated. The policy claims that some semi-automated scripts might qualify as bots, but I've never put one through RFA and rarely see others do so. Generally, it's a bot if the person is willing to admit it's one. Otherwise, it's assumed that it's 100 tabs open in a browser and/or a semi-automated script. In any case, as I said on the other page, an image deletion deletion bot would never make it through RFA in a million years. Period. If you want this stuff done, it will have to be under the table, or you'll have to try to come up with an admin-bot approval process that circumvents RFA, which is just as unlikely as getting one approved. Hence, IAR for the greater good, or volunteer to be the one who slogs through thousands of crap images manually.--Dycedarg ж 20:50, 2 May 2008 (UTC)
Gah... parallel conversations are taking place on two different pages... (my bad!) The thing is that while ImageDeletionBot will probably be shot down at RfA, I am optimistic that BotDeletingImagesInTwoVerySpecificEasyUncontroversialCases would be fine. You just have to do the marketing right. The sucker volunteering to do the Commons dupes clean up by hand is me. I'm pretty fed up with it and seeing as there are huge backlogs in both cats, there aren't many suckers willing to take my place. Pascal.Tesson (talk) 21:14, 2 May 2008 (UTC)
(to Pascal) This is a meta-discussion though, with wider implications.
(to Dycedarg) I won't argue with what you say, but I don't particularly buy into "won't pass so do it in secret and never tell". That's indicative of a problem, we should solve that problem, n'est-ce pas? Look at User:BHGbot#Proposed, there is a good example of stating openly what will be done. Why can't an admin openly state intentions to run an automated series of edits on their own account?
And to the definition of bot edits, I would prefer to adopt "anything where you are blindly changing stuff quickly". For example, why was my own comment on my own talk page changed arbitrarily with no edit summary? That smells to me of a just-do-it-because-I-can mentality, that doesn't fit well into the wiki model, and rapid edits in large quantities leave the rest of us gasping for breath trying to figure out what just happened. The potential for damage is pretty large here. Franamax (talk) 21:35, 2 May 2008 (UTC)
Activities covered in the Signpost are not particularly secret (see the last paragraph). Dragons flight (talk) 21:42, 2 May 2008 (UTC)
Ouch, that was the month I joined up. I'll have to read those threads, I hadn't seen any indication of their existence 'til now. Maybe I wasn't looking hard enough. Franamax (talk) 21:57, 2 May 2008 (UTC)
OK, I've been through those threads, I note that the actual owners of the adminbots contributed a total of once (oops maybe twice) and the discussions seemed to peter out and die. What was your point? Franamax (talk) 00:50, 3 May 2008 (UTC)
I do generally agree that these things should be better documented than they are. Dragons flight (talk) 01:39, 4 May 2008 (UTC)

BotDeletingImagesInTwoVerySpecificEasyUncontroversialCases already exists. Misza13 runs it, as does east718 (and others, I'm sure), with error rates so low they're negative numbers (not really, but yeah). dihydrogen monoxide (H2O) 01:34, 4 May 2008 (UTC)

Tasks

If my bot was approved to subst Welcome templates, do I need to submit another BRFA to subst templates in the Category:User block templates? MBisanz talk 05:49, 3 May 2008 (UTC)

IMHO, this particular BRFA was regarding substing every talk page message, go for it. MaxSem(Han shot first!) 06:18, 3 May 2008 (UTC)

Ongoing BAG membership request

I've put myself up for BAG membership (God that sounds lame) on WT:BAG -- here. Any comments / questions / votes (zomg votes!) are welcome. Cheers. --MZMcBride (talk) 05:08, 4 May 2008 (UTC)

Bot RFC

An RFC on a bot has been opened at Wikipedia:Requests for comment/VoABot II, if you are interested in the proceedings, please stop by and have a look and/or make a comment. Thanks, — xaosflux Talk 03:50, 5 May 2008 (UTC)


Addbot

USer:Addbot: - Seems to have been running without a flag, and non-approved tasks. Rich Farmbrough, 12:11 5 May 2008 (GMT).

Addshore has pointed me to some approvals. Rich Farmbrough, 12:14 5 May 2008 (GMT).
Also flag has now been added. ·Add§hore· Talk/Cont 15:22, 5 May 2008 (UTC)

Adminbot operator needed

Is there any user who is an experienced bot operator and a wikipedia admin, who is prepared to endure a bot RfA to manage a bot like this? From the discussion at AN, it seems that the idea has support from the community, especially if the bot were copied directly from the one at nl.wiki, which has been operating flawlessly for over a year. I'm sure the bot-operating community would support to the hilt any brave soul willing to endure the Skynet comments at RfA for a week. Volunteers, anyone? Happymelon 15:38, 4 May 2008 (UTC)

What are the technical requirements? If its a "run once a day" bot, I would be interested, if its a "run 24/7" bot, I'd have to think it over. MBisanz talk 15:41, 4 May 2008 (UTC)
It seems to be a batch process, currently run once or twice a week at nl.wiki and he.wiki by RonaldB-nl, based on a database of TOR nodes identified through statistical traffic analysis. The BRFA linked above was for RonaldB-nl to perform the same task (using the same script, presumably) for en.wiki, but was denied because he was not himself an admin here. So the task would probably consist of liasing with RonaldB-nl, getting hold of the blocking script, configuring it to work on en.wiki, then regularly processing the data file and doing a blocking run about once every 3 days or so. You can see more details here. Happymelon 15:50, 4 May 2008 (UTC)
Ok, I've read it over, seems simple enough, click a button loading a list every couple of days, bot's code takes care of the rest, check the bot's contrib list for accidental false positives. Yes, I know an RFA will be painful, but where do I sign up. MBisanz talk 15:55, 4 May 2008 (UTC)
I guess file a BRFA and wait for BAG to mark it approved for trial. Then armour up and get ready to face the music! Let me know when the RfA goes live so I can lend some support. Happymelon 17:34, 4 May 2008 (UTC)
Sounds to me as if the GlobalBlocking extension that is currently being developed may save a whole lot of log entries in the near future if this bot would ever be used on meta:Global_blocking. Siebrand (talk) 13:07, 14 May 2008 (UTC)
I think global blocking is going to be contorversial enough without the additional complications of potential GB users knowing that there will be a bot operating on a distant site largely beyond their control, applying blocks to IPs that a distant user, who might not even speak their language, has determined are open proxies. At least we at en.wiki have the advantage that RonaldB-nl is somewhat active here, and can respond to queries and comments. If we can gain the necessary interwiki consensus, then yes, it would be an infinitely better idea to apply these blocks at meta and propagate them through GlobalBlocking, but for the meantime (and given that we'd have to wait until the extension was deployed before even beginning that discussion), local implementations are the way forward. Happymelon 10:47, 15 May 2008 (UTC)
Sure, just painting the big picture :) Siebrand (talk) 13:46, 15 May 2008 (UTC)

New BAG members

Kingturtle volunteered some crat time to close the pending BAG nominations. Krimpet, Maxim, MBisanz, and Mr.Z-man are now BAG members. MZMcBride's nomination did not reach consensus. — Carl (CBM · talk) 19:56, 15 May 2008 (UTC)

Request for BAG membership

I've posted a request for WP:BAG membership here, comments are appreciated. Mr.Z-man 05:55, 6 May 2008 (UTC)

I've also nominated Krimpet here. Any input would be greatly appreciated. SQLQuery me! 06:34, 6 May 2008 (UTC)
I have accepted a nomination to be considered for membership in the Bot Approvals Group. Please express comments and views here. MBisanz talk 08:35, 6 May 2008 (UTC)

I have also nominated myself, here. dihydrogen monoxide (H2O) 02:09, 9 May 2008 (UTC)

I also reopened mine --Chris 11:43, 17 May 2008 (UTC)

WP:IPEXEMPT

The IP block exemption mechanism has now been enabled for non-admin accounts.

Broadly, an account can be tagged to be unaffected by any IP block (including autoblocks), meaning only a direct block on the actual account name, will block it.

This might be useful for "bots that meet some suitable standard of approval and acceptance" (BAG approval, bot flag, don't know, not in BAG)... it will mean such a bot can't be blocked as a result of a toolserver block, or fallout via autoblock from another bot being blocked.

Users at BAG, and crats, might want to consider whether IP block exemption should be given as standard to any bots, such as those "officially approved" or the like, or not, or whether this would help ensure bots run more reliably (ie can't be inadvertantly blocked due to some other incident).

Just a heads up to start a discussion :)


FT2 (Talk | email) 20:21, 10 May 2008 (UTC)

My thought is that bots should only get it if there is a demonstrated need (the owner is an admin passing through IP-exempt, but the bot is hitting a range block. MBisanz talk 20:27, 10 May 2008 (UTC)
I would suggest it be given to all toolserver bots as a precaution - these should only really be blocked individually. dihydrogen monoxide (H2O) 01:20, 11 May 2008 (UTC)
If we decide that all bots should be made ipblockexempt, then the permission should just be bundled with the 'bot' usergroup, rather than manually given to all bots by a crat. Happymelon 17:40, 11 May 2008 (UTC)
I could see the argument for giving it to toolserver-based bots, but, honestly, the last time I even saw a TS bot blocked, was a few months ago, when BHG blocked BetacommandBot (and, also forgot to disable the autoblocker). Situation was resolved within a couple moments. It does not seem worth the trouble to assign really. SQLQuery me! 18:12, 11 May 2008 (UTC)
There are good reasons for not giving to bots as well (mainly non-toolserver bots) - if a bot operator is blocked, the bot should not be able to operate through the autoblock. Mr.Z-man 17:43, 11 May 2008 (UTC)
If the bot actually does operate from the toolserver, though, a block would only catch the bot if that account were used manually, from the same IP as the bot owner. And in that rare case, I'm not sure it's that big a concern. Ral315 (talk) 16:51, 15 May 2008 (UTC)
Since an autoblock would effectively shut down all of the toolserver bots that edit en.wiki, I think it would be a very good idea to grant this to bots that reside solely on the toolserver. Nakon 18:50, 16 May 2008 (UTC)

Problems with an archiving bot

I was recently browsing Talk:List of commonly misused English language phrases and I thought to myself, "there used to be more discussions here, but there aren't any links to archived discussions." So I looked into the page's edit history, and I found several cases where User:MiszaBot I had removed content from the discussion page to archive pages, but hadn't created links to the archive pages [38], [39], [40], [41], [42], [43]. This struck me as very bad. It is important that people who are browsing talk pages be able to see if there is any previous discussion they are missing. Often there have been discussions with important conclusions years in the past but involve things that new contributors to a page often bring up. Not giving diligent editors a clear way to find such discussions is intolerable. I have brought this up with the bot's operator (User talk:Misza13#Talk:List_of_commonly_misused_English_language_phrases) but he does not seem to think there is any problem, and also appears to be laying the responsibility for fixing it at my feet. Neither response is acceptable to me. Furthermore, I am concerned that there are talk pages all over Wikipedia which have had content removed by MiszaBot without links to the archive pages, and I think there needs to be an effort to find those pages and fix the problem by creating links to the archive pages. Nohat (talk) 18:20, 18 May 2008 (UTC)

The bot is not doing this on its own; it was directed to do it by this edit [44]. We ordinarily expect users to handle archive links on their own, since lots of different systems are used. I don't think there are any archiving bots that automatically add links. I do agree that someone who frequents the talk page there should add links to help new participants. — Carl (CBM · talk) 18:28, 18 May 2008 (UTC)
As a temporary fix I've changed the link to point to the Archives. Hopefully that's a bit less confusing. The better solution would be to create a page that would list everything and link there, but this works for now. Adam McCormick (talk) 18:42, 18 May 2008 (UTC)
I think there needs to be a lot more oversight in ensuring pages that archiving bots are working on have links to maintained links to archive pages. What happened to Talk:List of commonly misused English language phrases is not acceptable. As can be seen in the history of the page, User:Rm w a vu did not attempt to get any consensus for the use of an archiving bot, nor did s/he add any archive links. I don't think Talk:List of commonly misused English language phrases was long enough to need archiving, and it certainly wasn't high-traffic enough to warrant monthly archives. If bot operators run bots that are capable of mangling pages if they are not used correctly/responsibly, and they make no attempt to do due diligence to ensure the bot is being used correctly/responsibility, then they at least have a responsibility to clean up the messes their bots have made. How do we make sure it doesn't happen again? Nohat (talk) 18:47, 18 May 2008 (UTC)
Then change it. If you don't like the archiving, then put the content back and remove the archiving template. It's not the bot owners responsibility to police the use of the archiving template. Adam McCormick (talk) 18:52, 18 May 2008 (UTC)
I plan to, but the content is scattered across 19 tiny archive pages, and fixing it is nontrivial. I'm bringing this up here because I am concerned that there are pages all over Wikipedia that have been irresponsibly archived and it's going to take more than just me to deal with the problem. Furthermore, I disagree regarding responsibility: whoever makes edits to a page is responsible for those edits. Being a bot does not exempt one from being responsible for one's edits. Nohat (talk) 19:00, 18 May 2008 (UTC)
(outdent) No, the responsibility lies with the person who set up the archiving. The bot operates, properly, in accordance with the configuration settings given on the page. It's not the bot's responsibility to enforce a specific archiving method. It is up to the page users to decide how they want to archive. As for having links to the archive pages on the main talk page, simply add the {{archive box}} template and it will automatically keep track of any new archive pages (assuming the naming has been done correctly). -- JLaTondre (talk) 19:16, 18 May 2008 (UTC)
The problem is that the bot-created archives are too granular, and need to be merged into larger pages. It seems that it is very easy, indeed too easy, to add archiving to low-traffic talk page when it's very hard to fix problems that are discovered months later. If the person who requested the archiving did it maliciously or incompetently, and is unwilling or unable to fix the problem, or has the left the project, then when a problem is discovered months later, like no links to archive pages, or too much granularity in the archives, it's a big burden to be placed at the feet of the person who discovers the problem when the bot owner refuses to accept any responsibility for the problem. I don't think being a bot does exempts its owner for being responsible for the edits it makes, regardless of who requested those edits. Nohat (talk) 19:21, 18 May 2008 (UTC)
Miza has fixed the problem and collapsed the archives. in the future, such archives can be listed automatically using {{MonthlyArchive}}. Adam McCormick (talk) 19:58, 18 May 2008 (UTC)
ClueBot III has the ability to generate links to content that it archives, and place them in an archive box. If you find it necessary for the bot to maintain the archive lists, try using ClueBot III. -- Cobi(t|c|b) 04:09, 21 May 2008 (UTC)

Bot Assisted Assessment

(It was suggested that I post my request here.) Bot Assisted Assessment is where "a bot looks at all unassessed pages and adds the highest assessment parameter from other project templates on the page to the {{WP India}} template. E.g. if a page that has the WP India template has 'Start' and 'B' classes from other templates, '|class=B' will be added to the WP India one."[45] I was very surprised to learn that this was even possible. I've been working on moving WikiProject Ecuador along and something like Bot Assisted Assessment would be great for WikiProject Ecuador as well. In view of this, is it possible that you guys can create a list of things you have done with the bots as they relate to regional WikiProjects so that non-bot people (e.g. me) are aware of what can be done and can pick an choose among bot options to improve articles within the scope of a regional WikiProject. Thanks. GregManninLB (talk) 00:17, 25 May 2008 (UTC)

if you can think it, we can probably write it. βcommand 2 01:12, 25 May 2008 (UTC)
To an extent, BC. :) But yeah Greg, what you're talking about is a fairly simple bot task. Feel free to ask here about any others. dihydrogen monoxide (H2O) 01:42, 25 May 2008 (UTC)
User:Chris G Bot 2 already does article assessment. It would be a fairly simple modification to the code to create a bot with that functionality --Chris 02:28, 25 May 2008 (UTC)
Every bot which can do article assessment has slightly different heuristics for doing so: MelonBot looks for stub templates on the article, other project assessments, and also looks on the FA, GA and FL lists. Other bots do things slightly differently, but the majority of assessment bots will do the majority of the checks that are possible to best analyse an article. It's mainly a case of who will get to your request first :D. Happymelon 09:59, 25 May 2008 (UTC)
My bot was also written to do this task, assessing by looking at the highest rating the article has recieved from other projects and adding this to the template it is analysing. RichardΩ612 Ɣ ɸ 14:48, May 25, 2008 (UTC)