Wikipedia:Requests for adminship/TempDeletionBot
From Wikipedia, the free encyclopedia
- The following discussion is preserved as an archive of a request for adminship that did not succeed. Please do not modify it.
Contents |
[edit] TempDeletionBot
Final (talk page) (Withdrawn) (38/34/2); Ended Mon, 05 Nov 2007 22:38:21 (UTC)
TempDeletionBot (talk · contribs) - Good afternoon!
I present for your consideration a robot intended to help clear a boring administrative backlog - the category of temporary Wikipedian userpages which are created when a user is blocked indef and should be deleted after a month. Going through this backlog is very time-consuming - a human first checks for sockpuppet templates on the user page and talk page, because pages with sockpuppet templates should not be deleted, then needs to check the history for the latest edit, before clicking delete and pasting a deletion rationale. Despite this lengthy process, human judgment is not needed to determine whether a page is fit for deletion. All that must be done is a check for sockpuppet templates and a check of the last edit date. Therefore, I propose a robot be used to perform these tasks.
This bot will require the sysop bit. This bot gathers items in Category:Temporary Wikipedian userpages, and first perform some very liberal tests to check if it is deletable: If the word 'sock' appears anywhere on the page whatsoever, it will be skipped. (This is because pages with sockpuppet templates should not be deleted, however we want to catch even substed templates or discussion.) Then, it will check for some content - this is simply to make sure it doesn't crash, and prevents encoding issues that I've seen with my other bots. It will then acquire the latest edit from the history. It will parse that to find the date. If the last edit was within a month prior to the bot running, the page will be skipped. (The exact limit is 30 days.) If none of these checks results in a skip, the associated page will be checked (User talk <-> User) for edit time. If that page either does not exist, or was not edited recently, then the tagged page - only the tagged page - will be deleted. The associated page will not be deleted unless it is also tagged for deletion, and the bot checks it later in the run.
A very common and very necessary concern about admin bots is the ability for them to be abused by vandals. This bot has no potential for abuse, as any page it performs actions on must have remained unchanged for a full month.
Another common concern is coding errors. I've already asked others to review this code, and after some minor changes, they all find it satisfactory. Additional review is welcome, the code is available here.
And if the bot is blocked, it will be unable to continue deleting. --uǝʌǝsʎʇɹnoɟʇs 19:43, 3 November 2007 (UTC)
- Confirming that this account is owned by ST47. TempDeletionBot 1945, 3 November 2007 (UTC)
- Additionally, a dry run has been performed, the results are at User:ST47/DTUPL. (Can you tell that I love acronyms?) --uǝʌǝsʎʇɹnoɟʇs 19:47, 3 November 2007 (UTC)
[edit] Questions
- 1. If it skips a page due to containing the string "sock" or other issues not related to the date of last edit, will it list these somewhere or will they just be skipped every time the bot runs? Mr.Z-man 19:58, 3 November 2007 (UTC)
- A. They'll be in the log, however the bot won't do anything with them. I'll manually review those that the bot cannot handle, and I'll publish the list of pages with sock on them so they can be checked.
- Questions from Maxim
- 2. Will the bot be open to recall? (Sounds silly, but still pertinent, IMHO.)
- A. Absolutely. I've shut down one of my bots before due to community opposition, and I believe that adminbots should be especially accountable.
- 3. Do you pledge that this task will be the only admin actions made by this account?
- A. Yes.
- 4. Does that account have a strong password? Is abuse VERY unlikely to happen?
- A. This account has a password over (much over) the standard 8 characters, which contains both capital and lowercase letters, several numbers, and several other forms of punctuation. The password is comparable in strength to the password on my account and the password I use for the root account on my computer.
- Optional from Spebi
- 5. Will this bot be the last of the adminbots, or do you think there will be more to come in the future?
- A. Personally, I believe that as long as there are tasks to be found, people will make more.
- Question from Picaroon
- 6. What if the user hasn't been blocked, but is merely inactive, and a vandal adds {{indef}} or {{subst:indef}} to their userpage? How will you prevent the bot from deleting this sort of page? Picaroon (t) 21:22, 3 November 2007 (UTC)
- A.
I can check that the last editor is not the user, and I will look into adding that check right now.- Could you also check the block log for the letters "indef" or "infin"? Picaroon (t) 21:36, 3 November 2007 (UTC)
- (e/c)Why would the user be the last editor? That would likely never be the case if a user is indef blocked. Checking the block log and/or the user's last contribs would be more effective. Mr.Z-man 21:42, 3 November 2007 (UTC)
- Picaroon: I can, though the extra page load may not be worth the benefit. I tried a dry run with my above change and it saw a few false positives, so I'll see about checking the block log. --uǝʌǝsʎʇɹnoɟʇs 21:43, 3 November 2007 (UTC)
- Mr.Z-man: The user would be the last editor if they fraudulently added the templates to their userpage - it seems I misread picaroon's request, please disregard my previous post. --uǝʌǝsʎʇɹnoɟʇs 21:43, 3 November 2007 (UTC)
- Could you also check the block log for the letters "indef" or "infin"? Picaroon (t) 21:36, 3 November 2007 (UTC)
- A.
- Questions from Cryptic
- 7. I've reviewed your code, and there's nothing that limits the bot's deletions to the User: and User talk: namespaces (except that it won't swap namespaces before the second history check). You plan on fixing that, right? —Cryptic 21:47, 3 November 2007 (UTC)
- A. Apologies for the delay in answering. Yeah - that's actually in place in the current version, I just haven't uploaded it yet - once I get a chance, I will put it up and post the new link.
- 8. It's an extremely common error to write [[Category:Temporary Wikipedian userpages]] instead of [[:Category:Temporary Wikipedian userpages]], and given that user talk pages don't generally get much outside scrutiny, this could well survive long enough to get archived and thus be considered inactive (especially if the user in question reads edits mostly through diffs). Any ideas how to minimize this? —Cryptic 21:47, 3 November 2007 (UTC)
- A. You are very right, however if this happens, the bot would look at the block log for, say, ST47/Archive 1, find no blocks, and skip that page due to the changes I added just a few moments ago for Picaroon.
- 9. What happens when some clever vandal drops <includeonly>[[Category:Temporary Wikipedian userpages]]</includeonly> on some semipopular but unprotected template that's transcluded on a couple hundred inactive userpages just before a run? —Cryptic 21:56, 3 November 2007 (UTC)
- A. Then the bot would waste an insane amount of time checking every one of those user's block logs, skip every one of them, and continue on its merry way.
- This might be a problem in terms of server load. Isn't there a more elegant way to do this; e.g. using generator=categories for prop=revisions and checking if a block template is directly transcluded on the page? GracenotesT § 22:53, 3 November 2007 (UTC)
- A. Then the bot would waste an insane amount of time checking every one of those user's block logs, skip every one of them, and continue on its merry way.
- Questions from Carcharoth
- 10. What happens if an inactive user is blocked indefinitely after they have gone inactive (it has been known to happen when something is noticed later)? Will their talk page and user page be deleted by the bot before they get a chance to return and defend themselves? And what about inactive users who have blocks in their logs, whose pages then get vandalised with the "temporary wikipedian userpages" category? Is the bot checking the actual length of the block as recorded on the server, or is it only looking for clues in the wording of the block logs? If the former, that would be OK, but the latter just won't work reliably enough. Carcharoth 22:13, 3 November 2007 (UTC)
- For your first question, adding the category to the page is an edit and would "reset" the 30 day timer, as it were, so the page would be left alone regardless of previous activity. — Coren (talk) 22:17, 3 November 2007 (UTC)
- And after another 30 days it would delete it, right? This doesn't sound as clear-cut as people are maknig it sound. Can someone clearly state in English what the code is telling the bot to do? Carcharoth 22:20, 3 November 2007 (UTC)
- Correct. If the category is on a user's page and left there for 30 days, and the user is indef blocked, the page will be deleted.
- As for the block log parsing, as that's what I'm currently working on, it checks that the most recent block is indef, and that the user is still blocked.
- I'm working to address the rest of your questions right now :) --uǝʌǝsʎʇɹnoɟʇs 22:22, 3 November 2007 (UTC)
- "And what about inactive users who have blocks in their logs, whose pages then get vandalised with the "temporary wikipedian userpages" category?"
- And after another 30 days it would delete it, right? This doesn't sound as clear-cut as people are maknig it sound. Can someone clearly state in English what the code is telling the bot to do? Carcharoth 22:20, 3 November 2007 (UTC)
- For your first question, adding the category to the page is an edit and would "reset" the 30 day timer, as it were, so the page would be left alone regardless of previous activity. — Coren (talk) 22:17, 3 November 2007 (UTC)
- 10a.The one big problem I see is that it gets its initial list from a category. You are assuming that the category Category:Temporary Wikipedian userpages will only ever be used for the indefinitely blocked users. Once the bot is running, you need to find some way of making sure that the use of Category:Temporary Wikipedian userpages does not change so as to give this bot the wrong data. Say for example, in two years time, someone unaware of this bot, and unaware of the real purpose of Category:Temporary Wikipedian userpages, adds Category:Temporary Wikipedian userpages to a new template they have created, and this template starts getting added to lots of user pages, including some inactive ones? By relying on the category and the block log, you are assuming that use of the category and the block log will not change in the future. I would be much happier if there was some sort of limit on the number of page revisions it deleted. Most of these inactive accounts will have only a small number of page revisions, right? Can you not put some reasonable limit on the number of page revisions to stop it deleting old inactive accounts that were once very active and somehow ended up with an indefinite block. Those pages with large number of revisions usually require human judgment to decide whether to keep or not. Usually either a right to vanish, or preserving of talk page discussion is what the human will consider. I'd like the bot to go "oohh, 2,000 page revisions! Better not touch that one!" Is that possible? Carcharoth 22:36, 3 November 2007 (UTC)
- Ah right ;) Yep, that is possible, and I can set that up (50 revisions?) in a bit. --uǝʌǝsʎʇɹnoɟʇs 12:25, 4 November 2007 (UTC)
- Is 50 revisions a good number to pick, based on some average, or is that just a number plucked from thin air? :-) Carcharoth 13:06, 4 November 2007 (UTC)
- Ah right ;) Yep, that is possible, and I can set that up (50 revisions?) in a bit. --uǝʌǝsʎʇɹnoɟʇs 12:25, 4 November 2007 (UTC)
- Questions from AS 001
- 11. What guarantee that the bot will not commit vandalism such against as a user wrongly indef. blocked then appeals it after a month or two of thinking and preparing (especially if they have a job and can't devote the workday to wikipedia)? AS 001 22:47, 3 November 2007 (UTC)
- A. Blocked users can still recreate their talk page if it is deleted (I checked) and it never takes more than a day or 2 for appeals to be responded to so there is no risk of the bot deleting the page before the appeal is answered. They can also email the unblock mailing list and appeal in #wikipedia-en-unblock. Mr.Z-man 00:15, 4 November 2007 (UTC)
- 12. Can you truthfully state that there has never been and guarantee that there have never will be wrong indef. blocks? AS 001 22:47, 3 November 2007 (UTC)
- A. Nobody can, see the answer to question 11 as well about appeals. I think you may misunderstand what this bot does. It does not block people or delete accounts, it just deletes the talk pages of indef blocked users if they are not blocked for sockpuppetry, the page has not been edited in 30 days and the page is in the temporary userpages category. Mr.Z-man 00:15, 4 November 2007 (UTC)
- 13. What harm is it to keep some of these pages a little longer or even indefinitely?AS 001 22:47, 3 November 2007 (UTC)
- A. Some? No harm at all. All of them? Well, insulting words show up in google searches, they muck us categories, and templates are transcluded - that is, every month, 5000 or so of these pages are created, and many transclude a template. If, after 6 months, that template is changed, mediawiki has to re-parse almost 30000 pages.
- 14. The criteria for recalls is often 5 editors with objections. There are already 4 oppose votes. Will you withdraw the RFA if there are 5 oppose votes or will you ignore the usually recall criteria (even though you state that you will be subject to recall)? Sorry if this sounds harsh. If your criteria is 100 editors to force a recall, just say so.
- A. My criteria for recall are 5 editors in good standing forcing an RfA. If the RfA fails, the bot loses the bit. (I believe that's more or less the standard criteria)
- Question from Миша13
- 15. (Inspired by JzG's support !vote.) You have provided elaborate technical explanations about the bot's operations, but can you actually explain, in detail, what is the added value of deleting these pages? WP:DENY doesn't seem to be the case as the pages are often replaced with an {{indef}}/{{UsernameBlocked}}/{{whatever}} thus any trolling is removed. Only reason I could imagine (and a reason I ran a bot clearing the very same category, except bloating my deletion count ) is that we don't want certain pages (insulting titles) to show up in Google searches. To me now, CAT:TEMP seems more an imaginary backlog (most of the pages can just stay there) than a real one ("a solution looking for a problem"). Could you comment on this? Миша13 23:32, 3 November 2007 (UTC)
- A. As you said, google searches. Also, however:
- To make categories of indefblocked users somewhat useful as to recent blocks
- To decrease the amount of pages with templates on them - remember that every time we edit a template, the pages that template is on need to be reparsed. After years of blocking, I'd imagine that number is pretty high. --uǝʌǝsʎʇɹnoɟʇs 23:36, 3 November 2007 (UTC)
- Surely they only need to be reparsed when the page is actually loaded? David Mestel(Talk) 17:55, 5 November 2007 (UTC)
- A. As you said, google searches. Also, however:
- Question from LessHeard vanU
- 16 Have you run this bot by WP:BAG? LessHeard vanU 00:28, 4 November 2007 (UTC)
- A. Yep.
- Wunnerful... It may have been an idea to have commented on this, with a link like above, previously just so the less tech minded (that would be me, that it would) are re-assured that it has passed its first hurdle. Will support. LessHeard vanU 00:48, 4 November 2007 (UTC)
- Another question from Миша13
- 17. After reading LessHeard vanU's question above, I have actually turned my eyes to the said approval - well, excuse me sir, but "run by BAG" is exactly what I would call that. A total of one member commented on it (other than the nom), with no outside opinions (yup, just two users as of writing this). What I can conclude from this is that the bot has had almost no technical oversight before being brought before the community. Assuming this RfA succeeds and BAG denies later (as Coren said, additional proceedings are to take place afterwards), doesn't all this look silly? BTW, I assume you would request the bit to be retracted immediately? Миша13 11:07, 4 November 2007 (UTC)
- A. First, I've asked several other users to look at the code - User:Eagle 101, User:Shadow1 - both of whom use the same module I'm using very often, and the latter of whom actually wrote it. Additionally, the BAG member who commented that, Coren, reviewed the code, as did Carl-m (no idea what his wikipedia name is) and some of the reviewers below - Majorly, Picaroon, PxMa.
- Second, if the bot passes this request and fails BRFA, then I'd request that a steward remove the sysop bit - there'd be no need to have it. —Preceding unsigned comment added by ST47 (talk • contribs) 11:28, 4 November 2007 (UTC)
- Well, if they all have, and yet you have decided to follow an on-wiki process, I would expect them to certify somewhere on-wiki too (on the BRFA, or RfA like the last time, where at least an extensive technical discussion took place and several devs have certified to have checked the code). Otherwise, to the average reader this bot makes an appearance of trying to slip through unchecked and the RfA a potential waste of time. Миша13 12:02, 4 November 2007 (UTC)
- Another question from Carcharoth
- 18. The bot operator has failed to respond to Dragonflight's concerns below: "he added code to User:STBotI to tag images as {{di-no source}} despite the fact that its approval doesn't cover that, and continued to operate it despite many complaints that it makes too many errors". Will the bot operator please address this concern, and pledge that if this bot is given admin status, that you will not use it to do tasks outside of the approved tasks, and will not add tasks later? Carcharoth 12:31, 4 November 2007 (UTC)
- A. I don't believe that this change to STBotI is major enough to merit a second request - if BAG asks me to stop, I will, and I've been adding detection rules to the bot whenever there's a complaint. As for this bot, I've already stated (Q3) that the bot will not perform any other tasks. —Preceding unsigned comment added by ST47 (talk • contribs) 12:48, 4 November 2007 (UTC)
- If BAG asks you to stop, sure. What if the community, or a sizeable number of people, ask you to stop, what then? Your response to the previous requests to stop don't inspire me with confidence. Also, how will you handle changes that may be needed for this bot? You intend to add detection rules as and when needed. Will you also pledge to undelete any mistakes made by the bot and stop operating it until such mistakes are corrected? Thanks for your answer to 10a. Hmm. I might even switch to support if satisfactory answers are given for questions 13 (could be difficult) and 14. :-) For question 13, I'm thinking whether a backlog of 6 months could be managed by a bot, instead of a backlog of 1 month? ie. delete when they get to 6 months old. How large is the problem? Could you indicate in Q13 roughly how many pages pile up in that category per month? Carcharoth 13:06, 4 November 2007 (UTC)
- A. I don't believe that this change to STBotI is major enough to merit a second request - if BAG asks me to stop, I will, and I've been adding detection rules to the bot whenever there's a complaint. As for this bot, I've already stated (Q3) that the bot will not perform any other tasks. —Preceding unsigned comment added by ST47 (talk • contribs) 12:48, 4 November 2007 (UTC)
- Question from Tra
- 19. I'm worried that this bot could be compromised by vandals who would be able to take advantage of the fact that any page in Category:Temporary Wikipedian userpages and with an indefinite block in the log will be deleted. For example, they could take an obscure encyclopedia article, move it onto the userpage of a banned user and apply the category. As long as it's not spotted by RC patollers, an encyclopedia page will be deleted and will be very difficult to revert. Another problem is that a vandal could allow sockpuppet userpages to be deleted simply by editing them in order to maliciously bypass the check for sockpuppets. Also, perhaps a safer way to deal with the category being applied incorrectly thrugh userboxes would be to check the page source of each page to make sure that it contains at least one of a pre-approved list of templates.
- Therefore, what I would suggest is that the bot should skip all pages that have ever been moved, and only delete pages where the last edit is by an admin, as well as checking the templates. Tra (Talk) 20:18, 4 November 2007 (UTC)
[edit] General comments
- For the edit count, see the talk page.
- Links for TempDeletionBot: TempDeletionBot (talk · contribs · deleted · count · logs · block log · lu · rfar · rfc · rfcu · ssp · search an, ani, cn, an3)
- Link to the Bot Request: Wikipedia:Bots/Requests for approval/TempDeletionBot
Please keep discussion constructive and civil. If you are unfamiliar with the nominee, please thoroughly review Special:Contributions/TempDeletionBot before commenting.
[edit] Discussion
- Note that I have no ill will against the creator of the bot. I just don't think giving the bot powers that 99% of editor don't have is a good idea. AS 001 22:48, 3 November 2007 (UTC)
- Note that MediaWiki is just a computer program with thousands of lines of code. MediaWiki can do much, much worse than any administrator can. Also, we trust MySQL, another computer program with thousands of lines of code, to keep track of all the data on Wikipedia. Furthermore, we trust Linux, an operating system with hundreds of thousands of lines of code to store all this data. Now, should we trust a program with only 66 lines of code — which has a much less chance of having a bug — to do completely reversible actions? -- Cobi(t|c|b|cn) 01:34, 4 November 2007 (UTC)
- It depends on how often the actions need to be undone, and whether the need to undo them is spotted. We won't really, truly, know the former until it starts operating. The latter is difficult to know for sure, as it depends how visible the operations are. There is also the question of reliability. The computer programs you mention - are there many complaints about them getting things wrong (have a look at bugzilla). It is also a question of visibility. Look at the complaints some poorly-programmed bots have received - those were spotted quickly because they edited high-visibility pages. If this bot deletes pages no-one is looking at, who is going to complain if it gets some things wrong? Carcharoth 13:14, 4 November 2007 (UTC)
- My running of the bot with manual review over several months only required one restoration. That case was a user who was once indefblocked and was later unblocked, the current bot won't even make that mistake. (Wow, a bot that's smarter than a human! They really ARE taking over!!!) --uǝʌǝsʎʇɹnoɟʇs 17:27, 4 November 2007 (UTC)
- It depends on how often the actions need to be undone, and whether the need to undo them is spotted. We won't really, truly, know the former until it starts operating. The latter is difficult to know for sure, as it depends how visible the operations are. There is also the question of reliability. The computer programs you mention - are there many complaints about them getting things wrong (have a look at bugzilla). It is also a question of visibility. Look at the complaints some poorly-programmed bots have received - those were spotted quickly because they edited high-visibility pages. If this bot deletes pages no-one is looking at, who is going to complain if it gets some things wrong? Carcharoth 13:14, 4 November 2007 (UTC)
- Some of those in the "oppose" section are making comments along the lines of "this task needs judgment because such pages should not always be deleted". Out of (genuine) interest, in what circumstances might it be advisable to keep such a page for long than 30 days, and would those pages be better put in a different category (presumably by a slightly altered block notice template?) BencherliteTalk 17:20, 4 November 2007 (UTC)
- Have you considered using the bot to produce a list matching those criteria that you can handle manually (ie, with double-checking)? --bainer (talk) 01:14, 5 November 2007 (UTC)
[edit] Support
- Support; As a member of the BAG, I have reviewed the code and found no flaws making me think that this bot could be abused or cause damage. The checks it utilizes are very liberal and err on the side of leaving
articlespages untouched, and the backlog of that task is large enough to warrant the use of an automated process. I will be approving the bot for trial if it gets the sysop flag. — Coren (talk) 19:50, 3 November 2007 (UTC) - Support; Full support, with the usual prerequisites, the source code is open to scrutiny, passwords are kept secure etc. Nick 19:55, 3 November 2007 (UTC)
- Support I trust ST47 and this is a pretty mundane, uncontroversial, boring, and repetitive task - perfect for a bot. Mr.Z-man 19:58, 3 November 2007 (UTC)
- Support. A good use of a bot that will free up admin time. Sam Blacketer 20:31, 3 November 2007 (UTC)
- User:Veesicle 20:35, 3 November 2007 (UTC)
- Very much so. Grandmasterka 20:54, 3 November 2007 (UTC)
- Support I had an idea for a bot like this once Kwsn (Ni!) 20:57, 3 November 2007 (UTC)
- + Good work. GDonato (talk) 21:26, 3 November 2007 (UTC)
- I would like to see this bot do admin work. NHRHS2010 talk 21:33, 3 November 2007 (UTC)
- Support - Good bot, good operator. -- Cobi(t|c|b|cn) 21:41, 3 November 2007 (UTC)
- Support. I've slogged through that backlog a fair few times, and I can assure you that this would be one useful Bot :) Anthøny 21:48, 3 November 2007 (UTC)
- Support. This is an ideal task for a bot, because a human is much more likely to miss something important (like recent editing). I do feel, like some others, that RfAs are not necessary for adminbots, but I think not supporting such a bot for that reason is counter-productive. Chick Bowen 21:50, 3 November 2007 (UTC)
- Support Code looks rock-solid. east.718 at 21:57, 11/3/2007
- Support Reedy Boy 22:16, 3 November 2007 (UTC)
- Support on a trial basis. Addhoc 23:10, 3 November 2007 (UTC)
- Support Would help a lot if it works well. If not then it can always be desysopped. Wikidudeman (talk) 23:13, 3 November 2007 (UTC)
- Support I trust ST47 to make this perform to the best of its ability. He knows perl like it is the alphabet and I'm sure he will be able to fix the problem if it errors. — E talkBAG 23:15, 3 November 2007 (UTC)
- Support. Risk low, benefit tangible. Guy (Help!) 23:17, 3 November 2007 (UTC)
- Support As long as it has the backing of other reputable bot operators, it is a huge time saver. Admins are already busy enough, it looks like this bot would save admins a lot of time.--Snakese 23:28, 3 November 2007 (UTC)
- Great idea ViridaeTalk 23:51, 3 November 2007 (UTC)
- Support as a ideal task for a bot, which is extremely unlikely to do any harm in the hands of this trustworthy operator.--BencherliteTalk 00:07, 4 November 2007 (UTC)
- I've looked at the source. I'm not great at perl, but it looks fine to me and it errors on the side of caution. :) Ρх₥α 00:37, 4 November 2007 (UTC)
- Support a great idea and a simple task. AntiVMan 00:45, 4 November 2007 (UTC)
- Support It's in the bag... LessHeard vanU 00:50, 4 November 2007 (UTC)
- Opposes lack validity, this is a good idea. -- John Reaves 01:38, 4 November 2007 (UTC)
- I think you mean "opposes at the time of writing lack validity", as later opposes might be valid, no? Carcharoth 12:26, 4 November 2007 (UTC)
- Support mindless discretionless tasks are best for bots. Carlossuarez46 02:02, 4 November 2007 (UTC)
- Support. –Crazytales talk/desk 02:11, 4 November 2007 (UTC)
- Support I think that it is good for bots to slowly start replacing humans, since they can really speed up things and avoid human mistakes (though they lack of judgement). I imagine a day in which Wikipedia will be entirely run by super sophisticated bots with super complex source code (so perfect that humans will no longer be needed), creating an error-free vandalism-free ambient and an encyclopedia that grows at extremely high speeds. Also, if this bot starts failing, then we can just desysop it and reconsider this option; let's give it a try. ♠TomasBat 02:20, 4 November 2007 (UTC)
- Support. This is a needed task to be done. --Rschen7754 (T C) 02:35, 4 November 2007 (UTC)
- Support I would like to see this bot having the admin tools as this is a very time consuming task. --Siva1979Talk to me 03:00, 4 November 2007 (UTC)
- An excellent idea, and I strongly support this nomination. I've said it a few times before that we need more admins working in that category. I hope that, at the very least, if this nomination doesn't pass, more administrators decide to help delete pages in that category. Acalamari 03:35, 4 November 2007 (UTC)
- Support This is not so completely mindless as RedirectCleanupBot, but it's close, and after reviewing the first ten questions or so, I don't see any critical weaknesses in the bot's concept. Shalom (Hello • Peace) 04:05, 4 November 2007 (UTC)
- Support - per no weaknesses apparent. Rudget Contributions 17:02, 4 November 2007 (UTC)
- Support - A useful bot performing an entirely mindless administrative task in a highly conservative manner. Tim Vickers
- Support - this is a perfectly mindless task which can be performed by a bot. If the technical bits are in question, that's for the BAG to deal with (though I think they've all been taken care of). —Preceding unsigned comment added by Mike.lifeguard (talk • contribs) 22:01, 4 November 2007 (UTC)
- Very weak support yes it's good but the opposers are right. How's anybody gonna research how the more prolific ones got banned if they want to? Still, don't feed the trolls is a good reason to support. Maser (Talk!) 03:41, 5 November 2007 (UTC)
- Support I always wondered who was purging the category (since there is no monthly breakdown) and assumed there was already some sort of rogue bot around. I pondered it a lot, and I think that if clear logs are maintained (on the deleted AND skipped pages, for human review of the pages where the template should be removed), this is a good idea. -- lucasbfr talk 14:57, 5 November 2007 (UTC)
- Support, of course. Any backlog that doesn't require human judgment can be automated, in my opinion. ^demon[omg plz] 17:31, 5 November 2007 (UTC)
[edit] Oppose
- Oppose I used to clear this backlog. There's plenty of reason why an old user/user talk page should be kept, that isn't defined by a category. It needs human judgement. Majorly (talk) 21:52, 3 November 2007 (UTC)
- Oppose - only one edit. :-) Seriously, I'm probably going to oppose per Majorly, as I would prefer human judgment to be used here, particularly as the boundaries of the other adminbot still need testing (it seems that the redirect deletion bot can accidentally remove old evidence trails and so impede the work of Wikipedians investigating what used to link to a deleted article via a redirect). I also thought a different oppose reason might be nice. I'm also confused as to why the one edit the account has made should be proof that the bot is "owned by ST47". That is not normally required of bot operators, so why has that been done in this case? Carcharoth 22:01, 3 November 2007 (UTC)
- As for the first edit, I just want to be sure of security, confirming that I own the account in lieu of a formal acceptance. --uǝʌǝsʎʇɹnoɟʇs 22:03, 3 November 2007 (UTC)
- (e/c) I don't beleive it was required in this case either, but having the user account that is meant to have the sysop flag touch the RfA as confirmation that all is kosher is not a bad idea either. — Coren (talk) 22:04, 3 November 2007 (UTC)
- Oppose. I do not trust this bot operator. Dragons flight 22:32, 3 November 2007 (UTC)
- ST47 neglects to mention that he has been sporadically clearing this category with bot-like speed already (e.g. [1][2]). If he wanted to now seek approval and forgiveness the first step is being open about his past behavior.
- He also has an undocumented bot-like process clearing through Category:Rescaled fairuse images (e.g. [3]). Again, my problem here is honesty/openness. Even if the function is a good idea, he ought to come clean before asking the community for trust.
- More seriously, he added code to User:STBotI to tag images as {{di-no source}} despite the fact that its approval doesn't cover that, and continued to operate it despite many complaints that it makes too many errors. All of STBotI's functions seem to have been disabled since the 28th, but here are some source detection errors from its last ~24 hours of operation: [4][5][6][7][8]. This continued despite the feelings of myself and Betacommand that is it essentially impossible to detect "no source" except in the trivial cases (e.g. blank description).
- Another example of poor judgment with bots: Initially STBotI had unicode handling errors. Not in itself a big deal, but ST47's response was really questionable. His bot created image pages that don't exist like Image:RenéCaillié.jpg. OrphanBot sent ST47 warnings that he had created pages where no image existed (e.g. [9]). Rather than delete the bad pages, ST47 reverted the warnings. Then each day, OrphanBot sent new warnings about the bad pages and those warnings were also reverted, until eventually ST47 trained a bot to automatically revert OrphanBot as "pattern vandalism" (i.e. [10]). Eventually, he did delete some of the bad pages he had created, but not before a majority were deleted by others (e.g. Misza13, ElinorD, Wizardman). I don't consider this to be a responsible response to the mistakes created by ST47's code.
-
- Point one: Yes, I've run this bot in the past without approval - deletions were manually reviewed - page text and last edit of potential pages was printed out. As with most of my bots, it started as a tool to break through the backlog (See my BRFA for ImageBacklogBot - that was initially an interface that gives me date on a certain page, and in its supervised mode, it still is) and now I feel that it's accurate enough to seek approval.
- Point two: Again, manually reviewed. The bot looks through the category, strikes from the list anything with only one revision, and presents the rest for review. I have the option to delete certain revisions or remove the tag.
- And yes, I have run unapproved bots under my main account. I ran a bot once to help delete something like 250 categories that were CfDed. I've used a bot to make modifications in a user's archives, at the user's request. I've written and ran a bot to correct errors in signpost distribution. --uǝʌǝsʎʇɹnoɟʇs 22:49, 3 November 2007 (UTC)
- Unicode point: First, the bot was posting the same list to my userpage day after day, even after I'd fixed the code. There was later (appearantly) another issue, which I fixed again. (You just recently posted info about a username with unicode issues, I'm looking at a way to stop that from happening. --uǝʌǝsʎʇɹnoɟʇs 22:49, 3 November 2007 (UTC)
- To the last item, OrphanBot was designed to keep sending messages until the corrupt pages were deleted. Had you deleted your mistakes in the first place, as you should have, the problem would have been resolved. Instead you went to extraordinary lengths to ignore the problem. Dragons flight 22:55, 3 November 2007 (UTC)
- Oppose I oppose with the great risk that I may be blocked in retaliation for expressing my opinion. I was blocked indefinitely for the same reason (expressing a valid opinion in a RFA against a bot. Block was later reversed without an apology). Problems of this bot are many:
- Only a few people will have the ability to stop the bot. Only admin, less than 1% of the editors, can stop it. If a non-admin raises a legitmate objection, he/she risks being ignored or blocked on the excuse of "trolling".
- Retaining pages is important for investigations of high complexity and to keep a record of wikipedia for online cyber-archeologists.
- If the owner of the bot is ever blocked, they will continue to have admin rights with the bot, which they could have renamed.
- There is no mechanism for review of the bot. It could evolve and delete more yet we have no recourse short of a big effort. There are a few admin who are abusive but even they are never recalled.
- The rules for admin allow only people. Supporters of the bot may cry "wikilawyering" but they are guilty of breaking the rules for their own convenience.
- The creator of this bot has been an admin for only a short time. People have to be around for 6-12 months before RFA so ST47 should be admin for 6-12 months before allowing to get a 2nd admin power.
- The bot has a flaw because it cannot judge whether the indef block was justified. If it wasn't justified then the bot is committing vandalism.
- I do not hate ST47 but I don't like the idea of a computer program having more power than 99% of editors and that it is under less control once given admin powers.
- Also oppose under the reason that the bot can't exercise human judgement like a person. AS 001 22:35, 3 November 2007 (UTC)
-
- I'll try to address some of your points:
- First, this is true of all bots. Do you suggest that we shut down every bot on Wikipedia?
- If I start acting like a human with the second account after I'm blocked, someone would be suspicious.
- This bot would never gain more tasks without an RfA, and any admin can block it if it attempts to do so.
- --uǝʌǝsʎʇɹnoɟʇs 22:54, 3 November 2007 (UTC)
- Thanks for answering some points! You haven't answered most of the concerns so I believe that they are still valid points.AS 001 2:58, 3 November 2007 (UTC)
- Ok, I have a little more time now. Per points 2, 4, and 7: These concerns would be true even if a human was doing the tagging. As for point 6, a second admin account does not give me any more power - as I said above, if I am blocked and suddenly start acting as my bot, someone would notice. --uǝʌǝsʎʇɹnoɟʇs 23:31, 3 November 2007 (UTC)
- (e/c)I can answer some more points:
- Point 1: When has a user ever been blocked for trolling when complaining a bot is broken? From my experience, such requests made to WP:ANI are always taken seriously and responded to quickly.
- Point 2: Most of these are just pages of vandal only accounts, the contributions and logs of the account will still exist, that's usually what matters for investigation; pages tagged with sockpuppetry related tags will be ignored. Wikipedia is an encyclopedia, not a tool for cyber archaeologists, and why would they care about users blocked for vandalism in the past, we have plenty in the present.
- Point 3: Why would a bureaucrat rename the account of a bot operated by a blocked user? If ST47 is blocked and the bot starts doing anything, it would be blocked.
- Point 4: Yes there is, you can see its deletion log. There is as much "mechanism for review" as there is with any bot or human. Has your computer ever evolved? Mine hasn't, this bot is just a few dozen lines of code.
- Point 5: "Breaking the rules for their own convenience" - you mean they are ignoring them? Ignoring rules can be within the rules. We already have 1 admin bot User:RedirectCleanupBot and another would have passed RFA had it not been superseded by a software change. There is also User:MediaWiki default, run by the developers that can delete pages.
- Point 6: 6 months is fine but 5 isn't? I was made an admin with 5 months of experience, other users have needed far less. Has ST47 shown himself to be an abusive admin? Its not an extra power, its just a second account doing something automatically that ST47 can already do by hand.
- Point 7: Even if they were blocked unfairly and did not complain for a month, how is deleting their talk page considered vandalism? They can still recreate it to contest their block. This is where we assume good faith on the part of the blocking admin.
- Point 8: It will be under as much control as any other bot. Admin actions are not irreversible.
- Point 9:No bot can exercise human judgment; by that logic, we should not use any bots here. We trust robots and computers to build our cars and keep track of trillions of dollars. Why should we not trust one to delete user talk pages of blocked users on Wikipedia that have not been edited in a month. Mr.Z-man 23:56, 3 November 2007 (UTC)
- Ok, I have a little more time now. Per points 2, 4, and 7: These concerns would be true even if a human was doing the tagging. As for point 6, a second admin account does not give me any more power - as I said above, if I am blocked and suddenly start acting as my bot, someone would notice. --uǝʌǝsʎʇɹnoɟʇs 23:31, 3 November 2007 (UTC)
- Thanks for answering some points! You haven't answered most of the concerns so I believe that they are still valid points.AS 001 2:58, 3 November 2007 (UTC)
- Note: the user who who gave this oppose has been blocked indefinitely as a sockpuppet. Oppose indented accordingly. Acalamari 21:37, 5 November 2007 (UTC)
- Oppose I won't support any bot as admin. My own personal pref is that only carbon-based life forms with the ability to rationalize should be granted the tools. the_undertow talk 00:30, 4 November 2007 (UTC)
- No. This is nothing like RedirectCleanupBot - that was coded in a way that there was NO way the bot could make mistakes. This bot can make mistakes. This task requires judgement (see Majorly's comment). This should not be done by any bot, and especially not by a bot with as many flaws as have been noted by botops more experienced than myself. — H2O — 00:33, 4 November 2007 (UTC)
- Strong Oppose per all opposers above. There's a reason why only 1,374 in 5,733,215 users are administrators, because tasks such as these need human judgment. I doubt any bot besides RedirectCleanupBot will be granted sysop buttons for some time. --Pumpmeup 00:49, 4 November 2007 (UTC)
- Oppose This is not a task that needs to be done blindly. We've had previous administrators attack this category with blind bot-like disregard and it did nothing more than create additional work and drama. Userpages are frequently placed in the temporary pages category incorrectly, and it takes human judgment to determine whether or not these pages should be deleted. Absolutely not.- auburnpilot talk 02:00, 4 November 2007 (UTC)
- Oppose per Majorly and AP. Please note that I am in no way opposed to adminbots whose scopes are clear and benefits tangible (see comments at ProtectionBot and RedirectCleanupBot). However, contrary to the claims of the applicant, this is a category that absolutely does require human judgment and manual review of history prior to deletion. Additionally, the existence of the pages poses little to no threat to the encyclopedia. This is not a good target for a non-Turing-complete being. —bbatsell ¿? ✍ 02:23, 4 November 2007 (UTC)
- Strong oppose, there are a variety of factors which would stop me deleting a page in these categories, and a number of them can't be scripted. Clearing this category needs judgement. Daniel 02:37, 4 November 2007 (UTC)
- Oppose I think there are far too many problems that people have listed for this bot to work as it should. Captain panda 03:47, 4 November 2007 (UTC)
- No Potentially problematic. -- Anonymous DissidentTalk 03:50, 4 November 2007 (UTC)
- Oppose I'm sorry, but I'm not too keen on having bots as admins. We give the powers of adminship to people because of their subjectiveness. I feel that we need humans to deal with all the "what if"s that can arise in Wikipedia, not bots that run according to a typed out code that only works for textbook examples. Icestorm815 05:16, 4 November 2007 (UTC)
- Oppose sorry but we should worry about making people admins before bots. SashaCall (Sign!)/(Talk!) 06:17, 4 November 2007 (UTC)
- That's illogical. -- John Reaves 06:30, 4 November 2007 (UTC)
- This job requires human judgement, and the bot can't replicate this. ~ Sebi 09:13, 4 November 2007 (UTC)
- Oppose, not a suitable task for a bot - judgement is needed. Plus I have the impression from the above that ST47 already runs bots on his admin account and gets away from it, so this isn't needed. Neil ☎ 09:41, 4 November 2007 (UTC)
- Oppose I feel like their is too much room for error. Great idea at the basis though! Jmlk17 10:20, 4 November 2007 (UTC)
- Just because there was recently one admin bot, does not mean to say that we need to make it a regular thing. Qst 11:51, 4 November 2007 (UTC)
- Oppose Function proposed for the bot is inappropriate for a bot, as it requires human discernment. Xoloz 13:35, 4 November 2007 (UTC)
- Opppose Per my comments on Wikipedia:Bots/Requests for approval/TempDeletionBot, believe that more technical issues and controls needs to be discussed first. — xaosflux Talk 15:46, 4 November 2007 (UTC) (BAG member)
- Oppose - I'm not convinced that this task is appropriate for a bot due to judgment required. --After Midnight 0001 15:57, 4 November 2007 (UTC)
- Oppose. I'm extremely wary of giving admin powers to any bot, and this appears a situation in which judgement is at least occasionally required. Espresso Addict 16:27, 4 November 2007 (UTC)
- Reluctant Oppose. While this sounds like a great idea in theory, I agree that this is a place where judgment is really necessary. If it can't pass a Turing test, it shouldn't be deleting pages. Sorry. - Revolving Bugbear (formerly Che Nuevara) 16:43, 4 November 2007 (UTC)
- I do not believe we need to be granting adminship to this bot. The task requires more judgment than any algorithm could produce. Mercury 17:04, 4 November 2007 (UTC)
- Oppose, per Majorly. @pple complain 17:47, 4 November 2007 (UTC)
- Oppose, rather strongly. I'm not convinced that this task is suitable for automation, despite the comments of the nominator and others. Somebody could vandalize a remotely hidden userbox, include hundreds of "regular" userpages in the category, and have the bot nuke them. While yes, that is reversible, no thanks. Titoxd(?!? - cool stuff) 18:51, 4 November 2007 (UTC)
- The bot checks the user's block log to make sure they are actually indef blocked. Mr.Z-man 19:00, 4 November 2007 (UTC)
- Still. Some "temporary" userpages include information about a user who is either a sockpuppeteer, or has information about why the user is banned. My point here is that this task requires human intervention. Titoxd(?!? - cool stuff) 20:00, 4 November 2007 (UTC)
- The bot checks the user's block log to make sure they are actually indef blocked. Mr.Z-man 19:00, 4 November 2007 (UTC)
- Oppose. This is really a task that requires human eyes. The concerns about previous bot operation are also worrying, I'm not particularly satisfied with ST47's answers to those. --bainer (talk) 01:13, 5 November 2007 (UTC)
- Oppose. As much as I think this area could use clean-up, my admittedly casual experience with it indicated that human judgment was necessary. Even a cursory look showed all sorts of variables within just a few different pages that I suspect are outside the scope of this bot. Sorry. Pigmanwhat?/trail 01:20, 5 November 2007 (UTC)
- Oppose Indef. blocked members can always can come back to the encyclopedia with permission from the community, Jimbo, or Arbcom. I also agree with the point raised before that some socks pages may be deleted if we enable this bot. Miranda 04:27, 5 November 2007 (UTC)
- Oppose. The targeted task needs human discretion, and should not be assigned an automaton. There is no way the bot is going to know about suspected sockpuppets, meatpuppets, e-mails, and other matters which might affect the decision of whether to delete the userpage. Sjakkalle (Check!) 09:20, 5 November 2007 (UTC)
- Oppose per Majorly's succinct explanation of the problem's with the candidacy of this bot. K. Scott Bailey 15:11, 5 November 2007 (UTC)
- Oppose per Majorly and WP:SENSE. Think about it. Firstly, Majorly is quite right that this task requires human judgment. Secondly, is it really worth the risk? Unlike broken redirects, temporary userpages are not part of the encyclopedia. As per Wikipedia:Don't worry about performance, and the fact that deleted material is kept in the archives, there is no urgent need for them to be deleted quickly. I'm not arguing that we shouldn't bother deleting them, but I do think that it's worth taking the extra time to have a human being doing this task. As to the question of ST47's judgment, I offer no opinion either way; I object to the idea of this bot, not to its operator. WaltonOne 15:18, 5 November 2007 (UTC)
- No need for a second account, not for every imaginable automated task that requires admin access, not for every single admin who wants to perform each task, holy accountcountitis, batman. If it needs doing, just do it (taking, of course, direct responsibility for any and all malfunctions, which I'm sure will be few and far between, and remarkably easy to fix, and won't happen again as long as you're programmed to keep a full list and never delete the same page twice, or whatever other reasonable precaution might be taken). As per RedirectCleanupBot, waste of process. —freak(talk) 17:44, 5 November 2007 (UTC)
- Oppose: Firstly, I can't see why all these pages should be deleted, and secondly, there appear to be technical issues with the bot's code, and with the operator of the bot. David Mestel(Talk) 18:29, 5 November 2007 (UTC)
- Oppose It would be nice to automate this kind of process, but I can't see this happening, because with all the concerns raised above, and possibly more in the future, the bot is bound to make a mistake at some point.
If this was simply the wrong talk page thread being archived, that can be reverted easily but deleting a page would be a lot harder to revert because a) non-admins can't revert it and wouldn't know where to look since the content is hiden and b) deleting the page takes it off the category and what-links-here lists etc making the page harder to track down by admins.
If this RfA fails, perhaps another option to try would be to have a bot generate a list of pages in Category:Temporary Wikipedian userpages that are over a month old to allow admins to more easily manually review them. Tra (Talk) 22:36, 5 November 2007 (UTC)
[edit] Neutral
- Protest neutral - wrong venue. (Compare with my comments on Wikipedia:Requests for adminship/RedirectCleanupBot.) This should be done via a proper WP:BRFA followed by an automatic +sysop, even more so that this not a joint-venture of two sysops (like the previous bot), so no "shared accounts" etc. are involved. Миша13 21:15, 3 November 2007 (UTC)
- Let me chime in as (one of) the voice of the bot approval group. The reason I have explicitly requested that this RfA takes place before proceeding with the bot request is that sysop bots require much wider consensus and approval than the simple technical overview that the BRFA requires. In addition, the character and trust of the operator is very much at issue, and only an RfA is proper venue to evaluate that.
As for why this does not take place after approval by the BAG, it's a simple matter: it is not possible to evaluate the proper functioning of the bot during a trial run if the bot is unable to perform its function because it does not yet have the bit, and there is no point in approving a bot that could never perform its function if this RfA should fail. — Coren (talk) 21:30, 3 November 2007 (UTC)
- BRFA is the proper venue for technical overview - operational discussions should be held there. RfA is not - with all due respect for participants, many have no clue about running bots - RfA is for establishing trust towards the candidate. As far as I know, ST47 has been granted with that trust already. If that trust is in doubt, RfC or RfAr are the places to go.
- As for your latter comment, it's a classic chicken or the egg dilemma which proves why RfA is a b0rken way to solve this issue. Миша13 21:48, 3 November 2007 (UTC)
- I don't believe the community is comfortable with that yet, and if I did simply request +sysop from a crat, they would not be willing to do it. --uǝʌǝsʎʇɹnoɟʇs 21:33, 3 November 2007 (UTC)
- Which in the least doesn't change the fact that it's not the way it should be done. Does this mean btw that the 'crats don't have the balls for the job? Миша13 21:48, 3 November 2007 (UTC)
- No, but it's arguable that this, and RedirectCleanupBot's RfA are establishing exactly what is the proper procedure. Allowing sysop bots at all is a recent development and it would be premature to shortcut the RfA at this stage— while the community seems to be fairly satisfied that the BAG does a good job at selecting bots, it was not given the mandate to authorize sysop flags and we'd be overstepping if we did ('crats would be well justified to ignore us on that matter).
This does not mean that this procedure is set in stone; when and if the community feels comfortable with the BAG allowing automated sysops we can revisit the issue. For that matter, this could be a good topic for a RfC now— but until then this is pretty much the most reasonable course of action. — Coren (talk) 21:54, 3 November 2007 (UTC)
- Misza, respectfully, I believe you've misunderstood the purpose of this RFA. It's not to make a technical evaluation of the bot, but rather to decide whether or not this task is appropriate for any bot. --JayHenry 22:14, 3 November 2007 (UTC)
- Respectfully, "Requests for adminship (RfA) is the process by which the Wikipedia community decides who will become administrators". ST47 has already become an administrator. Deciding whether any task is appropriate for a bot is the job of BAG+community at BRFA. Note that should the bot be approved on BRFA, it would be almost within policy for ST47 to run it on his main account. Why almost? Because WP:BOT requires that bots run on separate accounts. Ergo: the purpose of this RfA is to transfer the sysop bit to another account owned by the same person (therefore bypassing the limitation of "one +sysop per person). Миша13 22:43, 3 November 2007 (UTC)
- No, but it's arguable that this, and RedirectCleanupBot's RfA are establishing exactly what is the proper procedure. Allowing sysop bots at all is a recent development and it would be premature to shortcut the RfA at this stage— while the community seems to be fairly satisfied that the BAG does a good job at selecting bots, it was not given the mandate to authorize sysop flags and we'd be overstepping if we did ('crats would be well justified to ignore us on that matter).
- Which in the least doesn't change the fact that it's not the way it should be done. Does this mean btw that the 'crats don't have the balls for the job? Миша13 21:48, 3 November 2007 (UTC)
- All accounts that get sysopped go through RFA (well, almost all). It is not the job of the bot approval group to circumvent that. Neil ☎ 09:41, 4 November 2007 (UTC)
-
- People get sysopped, not accounts - obvious evidence being that former admins returning under a different username can have their new accounts sysopped after privately contacting ...
- ... a bureaucrat, who does the sysopping, not RfA - the latter is only meant to establish trust towards a person. Once. (Unless that trust has been seriously questioned.)
- Миша13 11:11, 4 November 2007 (UTC)
-
- Let me chime in as (one of) the voice of the bot approval group. The reason I have explicitly requested that this RfA takes place before proceeding with the bot request is that sysop bots require much wider consensus and approval than the simple technical overview that the BRFA requires. In addition, the character and trust of the operator is very much at issue, and only an RfA is proper venue to evaluate that.
- Neutral - I'm sure unwanted deletions would be rare, but I think the benefits of the task are rather negligible as well. — xDanielx T/C 20:46, 4 November 2007 (UTC)
- The above adminship discussion is preserved as an archive of the discussion. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as the talk page of either this nomination or the nominated user). No further edits should be made to this page.