Wikipedia:Bot owners' noticeboard

From Wikipedia, the free encyclopedia

Shortcuts:
WP:BOWN
WP:BONB
WP:BON
Bot policy · Requests for bot approval · How to create a bot · Bot Approvals Group
Requests for work to be done by a bot · Bot owners' noticeboard · Bot status page · BAG tools

This is a message board for coordinating and discussing bot-related issues on Wikipedia (also including other programs interacting with the mediawiki software). Although its target audience is bot owners, any user is welcome to leave a message or join the discussion here.

This is not the place for requests for bot approvals or requesting that tasks be done by a bot. It is also not the place for general questions about the mediawiki software (such as the use of templates, etc.), which have generally a best chance of being answered at WP:VPT.



Contents

-BOT Process

[watch this thread] [edit]

So from time to time we have an issue with a bot running out of control, unapproved bots being run, etc. In a recent matter (actually its still going on, but thats beside the point), members of the community seems to discuss a proposal that would have (IMHO) compelled a bot owner to change the operation of his bot. But the bot owner was already on record of saying he would not pay attention to that proposal. I asked the crats what sort of consensus they would look for, and WJBscribe indicated they'd look towards the BAG [1] and that it might be nice if the BAG had some formal process "where someone can raise problems with bots and BAG can evaluate whether to require changes to the bot's operation be made in order for approval not to be withdrawn." Im thinking a possible extension might be an RFC-bot, modeled on the RFC-user conduct and RFC-policy systems. Or something akin to Admins Recall, if it could be applied to all bots equally (not 500 different processes). Other ideas? MBisanz talk 07:56, 21 February 2008 (UTC)

Well, we already had a sort of rfc/bot attempt as a subpage of WP:Bots. (I'm not linking it, and if it doesn't get deleted it can be found with a prefix search.) There is a more general problem of lack of oversight of bot tasks. Basically, anyone can do anything. Bot policy allows "assisted scripts" to work without approval with very little real restriction, so long as it doesn't edit so fast that it cannot resonably be an assisted script. If we AGF, it's difficult to justify under policy blocking most bots that do not display straightforward bugs.
One idea I've had is to say that any bot needs approval which 1) edits at a clip too fast to reasonably check a good proportion its edits (maybe 5 per minute?), or 2) is doing a single job with more than some number of edits (maybe 1000), or 3) the operator does not respond to inquiries within 15-20 minutes (an operational definition of an unassisted script). And likewise, any *task* over 1000 edits needs to be posted somewhere (WP:BOTREQ, for instance) with time allowed for objections, unless an exemption is part of the bot approval for a specific type of task. That way admins will have some specific to point to when dealing with editors who start up AWB and make hundreds of edits removing spaces, as noted above. It would address the issue with certain javascript tools which tie up a browser for an hour. And BetacommandBot would be given a considerably larger per-task/per-day edit allowance for image tagging, but whether that's 2000 or 5000 per day would have some community input. Gimmetrow 08:26, 21 February 2008 (UTC)
Those are good ideas, but I think they tackle the bigger problem of not being able to keep track of all the bots and all their approved tasks (look here [2] at that prefix you gave me). Given the recent, shall I use the word, forum shopping, with BCB, that would be severely frowned upon if it happened to a human user or a policy, I'd wondering if we couldn't codify the WP:Bots subspace system. Like with a standard page naming format, rules of what IS a complaint, endorsing users, consensus closing, etc. MBisanz talk 08:42, 21 February 2008 (UTC)
I've been looking at Wikipedia:Requests for comment/User conduct and didn't realize there was a separate section for Admins and non-Admins. Maybe a third Bot section that creates a new page in the Bot subspace? MBisanz talk 23:32, 21 February 2008 (UTC)
Seeing as anything done can be undone, I've created the following Wikipedia:Requests_for_comment/User_conduct#Use_of_bot_privileges procss as a proposed process to show what I'm thinking of. MBisanz talk 02:18, 22 February 2008 (UTC)
Thanks for that, MBisanz. Hopefully this will work. I would like to see WP:BAG make some official, or semi-official pronouncement about this, as it will need their support to work. Some acknowledgment that they will act constructively on the results of bot requests for comments (ie. explaining things to people) rather than just dismissing "attack pages". How does one go about getting an "official" response from WP:BAG? Carcharoth (talk) 11:37, 23 February 2008 (UTC)
I'm counting 1413 active BAGers. Maybe some survey of them of how they'd respond to a Bot-RfC? My inspiration for this was WJB's suggestion at Wikipedia:BN#Bot_change, so maybe it would be better to wait till we have a Bot-RfC that comes to a consensus to do something, the operator refuses, and then see if the BAG responds. MBisanz talk 04:29, 24 February 2008 (UTC)


(copied from BN) IMHO WP:BRFA isn't enough in this respect. Consensus (and the bots themselves) can and do change. There needs to be a process for governing bot (and bot owner) activity including withdrawing approval if necessary. Sure, bots can be blocked but that tends to be reactionary and only takes one admin. I had a bot blocked a few days ago (see bot out of control from above) too and it just seems that, for lack of sufficient process, the block (which was not set a time limit) was just forgotten. We, as a community, need the ability to govern bots because when it comes down to it they are just too efficient. This bitterness and resentment seems to stem mostly from the lack of binding recourse either for the sake of justifying a bot, or for governing one. But as I said it's just my opinion. Adam McCormick (talk) 07:46, 24 February 2008 (UTC)

if there is a serious issue with a bot, leave a note on WT:BRFA and BAG will review the situation. βcommand 14:39, 12 March 2008 (UTC)

{{t1|nobots}} proposal.

[watch this thread] [edit]

Just wanted to drop a note here, there is presently a proposal underway at WT:BOTS, to require that all bots be {{nobots}} compliant. SQLQuery me! 03:28, 9 March 2008 (UTC)

nobots needs to be redesigned

[watch this thread] [edit]

I just got around to looking at the nobots system, and realized how far from best practices it is. The system is premised on the historical practice of downloading the entire content of a page before making any edit, even if the edit is only to append a new section to the bottom. Once the API editing is implemented, we probably won't need to download any page text at all to get an edit token and commit the new section. At that point, the nobots system will be completely broken.

It seems to me that we should discuss a nobots system that doesn't require bots to perform lots of needless downloads. Perhaps a database of per-bot exclusion lists, like Wikipedia:Nobots/BOTNAME or something like that, which would only require one fetch to get the full list. — Carl (CBM · talk) 12:59, 9 March 2008 (UTC)

Appending sections doesn't make any sense in mainspace (and other content namespaces) because it would break layout by adding information afters stubs, categories and interwikis. If it's not for mainspace, most bot developers will ignore this possibility. MaxSem(Han shot first!) 13:07, 9 March 2008 (UTC)
It does make sense, however, for talk pages, which are the main source of interest for the nobots system. I don't advocate forcing bot developers to follow nobots or forcing them to use the new section editing method. But the current nobots system is (mis)designed assuming that all bots download the old page text before all edits, which is an actively bad assumption because it discourages bots from using more efficient editing methods. — Carl (CBM · talk) 13:20, 9 March 2008 (UTC)
There's whatlinkshere: section-editing bots could load all transclusions and then ignore everything that uses {{nobots}} and load whole pages for those who use {{bots}} with selective exclusion. I don't like the centralised system because it's prone to vandalism and eventually we'll have to fully protect the exclusion lists and build unnecesary bureaucracy around addition/removal from them. MaxSem(Han shot first!) 13:32, 9 March 2008 (UTC)
I agree it's a pain to have the exclusion lists on the wiki. By the way, because of bugzilla:12971, a bot would need to use the API list=embeddedin rather than backlinks to get a list of pages.
But the current system is worse. It is inherently flawed and riddled with technical problems because bots are not as smart as the wiki preprocessor. If the template is on user pages then bots need to be able to parse it as robustly as the wiki parser does. So this code should work:
{{bots|allow={{MyBotAllowList}}}}.
The only implementation I have seen of nobots is the pywikipedia one, and it does not support this because it does not resolve transclusions in template parameters. Also, if {{bots}} was transcluded from a user box, pywikipedia would not notice it even though it would be listed as a transclusion by the API. Which is reasonable enough - nobody should expect bots to parse pages in this way. — Carl (CBM · talk) 13:49, 9 March 2008 (UTC)
Is there a [[Category:]]-like system that we could use? Just throwing out ideas - I doubt there is, but am trying to think "outside the hammer". -- SatyrTN (talk / contribs) 16:03, 9 March 2008 (UTC)
Here's one possibility along those lines. We could set up categories like Category:Pages not to be edited by bots, Category:Pages not to be edited by BOTNAME, Category:Pages that may be edited by BOTNAME and overload the bots and nobots template so that code like {{nobots|BOT1|BOT2|BOT3}} puts the page into the appropriate categories. — Carl (CBM · talk) 18:39, 9 March 2008 (UTC)
A category system just creates massive overhead: the bot would have to load the entire category tree and hold it in memory, whether or not the page was ever actually called on. Clever programming could minimise the overhead, but it's still quite substantial. For userpages, I would advocate using something like User:Example/bots.css. How efficient is a check for page existence? I don't know off the top of my head, but I expect it's pretty low overhead. Whenever a bot wants to edit a page in userspace, it checks for the existence of a bots.css page for that user. If it doesn't exist, it knows it has free reign in the userspace. If it does exist, it loads the page and parses it - we can work out the most versatile and efficient coding - and from that learns which bots can edit which pages in the user's userspace. This provides the additional advantage of being able to easily apply nobots to all your subpages if you so wish. I'm thinking something along the lines of:
exclude [[User:SineBot]] from [[User talk:Happy-melon]]
exclude [[User:MelonBot]] from [[User:Happy-melon]] [[User:Happy-melon/About]] [[User:Happy-melon/Boxes]]
exclude [[User:ClueBot]] from all
exclude all from [[User:Happy-melon/Articles]]
Is pretty easy to read by humans, easy to parse by bots, and easy to debug (redlinks = bad). Not sure how this would extend outside userspace, but how often is nobots used in other namespaces? Comments? Happymelon 20:40, 9 March 2008 (UTC)
When should {{nobots}} ever apply outside the user talk? BJTalk 20:50, 9 March 2008 (UTC)
I've had various instances of editors telling me to keep SatyrBot from adding WikiProject banners to the talk page of articles. I don't know if that's valid, but that's one instance where nobots might apply. And don't we tell certain bots to archive/not archive various talk pages? Or tell sinebot to watch / not watch certain pages? -- SatyrTN (talk / contribs) 21:11, 9 March 2008 (UTC)
Archiving is opt in and Sinebot had a cat last time I checked. BJTalk 21:17, 9 March 2008 (UTC)
Indeed, those were just the first three bots I could think of - don't think of them as anything more than examples. What do you think of the actual system? Happymelon 21:29, 9 March 2008 (UTC)
It basically robots.txt, which has worked for years. But I still see no use for it. BJTalk 21:33, 9 March 2008 (UTC)
That was my inspiration, yes. I can't fully see the use of it myself, but someone said {{nobots}} was becoming obsolete, and proposed a (to my mind) impractical solution, so I came up with my own (hopefully less impractical) idea. Happymelon 21:37, 9 March 2008 (UTC)
Banners on talk pages. -- SatyrTN (talk / contribs) 21:39, 9 March 2008 (UTC)
I have no idea how your bot works but any nobots system doesn't seem like the best way to deal with that. BJTalk 21:45, 9 March 2008 (UTC)

Happy-melon: what do you mean the bot would hold the entire category tree in memory? There would be at most three categories to read: the list of pages forbidding all bots, the list permitting that particular bot, and the list forbidding that particular bot. This would mean (unless any of the lists is over 5000 entries long) only three HTTP queries, one time, to load the exclusions list. That's reasonable.

On the other hand, any system that requires an extra HTTP query for every edit that must be made is unreasonable because it is vastly inefficient. It would be possible to reduce the number of extra queries if you were just looking for page existence, but still every single bot.css file or whatever would have to be loaded, every time the bot wants to edit the corresponding page. That's far from ideal design. — Carl (CBM · talk) 22:59, 9 March 2008 (UTC)

That is true, however, consider the ramifications of actually maintaining such a system, not simply using it. There are four hundred and eight accounts with the bot flag on the english Wikipedia, meaning that a complete system would require at least 800 categories to be created. We can't justify not creating the categories until they are needed, otherwise we will have slews of problems like "OI, my userpage was in Category:Pages not to be edited by SignBot, why did I still get notified??"
The problem we are essentially dealing with is that we need, in some manner, to complile a database table in an environment which doesn't really support multidimensional structures. We have a large number of bots which edit userpages; we have a larger number of userpages which might be edited by bots. We have to cross-reference those data sets in the most efficient manner possible. The real question is: do we divide the table up by bot, or by userpage? the Categories system is an attempt to break the table up by bot, which makes it easy for the table to be parsed by the bots which need to use it. The bots.css system breaks the table up by user, which makes it easier for individual users to manage it. I am of the opinion that, since the bots exist to serve the users, not the other way around, our priority should be to create a system that editors can use easily, even those with no programming experience. Asking them to add each individual page to Category:Pages not to be edited by bots is much more time-consuming and error-prone for them than just adding "exclude all from all" to one file. There's no reason why we can't use a bot to generate the alternative version of the table - even just a regularly-updated list of existing bots.css pages would reduce overhead. If the updating bot checked newpage-tagged RecentChanges, and the deletion logs, for pages with "/bots.css" in the title, the list would be completely current. I doubt that's a particularly onerous task, but it's not one that is even necessary for the system. Essentially what I'm saying is, Wikipedia's back streets are created for its editors, not for its bots - any system should put user-interface first, and bot-interface second. Of course we should endeavour to optimise both interfaces, but if that's not possible, the humans should win. Happymelon 19:35, 10 March 2008 (UTC)
It doesn't seem difficult to me to maintain two overall categories plus two categories per bot, given that the number of exceptions is always going to be very low for properly designed bots. Categories are, in a way, easier for individual users than writing a file using some new syntax that they don't already know. Adding a category is a task everyone is familiar with. (As an aside, naming it shouldn't be .css since it isn't a style sheet.)
But I would prefer to see a per-bot blacklist in any case; the idea of categories is only one proposal. — Carl (CBM · talk) 20:00, 10 March 2008 (UTC)
I suggested .css subpages because they can only be edited by the user, thus preventing vandalism. In the same way, it's just an idea. My main problem with categories is the necessity of maintaining a large tree of mostly-empty categories, since to avoid errors each bot should have existing categories, whether or not they are populated. Happymelon 13:09, 12 March 2008 (UTC)
It makes no difference from the point of view of the bot whether the category page exists or not - the contents of the category can still be queried either way. So I don't see the need to create all the categories at once. But I also am not a strong proponent of the category system - a simple blacklist maintained for each bot would be fine. — Carl (CBM · talk) 13:49, 12 March 2008 (UTC)

BJBot

[watch this thread] [edit]

I would like to urge those who approve bots, that bots like BJBot — which left an unwanted long notice on my talk page because I made a single edit to Adam Powell, telling me that it was listed on AfD — should honour {{nobots}}.

As a side note this response is rather uncalled for behaviour for a bot operator. I'm glad he struck that later, but it's still disappointing. Requests by useres not to notify them should only be ignored if there is a good reason to do so. --Ligulem (talk) 19:00, 9 March 2008 (UTC)

How do I roll my eyes over the internet? If you would like something changed ask, don't tell me to stop running my bot. BJTalk 19:18, 9 March 2008 (UTC)
Well, this rant did actually include a hidden gem of a bug report. Thanks. BJTalk 20:08, 9 March 2008 (UTC)
Consider this bot not having my approval. --Ligulem (talk) 20:32, 9 March 2008 (UTC)
k? BJTalk 20:35, 9 March 2008 (UTC)
I personally wouldn't require anyone to implement the current nobots system. BJ, was the bug you mentioned that this editors shouldn't have gotten a notice? It does seem odd if everyone who edited the article even once gets notified. — Carl (CBM · talk) 14:54, 10 March 2008 (UTC)
You might want to read Wikipedia:Bots/Requests for approval/BJBot 4, where Bjweeks said to have had implemented {{nobots}}. When I asked him to stop his bot until that actually works, he first denied my request (later struck his denial) and labelled my comment here as "rant". Besides, that bot task is entierly uneeded and unwanted anway, so it doesn't have my approval (even if it would work as advertised). We simply don't need nor want this hard core talk page spamming. After all, there is a watchlist feature for a purpose. --Ligulem (talk) 16:40, 10 March 2008 (UTC)
I don't think that your individual approval (or mine, since I'm not a BAG member) is the deciding factor. But BJ did say in the bot request that the bot would honor nobots, and I think it is a reasonable thing for this bot to do, if its purpose is mainly to notify users on their talk pages. — Carl (CBM · talk) 17:01, 10 March 2008 (UTC)
There is no consensus for running this bot task. That's the deciding factor. BAG implements consensus. And as an admin, I may block a bot that doesn't follow its approval if its owner is unwilling to stop and fix it after I have asked him to do so. --Ligulem (talk) 17:48, 10 March 2008 (UTC)
The bot seems to be notifying a hell of a lot of people for a single AfD. Can we stop the bot, reopen the BRFA and seek wide community input please (as this task probably affects most of the community and could do with broader input that that provided in the previous one day BRFA)? Martinp23 18:06, 10 March 2008 (UTC)

I would like to see the approval for this task looked into further by BAG. The notifying of people with very few edits to articles seems rather an annoyance and the bot seems to be notifying a lot of people (IPs included) - I count about 50 notifications about the proposed deletion of Prussian Blue (duo) alone. This was probably a request that should have been scrutinised a little longer... WjBscribe 18:22, 10 March 2008 (UTC)

This request should not have been granted. But since there seems to be no procedure for withdrawing of erroneous approvals, chances are small that anything will happen here. In case BAG or whoever actually does review this bot's task, I suggest to at least rethink if it really makes sense to post lenghty notices about article deletions if the last edit of that editor on the article at hand dates back more than a year. Furthermore, notifying admins about page deletions is particularly pointless, since we can still see "deleted" pages anyway. Also, wiki-gnomes like myself who currently don't edit and who have many thousands of small edits in their contribs, are particulary annoyed by having their talk pages plastered with these pointless wordy "notfications" which don't serve much more than making inactive editor's talk pages look like they would pertain to some stupid newbie who needs a pile of corrective warnings about his misplaced steps on this wiki.
This project has really gone mad. Some bot operators with approvals seem to think they are on a heroic mission here and they have to be prepared to knee-jerk reject requests to stop and fix their bots. This attitude is harmful to this project. But that seems to be the norm nowadays on Wikipedia. --Ligulem (talk) 01:07, 12 March 2008 (UTC)
What part of it was a bug do you not get? BJTalk 02:57, 12 March 2008 (UTC)
We could just, you know, ask him to do something about it... --uǝʌǝsʎʇɹnoɟʇs(st47) 19:53, 10 March 2008 (UTC)
I'm confused, isn't that what's been going on so far? —Locke Coletc 20:06, 10 March 2008 (UTC)
I've made some small changes which halved the number of notices to that article. I'm also working on adding a check for when the person last edited the article. That should be done by tomorrow. BJTalk 03:50, 12 March 2008 (UTC)

There was in fact two different bugs that allowed Ligulem to get a notice. The first was me playing around with nobots early in the morning and had been fixed for hours (what he requested fixed on my talk), the second I didn't notice until he posted his rant here ("only one edit" got my interest), I also fixed that. If anybody sees unwarranted notices, leave a message on the bots talk with a diff. I also plan do redisable IP notices per a message on my talk, that should further reduce notices. BJTalk 01:48, 11 March 2008 (UTC)

Thanks, for responding to, and fixing the bugs mentioned somewhere in this complaint. Also, thanks for staying cool on this one. SQLQuery me! 04:16, 12 March 2008 (UTC)
So you do think that this response by BJ was fine? --Ligulem (talk) 09:02, 12 March 2008 (UTC)
This was clearly a misunderstanding by the operator, which he has since corrected. It's not a big deal. -- maelgwn - talk 09:30, 12 March 2008 (UTC)
Yes it's not a big deal, but it would have been nice to admit that in the first place instead of labelling my post here as a "rant". Second, it seems somewhat of an irony, that it was SQL who fully protected his talk page recently [3]. Of course, I do understand that he was under very tense stress in real life and with some recent on-wiki issues. --Ligulem (talk) 09:54, 12 March 2008 (UTC)

[edit] Opt-in instead of opt-out

I've added a new section on the approval discussion page at Wikipedia:Bots/Requests for approval/BJBot 4, proposing to use an opt-in procedure for task 4 (delete notifications). I suggest to follow-up at Wikipedia:Bots/Requests for approval/BJBot 4#Opt-in instead of opt-out. --Ligulem (talk) 10:40, 12 March 2008 (UTC)

Bot owner's essay

[watch this thread] [edit]

Is there an essay or guideline for how to deal with bot owners? I have, in the course of the past year, gotten comments and requests about my bot's behavior that range from polite through negative to downright abusive. I'm sure I've read something somewhere, but can someone point me to it? -- SatyrTN (talk / contribs) 21:14, 9 March 2008 (UTC)

There probably should be. Coming in screaming, or even threatening to block / have blocked, is rarely productive. I don't think, however, that a simple essay somewhere, would do much to solve the problem. It's just something you kinda have to deal with, in my opinion. SQLQuery me! 04:18, 12 March 2008 (UTC)

Pywikipedia getVersionHistory

[watch this thread] [edit]

Something changed in the format of history pages which broke pywikipedia's getVersionHistory. I've fixed it for my own needs, but heads up in case any other bots use this function. Gimmetrow 23:03, 10 March 2008 (UTC)

  • It would be appreciated if you could post a bug report and/or patch on the Pywikipediabot tracker. --Russ (talk) 01:13, 11 March 2008 (UTC)
    • Well the format of history pages changed again. Looks like it might be back to the old form. Gimmetrow 21:46, 11 March 2008 (UTC)

Bot roles for nobots

[watch this thread] [edit]

I propose to extend the {{bots}} specification to allow easier restriction of particular bot types. This involves creating pseudo-usernames to be used in allow and deny parameters, for example, username "AWB" relates to all AWB-based bots (already supported), other bot framework names could include "pywikipedia", "perlwikipedia", "WikiAccess", etc. Additionally, we could classify bots by roles they perform: "interwiki", "recat", "fairuse", "antivandal", "notifier", "RETF", "AWB general fixes" and so on. For convenience, these roles should be case-insensitive. MaxSem(Han shot first!) 10:23, 12 March 2008 (UTC)

Actually, if supported I'd implement this (fairuse only). I dislike {{nobots}} as it disabled messages without the user knowing what they are disabling. BJTalk 12:31, 12 March 2008 (UTC)
I think we need to consider a more fine-tuned system. But the big problem with this is that it implies that every single type of such a bot will follow this system, and clearly not every bot will. For example, my Signpost delivery bot wouldn't follow it, because it's opt-in anyway, and most people who use nobots wouldn't understand that they'd have to give an exception for me. Ral315 (talk) 19:36, 12 March 2008 (UTC)
More fine-tuned system would be to blacklist each bot separately - not very convenient. MaxSem(Han shot first!) 19:49, 12 March 2008 (UTC)
I've got an idea for a more fine-tuned bots system, based somewhat on robots.txt -- I'll post a mockup tomorrow. Ral315 (talk) 19:54, 12 March 2008 (UTC)

I need some one who operates bots on OS X

[watch this thread] [edit]

I use OS X Tiger and Im trying to run bots for the Telugu Wikipedia. I downloaded the python framework from this page. and I created the user-config.py file which reads

mylang='te'
family='wikipedia'
usernames['wikipedia']['te']=u'Sai2020'

Sai2020 is my username. I open Terminal and type in python login.py I get the error python: can't open file 'login.py'

Can someone help me please Σαι ( Talk) 12:19, 12 March 2008 (UTC)

If you are on Tiger I'd recommend installing Python 2.5. BJTalk 12:28, 12 March 2008 (UTC)
is python installed on tiger? βcommand 14:33, 12 March 2008 (UTC)
It is but it is 2.3. BJTalk 14:53, 12 March 2008 (UTC)
Sai2020, make sure you cd to the correct directory. For me, before I enter the code, I type cd ~/Bots/pywikipedia because my pywikipedia folder is in Users/soxred93/Bots/pywikipedia. Hope this helps! Soxred93 | talk bot 01:02, 13 March 2008 (UTC)


That was the problem. once I cd'd it worked but i get a different error this time

Sais-MacBook:~/Desktop/pywikipedia Sai$ python login.py
Traceback (most recent call last):
 File "login.py", line 49, in <module>
   import wikipedia, config
 File "/Users/Sai/Desktop/pywikipedia/wikipedia.py", line 127, in <module>
   import config, login
 File "/Users/Sai/Desktop/pywikipedia/config.py", line 364, in <module>
   execfile(_filename)
 File "./user-config.py", line 1
   {\rtf1\mac\ansicpg10000\cocoartf824\cocoasubrtf440
                                                    ^
SyntaxError: unexpected character after line continuation character

whats going on? I'm not very good at these kind of stuff.. Σαι ( Talk) 01:27, 13 March 2008 (UTC)

Perhaps you created user-config.py with Apple's TextEdit? Maybe it created it as a rtf. Choose "Make Plain Text" from the "Format" menu and re-save the file. Staecker (talk) 01:44, 13 March 2008 (UTC)
Great that was the problem.. once i run the login.py, Terminal asks me for my password but i cant enter it.. what ever i type nothing comes there... Σαι ( Talk) 05:08, 13 March 2008 (UTC)
Of course, but it just doesn't echoes it to the terminal. You just type and press enter. Snowolf How can I help? 06:34, 13 March 2008 (UTC)

Thank you very much people. I can now login :) Σαι ( Talk) 08:57, 13 March 2008 (UTC)

New Bot?

[watch this thread] [edit]

Any chance of a bot that automatically reverts any blanked page? One may already exist but, if so I'm not familiar with it. I've been chasing alot of blankings lately in my anti-vandalism crusade. Thanks either way. Jasynnash2 (talk) 17:09, 14 March 2008 (UTC)

ClueBot seems to detect it. multichill (talk) 00:01, 15 March 2008 (UTC)

Proposal on WT:BRFA

[watch this thread] [edit]

Please offer input there if you have any :). Martinp23 19:33, 17 March 2008 (UTC)

Extended help wanted

[watch this thread] [edit]

I'm interested in developing my bot skills, particularly to running bots which operate on a continuous basis, rather than the more script-oriented bots I'm already operating. I'm looking for a more experienced bot coder/operator who can help me get to grips with the extra knowledge and tools required to operate continuously-running bots. Kind of an adopt-a-bot-owner system :D. I can work in C++ and VB, but all of my previous bot-coding experience has been in python. Anyone interested and willing to give me a hand? Happymelon 10:40, 18 March 2008 (UTC)

Problem with dotnetwikibot

[watch this thread] [edit]

Is anyone having a problem with dotnetwikibot today? As of this morning, any attempt to FillAllFromCategory is not working. I changed nothing in my code, which was working fine yesterday.

I placed a query about this at sourceforge.net dotnetwikibot framework forum, but it doesn't appear to get alot of traffic.

Any help would be appreciated. --Kbdank71 15:17, 20 March 2008 (UTC)

What error messages are you getting? Does the code use the api to get a cetegory list? There has been a recent interface change: [4]. If this is the problem, it's easy to fix. — Carl (CBM · talk) 15:46, 20 March 2008 (UTC)
That's the problem, I'm not getting an error. I know it's hitting the FillAllFromCategory routine, as it returns "Getting category 'Category:Foo' contents...", but then it just acts as if there were no articles in the category. I added pl3.ShowTitles(); as a sanity check, and it shows no pages in the pagelist. I've tried using the api via FillallFromCategoryEx as well, with the same results. --Kbdank71 16:08, 20 March 2008 (UTC)
All that has changed is a parameter name in the api query - can you edit the source and recompile the library? If not, you'll have to find someone who maintains the code and get them to do it. — Carl (CBM · talk) 16:12, 20 March 2008 (UTC)
Thanks for the help. I've tried to contact the developer to see if this can be fixed. Hopefully it can. --Kbdank71 20:09, 20 March 2008 (UTC)
Thanks again. The developer gave me a fix and will be updating the framework soon. --Kbdank71 12:58, 21 March 2008 (UTC)

This page now under bot care

[watch this thread] [edit]

As a trial for the CorenANIBot, this page is now automatically archived into subpages when new sections are created. There is an automatically generated link right of the titles to edit or watch the subpages, allowing you to watch the individual threads.

Watching this page itself will allow you to see new threads.

Warn me if it breaks! — Coren (talk) 20:31, 21 March 2008 (UTC)

That's certainly interesting. I'm not sure whether I like it or not, but this is a good noticeboard to try it on. What are the perceived benefits? I can see 1) being able to watch individual threads and 2) a sort of 'instant archive', since they're already sorted by date. But I'm not sure how it would fit in with the archiving schemes currently in place at WP:AN, WP:ANI, etc, or what a newbie making their first post to WP:AN would make of a long list of page transclusions. Perhaps this system is best placed at boards which are frequented by regulars, like WP:ANI or WP:AN3RR. Happymelon 10:46, 22 March 2008 (UTC)

Part the the reason the bot acts like it does it to make it easy for newbies: just add a section. The bot takes care of the rest. — Coren (talk) 15:41, 22 March 2008 (UTC)

I think this is a great idea for AN and ANI. Trying to watch for changes to any given thread there at the moment is rather impossible, especially on ANI. To address HappyMelon's concern, we'd just have to make it clear via some notices at the top not to try and edit the page itself and to use the edit and add section links. This could also be extremely useful for addressing vandalism attempts on those pages; the main pages themselves could be semiprotected or protected if necessary without shutting down discussions, the same could be done to the transcluded pages without disrupting other discussions.--Dycedarg ж 22:24, 22 March 2008 (UTC)

Adding a thread

[watch this thread] [edit]

...just to see what happens to it under bot care... Franamax (talk) 12:56, 22 March 2008 (UTC)

That was a little weird. First it didn't show up at all, then it showed as a redlink. I did a server purge and it showed up fine. Seems a little confusing, maybe I missed something? Franamax (talk) 13:01, 22 March 2008 (UTC)

Clarify: first it was on the page normally (I could see it in edit page), then it vanished and redlinked. Perhaps the bot could leave a "reformatting" message? Also, is this a failsafe method? What if there's an edit conflict along the way? Keeping in mind that the only thing important to me is my post and I want to make sure it's there because to me it's the most important thing in the world. :) Franamax (talk) 13:13, 22 March 2008 (UTC)

The bot watches the page, waits for a few seconds, then moves the thread to a subpage (by first removing it from the noticeboard so as to avoid the possibility of someone trying to put a reply and it getting lost). In order to see the redlink, you had to hit refresh at the exact time this was going on.  :-) You can get the complete details on how it works and what safeguards are in place on the bot request page discussion. The short of it: it's paranoid enough to make sure comments don't vanish. — Coren (talk) 14:58, 22 March 2008 (UTC)

Resolving conflicts with article maintainers

[watch this thread] [edit]

Do you have any good ideas on how to build consensus in discussions like this? Whenever an autonomous interwiki bot links the article Monoicous, the bot owner gets angry comments from the article maintainers. I have tried to explain how interwiki bots work and how we can solve the problem by correcting all the links manually, but the discussion always seems to drift toward “just fix the bots”... --Silvonen (talk) 04:19, 11 April 2008 (UTC)

Well, it's true. Some problems can't be solved by bots. Maintaining the links manually (which the editors of that page are willing to do) strikes me as preferable to a multilingual, multi-project edit war. The best solution might be {{nobots}} -- do interwiki bots generally follow it? rspeer / ɹəədsɹ 05:28, 11 April 2008 (UTC)
I'm pretty sure interwiki bots normally follow {{nobots}} because they use interwiki.py and pywikipedia supports it. -- maelgwn - talk 06:28, 11 April 2008 (UTC)
why not just do the smart thing and fix the issue with the interwiki links acrros all projects that are effected. instead of complaining about the bot why not FIX THE ISSUE βcommand 2 21:06, 12 April 2008 (UTC)
Calm down. If you look at the page, you will see that they tried your "smart thing" first, and the result is what I was referring to as a "multilingual edit war". Sometimes things aren't that simple.
There are a couple of issues here. One of them is that there are disagreements among different Wikipedias about what a certain word refers to in different languages. This is not a problem that can ever be completely resolved, because languages are different and don't always have a one-to-one mapping in their vocabularies, and also because people's use of languages differs so it may not always be possible to tell if there's supposed to be a one-to-one mapping or not. The purpose of bots is not to enforce perfect correctness, it's to do repetitive tasks that humans are unwilling to do. The humans on the page in question are willing to maintain their links manually. In that case, the fix is for them to apply {{nobots}}. That resolves the problem where they complain about interwiki bots for doing what they are programmed to do, as long as they are willing to take responsibility to do what they consider right for the page. rspeer / ɹəədsɹ 17:19, 18 April 2008 (UTC)
Didn't we at some point get the even easier solution (after the Ingria and "ru-sib" issue) that if an interwiki link is commented out on a page, the bots will leave it alone and not readd it? I remember we were discussing this some time ago, and it seemed to be working on another page where the problem came up recently. If the bots honor that, it's simple, effective and intuitive to use. Fut.Perf. 21:09, 2 May 2008 (UTC)

WP:BOT has been completely rewritten

[watch this thread] [edit]

...just in case anybody didn't notice. It would be much easier to pretend that the rewrite has consensus, and attempt to gain consensus for more radical kinds of change, if we could get more people commenting there.--Dycedarg ж 20:31, 12 April 2008 (UTC)

{{tl|bots}} change

[watch this thread] [edit]

Oh noes, the drama llama
Oh noes, the drama llama

Are we going with this? This is directed ST47 and Carnildo mainly, as I don't think Beta would follow it without force. I'm sure this has already been talked about but I stopped reading that debate a while ago. BJTalk 15:25, 16 April 2008 (UTC)

[edit] RBAG Spam

Chris is currently being considered for BAG membership. To view the discussion and voice your opinion, please visit click here.

[edit] Single sign-on breaks old bot code

Unifying my bot account has completely broken my bot's login code. I can see how to fix it, but the fix is rather involved, since I would have to completely emulate the cascade of image-load and cookie operations associated with a new-style SSO login. Would it be possible for the SSO mechanism to recognize the traditional login cookies in the (for example) en.wikipedia.org domain, even when an account is unified for single sign-on? Otherwise, I'm going to either have to write a load of SSO bot-login code, or have to port it to a framework which already has this implemented. Either of these would be painful, and I'd prefer not to have to do either. -- The Anome (talk) 20:34, 29 May 2008 (UTC)

I know pywikipedia works with SUL login, you might want to look at that code. βcommand 20:39, 29 May 2008 (UTC)
I've now made the same request at Wikipedia:Village pump (technical), which is probably a better place for it. -- The Anome (talk) 21:19, 29 May 2008 (UTC)

[edit] Minor bot edits

I asked a question at the village pump here, but decided this is probably the better place to ask. When I have ShepBot set to make a minor edit I am told it should show mb next to the edit. Yet I only see an m; see [5] for an example. Is there some special b that signifies a minor bot edit I need to turn on? Thanks for your help! §hep¡Talk to me! 22:07, 31 May 2008 (UTC)

If I remember rightly, the "b" only shows up in Special:RecentChanges and the watchlist - try clicking "show bot edits" in your watchlist and see how many "b"s appear - but note that no bot edits have the "b" next to them in the history. IIRC, it's because the bot/not bot status is stored in the recentchanges table, which is purged on a 30-day rolling cycle, so it's only used for watchlists, RC and other things that can't be viewed months after the fact (the IP addresses used for CheckUser are also stored there, and a whole host of other data to make those feeds more comprehensive). All edits by a bot account are unavoidably recorded as bot edits for as long as the bot remains flagged. Happymelon 22:49, 31 May 2008 (UTC)
Okay. An upset user recently contacted me and I thought I may have done something wrong. Thanks for the help! §hep¡Talk to me! 22:51, 31 May 2008 (UTC)
Actually, the "unavoidably" part isn't true anymore; bots may include the parameter "bot=0" when saving an edit to avoid it being flagged as a bot edit. —Ilmari Karonen (talk) 01:02, 1 June 2008 (UTC)
  • Just a note, edits with +m and +b in the User_talk: namespace will use a bot's "nominornewtalk" user access and NOT trigger the 'new messages' flag for the user who owns the page. — xaosflux Talk 01:20, 1 June 2008 (UTC)

[edit] BAG request: Bjweeks (BJ)

My request to join the BAG is here. BJTalk 06:58, 6 June 2008 (UTC)

[edit] Deflag bots that haven't edited since 2006

The topic of better enforcement came up on IRC and one of the issues I think is the the lack of knowing what bots are doing what task. Having bots that haven't edited since 2004 (!) doesn't help the issue. As a starting point I propose that all bots that haven't made an edit since 2006 should be deflagged. BJTalk 09:35, 6 June 2008 (UTC)

They did this same thing a few months ago. The first step is to contact the owner to see if there are any plans. giggy (:O) 09:37, 6 June 2008 (UTC)
I just checked the first few and the owner already had a message dated March. Seems they were never deflagged. BJTalk 09:40, 6 June 2008 (UTC)

[edit] Wikipedia:Bots/Status Updates

Some of you may know that I have spent my last months working on the bot status page. Well now I think my work is done. I have managed to:

  • Get rid of bad links
  • Fix links
  • Organize into categorys
  • Create templates
  • Taken the page from 75,846 bytes to 59,943 bytes (so far)
  • e.t.c

Now I need a hand from you the "bot owners" This pge has always needed updating and now it is allot easier(see the header at the top of the page explaining how to sue the template). This should cut the page size down even more making it easier to use and easier to load. It will also be allot tidier and also up to date.

All I am asking you to do is to go there and update your bots entry. If you have any questions about the template please ask me on my talk page or here. If you need to add a link to a request approval before the BRFA system then you just need to add it into the "Other Links" in the template. Again is you have any questions abotu that please ask me. I hope I have made myself clear. ·Add§hore· Talk/Cont 13:50, 8 June 2008 (UTC)

Done, thanks for your help also :) — E TCB 22:45, 8 June 2008 (UTC)
Another point I would like to put in bold is, "don't just update them but put them in the template as well" :> ·Add§hore· Talk/Cont 06:46, 9 June 2008 (UTC)