User talk:Harnad
From Wikipedia, the free encyclopedia
Hello there. About transdisciplinary vigilantism: a couple of points. While this is potentially of interest about the workings of academia, the proprieties of non-peer review, it is not really adequate to leave the reader in suspense about the cases you cite. Second, if this is a neologism, you should know that Wikipedia is not a neologism-friendly place. A more laborious article title may be needed. Charles Matthews 19:49, 31 December 2005 (UTC)
Contents |
[edit] Categorical perception
Hi —
I've just listed the article Categorical perception as a possible copyright violation. Wikipedia is lisenced under GFDL, and cannot include any copyrighted works, including article abstracts taken from the web. If you disagree with this, feel free to let me know on my talk page. If you actually want to work on the article, I've been slowly preparing the article myself in my sandbox (I wrote my batchelor's thesis on CP). You can feel free to collaborate there, or re-write your own article without taking from the web.
Thanks, — Asbestos | Talk (RFC) 15:12, 22 February 2006 (UTC)
Urgent: If you are, in fact, the author of the article in question -- and solely hold the copyright to it, I strongly advise you to read the GNU Free Documentation License article and the text of the GNU Free Documentation License before you commit yourself. You are, in effect, donating your writing and your copyright to the project, and would no longer have control over what is done with it. Please read the article and the licensing text for more precise detail, and please be certain it is what you want to do. --Calton | Talk 01:55, 6 April 2006 (UTC)
[edit] OA section
The OA articles have been moved back to the right pages and mostly restored. I have a little more to do on Monday and, probably, Tuesday. DGG 08:45, 8 January 2007 (UTC)
[edit] Categorical Perception - internal links
As part of Project Wikify, I am taking a look at the Categorical Perception article you started. This article is well-formatted, and someone cleaned up the references, but it still needs internal links pointing to other Wikipedia articles where appropriate. In my opinion, this is all that is needed to remove the Wikify tag, and I figure you're going to do a better job of it than I would since you wrote the article. Adam 16:08, 6 March 2007 (UTC)
[edit] Our recent changes to "Symbol Grounding"
Hi. I tried to stay in agreement with your papers. I'm just wondering if you read what I wrote in the "Symbol Grounding" discussion area and if you think that should make a difference to what you think is incoherent where I believed to be attaining better coherence. Just for my information, in order for me to gain a better understanding of your edits. Valeria B. Rayne 03:23, 10 April 2007 (UTC)
Hi: It's a tricky and subtle problem, and you really have to think about it carefully, and a lot. A sure clue to the fact that you are getting lost or fooling yourself is if the language gets complicated or ritualistic: Symbols are like words, to a first approximation. Words have referents, objects they stand for. And combined in sentences, the sentences are descriptions that are either true or false (or ill-formed or undecidable). The symbols inside a robot that can do with words (and the things they refer to) just as we do, are grounded. Whether they are also meaningful depends on whether the robot also feels. We can objectively test grounding, but we cannot objectively test feeling. That's all there is to to it. Harnad
- "Lost" meaning a degree of interest of course. Cognitive science is such a large domain with many variously interested subdomains and their development programs, and "meaning" itself is a nontrivial notion by accounts prevalent in the literature. It only makes sense, in the interests of empiricism, logical validity, and the acknowledgment of the multiplicity of programs, that a grounded symbol system has meaning autonomy, with conscious autonomy a separate if unrelated problem. Your papers explicitly indicate an understandably and logically modest pardon of consciousness, so denying mindspace a larger meaningspace, one not presupposed (for how else do we determine grounding if not for meaning autonomy, unless grounding is related neither to meaning nor to consciousness?), just wasn't expected. Valeria B. Rayne 04:19, 10 April 2007 (UTC)
- For instance, you say, "If the meaning depended on an external interpreter, then the system would not be autonomous." Yet you insist on being the very external interpreter who identifies an autonomous grounded symbol system that imputes meaning to its symbols. Hence, for meaning – which is precisely what makes the system autonomous precisely according to you – to depend on you is to beg the question. I hope you might read what had been my final revision charitably. Valeria B. Rayne 04:57, 10 April 2007 (UTC)