Talk:Signal (electrical engineering)
From Wikipedia, the free encyclopedia
- Please add new talk topics as new sections at the bottom.
Contents |
[edit] Old talk
As I understand it, a signal needs a temporal component (modulation in time). But the Fourier transform of that signal loses the temporal component. Thus I do not see how that FT(signal) can also be called a signal, which is why I renamed it signature, which describes a characteristic of said signal. Ancheta Wis 02:01, 27 Feb 2004 (UTC)
- An FT of a signal is much more than a signature: It contains all of the same information as the original signal---very different from the idea of, for example, a Digital Signature. 71.253.0.236 03:45, 25 July 2007 (UTC)
On Nov. 30, 2005, I edited the article extensively, to correct the following problems: a) mistatement of the notion of a signal in information theory-the notion is not that it is a flow of information but rather that it is a sequence of states of a communications channel at the transmitter. b) confusion of "data" with "signal"- Some of the problems are: a) analog and digital are properties of data NOT signals. b) it is possible to perform frequency analysis on data but is not possible to perform it on a signal c) a signal is not a sequence of numbers but a sequence of numbers can be properly referenced as "data." As the article is about "signal" and not about "data," the best way to handle this situation seems to be to edit out the references to "signals" that are really "data." c) The claim that entropy is a property of a signal or set of signals is incorrect. Entropy is a property of a communications channel. As the article is about the concept of a "signal," it seems best to leave out discussion of entropy.
One of the properties of the "signal" that is defined in information theory is that the process that generates it it is a stochastic process. As this is sometimes a source of confusion, I added a discussion of this topic. Terry Oldberg http://www.oldberg.biz terry@oldberg.biz
- Hours after I contributed an edit or the article plus the above justification for it, "Shanel" wiped it out without supplying a justification. Shanel: please either attempt a justification of your action or restore the text as edited by me.
- Terry Oldberg Dec. 1, 2005
Terry, what other characteristics of signals ar there besides the stochastic attribute of the signal-generating process? And why is that one so important? What about the channel's characteristics and influence? Your first paragraph is useful because it helps people distinguish a signal from a waveform and a message. Many people have the concept of a signal as something radiating from the tip of an antennae, i.e. radar and radio and many people would probably relate to that. Many people discover signals through their interest in digital audio/video. People coming from different backgrounds should find something they can relate to in the article. Rtdrury 20:56, 3 December 2005 (UTC)
-
- RTdrury: The stochastic attribute is important because it implies a constraint on the signal-generating process, for not all processes are stochastic. I'm sensitive to this problem as a result of having once attempted statistical research in the field of nondestructive testing. Through misuse of the word "signal," workers in the field implied that a process was stochastic when it was not. In doing so, they implied that this process obeyed conventional statistics when this was not true. One of the results of the misuse was (and still is) to expose the people of the world to unnecessary hazards from such events as explosions of nuclear reactors and downings of aircraft. You are quite right in implying that the channel characteristics and influence are important. However, it seems to me that it would help untangle Wikipedia if they were to be discussed in a separate article on the notion of a communications channel. After all, a signal and a communications channel are two, different entities.
- Terry Oldberg 07:25, 15 December 2005 (UTC)
I came onto this article looking for a place to Wiki-link the term signal in the article Optical communications. The old article seems to have roughly the definition of signal that is used in that field, that is, a time-varying quantity of interest, regardless of how it was produced. But, as Terry points out, thats not the correct definition for the field of information theory. I'd like to move the old article to a new title like "signal (circuit theory)" or "signal (circuits and systems)" or "signal (communications)", but I'm not sure of the best title --- anyone feeling a little more bold want to just pick? -- The Photon 01:54, 4 December 2005 (UTC)
- I'm not clear on the distinction you draw between signals and data. For instance, I gather that you're trying to distinguish a digital data stream from the continuous electrical quantities that represent it, but it would be good if you could explain what exactly you mean. Furthermore, as the author of this article, I have to say that everything I've been taught in two semesters of signals theory supports the definitions I used.
- Finally, as far as wiki etiquette is concerned, if you ever see the need to delete a large amount of text from an article, please replace it somewhere else. For now, I'm going to put the text that you've removed back into the article, with a note saying that you contest its accuracy. --Smack (talk) 03:40, 14 December 2005 (UTC)
-
- Smack, are you certain that the definition of "signal" is the same in information theory and in signals theory? I don't know if signals theory is the same as what we called "signal analysis" in my university, but if it is, that field uses very different tools from information theory, and I'm not surprised if information theory has a very specific definition of "signal", and its not what you'd expect if you aren't an information theory (since very little in information theory is what you'd expect from lay knowledge of the terms they use). Again let me suggest moving the current article to Signal analysis or Signal (circuits and systems), and let the information theory people have an article that correctly relates to information theory.
-
- The Photon 05:11, 14 December 2005 (UTC)
-
- Smack: That there is controversy on the wording of the article asks whether there is an important distinction between a number and the representation of a number in telecommunications hardware; the latter is a sequence of states of the communications channel. If this distinction is preserved, there is, for example, a difference between 01100011 and the sequence of states of a communications channel that represents this number during transmission. The former is a sequence of digits. The latter may sequence of voltages across conductors. Should we blur the distinction between the former and the latter?
-
- In the defining paper of information theory, "A Mathematical Theory of Communications," Shannon distinguishes between the two. Numbers are a subset of the "characters" which, in sequence, make up what Shannon calls a "message." The sequence of states of the communications channel, at the transmitter end of it, make up what Shannon calls the "signal." Shannon's "message" is what one now calls the "data."
-
- Shannon's "received signal" differs from his "signal" through the entry into the communications channel of noise but the received message may, through the use of an error correcting code, be identical to the transmitted message. His "signal" may be continuous in time when his "message" is discrete in time, or vice versa. These are some of the differences between Shannon's "signal" and his "message."
-
- In view of the above facts, I submit that it is essential for Wikipedia's article on "signal (information theory)" to preserve the distinction between a number and the representation of a number as a sequence of states of a physical system. When this distinction is preserved, a "signal" is a sequence of states, at the transmitter. A number or sequence of numbers is a subset of what Shannon calls a "message" and one now calls "data." The "signal" encodes the "message."
- Terry Oldberg 07:25, 15 December 2005 (UTC)
-
-
- Terry, You inspired me to look up Shannon's paper (here). Shannon does mention both messages and signals which occur in discrete time, and he mentions both continuously valued and quantized signals as well (his example is a telegram, where the signal is composed of "dots, dashses, and spaces"). This seems to make much of the discussion about digital and analog signals as well as discrete-time and continuous-time signals relevant to information theory. Is there a different preferred terminology in information theory that you could put in place of "analog", "discrete", etc., instead of removing these sections entirely?
-
-
-
- Part I of Shannon's paper is titled "the discrete noiseless system", in which the transmitter, channel, and receiver are all noise-free. The bulk of Part I is a discussion of the statistical nature of the message: "the messages to be transmitted consist of sequences of letters...they form sentences and have the statistical structure of, say, English." It looks as if Shannon's formulation calls for the message to be generated by a stochasitic process (or a process which, since we can't predict it, we must model as a stochastic process). For example, he generates a number of articifial sentences, with increasingly sophisticated models of English. Sections 2, 3, 4, 5, 6, and 7 of Part I discuss the stochastic nature of the source of the message. The transmitted signal has a stochastic nature only because it is generated (noiselessly) from the stochastic message.
-
-
-
- Shannon's Theorem 7 relates the entropy of the transmitted signal to the entropy of the source, or of the message. That Shannon would calculate the entropy of a signal seems to conflict with the statement that "The claim that entropy is a property of a signal or set of signals is incorrect." Shannon's paper refers to the entropy of both message sources and signals, and the capacity of the channel.
-
-
-
- In Part II Shannon uses the term received signal, indicating that the term signal is not only associated with the output of the transmitter and the input to the channel. This seems to conflict with the definition of a signal as "the sequence of states of a communications channel that encodes a message, at the transmitter end of the channel" [emphasis added].
-
-
-
- I'm not yet clear on whether frequency analysis has a place here. I hadn't yet dug it out of Shannon's paper, but I've just noticed in Part III there is some discussion of channels characterized by their frequency and impulse response --- I'll withold any opinion until I've dug in further to that part of the paper.
-
-
-
- To wrap up, going by Shannon's paper, most of the current version of the article does seem to be relevant to information theory. Signals may be either discrete-time or continuous, and they may be either continuously valued or quantized, but perhaps information theory uses different words for these concepts. Signals may occur at either end of a channel, and they are characterized by an associated entropy. There is still some room for improvement in the article to clarify the difference between sources, messages, signals, and channels; and the terminology might not be exactly right for information theory. Nonetheless the bulk of the material in the article should be cleaned up, not eliminated.
-
-
-
- The Photon 07:03, 16 December 2005 (UTC)
-
-
-
-
- It looks like this is the time for me to step aside from this issue. What's the difference between information theory and signals theory? I wanted to name this article "Signal (signals theory)", but that would have been a circular definition. I also don't like the proposed qualifier "circuits and systems", because signal theory transcends electrical engineering. --Smack (talk) 19:45, 21 December 2005 (UTC)
-
-
-
-
- Have a look at these questions in their most primitive form: a single bit of information is represented by a change in state, from one thing to another thing. So to measure anything, you need always to compare two things (before+after, upper/lower bound, hotter/colder etfc.). This change may be discrete or continuous, and is called a signal. -- Waveguy 22:37, 21 December 2005 (UTC)
-
[edit] Need a broader definition of Signal
The current message-oriented definition is way too narrow. Articles such as Spectral density need to reference signal, but there's no appropriate definition. Should we make yet another signal page for the broader definition? Or just broaden this one?
At the risk of opening up a can of worms, I'm going to attempt a new broad intro. Dicklyon 01:58, 3 June 2006 (UTC)
- I agree with you. Also, I think the title should be "Signal (electrical engineering)". I think there is no need to have a separate article for "Signal (information theory)" (we can have as a small section of this article). BorzouBarzegar 17:59, 7 June 2006 (UTC)
-
- I agree it would be nice to have something to link to the word signal, but on the other hand Wikipedia is not a dictionary. Despite all the fuss I went to above over this article, I now think the real best answer is to define the term signal in the appropriate articles, such as Information theory, Telecommunications, Signal processing, etc. This article is either just a dictionary definition, or a second-rate rehash of what ought to be in those other articles.
-
- That said, if you're going to keep this article, but broaden its sense to cover meanings of the word signal outside of information theory, then the name should definitely be changed. -- The Photon 04:40, 8 June 2006 (UTC)
- The concept of Signal in Electrical Engineering definitely needs an article. BorzouBarzegar 15:56, 8 June 2006 (UTC)
- If you mean we need distinct pages, what definitions would you use that would be different between information theory and electrical engineering. In all my training, I never found such a distinction. Or are you just supporting renaming this one? Dicklyon 17:44, 8 June 2006 (UTC)
- I'm just supporting renaming this one to "Signal (electrical engineering)". BorzouBarzegar 19:57, 8 June 2006 (UTC)
- It seems that there isn't any objection to change the title. So, I move the page. BorzouBarzegar 20:32, 8 June 2006 (UTC)
- I think it should be moved back to Signal (information theory), or maybe something else entirely. As the "examples" section of the page points out, "signal" is something that is used by biology, physics, etc. A continuous signal can be sent from point A to point B without an electrical circuit ever being involved. Neurons are the clearest example, but I'm sure there are (and will be) other examples. --Interiot 20:09, 21 May 2007 (UTC)
- If you mean we need distinct pages, what definitions would you use that would be different between information theory and electrical engineering. In all my training, I never found such a distinction. Or are you just supporting renaming this one? Dicklyon 17:44, 8 June 2006 (UTC)
- The concept of Signal in Electrical Engineering definitely needs an article. BorzouBarzegar 15:56, 8 June 2006 (UTC)
- That said, if you're going to keep this article, but broaden its sense to cover meanings of the word signal outside of information theory, then the name should definitely be changed. -- The Photon 04:40, 8 June 2006 (UTC)
-
-
-
-
-
-
- That's true, but electrical engineering is the field in which signals are mostly studied, so it works as it is. Information theory talks about signals, but has a rather narrower view of them than is used in EE and other fields, where they are quite often treated outside of an information framework. Dicklyon 03:02, 22 May 2007 (UTC)
-
-
-
-
-
[edit] Proposed merger
I propose merging Signal processing into this article because
- Signal processing is effectively a stub, and shows no sign of growing into a substantial article.
- If both Signal (electrical engineering) and Signal processing were developed into PERFECT articles, they'd overlap each other substantially (90%?).
- Digital signal processing and Analog signal processing articles also exist (and DSP is a substantial article), to carry on in greater depth about the techniques of signal processing.
-- The Photon 04:24, 10 June 2006 (UTC)
- I think they're distinct enough content areas to keep separated. The signals article talks about different types and classifications of signals, how the term is used in different fields (some of which, like information theory, have someswhat different usess than the broad signal processing field), and stuff like that. The signal processing article is more about techniques and application areas. I think merging them would be messy. Dicklyon 17:57, 10 June 2006 (UTC)
-
- The Signal processing article doesn't talk about any of that. Everything that's currently in that article would fit comfortably in this one.
- Digital signal processing does discuss the techniques and applications, and Analog signal processing could but its also just a stub. So my proposal is to give the top level overview in Signal (electrical engineering), and put applications and techniques information into the more specific "Digital ..." and "Analog ..." articles. From your input, I might change my above reasoning (2nd bullet) to say, "if Signal processing were developed into a PERFECT article, it would overlap almost entirely with either Signal (electrical engineering) or Digital signal processing or Analog signal processing."
- There's a sideline or tangential issue that Analog signal processing is not a common term (at least in my experience), and Filter theory or Filter or Analog electronics would be better titles for that article.
- -- The Photon 05:21, 11 June 2006 (UTC)
-
-
- I agree that there's a lot of overlap in the topics covered. However, signal processing is such a basic subject... I just don't think it would look right if someone were to search for signal processing and be redirected to some other article. Wouldn't it be better if the final article were called Signal processing, rather than Signal (electrical engineering)? Don't you think the former is a more common search term? --Zvika 17:01, 12 June 2006 (UTC)
-
-
- If I was sole editor of a traditional encyclopedia, I'd probably rather do it that way. There's so many different ways to define the word signal that each subject area should just treat the topic within its own article. But for Wikipedia, there's a couple of reasons to do it the other way around:
- Editors of other articles will want to be able to make a link to Signal more often than to Signal processing. If that article doesn't exist, someone will re-create it.
- It's clear (maybe just to me) that signal processing is a subtopic of signals, but not obvious that signals are a subtopic of signal processing. It's possible to have signals without signal processing, but not the other way around.
- But, I could go either way on this. -- The Photon 02:40, 13 June 2006 (UTC)
- If I was sole editor of a traditional encyclopedia, I'd probably rather do it that way. There's so many different ways to define the word signal that each subject area should just treat the topic within its own article. But for Wikipedia, there's a couple of reasons to do it the other way around:
I prefer having separate articles. This article should be about the basic concept of the signal and its types. "Signal processing" should be about the field and its branchs and basic techniques. I don't think that there will be much overlap. Even if we merge them now, we will eventually need to split them. BorzouBarzegar 13:12, 13 June 2006 (UTC)
[edit] Correction
Under Analog and Digital Signals -> Discretization, it says,
"DT signals often arise via of CT signals."
Should this just say, "via"?
--208.188.2.93 18:20, 18 July 2006 (UTC)
- Yes, fix it. Dicklyon 18:34, 18 July 2006 (UTC)
[edit] Recommend the Usage of Standard Signal and System Text
Regarding the comment from 2004 at the top of this page, in ABET accredited ELE/ECE junior-level signal and systems courses, a periodic time-domain signal is defined as a type of vector. Its vector components can be determined by way of the fourier series. The notion of a vector carries over into non-periodic signals where the fourier series is generalized to the fourier transform. A highly respectable reference to this train of thought can be found in Signal Processing and Linear Systems by B.P. Lathi. I suggest the addition of these and similar structural ideas from B.P. Lathi's text or another standard ELE/ECE text such as Continous and Discrete Signal and System Analysis by George R. Cooper and Clare D. McGillem. --Firefly322 03:33, 22 June 2007 (UTC)
- Certainly describing signals as vectors can be worthwhile. Limiting to signals that are periodic, or that have Fourier transforms, however, is pretty limiting, as it leaves out for example the signals that are stationary random processes. And lots of decompositions besides Fourier ones are important. So if you go there, try not to make it too narrowing. Dicklyon 03:41, 22 June 2007 (UTC)
- I do also agree that relying on a good solid source like a text book would lead to improvements. And if I don't like what your book says, I'll be motivated to go find alternatives in other books. Too often we argue here just because we're too lazy to find sources. Dicklyon 03:45, 22 June 2007 (UTC)
-
- Okay, thanks for the fast feedback... Now I realize it's not often taught in this way, but a process, random or otherwise, is in fact just a sequence of events. A good example is a Markov Chain where each probability matrix in the chain can be viewed as a single hyper-number conceptually similar to other hyper-numbers (e.g., a quaternion or an octonion). Anyway, my ultimate perspective is that random processes have components that map to the elements of vectors/matrixes just as any signal will--periodic or otherwise. And I'm just wondering if you are suggesting something else? --Firefly322 03:59, 22 June 2007 (UTC)
-
-
- Just pointing out that if you define signals in Fourier space, i.e. as Fourier series or Fourier transforms, then you leave out the ability to represent all the non-periodic non-square-integrable signals such as noise processes, etc. And I was thiking of continuous-time processes, finite-order or otherwise, but you're right that discrete-time processes are also in need of good representations.
-
-
- Certainly, transformations and decompositions can and should be extended to Laplace and beyond. Keep in mind that all real-world signals can be transformed by either Fourier Transform or the Fourier series. Even PIE-IN-THE-SKY ideal-signals such as those of non-periodic infinite energy (i.e., your non-square integrable example) can be transformed with a little care (e.g. take into account the range of interest, etc). --Firefly322 05:17, 22 June 2007 (UTC)
-
-
- Generally, that's where I disagree. "A little care" means you need to change to some other technique, such as spectral density via Wiener–Khinchin theorem, in which case you don't have a basis but only a second-order statistic, or short-time techniques; that's why you need to be careful about using Fourier techniques in a definition of signals. Dicklyon 17:44, 1 July 2007 (UTC)
-
-
-
-
- Moreover, that's where I disagree. Signals are a part of nature. And nature can never be designed out. Hence I've seen neither a Signals and Systems nor a Controls problem where certain didactic ideas such as you now mention have truly proven themselves. My belief is the Wiener-Kinchin theorem is in fact neither necessary nor useful. And if truth be told it has proven itself a weak didactic idea that should no longer be given much notice even inside a graduate school curriculum. Firefly322 14:52, 18 July 2007 (UTC)
-
-
-
-
-
-
- There are many, many important results stemming from random signal theory which cannot be derived in any other way. Some examples off the top of my head are Wiener filters, Kalman filters (many applications in both signal processing and control), the ARMA model and its applications (e.g., linear predictive coding, which is used in voice compression). These are all based on the idea of a spectral density and of modeling a signal as a random process. How can you say that these ideas haven't "proven themselves"? --Zvika 14:45, 18 July 2007 (UTC)
-
-
-
-
-
-
-
-
- At best these ideas are intellectual puffery, at worst they represent a case of the emperor having no clothes. I see it as specious to hold random signal theory so important. These applications can be and were derived from basic signal and systems and control theories. Firefly322 14:56, 18 July 2007 (UTC) (UTC)
-
-
-
-
-
-
-
-
- I suggest you refrain from name-calling, as it will not advance the discussion. Are you saying that the Kalman filter, for example, can be derived without using random signal theory? Can you cite a verifiable source to that effect? If so, that would certainly be an interesting addition to the Kalman filter article (and to my own knowledge of the subject). --Zvika 15:28, 18 July 2007 (UTC)
-
-
-
-
-
-
-
-
- It sounds as if I am being warned not to commit a thoughtcrime. So exactly what does Zvika mean by name-calling? Just to be pre-emptive, are you anything at all like Betrand Russell? He lied to the Western World about what he had seen firsthand in the Soviet Union. That's a historical fact. And this is relevant. Wiener studied with Russell for a time. Both Wiener and Russell's work were, if not designed (though that is highly debateable), at least used by their authors in furthering their socialistic/communistic political agendas. [Well documented historical facts that have numerous sources] So anyway forgive me if I'm wrong, but more often than not, when professors or whoever bring up Wiener they are conciously or subconciously bringing up a hidden agenda. Firefly322 03:40, 22 July 2007 (UTC)
-
-
-
-
-
-
-
-
- History records that a tracking/prediction system of Black et al from Bell Telephone Laboratories went head to head with one designed to Wiener's theories. The BTL system performed far better than the one designed according to Weiner's theories. [A history of control engineering, 1930-1955 / S. Bennett, 1993; IEE control engineering series ; v. 47 ] Firefly322 17:57, 21 July 2007 (UTC)
-
-
-
-
-
-
-
-
- If history records that, find the source and write it up in the wikipedia. I'd be very interested. And it would give you a more constructive way to contribute than just knocking what others have contributed. Dicklyon 20:22, 21 July 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
- "knocking what others have contributed..." Thanks if you're implying that I have made no contributions. My first comment on this page was due to some foolish errors about the meaning of signals that were easily dispelled through standard references. Now that I have in fact contributed a source as requested by Zvika, the next step would be for him to read it and share thoughts. Firefly322 03:46, 22 July 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
- Right, as far as I can see, you've editted nothing but this talk page, and have said nothing constructive yet. Also, please learn to use the preview feature so that you don't have to make dozens of edits to add one comment. Dicklyon 05:57, 22 July 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
- I would also recommend reading "A New Approach to Linear Filtering and Prediction Problems" by R.E. Kalman, Transactions of the ASME, Journal of Basic Engineering March 1960 p.35-44. and "New Results in Linear Filtering and Prediction Theory" by R.E. Kalman and R.S.Bucy, Transactions of the ASME, Journal of Basic Engineering March 1961 95-104 There is very limited evidence as to whether these papers have made actual contributions to engineering itself as represented in actual designs. References by other papers and professors I find very weak, especially in light of the public political agendas of such "contributors" as Wiener and Russell. Firefly322 04:09, 22 July 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
- Indeed, it has been a long time since I've read those; good idea to read again. But what is the political agenda that you're referring to? And where can I read about that? Dicklyon 05:57, 22 July 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- 1950. The Human Use of Human Beings by Norbert Wiener.
- 1966. God & Golem, Inc.: A Comment on Certain Points Where Cybernetics Impinges on Religion by Norbert Wiener.
- Firefly322 06:46, 22 July 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- OK, I got the books, and read parts of them. Seems like he was trying to be thoughtful and philosophical about how his contributions to "cybernetics" would interact with society. If it's political, it's not very good politics, because I pretty don't get what his agenda is. But in any case, I don't see how these writings dilute in any way the validity of his mathematical contributions to random signal theory and such. Dicklyon 07:02, 5 August 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- Done. Just added a moderately reasonable introduction. Firefly322 17:07, 17 December 2007 (UTC)
-
-
-
-
-