Talk:Control theory

From Wikipedia, the free encyclopedia

This article is within the scope of the following WikiProjects:
This article has an assessment summary page.

Contents

[edit] Cruise control

I fail to see the correctness of

"(Does not apply to manual transmission vehicles.)"

What specifically does not apply? Holding the throttle fixed works the same for a automatic and manual transmission. Removing sentence and call for further clarification. Cburnett 00:07, 6 Jan 2005 (UTC)

Locking the throttle on an automatic transmission does not lock speed, due to losses in the torque convertor. --Sponge! 04:10, 12 Jan 2005 (UTC)

Locking the throttle on an engine does not lock in speed; it doesn't matter what the transmission is. Change in angle changes the torque on the engine, which changes the speed. So I still don't see how the quote is relevant... Cburnett 05:15, 12 Jan 2005 (UTC)

It is good to put an edit summary when you write something. When I saw that this page was modified, and no summary was put, I thought it was vandalism (it happens all too often unfortunately). Besides, putting an edit summary helps people who have this article on their watchlist understand what you are up to. This is just a thought. Oleg Alexandrov 05:25, 12 Jan 2005 (UTC)

By the way, do you know anything about optimal control? That page needs some work. See that page and its history. Oleg Alexandrov 05:25, 12 Jan 2005 (UTC)

Ugh, of all the control classes I've taken....optimal control was my least favorite. I'll add it to my list. Cburnett 05:42, 12 Jan 2005 (UTC)

[edit] Clarification

The stability section needs to be explained better. 128.112.86.171 ---- (sig added by Cburnett; please use ~~~~ to sign your posts)

What exactly do you find can be improved. For someone, like me, that knows this stuff...it's a bit harder to know exactly what you think needs to be better explained. I'll try though. Cburnett 03:46, Jun 9, 2005 (UTC)
Stability is a really general topic (BIBO stability, Lyapunov stability, total internal stability, exponential stability, asymptotic stability, global stability, etc etc etc). There's no clarification of what "bounded" means, or really even what "input" and "output" mean - really you're talking about the norms of the input and output signals, and then you have to talk about what a "signal" is. You're kind of assuming the reader is familiar with alot of the terminology and the mathematics, but I don't think that's really your target audience. --M0nstr42 21:45, 4 November 2005 (UTC)

[edit] Control in General

Various things need fixing, I think. There are various misleading things (the Laplace and Z-transforms aren't interchangeable, since they apply to continuous- and discrete-time systems respectively, for example). Various things could be explained more clearly, and the odd extra block diagram wouldn't be amiss.

Also, the history section surprises me - it talks about aeroplanes, but says nothing of Bode, Nyquist, and the like.

There's nothing in the entire Control section (at least that I've seen so far) about sensitivity or complementary sensitivity, which could be added to the classical control section.


If no-one has any objections, I intend to do a major reworking of a lot of this stuff. Not just the Control Theory article, but others. The Nyquist Stability Criterion could be better stated, and 'encirclements' really ought to be changed to 'anticlockwise encirclements'. If they're clockwise, you've got poles crossing the imaginary axis towards the right in the s-plane due to closing the feedback loop and applying gain, and you're making things worse. There are many other things, and it's probably not worth me writing them all down here, since they pertain to other articles

Jenesis 23:10, 20 October 2005 (UTC)

Jenesis, please do. It sounds like you know what you're talking about. I have alot of background in control theory, so if I can be of any assistance, leave me a message. --M0nstr42 16:07, 1 November 2005 (UTC)
I am not sure that the Wright Brothers did anything for the _theory_ of controls. Perhaps this should be taken out
Seconded. --SirTwitch (talk) 16:29, 15 May 2008 (UTC)
Did you mean to say "more so than the ability to produce lifT from an airfoil" (not life)? --M0nstr42 21:45, 4 November 2005 (UTC)


Well, I've had a go at a few things such as stability (asymptotic and marginal), and corrected some errors (which is generally quicker and easier than expanding the content). Unfortunately, the things I'd like to do here and the things I have time to do aren't really on a one-to-one standing. A few MATLAB plots would be fairly quick and easy, though, and diagrams are generally pretty helpful in Control. Anybody with access to MATLAB and the Control Toolbox could be of help there. I've checked with The MathWorks, and plots generated with an Academic copy can be freely distributed. Jenesis 22:59, 7 November 2005 (UTC)
The eqn shown for PID is incorrect I believe, control doesn't work off output but error (I did not edit it)I would disagree that PID is the simplest feedback control, likely the most common however.I can elaborate if need be. Billymac00 09:43, 2 January 2006 (UTC)

[edit] Many problems

In "controllability and observability", "states" and "unobservable poles" are discussed but not defined, therefore the section is incomprehensible for those it is intended for. It seems that a state is not defined in the whole article. Should we add a simple example, such as $\dot x=Ax+Bu$ and $y=Cx+Du$, where ABCD are constants (or constant matrices)? Or perhaps we should use a discrete-time system ($x_{n+1}=Ax_n+Bu_n$) for simplicity. Also in "stability" various other forms of stability should be mentioned at least, although they could be more exactly discussed in a separate article. However, basically the structure of this article might be a good idea: the reader can print it and easily study some basics at one glance. Tilin 14:37, 29 December 2005 (UTC)

[edit] Moved comment in Stability section to here

Someone posted this in the actual article under "Stability". Please in the future, post to the Comments page.

(It seems that here the author assumes that the transfer function of the system is rational. Can someone confirm if this is true also for nonrational transfer functions? References?)

68.40.50.73 06:24, 28 January 2006 (UTC)

[edit] History

I am not sure about that. But I think Kalman played a main role in control and filtering theory. Did not the NASA engineers say they needed two things for the travel to the moon: Newton's law and Kalman's filtering? I thoght he was a milestone in control theory, but I am not sure about that. I just saw many theorems named after him. Does anybody know more? Gala.martin 20:37, 6 February 2006 (UTC)

Kalman filtering is a topic in control theory that has to do with estimation of states that can not be directly measured. It is actually an iterative (discrete or continous) algorithm that uses information from measured states and the mathematic model of the actual system. The term filtering comes from the early days when operational amplifiers were used if my memory serves me right. Fannemel 20:50, 20 February 2006 (UTC)
Yes. Estimation of quantity conditionally to observations randomly perturbed. From a mathematical point of view, that's quite close to control theory. I just want to remark that Kalman had a main role in control theory history. He also understood what's the situation with attainable states, and proved several basic theorems in non linear control theory. Gala.martin 21:02, 21 February 2006 (UTC)

[edit] Appendix A

Wont this appendix be better placed in transfer_function? Fannemel 21:02, 20 February 2006 (UTC)

Possibly, but I don't think it is needed at all. An ecyclopedia usually only provides the relevant equations (the ones that the reader might want to use), and leaves all deductions to references. In this case we could refer to any elementary textbook on control theory. --PeR 11:13, 14 June 2006 (UTC)

[edit] Feedback control loop example

I think the control loop example should have a feedback other than unity (1). An example of which would provide a more general explanation.

I'd add a multiplier of k but I don't readily have access to matlab. Cburnett 13:49, 23 March 2006 (UTC)
Why do you need MatLab? Knotgoblin 02:54, 24 March 2006 (UTC)
a) It definitely looks like a matlab simulink schematic; b) got an alternative? Cburnett 03:01, 24 March 2006 (UTC)
I have access to Matlab and plenty of other similiar apps. I can try to make one. Knotgoblin 18:47, 27 March 2006 (UTC)

[edit] Mathematical Control Theory

Mathematical Control Theory is quite general. It is possible to state control problems in a general setting, that contains -for instance- continuos and discrete controls as particular cases. What about adding a section about Mathematical Control Theory? I am a Mathematician, but I have a Physics and Electronics background. I should be able to work it out. Let me know. gala.martin (what?) 05:53, 20 April 2006 (UTC)

[edit] Explanation

I find many of the explanations here extremely deficient. For example the link between impulse response and control theory is entirely unexplained: so the meaning of the equation x[n] = 0.5nu[n] simply cannot be deduced. What is x here? Or n? Have we implicitly moved to an iterative/difference equation scheme rather than an ODE model?

What is (s), used as an argument to P(s), etc? A Laplace transform variable? What variable is the Laplace transform carried out with respect to? As a PhD mathematician with only a little control theory, I found myself no wiser after reading the middle few sections. Having said that, the control strategies section and introduction are much clearer. -- GWO

[edit] Controllability vs Reachability

What historically has been named 'controllability' is ambiguous. Let x0 be the equilibrium state (usually taken to be 0) of a system in the absence of an input then.

  1. Reachability of an arbitrary state, xf, from an arbitrary state xi is the ability to transfer from the initial state xi to the final state xf in some time by applying a suitable input.
  2. Controllability of an arbitrary state, x, is the ability to transfer from this state x to the equilibrium state x0 in some time by applying a suitable input.

Obviously reachability implies controllability. For linear stystems in continuous time, both concepts are equivalent. However, a discrete time system my be controllable without being reachabable. A trivial example is:

 
x_{k+1} = A x_k+0 \,u_k

where A is nilpotent.

Mastlab 21:21, 10 September 2006 (UTC)

[edit] Further Reading section

I just added a Further Reading section, since I figure this article would benefit from having some external resources to point to. I populated it with my undergrad controls text - probably not the best resource to add, but it's a necessary section and needed to have something inside - feel free to remove it as others are added. 17:52, 22 November 2006 (UTC)

[edit] Cruise control, revisited

I changed the way the cruise control was described. The desired reference is of course the speed of the vehicle. The output, the variable which effects change, is of course the engine. The end result of the control is an effect on the vehicle speed, an 'output variable'. To simplify that concept from a general engine or vehicle speed to a specific output, the throttle position was chosen by some author. The desired result of the control was to control the vehicle speed, while the practical result, the actual output itself, is simply control of the engine. I do not see the logic of calling the physically controlled output the 'input variable'.

In order to further illustrate this difference, I added a couple clarifications in the following paragraph or two.

To illustrate this difference further here, let us consider a single closed loop controller managing the temperature of a closed room. The desired reference is let us say 72'F. The actual reference, the sensor input is telling the controller the temperature is 70'F. The controller sees the error and operates the output to increase the temperature. In this situation, let us assume the output is a proportional steam valve feeding a bank of registers. You would not call the proportional steam valve the input variable any more than you would call the throttle position the input variable.

-Garrett 68.63.108.124 15:22, 1 October 2007 (UTC)

The control variable u is very often referred to as the "input variable" (because it is an input to the physical system.) Using the term "control variable" or "control input" is perhaps better, as it is less ambiguous. However, referring to the reference signal as "input", while technically not incorrect, is not the way the language is used. The change you made to the section ons open-loop and closed loop control uses the opposite terminology of any control theory textbook I've ever read. Input/output is determined as seen from the physical system, not the controller. (However, the system input is referred to as "output" is when discussing the actual controller hardware, where, of course, this is an output port.) --PeR 18:19, 1 October 2007 (UTC)

[edit] Reverted anon

Recent anon tried to state that in the case of the control signal was voltage - this is most likely the situation, but may not be the case generally. Consider the situation where you have a fluctuating supply voltage (for some reason, bad power - batteries - who knows!) and you are measuring that to try to consider not burning out your motor, so you might have the ability to vary the load on the motor to keep it running at a constant velocity or torque. For example you might have a hydraulic brake (why you would do this is beyond me) or you might be stirring a fluid, which must be stirred at a constant velocity and you can raise or lower the fluid level. The control signal wont always be voltage; my examples are contrived but I believe possible. User A1 (talk) 13:17, 27 March 2008 (UTC)