Talk:Type system
From Wikipedia, the free encyclopedia
[edit] Talk archives
[edit] Java Contradiction (fixed)
This article claims Java as both a dynamically typed and statically typed language in short succession. I know you could call Java dynamic through casting, but this subtlety is not explained in the immediate section that claims Java as a dynamic language. Someone has now fixed this - thanks, this item may be removed.
[edit] The human compiler
This has led some writers, such as Paul Graham, to speculate that many design patterns observed in statically-typed languages are simply evidence of "the human compiler" repeatedly writing out metaprograms.[3]
The quotation is related, but Graham talks about "OO languages" here:
This practice is not only common, but institutionalized. For example, in the OO world you hear a good deal about "patterns". I wonder if these patterns are not sometimes evidence of case (c), the human compiler, at work.
He also uses Python as an example, which is not statically typed. We should either remove the link or put it in a better fitting context. --88.73.13.40 15:46, 24 September 2006 (UTC)
- I'm pretty sure Paul Graham is using "dynamic" in a more expansive sense than merely "dynamic typing". He compares everything to the standard of Lisp, which is just about the most dynamic language that exists. --FOo 03:24, 19 January 2007 (UTC)
-
- Agreed, I added that sentence, however I misunderstood what he was saying and I've now edited the article to remove the given statement. - Connelly 12:26, 21 February 2007 (UTC)
[edit] Duck-Typing dynamic??
I disagree that Duck-Typing is dynamic. If anything, it is "type ignorance": moving the whole concept of "what to do with data" outside of the scope of language-defined typing. Instead, interfaces to objects are (more or less) directly called. Directly saying "give me the integer value of x" doesnt make x any less static, x is simply asked to provide the only acceptable value. Sometimes that is itself, sometimes it is interpreted dynamically by x, but that doesnt make the return type of int(x) dynamic, nor does it make x itself dynamic. On a language-level, x and int(x) are static (though not necessarily safe)
To me, "dynamic typing" is something like: "if I say x='A' it's a string, but if I say x='3' it's an integer". "duck typing" is more like: "If it walks like a duck, talks like it a duck, it's the other guy's fault if it's not delicious"
Final argument: the ability to call atoi(ptr); doesnt make ptr a dynamic type. You've just explicitely called an operation on it. --vstarre 17:22, 18 January 2007 (UTC)
[edit] ALGOL 68: Strong dynamic typing?
ALGOL 68 supports Union types (modes).
mode node = union (real, int, compl, string);
Usage example for union case of node:
node n := "1234"; case n in (real r): print(("real:", r)), (int i): print(("int:", i)), (compl c): print(("compl:", c)), (string s): print(("string:", s)) out print(("?:", n)) esac
The mode (or "type" in "C" speak) of n would normally determined at run time from a hidden type field (unless optimised out). So could I call ALGOL 68 "Strong Dynamic" or "Weak Static" typing? Or is there another typing name?
NevilleDNZ 11:03, 1 February 2007 (UTC)
[edit] What is static typing?
The article states "A programming language is statically typed if type checking may be performed without testing equivalence of run-time expressions." Now this might be a great definition for Robert Harper's book, or a great definition in the ML Universe, but what does it mean here? How does it relate to the conventional definition of "Statically typed: all type checking of variables is done at the compilation stage?" I don't find it particularly well defined (and thus not particularly meaningful) as written. Also is dynamic typing the complement of static typing? Are the two sets even disjoint? The article uses "A programming language is dynamically typed if the language supports run-time (dynamic) dispatch on tagged data." It seems like you could do all type checking of variables at the compilation stage and also (inefficiently) use dynamic dispatch at run-time, therefore it is possible to make a language that is both statically and dynamically typed. This makes no sense; clear definitions are needed. You can say my objections are pathological, yet Common Lisp is said to be "dynamically typed with optional static type declarations" so one can ask how this was arrived at. Can every statically typed language be made into a dynamically typed dual with the addition of an "Object" or "Variant" and dynamic dispatch? Is then Java not the result of taking a statically typed language and applying the operation which results in a dynamic language? Is then Java not dynamically typed? Is this all just social science B.S. or are there any clear or meaningful definitions to be had? - Connelly 18:33, 20 February 2007 (UTC)
- Good questions. The first is somewhat moot since the "without testing equivalence" definition has been removed from the article, but I'd like to try to explain it. The only place I've run into that definition (the following comments have much less weight if I'm wrong) is in Harper's chapter of Pierce's Advanced Topics in Types and Programming Languages (MIT Press, 2005, p. 305). There are a few things to say about this:
- That definition was given in that book chapter in order to make a certain subtle but important point about module systems, which is what the chapter is about. In particular, he is contrasting static typing not with dynamic typing but with "dependent" typing.
- Harper, unlike Wikipedia, has the luxury of defining his terms however he wants -- especially in contexts like that book chapter, where the only person he has to convince that his definition makes sense is Pierce!
- In my experience, Harper (who was on my thesis committee) likes to give unexpected definitions for terms to make his readers think. His goal is the "aha!" experience that occurs when the reader realizes his wacky definition and the usual one are related, gaining some new understanding in the process.
- IMO, the key to understanding this one is that in the next sentence he says that testing equivalence of run-time expressions is "sometimes called 'symbolic execution'". So try rephrasing the definition to "...without evaluating any expressions in the code being checked." If type-checking requires evaluating expressions, then (in general) it cannot be done before run time because the expressions might perform I/O or have other effects in the course of producing their values, or might not terminate. Thus, a type system with this property cannot be called "static", but (for Harper's purposes) there's not much harm in calling any other type system static.
- As for the second question: "Is dynamic typing the complement of static typing? Are the two sets even disjoint?" I'm glad you asked! No and no! See Type safety#Type safety and strong typing. Under definitions like Harper's, static typing just means that all the checking of types that must be done at compile time (i.e., because the language definition says programs that fail those checks aren't supposed to be compiled) can be done at compile time. So languages with no compile-time type checking are statically typed (yes, you read that right!) -- and any language where the necessary static checking (if any) leaves behind a need to check some things (or everything) at run time is dynamically typed. Thus Common Lisp and Java are both statically typed, and are both dynamically typed.
- So to your third question: "Is this all just social science B.S. or are there any clear or meaningful definitions to be had?" Yes, there are clear and meaningful definitions, but they aren't what most people think they are. Creating a satisfactory Wikipedia article out of this situation involves a lot of B.S. :-) Cjoev (talk) 21:03, 17 December 2007 (UTC)
This is another item within the static typing topic: I have to disagree with the statement about C# in the article "However, a language can be statically typed without requiring type declarations (examples include Scala and C#3.0)". C# executes in the .NET runtime, which goes allows us to use concepts beyond static compilation. It is true that C# is strongly typed so I propose striking the reference to C#3.0 in the sentence of the article. --Ancheta Wis (talk) 13:16, 25 December 2007 (UTC)
[edit] Dynamic data typing
Dynamic data typing at present is very ahort and lacking context. I propose that it is merged into this article. If at some future time there is a lot more written about dynamic data typing a demerge can be considered but at present a redirect here is more appropriate as I see it.--NHSavage 09:18, 31 March 2007 (UTC)
- 69.140.164.142 23:26, 5 April 2007 (UTC)
- Compared to the complex, specialized language used in this article, Dynamic data typing is an easy read and sure helped me understand the whole concept of a type system. Let's not lose that aspect if we do a merge into this article's section on static and dynamic typing, which explains the concept but fails to elucidate it for non-compsci majors. --Martinship 07:02, 3 April 2007 (UTC)
- I'll start on the merge on my sandbox page. Comment and edits welcome.--NHSavage 18:11, 3 April 2007 (UTC)
- Merge complete. I hope I have made the article easier for a non advanced audience without losing any information.--NHSavage 19:11, 12 April 2007 (UTC)
- I'll start on the merge on my sandbox page. Comment and edits welcome.--NHSavage 18:11, 3 April 2007 (UTC)
[edit] JavaScript Dynamic Typed ?
Are you sure JavaScript is dynamically typed as you can declare variables without first initializing them
ie: Var x;
x="Wiki"; Paulchwd
[edit] Dynamic typing: a naive view
Allow me to expose my own view about dynamic typing, which might be fairly naive since I do not know about the literature on the subject or type theory.
I always thought that dynamic typing was simply the ability to contain, within a single variable, an object of different possible types, hence making the real type of the variable "dynamic".
For example, with nominal subtyping, a variable of type Base can actually contain (or refer to, depending on the semantics of the language) an object of any type derived from Base.
Same with structural subtyping, except derivation is implicit.
That also happens with discriminated unions: if the variable either contains an object of type T1, of type T2, ..., or of type Tn, the real type of what it contains is then dynamic.
None of these, however, are considered dynamic typing with the provided definition, since they're completely type-safe constructs. Does that really make those static typing though? According to the Java description, it seems they are considered as such. When I discovered that I realized my definition was in contradiction with the one other people seem to use.
Of course, all of these can be used in the static domain instead of the dynamic domain, just replace variables by types and types by meta-types. Static subtyping is provided by C++0x Concepts or Haskell Classes, for example. Static discriminated unions would be the same as overloading.
Then there is duck-typing. PHP, Python, ECMAscript and all provide dynamic duck-typing. C++ templates provide static duck-typing.
Duck-typing is extremely unsafe, since validity of an operation is only checked when it is done, so it can fail anytime. Well, if it is static duck-typing, hence working in the meta world of compile-time, the only problems that it generates is hard-to-understand compiler error messages.
According to the dynamic typing definition, it seems the only valid form of dynamic typing is duck-typing, which funnily enough is not even tied to dynamics.
That seems fairly restrictive to have a term "dynamic typing" if it only exists to speak of dynamic duck-typing.
By the way, dynamic duck-typing is usually equivalent to having all variables of the following 'variable' type (pseudo ML-like code):
type variable = Object of object | Function of (variable list -> variable) | Procedure of (variable list -> unit) | Int of int | Float of float | Array of (variable, variable) map | Resource of int ... other native types and type object = (string, variable) map
One can also remark that's fairly similar to prototype-oriented languages.
Unless it is converted to some other kind of typing (which would require whole program analysis), that explains the inefficiency of such dynamically typed languages. A solution seems to JIT-compile them, in order to have types evolving within the compiler at runtime.
--140.77.129.133 (talk) 04:38, 30 March 2008 (UTC)
- I was going to comment, then noticed the section title which is apt ;-) Try looking at the Duck Typing page for more info on that subject for example. --Paddy (talk) 07:08, 30 March 2008 (UTC)
I don't see how that brings any more useful info or anything new to the discussion.--140.77.129.158 (talk) 08:54, 3 April 2008 (UTC)
- I thought it would help point out the differences between C++ templates and duck typing. It doesn't help in general to talk of C++ templates being static duck typing and the duck typing entry, as well as its talk page would help you find out why. --Paddy (talk) 13:36, 3 April 2008 (UTC)
[edit] Dynamic typing != no type checking
A large part of the section Type_system#Static and dynamic type checking in practice is void as the writer assumes, falsely, that Dynamic languages do not do type checking when many type check at run time. For example, Python is a strongly typed language where type compatibility is checked between objects themselves rather than any variable referring to such objects and this is enforced at run time. --Paddy (talk) 04:43, 16 April 2008 (UTC)
[edit] loose typing
I've seen the term "loose typing" several times. Could we add a section on some of the things this term might be used to refer to, similar to the Strongly-typed programming language article? Herorev (talk) 18:04, 30 April 2008 (UTC)
[edit] repeatedly incorrect paragraph about type inference
under "Explicit or implicit declaration and inference", there is a paragraph with some problems.
"Type inference is only possible if it is decidable in the type theory in question."
That's not true. GHC's "AllowUndecidableInstances" extension does some type inference. The only catch is: when you enable that flag: although GHC might terminate successfully, it might however not terminate at all if you've done the wrong thing.
"Haskell's type system, a version of Hindley-Milner, is a restriction of System Fω to so-called rank-1 polymorphic types"
Nay, that's Haskell98. Everyone (well, hugs and ghc anyway) supports Rank2Types.
", in which type inference is decidable."
Except for polymorphic recursion, which Haskell98 allows but requires explicit type-signatures for. Including those, I'm not sure if Haskell98 is theoretically described by "a restriction of System Fω to so-called rank-1 polymorphic types" (though that's mostly an issue of overly theoretical terminology, to me-on-Wikipedia)
"Most Haskell compilers allow arbitrary-rank polymorphism as an extension, but this makes type inference undecidable. (Type checking is decidable, however, and rank-1 programs still have type inference; higher rank polymorphic programs are rejected unless given explicit type annotations.)"
It makes *complete* type inference undecidable. (But it already was, due to polymorphic recursion; and many other Haskell type-system extensions require type signatures in particular places too.) The presence of one undecidable function, can be fixed by the type annotation and the rest of the module doesn't then require type signatures.
In fact, some Lisp compilers are known to attempt some type inference for optimization purposes.
How should we fix this paragraph? We don't want to just delete it and give the reader the impression that type inference is a magical thing that will fix all their type errors while keeping all their working untyped programs valid. I looked at the main article Type inference for inspiration and wasn't very impressed with it...
—Isaac Dupree(talk) 14:10, 29 May 2008 (UTC)