Wikipedia:Reference desk/Archives/Computing/2007 October 2
From Wikipedia, the free encyclopedia
Computing desk | ||
---|---|---|
< October 1 | << Sep | October | Nov >> | October 3 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
Contents |
[edit] October 2
[edit] Pseudocode operator
What does the operator := do? Example at Euclidean_algorithm#Using_iteration Steeltoe 00:49, 2 October 2007 (UTC)
- It's the assignment operator in some languages. --tcsetattr (talk / contribs) 00:56, 2 October 2007 (UTC)
- Early programming languages were concerned that the '=' symbol had two different meanings - one was assignment (A=6 meaning that whatever value A had before, it is now set equal to 6) - the other was a equality test (A=6 being 'true' if A really is equal to 6, 'false' otherwise). There was a concern that people would be confused about this and that mathematicians would start to tremble when they saw 'A=A+1' because this looks like an equation.
- Some languages chose to replace the assignment function with an alternative symbol. Algol used a left-facing arrow - but when the ASCII character set became common, there was no arrow symbol - and Pascal adopted the ':=' to mean 'becomes equal to'...an assignment.
- Other languages (FORTRAN for example) decided to stick with a simple '=' sign and used '.EQ.' for a comparison. Yet other languages (the original BASIC dialect for example) opted to use a key-word ("LET A=6" is an assignment in BASIC) - later versions of BASIC dropped the 'LET' and deduced what was intended from the context. That's dangerous because A=B=6 could mean 'assign 6 to B then assign B to A' or it could mean A=(B=6) meaning that A is true if B equals 6 or false otherwise.
- The C language went the other way - it used '=' for assignment and added '==' to represent a comparison. Somehow, the C way of doing things has won out and this convention is used in pretty much all modern languages (C++, JAVA, PHP, Python, etc). Some formal or 'pseudo'-languages stick to ':=' and I guess that's what the article you found did. SteveBaker 13:47, 3 October 2007 (UTC)
-
- The idea behind its use is so that a person, regardless of what programming language they understand, will be able to read the pseudocode. It's sorta like an "international" programming language that focuses on program structure and not language syntax.--Mostargue 14:14, 4 October 2007 (UTC)
I only read this RD occasionally, but am dropping in to note what I believe is an error in the above. As I understand it, the left-arrow character was in the original form of ASCII, around 1962, but the character set was revised around 1968 and the arrow was dropped because it wasn't much used and it was felt that the underscore (_) would be more useful. There was one other change made at the same time: the up-arrow was replaced by the caret/circumflex (^).
Note that the A in ASCII stands for American. Perhaps if ALGOL had been more popular in the US, there would have been a demand to keep the left-arrow character in ASCII. In any case ASCII wasn't really the only character set to consider: when it was new, there were several other character sets competing with it, and I don't think many of them, if any, had the character.
--Anonymous, 03:40 UTC, October 9, 2007.
[edit] processor
my machine just got bambooooozled by a serious case of trojans viruses and worms.i need help.i cant system restore,i cant access the task manager.they have gained access to my registry and have changed some things there.its pretty terrible,antiviruses arent working.i use kaspesky 7 but it kills some but some its says the voterai or sumthing trojan was not found.what the guaranteed way of getting rid of this wretched things...dont tell me to run xp again please. 2.in school our teacher refers to the cpu as the processor and also as the microprocessor.is the microsprocessor the same as the cpu.i thought a micro processor was a small processor —Preceding unsigned comment added by 212.49.81.186 (talk) 10:09, 2 October 2007 (UTC)
I am sorry to hear that your computer was infected (212.49.81.186), but your situation seems serious enough for me to warrant a hard disk full format. However, there may be better solutions such as running a latest liveCD version of Knoppix and fighting the malware from there.
I hope to hear from other Wikipedians about the effectiveness of using LiveCD. In any case, I would suggest you to go ahead and read the wikilinked articles, if you have some time.
Please refer to Microprocessor for a full treatment to your second question.
Regards, Kushal --KushalClick me! write to me 12:13, 2 October 2007 (UTC)
- The processor, microprocessor, and CPU (central processing unit) are the same. It is small, and is central, and is a unit, so you can add any of those terms, or not, if you want, or don't. The only possible confusion would be with the graphics processor or math coprocessor. The graphics processor is normally called a graphics card, though. StuRat 13:29, 2 October 2007 (UTC)
-
- A processor is anything that does calculations.
- A microprocessor is a processor that fits onto a single chip (which they all do these days). There are likely to be MANY microprocessors in your PC - there is probably one in the keyboard, another on the graphics card, probably one on each disk drive. Years ago, we had minicomputers and mainframes whose processors were too big to fit on a single chip - hence the distinction between processor and microprocessor is becoming outdated.
- The CPU is the central processor. In a PC, it's the processor that runs your programs and the operating system. You can sometimes have more than one CPU in a computer - in which case, they all run your programs for you.
- The other processors are simply there to make their specific part of the system work - and perhaps to communicate that data to the CPU. Notably, many modern PC's include a 'GPU' which handles all of the graphics calculations for you. One company has even produced an ultra-specialised 'PPU' to solely perform that task of doing physics calculations in computer games. There are all sorts of specialised processors out there. SteveBaker 13:36, 3 October 2007 (UTC)
[edit] xp
i installed windows xp twice on my machine.i want to delete one installation.i tried by deleting the windows folder of one of the installation now one installation is corrupt.how do i delete the installation so that when i boot i dont get the option to select which installation i want to log into. 2.can i install 3 operating systems in my machine.i want linux,xp and vista.what are the demerits of doing so and what are the minimal specs my machine needs to be having?to maintain such a tall order. —Preceding unsigned comment added by 212.49.81.186 (talk) 10:15, 2 October 2007 (UTC)
AFAIK, Yes, you can install three operating systems on your computer. I would suggest you to go in this order: first install Windows XP, then install Windows Vista, then install <strikethrough>Kubuntu</strikethrough> your distribution of GNU/Linux.
Regards, Kushal --KushalClick me! write to me 12:17, 2 October 2007 (UTC)
- Since you only run one O/S at a time, you can have three available so long as the computer meets the requirements for each. The only thing cumulative about the requirements would be hard disk space. StuRat 13:23, 2 October 2007 (UTC)
- I once tried to triple boot XP, vista, and linux. IIRC you have to hide the XP partition from the vista installer so it thinks it's installing to the primary partition, then either do something really weird with GRUB or settle for separate Windows/Linux (GRUB) and Vista/XP (winload, which has to be manually configured with msconfig from vista) boot menus. I never could get it to work --frotht 17:43, 2 October 2007 (UTC)
- Possible workaround: You could install two separate hard drives in your computer, and just alternate between them using the BIOS at startup. This would solve the problem of having XP and Vista on the same "machine". You could run linux on both of them very easily with a LiveDistro. This approach offers the lowest interoperability, with the highest simplicity of installation and maintenance, because none of the components will produce unexpected interactions. (See also, Category:Virtualization software) dr.ef.tymac 00:21, 3 October 2007 (UTC)
[edit] How much would the US government pay for this algorithm
How much would the U.S. Department of Defense pay to get its hands on a BigO(log(n)) algorithms that solves the factorization problem or the discrete logarithm problem. I was just wondering because recently the RSA factorization challenges were removed and now there are no incentives to figure out new fast factorization algorithms. Since most asymmetric algorithms deal with this problem and many of them deal with protecting financial assets, would this algorithm actually be worth billions of dollars? —Preceding unsigned comment added by 128.227.158.141 (talk) 14:19, 2 October 2007 (UTC)
- If you actually discovered such an algorithm, you'd probably wake up blindfolded and bleeding on the floor with a knee on your back and next thing you know, you're a terrorist without a trial serving a life sentence in guantanamo bay solitary under the Patriot Act. Publish far and wide anonymously. --frotht 17:36, 2 October 2007 (UTC)
- Well, you might alert world governments a year beforehand so as not to cause quite so much chaos, but anonymously publish far and wide eventually. If you're trying to sell the algorithm under the threat of releasing it, you'd undoubtedly be considered a terrorist and taken down in secret.. they wouldn't just hand you a billion dollars and make you promise not to tell --frotht 17:40, 2 October 2007 (UTC)
- First off all, there is almost certainly not any O(log(n)) algorithms, maybe there exists polynomial algorithms, but certainly not logarithmic ones. As for how much money a fast way to crack asymmetric algorithms would be worth, billions of dollars is a conservative estimate. If you had an algorithm like that, virtually all transactions on the internet would be public, from terrorist communications, credit card transactions, bank sessions and everything else you could imagine. Believe me, the RSA factoring challenges were not, by any stretch of the imagination, even close to being a big incentive (the prize money was what, a few ten thousands of dollars?) Mathematicians all over the world are working their asses of on this problem, and it's not because they hope to win some prize. Almost certainly there are also thousands of people employed by various governments of the world working on this same problem. This would be the holy grail of modern cryptography. Personally, I'm fairly certain it's not possible. To get further, you need quantum computers. --Oskar 17:36, 2 October 2007 (UTC)
- The government would not pay you big money for your algorithm. They would declare it classified and arrest you for distributing it, if you distributed it (it would no doubt fall into one of the prohibited export restriction categories). And then probably use it without your permission. See Invention Secrecy Act, for example. Now maybe you could find someone else to pay you a lot of money for your algorithm, but rest assured finding a buyer without ending up in jail would be tricky, and you wouldn't necessarily be out of the legal thicket at all if it got traced back to you. --65.112.10.56 20:46, 2 October 2007 (UTC)
All this is so interesting to read. Adds a 'humane' touch to the impersonal academia questions at hand. LOL Anyways, before I say anything more and find remarks like rtfm on my talk page, I should leave it here. --KushalClick me! write to me 02:52, 3 October 2007 (UTC)
There is no "factorization problem". Any number can be factored in O(1) -- namely, in about 786,000 operations, which takes just milliseconds on a modern processor. 84.0.127.58 07:21, 7 October 2007 (UTC).
- So if I give you the product of two very large prime numbers, you can factor the result into the two large primes in a reasonable amount of time? Epylar 23:28, 7 October 2007 (UTC)
[edit] Antitrust laws: support for .odt
Isn't refusing to support OpenOffice files when OpenOffice supports .doc files, considered an anticompetitive strategy? --137.120.3.217 16:25, 2 October 2007 (UTC)
- No. It would be anti-competitive to do otherwise. For example, you create your own word processor. Microsoft supports your format, forcing you to support theirs. Now, you have to set aside assets to develop support for MS Word or get sued for anti-competitive practices. -- kainaw™ 16:28, 2 October 2007 (UTC)
-
- With all due respect, I find your logic somewhat specious and contradictory. Grandparent, I would warrant this question is more suited to the Humanities reference desk. wilymage 00:25, 3 October 2007 (UTC)
-
- I gotta say, Kainaw, that "explanation" makes no sense to me whatsoever. I think what you mean is, "It would be anti-competitive to force Microsoft to have to devote resources to ODT files" but boy you've picked a not very clear way to express it (and I'm not sure you're right, but I don't know a thing about anti-trust law). --24.147.86.187 00:23, 3 October 2007 (UTC)
-
-
- The OP stated the situation: OpenOffice supports .doc files, but Word does not support .odt files. The OP then asked if that was an anticompetitive strategy. I have to assume Microsoft was chosen in this example because they have been sued repeatedly for possibly illegal anti-competitive practices (and many times, they have to pay out large fines). So, I put it back on the questioner. What if he develops a word processor that does .wrd files. Then, what if Microsoft decides to support .wrd in Word. Now, does that mean that the questioner is using anti-competitive practices? Should Microsoft sue him? If that makes absolutely no sense, then I'm obviously not reading the question correctly. -- kainaw™ 01:23, 3 October 2007 (UTC)
-
-
-
-
-
- I mean that it would be anti-competitive to make rules such that "if company A supports company B's format, company B is required to support company A's format." The question appears to me to imply that if someone supports your format, you must support theirs - and considers it anti-competitive to fail to support their format. I wanted to point out an example of how requiring everyone to support everyone else's format allows for anti-competitive practices. If Microsoft doesn't want to compete with you, they just support your format and then run you out of business by forcing you to go back and support all of Microsoft's formats. -- kainaw™ 18:11, 3 October 2007 (UTC)
-
-
-
-
-
-
-
-
- I don't think he was asking about your hypothetical rule. Tempshill 19:09, 3 October 2007 (UTC)
-
-
-
-
- To 137.120, volumes have been written on this issue, and quite frankly, you are not likely to get a conclusive answer regardless of where you ask this question because there are multiple ways to evaluate the difference between "anticompetitive practices" and "legitimate trade secrets". Also, the answer you get will change depending on whether you ask an antitrust lawyer, an economist, or a VP of business development for a technology firm.
- Unless you clarify which "angle" of this subject most interests you, the best starting point for you is probably Vendor lock-in, followed by Competition law. Also, as a side note "anticompetitive" does not necessarily equal "violation of anti-trust laws" ... just in case that wasn't obvious. dr.ef.tymac 01:11, 3 October 2007 (UTC)
- I would say it's anticompetitive, yes, but not illegal. Is it a good idea from the Microsoft POV ? Perhaps in the short run, since ODT files are only a small part of the total market so not supporting them isn't much of a handicap now. However, as that format grows in popularity, MS will either need to offer support or this will be seen as a serious limitation in their software, causing people to go elsewhere for their word processing needs (say Linux running Open Desktop). In short, such practices only work if you have a stranglehold on a market, and MS is rapidly losing their stranglehold. Thus, they will need to change strategies or go under. StuRat 14:13, 3 October 2007 (UTC)
- Wilymage is correct, this isn't really a technology question. That said, you should start with what your definition of "anticompetitive". Broadly speaking, "anticompetitive" might mean "doing things that are mean to your competitors", which is, broadly speaking, quite legal. I think what you are really asking is whether this is illegal unfair competition. In the US, at least, under the Sherman Act, it is illegal to restrain trade with your monopoly. So, giving away Office for free would destroy the market for OpenOffice, and therefore would probably be judged illegal under the same rationale used in United States v. Microsoft; but I think it's unlikely that a judge or jury would decide that Microsoft was restraining trade by not taking an affirmative step to make Office read and write other file formats. If such a decision were reached for some reason, it would introduce a weird slippery slope: is Microsoft therefore illegally abusing its market position 1000 times over, by suppressing the 1000 other commercial and shareware word processors and spreadsheets out there; and must Microsoft keep Office compatible with all of them? Tempshill 19:09, 3 October 2007 (UTC)
[edit] How is schema definition referenced in an XML document?
In an XML document whose validity constraints are specified using a schema definition, how is the schema definition specified/referenced in the XML document? (I'm assuming that XML does provide a method for such specification/reference.) Is it required that a valid XML document specifies/references the DTD/schema with which its validity can be checked? --64.236.170.228 20:39, 2 October 2007 (UTC)
- First, have you had a look at these? XML Validation, Document Type Definition, XML Schema (W3C), XML Schema Language Comparison. If yes, please be a bit more specific, since the answer is not the same for all of them. If not, please have a look at the articles first for a good overview. HTH. dr.ef.tymac 20:56, 2 October 2007 (UTC)