Talk:PhysX
From Wikipedia, the free encyclopedia
[edit] Nvidia bought AGEIA on 5th Feb (not on 14th)
Not on 14-feb, as it's said on the article
Link: http://www.xataka.com/2008/02/05-nvidia-compra-ageia-technologies
(Please, change this, i don't know enought wikilanguage to change it ;))
[edit] Crysis?
Crysis doesn't support PhysX in any way. Why is it included in the supported games list? —Preceding unsigned comment added by 71.126.13.221 (talk) 20:40, 28 February 2008 (UTC)
This article is currently just a paste of [1]. I don't know if such a short text is a copyright violation, but in any case this kind of sensationalistic writing is not appropriate. Fx there is no proff that this will have as big an impact as the GPU did. Thue | talk 21:31, 18 Mar 2005 (UTC)
[edit] I didn't copy+paste anything
Maybe the first guy did, but not I.
Is there any indication of how the card will be used by software? Will physics acceleration become standardised (within DirectX?), or will each application require direct access to the card? Tzarius 05:44, 14 July 2005 (UTC)
I remember reading somewhere on microsoft.com that they are hiring people to the DirectX team to work on physics, so DirectPhysics probably isn't too far fetched.
- AGEIA's physics libraries, NovodeX handles all the physics applications to add to gaming. Its already widely supported in the video game industry and pretty much is the only middleware that contains a wide selection of physics functions. Its optimized for PhysX use of course. Theres a wikipedia article on it.
[edit] Revamp-age
I revamped the article to be simpler to those who might not be familiar with the terms used, while still keeping all the techno mumbo-jumbo. I removed the "context" note, too, but if anyone feels that it's still needed, by all means put it back. --Zeromaru 00:14, 10 August 2005 (UTC)
[edit] Pardon my being blunt but this page is nonsense.
Extremely light on technical detail and it reads like an advertisement. --Andyluciano 00:03, 11 September 2005 (UTC)
- I'm inclined to agree. Even though the page consists of my own writing (itself a heavy rewrite of what was before), my edit was only to make it suitable to remove the "insufficient context" notice, which it became. --Zeromaru 02:53:40, 2005-09-11 (UTC)
[edit] Merge with PPU article
Do not Merge - With current formats for computer peripherals, such as the graphics card and sound card link to specific commercial applications of that technology. Ageia PhysX is to PPU, as Nvidia Geforce is to GPU. The Nvidia articles even go so far as to include a seperate articles for each and every series of Nvidia's Geforce line. A further issue with merging the two articles is that an emerging technology, which is not brand, or comercially specific, such a as a CPU, will carry Ageia's name. Which is not the case.
In conclusion, Ageia's PhysX is a PPU. But a PPU is not a PhysX. --Kizzatp
- I agree. PhysX is important enough to warrant its own article. I've removed the merge template since no-one has written their support here. ··gracefool |☺ 11:32, 20 February 2006 (UTC)
[edit] Progression
Is the PPU supposed to replace the GPU in its entirety, or are they supposed to work in tandem? teh TK 12:30, 31 May 2006 (UTC)
- They do different things. A graphics card is only responsible for figuring out what stuff should look like; when it comes to actually calculating how objects should interact with each other and the environment, there are many more calculations that normally have to be performed by the CPU. The idea with a PPU is that you take a specialized processor with its own resources and let it handle that, leaving the CPU free for other things. This is analogous to offloading graphics to a CPU or sound processing to a soundcard. (To take that further, if you've ever used 3dMark06, you know how hard your CPU sucks when it tries to do what your GPU does. It's a similar thing with the PPU.)
- As mentioned in the article, though, this does increase the work for the graphics card(s); it is possible for the PPU to be producing so many objects that even an amazing SLI/Crossfire setup won't be able to handle it without setting some stuff down. If PPUs succeed, maybe they will help make the development of more powerful graphics setups faster. More expensive, too. --Ted 16:57, 13 June 2006 (UTC)
[edit] PCIe
I read up on the BFG and ASUS offerings when they came out, and I'm 95% sure that they were both PCI cards. Is this just something that's planned (surely not x16), or what? I'm removing it for the time being. --Ted 16:59, 13 June 2006 (UTC)
- No PCIe announcement has been made - although, while in development some versions of the PhysX included a PCI Express connector as well as a PCI connector --PdDemeter 00:08, 23 September 2006 (UTC)
- 1x, I assume? --Ted 02:52, 23 September 2006 (UTC)
- It was indeed as demonstrated here. --PdDemeter 03:30, 23 September 2006 (UTC)
- 1x, I assume? --Ted 02:52, 23 September 2006 (UTC)
[edit] PS3
I have read on this article that the playstation 3 may not be compatible with Physx PPU. Is it true?
- Well, video game consoles aren't generally designed for user modification, so if they don't throw in a PPU, you can't have one. However, I do believe Ageia licensed their physics API to Sony for this, so the Cell may use a core or two for physics in the same way your computer would use a PhysX. --Ted 03:14, 23 June 2006 (UTC)
[edit] Useless?
Does the final paragraph of 'Criticism and doubts' counter claims that PhysX is useless? This looks to be what the author wants it to say, but there are other conclusions that could be drawn from the drop in frame rate. Poor optimisation for a general purpose CPU? Intentional crippling? --BAGale 23:16, 4 July 2006 (UTC)
'Proves' is now 'suggests', though I think this is still too strong an assertion, particularly without any figures for the fps of the CellFactor R36 demo using PhysX hardware. --BAGale 00:31, 5 July 2006 (UTC)
"This suggests that the PhysX hardware helps significantly when properly implemented". Fairly sure that's not what you meant. This implies there was some question of ASUSTek or BFG being at fault. --BAGale 08:15, 8 July 2006 (UTC)
- Good point. When I added those words, I was more trying to point out that there has to be proper software-level support for PhysX to do anything, as there is in the Cellfactor demo. Can you think of a less ambiguous way to write that in? --Ted 21:49, 8 July 2006 (UTC)
for me PhysX willl be useless after MS deploys it team to enhance direct X for physics calculation. so better keep away from it as 300 bucks is whole lot of money- chandrakant
- User:PdDemeter recently removed this section:
-
- A Reuters news article dated April 28th, 2006 stated that the PhysX processor would go on sale in the U.S. in May for $300.00, a price that had people raising eyebrows and asking whether it was worth spending $300.00 on an "unproven technology". Reuters stated that the processor could be well beyond its time because they mentioned a demonstration of the PhysX chip using the game Cellfactor and they said before the demonstration, the graphics level actually needed to be lowered in the game itself because Reuters stated that the PhysX processor "can generate so many objects that even the twin graphics processors in Hegde [AGEIA's CEO]'s top-end PC have trouble tracking them at the highest image quality." Reuters stated that "Hegde is betting that gamers will happily sacrifice some graphical fidelity in exchange for greater interactivity." Reuters also reported that the PhysX chip first debuted in March in high-end gaming PCs from Dell, Dell's Alienware unit, and Falcon Northwest. [1]
-
- Despite AGEIA claiming that a PhysX PPU (Physics Processing Unit) was required by the game, it was discovered that by adding "EnablePhysX=false" to the end of the CellFactor Demo launch shortcut, it was possible to run the demo without the aid of the PPU. Independent benchmarks had suggested that the PPU helped very little when faced with extreme uses of physics, such as launching a grenade from the assault rifle at a large pile of physics-enabled objects. This led many people to believe that AGEIA's PhysX technology is 'useless' and that the demo was rushed without correct testing.
-
- The CellFactor "R36" demo, released 2006-06-08, however, allows software cloth simulation without the appropriate PhysX hardware (with "EnablePhysX=false" appended to the shortcut), whereas the earlier demo only simulated rigid bodies in software (not the cloth or fluid effects that could be done in hardware). With cloth simulated in software the frame rate would drop as low as 2fps, down from an average of 30-40fps when only rigid bodies were simulated in software. This suggests that the PhysX hardware can indeed help significantly.
- I'm very nervous about simply removing this section. The criticism it contains is valid - the PhysX hardware does not provide much accelleration for some significant classes of operation - and without this, we have something that reads too much like a press release. I don't think the heavy reliance on quotes from the Reuters article is what we need - but we do need some words here. Do we have a reference to the 'Independent benchmarks' referenced in the second paragraph? SteveBaker 17:51, 16 October 2006 (UTC)
-
- I certainly think there should be a criticism section, however I'm not sure it should look anything like the previous one: as you pointed out, the first paragraph is very heavy on reuters quotes. I'm not sure the cellfactor example is a very good criticism (if anything, it simply shows there are no benefits for rigid body physics, but great benefits over software-mode with cloth/fluids). What do you think about more abstract criticism:
- no benefits for rigid body simulations
- there are significant challenges to designing a game where fluid physics are more than just graphical effects
- indeed, no current physx games (possibly save cellfactor) seem to get anything but a minor graphical boost
- initial cost of card & its power consumption, especially given the small number of titles sporting it currently
- I certainly think there should be a criticism section, however I'm not sure it should look anything like the previous one: as you pointed out, the first paragraph is very heavy on reuters quotes. I'm not sure the cellfactor example is a very good criticism (if anything, it simply shows there are no benefits for rigid body physics, but great benefits over software-mode with cloth/fluids). What do you think about more abstract criticism:
-
- Or we could just revert the edit and modify the section instead. --PdDemeter 20:21, 16 October 2006 (UTC)
-
-
- I think it's important to report facts rather than expressing opinions. Certainly my findings are that the PhysX chip is pretty much worthless for rigid body dynamics - but that would be 'original research' - which is frowned upon in Wikipedia. Instead we need to report published data. The second two paragraphs of the deleted material are just fine IMHO - except that they need to have references so that other people can fact-check. Without those references, we're on pretty shakey ground.
-
-
-
- As for the viability of Physics hardware in general - there is no doubt that a dedicated physics engine isn't going to make it into widespread usage amongst game players when there is a perfectly good solution sitting there in the form of the GPU. I've played around a bit with GPU-based physics and the hardware is actually quite well suited to the task of doing rigid body dynamics. SteveBaker 21:27, 16 October 2006 (UTC)
-
[edit] Opening+competition rewrite
I've just rewritten the opening and competition sections; I think the result is more readable, and more focused (especially in the competition); the competition section does get a little technical - hopefully not too technical! I'm also changing the NovodeX article to point to PhysX (since they rebranded the NovodeX SDK to PhysX SDK)
In addition, I've removed "AGEIA claims that their PhysX chip can perform physics calculations at one-hundredfold that of current CPUs and physics software." because after looking for quite some time, I can't find any such claim on their site anymore (and the reviews with the actual gains available don't support this anyway, so I'm not sure what this claim would add to the article). --PdDemeter 03:13, 23 September 2006 (UTC)
[edit] Long list of supported games moved
The L-O-N-G list of supported games was getting longer than the actual article - and kinda pointless. I moved the content over to List of games using physics engines - but to be honest, that list is also kinda pointless. If you want to continue adding to it - feel free. SteveBaker 14:34, 16 October 2006 (UTC)
[edit] Problems with using GPU's for Physics
The original statement that GPU's might be unsuitable for physics is (IMHO) flat out wrong. I know this for an absolute fact because I happen to be working on an OpenSourced GPU physics package for the Bullet library. I'm not finished yet - but I'm a 3D graphics expert (I've been doing it for 25 years) and I do have first hand experience of using GPU's for physics and I can clearly see where the limitations lie. However, I can't say this in the article because that would be original research which WP does not allow. What I can say is that before I can accept what I believe is a completely false premise - I need to see references...bloody good ones. I offer as counter-evidence the fact that both ATI and nVidia have actually demonstrated game physics running on their respective GPU's - and in both cases doing so a lot faster than the PhysX hardware manages.
I'm marginally OK with leaving the original claim there with just a citation needed tag - but I'm not going to stand by and have someone else's original research being put there...which is why I reverted it.
Wikipedia is not the place for a detailed debate about why GPU's are - or are not - suitable for doing physics. I'd be happy to debate that with anyone who has a clue what they are talking about - but this is hardly the right forum.
So - put up some evidence from an acceptable outside resource (ie nothing written by AGEIA) - or leave these false claims out of the article please. SteveBaker 20:36, 1 December 2006 (UTC)
- I'm not an expert, but should the article not refer to the PhysX software as an API and not an SDK? According to my understanding, an API is what allows different peices of software to interact (for example, a computer game implementing Direct3D to the software of a Direct3D card). An end user with an Ageia physics card will only normally have the API component. A software writer seeking to write an application that uses PhysX would use the SDK, which includes the API as well for obvious reasons. Or am I wrong here?
- More generally, perhaps it would be appropriate to improve this article by drawing parallels with the graphics accelerator revolution of the late 90s. It looks as if a 'physics accelerator' could become an essential part of computers used for certain purposes such as games. Of course, this is pure speculation on my part, but perhaps this is what the experts are predicting and it can be referenced...--ChrisJMoor 02:14, 16 January 2007 (UTC)
-
- Probably, though I'll leave it to someone who knows more to make a definite call. Definite no on the parallels to the graphics accelerator thing, though. The most enthusiasm about dedicated physics hardware comes from Ageia itself; every remotely independent opinion I've read has been no better than lukewarm about it. There's not nearly as much agreement about this being a necessary step in the sense that discrete GPUs were. --Ted 05:31, 17 January 2007 (UTC)
-
- The trouble here is that originally, the term "PhysX" was the name of the hardware which was a silicon implementation (or so they claim) of their older NovodeX API. They have subsequently decided to rename the API so it is also called "PhysX" - which leads to precisely this kind of confusion. There is a bit of a grey area between an SDK and an API. The term "API" generally refers specifically to the list of interface functions, structures, classes, etc. An "SDK" is a more general bundle containing the API and associated documentation - but also including some demo/example programs, maybe some tools, some sample data maybe. So the PhysX software can be both an API and an SDK. Someone purchasing the PhysX hardware would probably only get the driver software to install - you'd need to be a developer to get your hands on the API and/or SDK.
-
- You asked whether we should draw parallels with the graphics accellerator revolution. Yes, I think we should do that - but I don't think this is going to turn out that way (although Ageia would like us to think so). The fact that GPU's seem to be able to do at least as good a job (and I would argue: better) at doing physics - and yet be much more generally useful devices - means that the PPU concept is dead in the water unless/until there are very pressing reasons to get one as well as a GPU. I don't see that happening.
-
- In fact, I've been playing around with doing some of my AI calculations on the GPU too (stuff like route planning and running neural networks) - would Ageia see the world heading towards having a custom 'AIPU' (APU?) too? Their view would ultimately be that our computers would consist of a wild profusion of different special-purpose chips - each one providing a different kind of specialised service to the CPU. The truth is that what we're really heading towards is having one or more garganutan highly parallel compute engines that are descendents of the present GPU but which have no particularly specialised role - offloading whatever highly parallelisable tasks can be removed from the CPU - leaving the CPU to perform the necessarily serial parts of the algorithm as best it can. This is a more streamlined view of the world.
-
- Think about almost any parallelizable activity and you can generally map it onto the GPU architecture quite easily. Think about playing audio for example: Think of your 'sound font' as a set of one-dimensional texture maps. Think of frequency shifting and doppler effects as rendering those textures with a scale factor. Volume control is brightness or 'lighting'. Envelope shaping is like gouraud shading. Mixing is like translucent blending. So - stuff all of your source audio into textures - render them all onto a 1D frame-buffer object and you've got a sound sample - ready to be dumped out to a digital-to-analog converter. If you want stereo, render one channel of the audio into the red plane and the other into blue. You can trivially see how to implement reverb and echo and all of those things using shaders. So - the sound card can easily be implemented entirely on the GPU.
-
- I think the PhysX hardware concept will die with this first chipset. In a couple of years the 'PPU' will be an interesting curiosity whilst the GPU will have gradually taken teeny-tiny steps towards becoming a general purpose parallel computing engine that does Physics, some AI, audio and graphics. We'll find other uses for it as it becomes more generalised - but already it's clear that anyone with enough imagination could already use it for all of those things.
-
- SteveBaker 06:13, 17 January 2007 (UTC)
- As it's a claim they're making, we could cite their faq or one of their whitepapers (p4 - Putting GPUs to the test). As far as I can see, nobody outside of their marketing department is making this claim. Removing the claim's probably a good idea, since it doesn't seem to be based in the real world. Havok FX and nVidia's Quantum Effects Technology only seem interested in offloading effects physics anyway, and Ageia are saying that GPUs aren't great for gameplay physics. --PdDemeter 16:28, 17 January 2007 (UTC)
-
- Yep - but that's no use - we need an independent reference. But for all of their claims that GPU's are only useful for 'effects' physics, the few games where their hardware actually does anything all seem to be in the area of special effects. SteveBaker 21:36, 17 January 2007 (UTC)
-
- OK - I've read the section in the Ageia white paper. It doesn't say that GPU's can't do the job. It asserts (perhaps rightly) that the GPU is optimised for graphics and therefore not necessarily optimal for physics. That may be true - but despite that, we are seeing better results from using nVidia GPU's than from the PhysX system. They also make some odd claims about the nature of parallelism in the GPU - which are not a problem if you have a large number of physics objects being processed in parallel. Their other main claim is that regardless of all that, the graphics chip is already pretty busy and doesn't have time to spare for doing physics - so adding a physics engine must be a good thing. What they are missing is that we aren't necessarily talking of using a PPU plus a GPU versus using a single GPU. I'm thinking more in terms of something like the nVidia dual GPU setups. With two GPU's, games programmers can choose between using one for physics and one for graphics - using both for graphics or some other split. With a PPU and a GPU, all of that flexibility is gone. Furthermore, a dual GPU setup is vastly cheaper than a PPU plus a GPU - and will likely stay that way because of economies of scale. So I don't find Ageia's arguments particularly pursuasive. However, this is (again) original research - but it shows that we can't use the Ageia document to back up this claim. SteveBaker 21:58, 17 January 2007 (UTC)
- I'd be inclined to remove their claim, then. --PdDemeter 11:35, 18 January 2007 (UTC)
- OK - I've read the section in the Ageia white paper. It doesn't say that GPU's can't do the job. It asserts (perhaps rightly) that the GPU is optimised for graphics and therefore not necessarily optimal for physics. That may be true - but despite that, we are seeing better results from using nVidia GPU's than from the PhysX system. They also make some odd claims about the nature of parallelism in the GPU - which are not a problem if you have a large number of physics objects being processed in parallel. Their other main claim is that regardless of all that, the graphics chip is already pretty busy and doesn't have time to spare for doing physics - so adding a physics engine must be a good thing. What they are missing is that we aren't necessarily talking of using a PPU plus a GPU versus using a single GPU. I'm thinking more in terms of something like the nVidia dual GPU setups. With two GPU's, games programmers can choose between using one for physics and one for graphics - using both for graphics or some other split. With a PPU and a GPU, all of that flexibility is gone. Furthermore, a dual GPU setup is vastly cheaper than a PPU plus a GPU - and will likely stay that way because of economies of scale. So I don't find Ageia's arguments particularly pursuasive. However, this is (again) original research - but it shows that we can't use the Ageia document to back up this claim. SteveBaker 21:58, 17 January 2007 (UTC)
[edit] Unfair Article?
Seeing as half the text in this article is about Havok as a PhysX competitor, I followed the link to the Havok article. There, absolutely no mention is made of PhysX. I'm not certain if this discussion is the best place for this comment or the one for the other page, so I'll add to both.
[edit] GPUs accelerate Vector Graphics?
This is way less important than the above efforts that are going in to making this article of a neutral standpoint. If you have some free time, take a look at this line: ...the GPU is used to accelerate vector graphics, and, by extension, 3D graphics.
Either I'm an idiot or this line is false. Since when have GPUs ever accelerated any vector images? Flash cartoons are just as slow with my new video card. Is this some weird way of saying that GPUs draw "vectors" between vertices? If so, it should be changed. It's confusing geektards like me, and some people might actualy think that flash animations are affected by your graphics card. 70.56.212.176 06:42, 14 April 2007 (UTC), "Anonymous Coward"
- A 3d scene is a vector graphic - have a look at Vector graphics and compare it with Raster graphics. We could probably disambiguate it to something along the lines of "a gpu is used to accelerate the rasterisation of 3d vector graphics" if we want to keep the original meaning. Alternatively, I've changed the sentence to "accelerates the rendering of 2D and 3D graphics", since this isn't an article on graphics, so we can probably use a high level imprecise description. Thoughts? --PdDemeter 00:36, 15 April 2007 (UTC)
[edit] Edited by Havok?
This page seems to be heavily oriented towards the Havok Physics Engine. Compare the two articles, you will see that there is a very heavy bias. Given that the page is about a specific product, not physics engines in general, it is very inappropriate. —Preceding unsigned comment added by 193.128.118.250 (talk) 10:35, 19 October 2007 (UTC)
I agree. I'll try to balance it out. --68.57.177.113 01:48, 7 November 2007 (UTC)
Havok FX is now effectively dead and I've removed references to it 121.72.131.61 (talk) 07:35, 15 December 2007 (UTC)
[edit] Supported titles - where are the names?
There's a section here called "Supported titles", but it doesn't actually mention any commercial games that use this product at all - the only two titles it names, Warmonger and CellFactor, are both promotional games that are being given away for free to promote the product! Meanwhile, the "Competition" section names two award-winning commercial games as examples of the 150+ titles it says use Havok physics.
This is a bit weird. Surely an article on the PhysX product should say more about games that use PhysX than games that use Havok? (If there aren't any major commercial games that use PhysX yet, then perhaps the "competition" section should be cut down instead, to reduce the imbalance.) 81.86.133.45 20:53, 25 October 2007 (UTC)
[edit] Added some stuff, and now I feel guilty
I've added an accurate UK price based on Google Product Search, changing "£50-£100" to "£90 to £145", added a list of games that support the hardware, and added the number of Physx games compared to the number of Havoc games available. Now, I just feel like a kid picking the limbs off a spider. Sorry, Ageia. Unreadablecharacters (talk) 16:31, 3 January 2008 (UTC)
You can get them for £75 from overclockers.co.uk, and the list of games on the Ageia website is way out of date 121.72.129.13 (talk) 10:24, 4 January 2008 (UTC)
[edit] Multiple problems
"Games using the PhysX SDK can be accelerated by either a PhysX PPU or a CUDA enabled GeForce GPU."
this can not currently be done, rather say: "Games using the Physx SDK can be accelerated by a PhysX PPU or in the near future, with the CUDA port of the PhysX drivers, be accelerated by a CUDA enabled GPU (Graphics Processing Unit)If the GPU = Graphics.... hasn't been mentioned yet.
"Stats and specifications (1st Generation)"
It can only be the first Generation if there is, or is to be a second generation.: "Stats and specifications"
"With Intel's cancellation of Havok FX, PhysX on CUDA is currently the only available solution for effect physics processing on a GPU."
again physx on the gpu is avaible yet
"With Intel's cancellation of Havok FX, the CUDA PhysX port will in the near future be the only available technology to do physics processing on a GPU for games" —Preceding unsigned comment added by 196.209.73.175 (talk) 21:39, 17 May 2008 (UTC)