Cell Factor!!

I haven't found many games that have gripped me for a fair while, in fact apart from hard drives I haven't bothered to upgrade my computer for around 18 months but this.... this changes everything.

amazing_phy_4.jpg
amazing_phy_2.jpg

It's been known about for a while, yet as skullknights game board has managed to miss it, I have taken it upon myself to spead the word of the latest revolutionary game due. Get your physics cards ready boys and girls, you're going to need it for this one as I don't think it's possible to run without the new hardware.


With many games the pictures make the game look better than it is, but Cellfactor is the complete opposite. Find video trailers here;

http://gaming.hexus.net/content/item.php?item=5198

http://www.gamespot.com/pc/action/cellfactor/index.html

They aren't small in size but it's best to see the game in all it's glory.
Due late 2006 I think, but with a game like this I wouldn't be surprised if they take longer, dispite looking fairly finished.

Also info on the new Ageia Physics Processing Unit for those new to the concept, sounds like most games in the future will be using physics cards like this;
http://www.hothardware.com/viewarticle.aspx?articleid=816&cid=2


God it looks like fun, Can't fucking wait....
 

Walter

Administrator
Staff member
FINALLY!! BOXES AND PIPES EXPLODING WITH ACCURATE PHYSICS!!!

This looks more gimmicky than "POrtal," and thanks to the physics card, will cost more too. Then again, flying upside-down through the air while shooting a mounted gun on a jeep does look kinda fun :guts:
 
You can play the game without a physics card, it just wont run as well and some extra features wont be available. I dont think the enviroments are destructable or something, the developers werent very clear what wouldnt work without the physics card.

Honestly Crysis is looking more interesting to me in general and looks like it would be worth the upgrade to Vista and to buy a DX10 card.
 

Aazealh

Administrator
Staff member
Sparnage said:
It's been known about for a while, yet as skullknights game board has managed to miss it, I have taken it upon myself to spead the word of the latest revolutionary game due. Get your physics cards ready boys and girls, you're going to need it for this one as I don't think it's possible to run without the new hardware.

This game doesn't look all that revolutionary to me, and as HawaiianStallion said you don't actually need a PPU to get it to run. Actually having a PPU doesn't change all that much from what I've heard, despite this game being a demo for the PhysX card.

Sparnage said:
Also info on the new Ageia Physics Processing Unit for those new to the concept, sounds like most games in the future will be using physics cards like this

I don't think most games will require that kind of card before a while, if they ever do. I know I'm not buying one.
 
if it doesnt have a 3rd person perspective option, im not interested... its really not that hard to do (from what i understand)... battlefront really nailed it... have 1st and 3rd felt real nice... and i dont understand why more games arent implementing a "dive and roll" or "sprint"... i was just playing quake 4 and as far as game play control goes, i felt like i was playing wolfenstein on the jaguar64 (slight exaggeration)... i just beat prey... it was a bit gimmicky but fun... i liked the whole anti gravity and portal stuff... in fact, without it it would have sucked (veil the suckiness in a shroud of gimmickiness (?))... even that game felt stiff and so did black and almost ALL these 1st person shooters that get pumped out for PC... blahckh
 
I'd say was a pretty good FPS, it wasnt that much out of the ordinary but I loved the enemies as even with a lack of variety I didnt notice as they were some of the best around, probably still are. Playing the game on its toughest setting without using slow mo... blows away a lot of other FPS games toughest settings.
 
Sparnage, the screenshot on the left is from a very early PhysX techdemo, not CellFactor.

Yes, I've had CellFactor: Combat Training running without the hardware, it is possible. On my admittedly ageing machine (P4 2.4C overclocked to 3.2GHz) it ran okay - until I started interacting with a large number of objects at once. This triggered memories of being forced to sit and feign interest as my Grandma showed me slides of her annual migration to warmer climates. The thing is, when you force it to run in software, it only does rigid body physics, which are nowhere near as computationally taxing as the sexy cloth and liquid stuff (which are purely aesthetic effects but meh). Another thing to consider is that Combat Training is pretty much a tech demo itself, and Ageia's API isn't even complete yet, let alone optimised.

Now, you guys need to see check out the vid for CellFactor: Revolution. It's already making Combat Training look a little dated. The vid only shows what seems to be multiplayer in another big arena, but they're saying the single player is going to be a lot more interactive, and will definitely necessitate physics acceleration. Seeing though they have financial backing from Ageia I really don't doubt they can design a game around the hardware that would grind to a halt without it.

To all the PPU naysayers, this is only the beginning. I'm positive that physics acceleration in one form or another (let's see if ATi's solution is everything they're touting it to be) is here to stay. Back when I first bought my Voodoo there was many lols to be had about how I had wasted my money when one could simply render graphics with ones CPU.
 
Walter said:
This looks more gimmicky than "POrtal," and thanks to the physics card, will cost more too. Then again, flying upside-down through the air while shooting a mounted gun on a jeep does look kinda fun :guts:

Am I to read that as you don't think "POrtal" will be one of the most amazingly awesome FPS games ever released? Because I'm really psyched about it. I've always been a fan of PvE logical puzzle games (stuff like Sokoban and the puzzle mode of Chu Chu Rocket, for example), and POrtal looks to be an excellent blending of that genre with the FPS. Really innovative, and looks to be fun and challenging as hell!
 

Aazealh

Administrator
Staff member
"Griffith No More!" said:
Realistic pipe and box physics; isn't this just Half-Life 2? =)

Nah, this is for people not skilled enough to use/create a purely software physics engine, so they need a $300 card to do it for them. :SK:
 
I don't see anything in Cellfactor that couldn't be done without another card. The only difference is that there'd only be 10 boxes flying around instead of 1000. Who cares?

It's obvious from Cellfactor that they haven't figured out just why you'll need this card -- The main appeal, presumably, is you can destroy environments realistically, but if you can destroy environments arbitrarily then it becomes extremely difficult to design levels without people being able to bypass obstacles. So then it becomes, "Well, you can destroy certain designated parts of levels... Plus lots of boxes and other non-gameplay critical junk." Aside from lots of extra junk, this is no different from any other game that has cinematic scripted sequences. The only difference with an Ageia card is now the work of creating that scripted sequence is offloaded from the game development team onto your wallet.
 
What I find odd is that games like Crysis have fully destructable foliage, buildings and vehicles (with things like plants reacting to you as you move past, not just clipping through) and it doesnt even support the physics card.
 

Aazealh

Administrator
Staff member
HawaiianStallion said:
What I find odd is that games like Crysis have fully destructable foliage, buildings and vehicles (with things like plants reacting to you as you move past, not just clipping through) and it doesnt even support the physics card.

Well there are people in this world that are able to properly code a graphics/physics engine on their own, without the need of specific libraries using specific hardware. Since their engine wasn't specifically designed to use the Ageia PhysX technology, it's not compatible with it. That's why I was saying that it'll take quite a while before "most" games use cards like these, assuming it's successful.
 
Aazealh said:
This game doesn't look all that revolutionary to me, and as HawaiianStallion said you don't actually need a PPU to get it to run. Actually having a PPU doesn't change all that much from what I've heard, despite this game being a demo for the PhysX card.

I don't think most games will require that kind of card before a while, if they ever do. I know I'm not buying one.

I find it hard to believe a PPU would not change much. Games using the physics card like will run far from it's potential without it. It's not really debatable; have specialised hardware for the software, it is capable of doing more. I guess it comes down to whether you can justify it or not at this point in time. Like Tiddlywinks said it's like the Voodoo running graphics with separate hardware from the CPU. It's going to become mainstream in one way eventually.

The PS3 is investing in PPU for their system now.  And even if... IF most mainstream games don't require such a card in, oh say 3 to 5 odd years with upcoming games, It'd likely be heavily recommended for maximum results. Crysis probably would've done well with using more than a software PPU, but perhaps started development before the concept was more recognised.

The only thing that would stop me buying this PPU is how much compatibility it will have with future games; ATi claim that you can use graphics cards to perform physics calculations just as if not better than Ageia. I have heard Nvidia also want to start putting PPU capabilities onto their cards in the future. So it's acknowledged it has much more potential than physics software by itself. 

Even if PPU's were put on graphics cards they are still using specialised hardware to support their cards. It could also mean many games end up supporting the future graphics cards with PPU attached because the market responded better to something more familiar and convenient, therefore having many games support non Ageia cards for physics. It's likely games wouldn't run under both PPU's.


Denial said:
I don't see anything in Cellfactor that couldn't be done without another card. The only difference is that there'd only be 10 boxes flying around instead of 1000. Who cares?

It's obvious from Cellfactor that they haven't figured out just why you'll need this card -- The main appeal, presumably, is you can destroy environments realistically, but if you can destroy environments arbitrarily then it becomes extremely difficult to design levels without people being able to bypass obstacles. So then it becomes, "Well, you can destroy certain designated parts of levels... Plus lots of boxes and other non-gameplay critical junk." Aside from lots of extra junk, this is no different from any other game that has cinematic scripted sequences. The only difference with an Ageia card is now the work of creating that scripted sequence is offloaded from the game development team onto your wallet.


Because it wouldn't run as well without one? Why not rave about not bothering to upgrade period as new programs released to PC that require better system requirements that aren't non physics? Is it necessary for games to update their graphics at all in that case? Well they keep doing it, and the principles exactly the same whether it'd be physics, graphics, AI, map size or whatever.

Games like this update to technology's potential because it often creates a better experience. The market wants games that have more interaction, options and realism, physics are of no exception.
 
Aazealh, please. The decision to use Ageia's solution is because of developer ineptitude? I'm sure it has nothing to do with wanting to utilise the hardware, but I can't for the life of me figure out why these slack developers didn't just go with Havok. I mean, that way you don't alienate any potential customers when there's clearly no benefit to be had from hardware acceleration.


HawaiianStallion said:
What I find odd is that games like Crysis have fully destructable foliage, buildings and vehicles (with things like plants reacting to you as you move past, not just clipping through) and it doesnt even support the physics card.

Crysis does indeed look fun, and is one of the reasons for the new rig I'll be upgrading to in the next few months. That said, people are thinking the physics are a tad more special than they are. They looks nice, but they're really not doing anything Havok couldn't if you threw enough CPU at it. The trees are split up into largeish segments so don't break exactly where the bullets penetrate (and only shake the leaves if that's where they hit), and you can think of the buildings as merely delicately balanced boxes. We're going to be seeing similar jungle scenes where the trunk splinters exactly where you shoot, not necessarily taking down the tree, and large leaves tearing realistically as you empty a clip through them. Besides, the first time I recall seeing foliage that could be distorted by the characters body was in the original Hitman (the best one of the series in my opinion), which is a good five or six years ago now, ages in technology terms.

Physics calculations benefit greatly from processors that are inherently parallel, so with the way CPUs are at the moment they're just not going to be able to perform anywhere near as well even before you chuck other stuff like AI, sound, map and animation data at them. Yes, no matter what engine you use, even if you program it yourself. :troll:


Denial said:
I don't see anything in Cellfactor that couldn't be done without another card. The only difference is that there'd only be 10 boxes flying around instead of 1000. Who cares?

It's obvious from Cellfactor that they haven't figured out just why you'll need this card -- The main appeal, presumably, is you can destroy environments realistically, but if you can destroy environments arbitrarily then it becomes extremely difficult to design levels without people being able to bypass obstacles. So then it becomes, "Well, you can destroy certain designated parts of levels... Plus lots of boxes and other non-gameplay critical junk." Aside from lots of extra junk, this is no different from any other game that has cinematic scripted sequences. The only difference with an Ageia card is now the work of creating that scripted sequence is offloaded from the game development team onto your wallet.

Then maybe this is a good thing, and we'll finally move away from standard linear yawn fests in which we must find the key/push the button/decapitate baby/etc. to pass said obstacles. Anyway, I don't think that's the main appeal - there's gonna be sweet things people will do over the coming years that we wouldn't have ever thought of.

I'm a racing sim buff (no, Grand Turismo is not a fucking sim), and I can't wait to see what happens with this genre. I not only expect my car to crumple in a life like manner when I inevitably crash, but for the damage to directly affect the driving model. I want the same calculations that determined how cumpled my car is to accurately decide how bent my axle or how rooted my suspension is, and then I want to feel it. Even further, if a piece of an opponents car manages to hit my arm with enough force, I want it to break and inhibit my steering ability. Or if my torso is unfortunate enough to make acquaintance with the engine, it would be nice to know whether I died or not, factoring in how resilient my ribcage and internal organs are, of course.
 

Aazealh

Administrator
Staff member
Sparnage said:
I find it hard to believe a PPU would not change much. Games using the physics card like will run far from it's potential without it. It's not really debatable; have specialised hardware for the software, it is capable of doing more. I guess it comes down to whether you can justify it or not at this point in time.

Hahah, just the reaction I expected. :troll: No offense Sparny, but you're as fervent as ever in your efforts to defend stupid causes, and you miss the point too. You may find it hard to believe but in the first demo issued it didn't change much, if anything at all. You can go ahead, download it and test it by yourself. There are articles about it all over the web. Now since that game is specifically created to make use of that PhysX card, you can bet they'll try all they can as time goes by to make it unplayable without it, even aside from the fact that using its capacities should help the performances (otherwise it'd be completely useless). Now, did you know that their PhysX-based engine is too intense for current graphic cards? And that the developers of Cell Factor had to lower the graphics' quality of the game in order to fully implement the physics engine in there? Sounds like a good case of "you can't justify it at this point in time" to me.

Which is what I said in my post, did you read it?

"I don't think most games will require that kind of card before a while, if they ever do."

I stand by what I said, and I don't think your reply to it is very pertinent. And yeah, if they ever do, because ATI/AMD and NVIDIA aren't going to sit there idly while it happens.

Sparnage said:
Like Tiddlywinks said it's like the Voodoo running graphics with separate hardware from the CPU. It's going to become mainstream in one way eventually.

It's like that in a way. There are differences, even though you may not see them. I don't feel like going into details over this right now, but the context isn't quite the same.

Sparnage said:
And even if... IF most mainstream games don't require such a card in, oh say 3 to 5 odd years

Hahaha, great prediction there buddy. How about 10 years? When you talk about "a while" in this domain it's a year and even that is long; now 3 to 5 years, that's enough time for anything to happen. It just moves so fast.

Sparnage said:
Crysis probably would've done well with using more than a software PPU, but perhaps started development before the concept was more recognised.

They could have done well using hardware physics acceleration? No way!? :isidro: No shit they could have, who couldn't? Hardware acceleration is always a plus for everything by default, it's pretty sad that you felt this was an argument. Nobody is contesting it. The point was that they didn't need to. Neither did the guys at Valve when they made HL2. Just nitpicking, but a processing unit can't be software by the way.

Sparnage said:
The PS3 is investing in PPU for their system now.

No, they bought the rights to include the PhysX development kit, which is basically a big middleware library, into their own PS3 SDK. No "PPU" here. Since the PS3's supposed to come out in 3 months it'd be pretty sad if they were still "investing" in new hardware for it.

Sparnage said:
The only thing that would stop me buying this PPU is how much compatibility it will have with future games; ATi claim that you can use graphics cards to perform physics calculations just as if not better than Ageia. I have heard Nvidia also want to start putting PPU capabilities onto their cards in the future. So it's acknowledged it has much more potential than physics software by itself.

Shouldn't it rather be that the only thing enticing you to buy this card is its compatibility with games and the gain of performance (as well as a hypothetic and unlikely lack of concurrence)? That's what I was getting at from the beginning, and it's the reason I'm not buying one of these PhysX cards. You seem to agree though so I guess we're cool.

Sparnage said:
It could also mean many games end up supporting the future graphics cards with PPU attached because the market responded better to something more familiar and convenient [...] It's likely games wouldn't run under both PPU's.

ATI was already developing hardware physics acceleration before Ageia came out of the woodwork. And it's Half Life 2 that started the trend and made physics engines mainstream with Havok. Also, Ageia requires the use of specific, proprietary technology and libraries, so I doubt the "games would run on both" part in the current context.

Have fun blowing your 10,000 crates anyway; and sorry if I ruined your mood, it wasn't my intent. :guts:

Tiddlywinks said:
Aazealh, please. The decision to use Ageia's solution is because of developer ineptitude?

Well, developer ineptitude and budget costs, yes. They choose not to develop their own engine and use one pre-made for them (well, for the major part at least), and in the case of Cell Factor they get paid for it. It also gives a lot of publicity to their game which otherwise would just be FPS_0021544.

It's really not all that different from people buying graphics engines... Remember Unreal? These guys had the balls and the skills to create their own engine and give id Software the finger. Same with Valve when they made Half-Life, and with the CryEngine, etc.

Tiddlywinks said:
I'm sure it has nothing to do with wanting to utilise the hardware, but I can't for the life of me figure out why these slack developers didn't just go with Havok. I mean, that way you don't alienate any potential customers when there's clearly no benefit to be had from hardware acceleration.

Actually, it's a valid point. There's a reason Ageia is paying for Cell Factor's development, and it's because otherwise no game would make use of their product in such a way that playing without it would kill the gaming experience. What they're trying to do basically is to take over the market before the big companies (ATI/AMD & NVIDIA) create their own solutions. I'm not sure they'll succeed honestly, but I can't blame them for trying. They've also managed to sell their technology to Sony so in that respect they've already done well. I don't really see them sticking around as king of PPUs, they'll probably be bought or other technologies will eclipse theirs. If ATI/AMD and NVIDIA develop their own thing fast enough and work with Microsoft to optimize DirectX then it'll be the end of Ageia's little enterprise.

Concerning Havok though I can understand the reticence to use it, simply because you have to pay for it (and I'm sure Valve is selling it at a ridiculous price). That doesn't change much as long as you have to pay for something else, but excluding the case of Cell Factor, the advantage of working with Ageia's solution is that it's currently less costly (and easier, more complete or less restrictive? I don't know). That certainly doesn't mean it's the end of software physics engines though, but they'll be cheaper to make with hardware physics acceleration in the future, and cheaper to buy off others too.
 
Crysis does indeed look fun, and is one of the reasons for the new rig I'll be upgrading to in the next few months. That said, people are thinking the physics are a tad more special than they are. They looks nice, but they're really not doing anything Havok couldn't if you threw enough CPU at it. The trees are split up into largeish segments so don't break exactly where the bullets penetrate (and only shake the leaves if that's where they hit), and you can think of the buildings as merely delicately balanced boxes. We're going to be seeing similar jungle scenes where the trunk splinters exactly where you shoot, not necessarily taking down the tree, and large leaves tearing realistically as you empty a clip through them. Besides, the first time I recall seeing foliage that could be distorted by the characters body was in the original Hitman (the best one of the series in my opinion), which is a good five or six years ago now, ages in technology terms.

- I dont think you quite get the extent of Crysis's deformable enviroments. Its realtime destruction of the tree's, plants, vines, etc. Not just shooting a tree and it breaks at the same point no matter what. Supposedly that's only DX9 stuff, with DX10 being a gigantic step up with everything reacting to everythign else in real time.

As for the foliage reacting to your body, its an entire forest, it bends around your form and snaps back into place like a real plant does, it doesnt just clip through you like Far Cry did.
 

Walter

Administrator
Staff member
HawaiianStallion said:
- I dont think you quite get the extent of Crysis's deformable enviroments. Its realtime destruction of the tree's, plants, vines, etc. Not just shooting a tree and it breaks at the same point no matter what. Supposedly that's only DX9 stuff, with DX10 being a gigantic step up with everything reacting to everythign else in real time.

As for the foliage reacting to your body, its an entire forest, it bends around your form and snaps back into place like a real plant does, it doesnt just clip through you like Far Cry did.
Sounds like a lot of work for no foreseeable impact to the gamer. Seriously, you're going to have to sell me on this whole interactivity thing. I'm really skeptical.

I don't give a shit about "realistic foilage" (like a real plant!). And most of the shots I've seen of Crysis look so good, I may as well just go out to my back porch and kick a hole in it, rather than pay $49.95 + ~$1000 (upgrades) to do it VIRTUALLY.

Of course, many will inevitably read this as a bitter old man post, but honestly I've always been more of a design/gameplay over graphics kinda guy, even in the NES days.
 

Griffith

With the streak of a tear, Like morning dew
This from the man that loved it in MGS3 when Snake's eyes adjust to the darkness in the cave area (BTW, I noticed recently that Kojima pulled that trick in MGS2 as well =). Anyway, you are being a bitter old man because you're exactly who's going to love it one day. I know because you love this kind of stuff in existing games (when utilized for enhanced gamplay and design as you said), and more importantly, it's just like all the things I'm ironically bitter about. =)

Anyway, to take this argument to it's obvious conclusion, these features will open up a multiplicity of new game design possibilities. Just wait until it's the core of the MGS6 engine or Super Mario 1024. :carcus:

But yeah, right now it looks like a tech demo.
 
Walter, ultimately it will come down to immersion and fun, but Griffith hit the nail squarely on the head already. By all means though, go kick a hole in your back porch.

I'd like it to be known that I'm not the Ageia fanboy I may seem to be, but I am a physics acceleration fanboy. Proprietary gaming technologies don't end up being a good thing for the consumers, just look at 3DFX, and don't even get me started on Creative. Somebody has to get the ball rolling though, so I applaud them. Microsoft has finally woken up and is implementing DirectPhysics into an upcoming DirectX, so that's good too. It'll be interesting to see how it plays out, I'll hazard a guess and say nVidia will buy them.


HawaiianStallion said:
- I dont think you quite get the extent of Crysis's deformable enviroments. Its realtime destruction of the tree's, plants, vines, etc. Not just shooting a tree and it breaks at the same point no matter what. Supposedly that's only DX9 stuff, with DX10 being a gigantic step up with everything reacting to everythign else in real time.

As for the foliage reacting to your body, its an entire forest, it bends around your form and snaps back into place like a real plant does, it doesnt just clip through you like Far Cry did.

I really do get it, and I think you need to read my posts a little more carefully. I'm aware the trees don't break at the same point no matter what. Think of them as planks of wood that break in two in any generic title, but with multiple breaking points up the trunk (and the leaves don't even break at all). DX10 has nothing to do with it. The main difference it will bring is the geometry shader, and more efficiency due to having the guts of all the previous DX's ripped out. Incidentally, after all the DX10 hype Crytek spouted, their focus has largely been DX9, and they plan to add more DX10 features after the game ships via patches, not unlike what they did with Far Cry and DX 9.0C. It's likely every screenshot and movie of Crysis you've seen thus far has been running on DX9 hardware.

I'm also aware the foliage didn't react to your body in Far Cry. The point I made is that it did in the original Hitman, just on a smaller scale.


Aazealh, we seem to be in semi agreement, but I still don't really understand all the Ageia bashing. You've acknowledged that their solution is fine on regular hardware, as already seen on some PC and 360 titles, and what we will probably see on PS3 (not that you could call that hardware "regular" by any stretch of the imagination, but the point being hardware not designed specifically for PhysX). I'm sorry but your reason for developer adoption simply isn't true for the most part. I think you're right about it being less costly than Havok, but for big players like Epic Games (you know, one of your examples of those development houses with the skills to code things for themselves), this really isn't a factor. That's right, Unreal Engine 3.0 uses the PhysX SDK. These guys have proven they know what they're on about, and I highly doubt they chose it just to save a few bucks.


Aazealh said:
Now, did you know that their PhysX-based engine is too intense for current graphic cards? And that the developers of Cell Factor had to lower the graphics' quality of the game in order to fully implement the physics engine in there? Sounds like a good case of "you can't justify it at this point in time" to me.

Actually, to me it sounds like a good case of "our game/tech is terribly unoptimised and not running very well at the moment, quick, let's shift the blame". I mean it was pretty, but the rigs running it had some monstrous cards running in SLI, don't think they're gonna be bottlenecking it. Graphically it's inferior to Crysis, and that seems to run okay on similar hardware.


Aazealh said:
It's like that in a way. There are differences, even though you may not see them. I don't feel like going into details over this right now, but the context isn't quite the same.

It seems like a fitting analogy to me. A few years ago the thought of per pixel lighting would've seemed exorbitant. In a few more years every object in a game being destructable to different degrees, being extremely flexible to brittle, having properties such as mass, buoyancy and maybe even thermal properties, fluids having differing viscosity etc. will be the norm. We're gonna need some kind of processors for this, maybe not dedicated, but certainly not CPU.


Aazealh said:
ATI was already developing hardware physics acceleration before Ageia came out of the woodwork. And it's Half Life 2 that started the trend and made physics engines mainstream with Havok.

It's really not all that different from people buying graphics engines... Remember Unreal? These guys had the balls and the skills to create their own engine and give id Software the finger. Same with Valve when they made Half-Life, and with the CryEngine, etc.

Concerning Havok though I can understand the reticence to use it, simply because you have to pay for it (and I'm sure Valve is selling it at a ridiculous price).

Now for some nitpicking. :troll:
Half Lifes engine was actually a heavily modified Quake 1. Havok Physics isn't by Valve (it's by Havok) and was around long before HL2, and even then people played around with inverse kinematics and physics before it. I'm not so sure ATi was doing hardware based physics before Ageia popped up, I know they've been toying with GPGPU for a while but yeah.
 

Aazealh

Administrator
Staff member
Tiddlywinks said:
Aazealh, we seem to be in semi agreement, but I still don't really understand all the Ageia bashing. You've acknowledged that their solution is fine on regular hardware, as already seen on some PC and 360 titles, and what we will probably see on PS3 (not that you could call that hardware "regular" by any stretch of the imagination, but the point being hardware not designed specifically for PhysX).

Am I bashing them? I said Cell Factor isn't revolutionary, and it isn't, and that Ageia's card won't be a must-have for most games before a while (assuming it will take over the physics acceleration market), which is what Sparnage was saying. Till now I have yet to see anybody come up with a serious argument against this. Oh yeah, and I said I believe ATI/AMD and NVIDIA would most likely come forth with their own integrated solutions in the future, buying them if necessary. As far as I know you're agreeing with this, aren't you?

Tiddlywinks said:
I'm sorry but your reason for developer adoption simply isn't true for the most part. I think you're right about it being less costly than Havok, but for big players like Epic Games (you know, one of your examples of those development houses with the skills to code things for themselves), this really isn't a factor. That's right, Unreal Engine 3.0 uses the PhysX SDK. These guys have proven they know what they're on about, and I highly doubt they chose it just to save a few bucks.

I don't think that proves what I said wrong at all. I was a bit facetious concerning developer ineptitude originally (kinda trolling the thread, you know), but they're still motivated by a gain of time and development costs here, and they don't necessarily have enough people/resources to start coding their own separate physics engine (more specifically to create libraries for it). I'm not privy to all these companies' internal situations, but it's a question of constraints basically, and I'm pretty sure that if they had the choice they'd be more than happy to develop their own solution (just like they do for the graphics engine). Because really, what's the reason otherwise? You're saying they're skilled enough to do it, but they're still buying a SDK from another company. If that's not for the gain of development time and cost, then what?

Tiddlywinks said:
Actually, to me it sounds like a good case of "our game/tech is terribly unoptimised and not running very well at the moment, quick, let's shift the blame". I mean it was pretty, but the rigs running it had some monstrous cards running in SLI, don't think they're gonna be bottlenecking it. Graphically it's inferior to Crysis, and that seems to run okay on similar hardware.

Well they were saying it didn't run well because everything had complex physics applying to it (so the graphics engine couldn't follow or something like that), which isn't Crysis' case. I'm not really all that interested in the matter as you might have guessed, so I can't say I've checked it up in detail, but if the developers say so and that they work together with the company producing PhysX, I'm going to give them some credibility on the subject. And so should you, I guess?

Tiddlywinks said:
It seems like a fitting analogy to me.[...] maybe not dedicated

Like I said, the context isn't quite the same. And you're killing your analogy yourself here: "maybe not dedicated." We already have cards dedicated to graphics. Now people are even using 2 of them simultaneously. And another, $300 card would be needed just for physics? All of this for games only? I don't think the 2 main manufacturers will let this happen, and I don't think people are going to make hardware physics acceleration a requirement to play most games before it's widely spread.

If you read my posts carefully, you'll find that I don't care so much about Ageia but about the fact I'm being told I'll definitely have to buy this $300 physics card to play any game in the future. I don't think that'd be great, and I wish/think it won't happen. I'd rather see specific chipsets integrated to future graphics cards, personally.

Tiddlywinks said:
Half Lifes engine was actually a heavily modified Quake 1. Havok Physics isn't by Valve (it's by Havok) and was around long before HL2, and even then people played around with inverse kinematics and physics before it. I'm not so sure ATi was doing hardware based physics before Ageia popped up, I know they've been toying with GPGPU for a while but yeah.

Yeah, that's true. Not sure how I managed to forget about HL's engine (duh...), doesn't really matter though. And I was somehow convinced that Valve had bought the Havok company, but I guess not. I don't think you'll disagree that Havok is what made physics engines popular/mainstream and brought them a massive amount of public attention though, in large part thanks to HL2 as far as the end consumers are concerned. Also, I distinctly remember reading that ATI said they'd been working on physics for a while themselves after Ageia revealed its plans (besides they experiment so much stuff in general...). I don't think it's very important in the end since they're working on it now.
 
Aazealh said:
Hahah, just the reaction I expected. :troll: No offense Sparny, but you're as fervent as ever in your efforts to defend stupid causes, and you miss the point too. You may find it hard to believe but in the first demo issued it didn't change much, if anything at all. You can go ahead, download it and test it by yourself. There are articles about it all over the web. Now since that game is specifically created to make use of that PhysX card, you can bet they'll try all they can as time goes by to make it unplayable without it, even aside from the fact that using its capacities should help the performances (otherwise it'd be completely useless). Now, did you know that their PhysX-based engine is too intense for current graphic cards? And that the developers of Cell Factor had to lower the graphics' quality of the game in order to fully implement the physics engine in there? Sounds like a good case of "you can't justify it at this point in time" to me.

Which is what I said in my post, did you read it?

"I don't think most games will require that kind of card before a while, if they ever do."

I stand by what I said, and I don't think your reply to it is very pertinent. And yeah, if they ever do, because ATI/AMD and NVIDIA aren't going to sit there idly while it happens.

So they are more intense than they need to be, at least you know when you buy one it won't go out of date as quick as some hardware, so the only issue as far as I'm concerned is compatibility. I don't care if they eventually make it so you can't play many games without them, sooner or later it happens to everything, and by then the results will be worth it.

It's like that in a way. There are differences, even though you may not see them. I don't feel like going into details over this right now, but the context isn't quite the same.

Yeah, goes without saying.

They could have done well using hardware physics acceleration? No way!? :isidro: No shit they could have, who couldn't? Hardware acceleration is always a plus for everything by default, it's pretty sad that you felt this was an argument. Nobody is contesting it. The point was that they didn't need to. Neither did the guys at Valve when they made HL2. Just nitpicking, but a processing unit can't be software by the way.

So you're aware of this, but you're still against it. Even if purchasing one can't be justified right now, especially for limited game compatibility it doesn't mean it's a bad piece. If they are out it may speed up games development.

There is so much potential. If Games continue to use the card, then it would be very interesting to see what would come out. Cell Factor is just one step in the right direction. If not, then there's always Nvidia and ATI.

No, they bought the rights to include the PhysX development kit, which is basically a big middleware library, into their own PS3 SDK. No "PPU" here. Since the PS3's supposed to come out in 3 months it'd be pretty sad if they were still "investing" in new hardware for it.

OK, I'll look more into it.

Shouldn't it rather be that the only thing enticing you to buy this card is its compatibility with games and the gain of performance (as well as a hypothetic and unlikely lack of concurrence)? That's what I was getting at from the beginning, and it's the reason I'm not buying one of these PhysX cards. You seem to agree though so I guess we're cool.

ATI was already developing hardware physics acceleration before Ageia came out of the woodwork. And it's Half Life 2 that started the trend and made physics engines mainstream with Havok. Also, Ageia requires the use of specific, proprietary technology and libraries, so I doubt the "games would run on both" part in the current context.

Yes, the compatibility with games is the main factor definitely makes me skeptical. All I need is several titles I really want to play using the Ageia's card and it's as good as bought. If Prices going down as these games come closer is a definite bonus. If games flow towards ATI and Nvidia's then I'll probably choose their card.

Have fun blowing your 10,000 crates anyway; and sorry if I ruined your mood, it wasn't my intent. :guts:

Hahah, are you kidding? All this talk about it has made me more excited it and upcoming titles than ever. It's not that I'm a Ageia fanboy either, I like new developments for better games and technology. If a hardware processor ever comes for something like AI or something similar, I'll be in favor of that too. Virtual Reality is just not coming quick enough, but I'm patient.

And while I'm having fun with Cell Factor, Stalker and Crysis, you enjoy Castlevania, Mario and the like.
 

Aazealh

Administrator
Staff member
Sparnage said:
So they are more intense than they need to be, at least you know when you buy one it won't go out of date as quick as some hardware, so the only issue as far as I'm concerned is compatibility.

No, you don't really know anything when you buy one. This card requires the PhysX SDK to be efficient, and it's a proprietary technology. If Microsoft implements its own physics libraries in DirectX, and AMD & NVIDIA develop their own hardware solutions based on it, all the developers are going to go for it, and your proprietary card using a different technology will be obsolete if Ageia doesn't make it retroactively compatible (which I doubt they could). So right now, few good games are planning to use it (UT2K7, Vanguard, Warhammer Online... Any other worthwhile title?), and virtually none will "require" it. It's a case of "not being justified at this time" like I said earlier. I don't think you disagree with this.

Sparnage said:
So you're aware of this, but you're still against it. Even if purchasing one can't be justified right now, especially for limited game compatibility it doesn't mean it's a bad piece. If they are out it may speed up games development.

Aware of what, and against what? See the previous paragraph concerning the good deal/bad deal. Keep in mind that people have tried to launch specifically dedicated cards before, be it sound cards or others. Lots have failed. Investing in a technology too early, when it hasn't been standardized yet and isn't widely accepted upon, often proves to be an error for the end-consumer.

Sparnage said:
There is so much potential.

Physics engines in general are what has a lot of potential. Again, it's not a case of whether having a physics engine is good or bad, it's about this particular piece of proprietary, dedicated hardware. You're saying you're fine with paying $300 for a card which evolution seems pretty limited all things considered and that not a lot of to-be-released games are actively supporting at the moment, while you know there will be (cheaper) alternatives in the near future? I don't know what to say man, that sounds more like donating to a research center than anything else to me.

Sparnage said:
Yes, the compatibility with games is the main factor definitely makes me skeptical. All I need is several titles I really want to play using the Ageia's card and it's as good as bought. If Prices going down as these games come closer is a definite bonus. If games flow towards ATI and Nvidia's then I'll probably choose their card.

Then I guess we're in agreement, only you're a lot more optimistic about Ageia's hardware venture than I am (and less regarding on what's worth your money I guess). If they sold their card for $50 I'd applaud them, and if they were negociating with Microsoft and/or other hardware manufacturers to include their chipsets as part of a bigger offer I'd think they could stay around. Positioning themselves as a pioneers in a new (and costly) market running parallel to the graphics cards one doesn't satisfy me.

Sparnage said:
And while I'm having fun with Cell Factor, Stalker and Crysis, you enjoy Castlevania, Mario and the like.

I don't think you know the extent of the games I play. I'll be enjoying Crysis and Stalker myself, but without a PhysX card since these games don't support it (Crysis runs on the CryEngine2 and Stalker on Havok). Like I said, enjoy Cell Factor's revolutionary crates as much as you want, I just hope it'll be worth the price. :troll:
 

Walter

Administrator
Staff member
Throughout all these intense arguments, let's please not forget that:

VIDEO GAMES ARE SERIOUS BUSINESS
 

Griffith

With the streak of a tear, Like morning dew
amazing_phy_4.jpg
amazing_phy_2.jpg
Coming soon to N64, GoldenEye 2. And Half-Life 2, in case you didn't see it.

Wally and Sparnage are right, games are serious, and this... this... this changes EVERYTHING (well, some things... in video games)! It's so great, it's not like it will be refined, improved, and made standard. I need to play it in its roughest, most un-utilized form in a generic FPS for the highest price possible. I'm married to these run of the mill titles already; it's my new religion. While you losers enjoy those timeless games with the all-day gameplay, staying power, refined execution, and thought put into them; I'll be feeling advanced with another in the slew of state of the art brand-X FPS tech demos released every few years, which even I won't remember a few months later. Fuck Mario, I'll be playing Rise of the Triad: Ageia, blowing boxes up and blasting pipe like a real man, and it'll be the best game since DooM 3, so there! :badbone:


P.S. Can I buy this for PS3 at launch? :???:
 
Top Bottom