vid card - 5900?
43 posts
• Page 3 of 3 • 1, 2, 3
- JimmyTango
-
- Posts: 1774
- Joined: Tue Nov 05, 2002 5:17 pm
- Location: Land of the Shemales.
Edogg,
Stop being an 'nvidiot.'
It really is not smart to stick with one brand, and become a 'fanboy' for them. It is better to go by who is making the best product.
All it does it limit you in what you can get.
The way you are sticking up for nvidia is not smart. Nvidia has screwed up big time, and lied to it's customers. Yet you still defend them, and are even arguing their faulty hardware is better.
DX9 is supposed to be the instructions of building games with X features. When a game developer, like ID and GSC, has to make their own 'instruction' to get these features, or hide the lack of support for those features, then the hardware is faulty and not made correctly.
This makes developers have to take MORE TIME to develope a game, just so it works on a generation of faulty cards..
Stop being an 'nvidiot.'
It really is not smart to stick with one brand, and become a 'fanboy' for them. It is better to go by who is making the best product.
All it does it limit you in what you can get.
The way you are sticking up for nvidia is not smart. Nvidia has screwed up big time, and lied to it's customers. Yet you still defend them, and are even arguing their faulty hardware is better.
DX9 is supposed to be the instructions of building games with X features. When a game developer, like ID and GSC, has to make their own 'instruction' to get these features, or hide the lack of support for those features, then the hardware is faulty and not made correctly.
This makes developers have to take MORE TIME to develope a game, just so it works on a generation of faulty cards..
Well lets just hope that ati doesn't go back to the days of crap drivers.
-
"Now, if things look bad, and it looks like your not going to make it, then you've got to get mean, I mean plum mad dog mean, 'cause if you lose your head and give up then you neither live nor win, and that's just the way it is."
- The Outlaw Josey Wales -
put me on the team that Harry aint on....I sure miss shooting him and if im on the same team as HaVoC...OMFG we will stomp a mudhole in you and walk it dry.
- YaDad -

"Now, if things look bad, and it looks like your not going to make it, then you've got to get mean, I mean plum mad dog mean, 'cause if you lose your head and give up then you neither live nor win, and that's just the way it is."
- The Outlaw Josey Wales -
put me on the team that Harry aint on....I sure miss shooting him and if im on the same team as HaVoC...OMFG we will stomp a mudhole in you and walk it dry.
- YaDad -

- Edogg
Originally posted by JimmyTango
Edogg,
Stop being an 'nvidiot.'
It really is not smart to stick with one brand, and become a 'fanboy' for them. It is better to go by who is making the best product.
All it does it limit you in what you can get.
The way you are sticking up for nvidia is not smart. Nvidia has screwed up big time, and lied to it's customers. Yet you still defend them, and are even arguing their faulty hardware is better.
DX9 is supposed to be the instructions of building games with X features. When a game developer, like ID and GSC, has to make their own 'instruction' to get these features, or hide the lack of support for those features, then the hardware is faulty and not made correctly.
This makes developers have to take MORE TIME to develope a game, just so it works on a generation of faulty cards..
call me what you want, I dont really care. My point is that I havent had any performance issues with my 5900. For the games out right now, it absolutely flys right through them.
Thats why it pushes my buttons when you guys say it sucks.
and dont patronize me, I know what direct 3d and open gl consist of. Im a programmer for god sakes.
- shockwave203
-
- Posts: 1440
- Joined: Mon Jan 20, 2003 2:40 pm
- Location: SK Canada
Originally posted by Edogg
actually im referring to 3d2k1. Im getting over 19k.
I havent really benched in 3d2k3 at all. I dont like 3d2k3 because its not as fun to tweak for.
ill beat your 2k3 score sometime this weekend and post the link.
ofcourse the 5900 is going to beat a radeon 9700, the 9700 is a lot older. It's old technology compared to the 5900.
if you want something to beat, go up against something that's in the same group. The 9800.
It's funny though--because even though the 5900 is in Nvidia's top card group, it is still comparable to a 9700.
I await your benchies

- JimmyTango
-
- Posts: 1774
- Joined: Tue Nov 05, 2002 5:17 pm
- Location: Land of the Shemales.
Edogg,
Take 10 seconds, breath slowly. No one is 'patronizing' you. It is ok.
You are getting very defensive over a stupid video card company. It is not that big of a deal.
If you are truely a programmer, then I do not see how you could not question Nvidia and how they have lied about it's support of DX9. They have lied to you, and gotten you to purchase something that does not live up to what they claim.
I do not see how this information is new to you. You call people here naive for not thinking Valve has optimized HL2 to run better on ATI hardware. That could not be further from the truth. Valve has stated that it is strict DX9 coding, and the ATI cards run better because the ATI cards run DX9 and the nvida cards do not run DX9 properly. Because of this, Vavle has had to do special coding, not for ATI cards, but for Nvidia cards. This special coding basicly turns off the parts of DX 9 the Nvidia cards can not do properly. All of this should ring a bell that there is a serious flaw in the current nvida designs.
DX9 benchmarks have shown this to be true WAY before the initial HL2 benchmarks came out. I was suprised that so many were suprised over Nvidia's performance in HL2, when these DX9 benches show the problem was going to come out when DX9 games were out.
It is common knowlege that Nvidia f'ed up big time with the NV3X line. ATI, however, struck gold. Nvidia's blunder this time is as bad as ATI's initial 8500 launch was. They hyped the 8500 up, only to have huge problems with drivers, 'cheating' in Quake benches, incompatability with software, etc.
The next generation of cards? Who knows. Maybe Matrox comes out of nowhere with some 900 trillion mega poly per one tenth of a second GPU with dual inverted girders directly welded to a Chevy small block V8 and takes the performance lead. We will not know until the next gen of cards are out.
I have owned Vodoo cards, Nvidia cards and ATI cards. Each time, I have bought what was the best at that time. I never looked at brand, just who was best. I have been nothing but pleased with my 9700 pro purchase. I have had no more problems with ATI drivers than i have had with nvidia drivers.
Each time I have come out on top. All because i did nto stick to brand, or take what the manufacturer told me for granted. I looked at real world benches(not that crappy 3Dmark, of which 2k1 is VERY CPU dependant).
Take 10 seconds, breath slowly. No one is 'patronizing' you. It is ok.
You are getting very defensive over a stupid video card company. It is not that big of a deal.
If you are truely a programmer, then I do not see how you could not question Nvidia and how they have lied about it's support of DX9. They have lied to you, and gotten you to purchase something that does not live up to what they claim.
I do not see how this information is new to you. You call people here naive for not thinking Valve has optimized HL2 to run better on ATI hardware. That could not be further from the truth. Valve has stated that it is strict DX9 coding, and the ATI cards run better because the ATI cards run DX9 and the nvida cards do not run DX9 properly. Because of this, Vavle has had to do special coding, not for ATI cards, but for Nvidia cards. This special coding basicly turns off the parts of DX 9 the Nvidia cards can not do properly. All of this should ring a bell that there is a serious flaw in the current nvida designs.
DX9 benchmarks have shown this to be true WAY before the initial HL2 benchmarks came out. I was suprised that so many were suprised over Nvidia's performance in HL2, when these DX9 benches show the problem was going to come out when DX9 games were out.
It is common knowlege that Nvidia f'ed up big time with the NV3X line. ATI, however, struck gold. Nvidia's blunder this time is as bad as ATI's initial 8500 launch was. They hyped the 8500 up, only to have huge problems with drivers, 'cheating' in Quake benches, incompatability with software, etc.
The next generation of cards? Who knows. Maybe Matrox comes out of nowhere with some 900 trillion mega poly per one tenth of a second GPU with dual inverted girders directly welded to a Chevy small block V8 and takes the performance lead. We will not know until the next gen of cards are out.
I have owned Vodoo cards, Nvidia cards and ATI cards. Each time, I have bought what was the best at that time. I never looked at brand, just who was best. I have been nothing but pleased with my 9700 pro purchase. I have had no more problems with ATI drivers than i have had with nvidia drivers.
Each time I have come out on top. All because i did nto stick to brand, or take what the manufacturer told me for granted. I looked at real world benches(not that crappy 3Dmark, of which 2k1 is VERY CPU dependant).
- (>Tool<)
Originally posted by JimmyTango
The next generation of cards? Who knows. Maybe Matrox comes out of nowhere with some 900 trillion mega poly per one tenth of a second GPU with dual inverted girders directly welded to a Chevy small block V8 and takes the performance lead. We will not know until the next gen of cards are out.
I don't want any part in this argument, but that paragraph is hilerious

- shockwave203
-
- Posts: 1440
- Joined: Mon Jan 20, 2003 2:40 pm
- Location: SK Canada
I am anxious to see what XGI can do later on. I saw some of their benchmarks on one of their duo GPU cards and it wasn't impressive, but that's probably due to bad drivers.
- Edogg
Originally posted by FI2ick
Well Edogg, just read this.
http://www.anandtech.com/news/shownews.html?i=20532
go back and read my posts. Where did I ever say that nvidia had better overall dx9 performance? I didnt. I said their dx9 performance has gotten better. I also said that nvidia cards would perform better in certain games because of nvidia working with the game developers. I never said in all games.
Does that link disprove anything I have said? no
Does it prove that valve hasnt done any extra special coding to make ATI perform that much better? no.
does it prove that nvidia cards will not perform better that ati cards in games like Doom III and Stalker? no.
well I have read your link. So what am I supposed gain from it? I havent learned anything new from it, and it hasnt disproved anything I have said.
- shockwave203
-
- Posts: 1440
- Joined: Mon Jan 20, 2003 2:40 pm
- Location: SK Canada
http://service.futuremark.com/compare?2k3=1654520
not bad for a nonpro card, that's over a year old.

not bad for a nonpro card, that's over a year old.
Actually it does prove that Valve hasn't done any special coding for ATI.
Unfortunately, it will probably be representative of most DX9 games
So you're saying all DX9 games will be optimized for ATI just to screw nVidia?
And how has their DX9 performance gotten better when it's a hardware issue?
No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?
Unfortunately, it will probably be representative of most DX9 games
So you're saying all DX9 games will be optimized for ATI just to screw nVidia?
And how has their DX9 performance gotten better when it's a hardware issue?
- JimmyTango
-
- Posts: 1774
- Joined: Tue Nov 05, 2002 5:17 pm
- Location: Land of the Shemales.
Originally posted by Edogg
Does that link disprove anything I have said? no
No, but this does. And what is does not 'disprove,' it wakes you up to what the Nvidia 'optimizers' actually do, and how it actually hurts the visual quality of the game, including one instance of removing fog completley out of a level while HL2 is still in development.
The design of the NV3X line simply does not fully support DX 9. They have to edit DX9 to make it work, with short cuts that include editing the actual game. It is not running DX 9 better, it is running DX Nvidia.
http://techreport.com/etc/2003q3/valve/index.x?pg=1
43 posts
• Page 3 of 3 • 1, 2, 3
Who is online
Users browsing this forum: Google [Bot] and 7 guests