NVIDIA or ATI?

Really? I’m running Silo 2 on a crappy integrated Intel x3100 graphics chipset in my laptop and I haven’t had any issues at all with stability.

Apparently it might have been a general instability running under Vista. I’ll have to wait and see if the original poster tries the updated version to see if there’s an improvement.

I’ve been a lifelong nVidia supporter. I like to think it’s because their drivers are a bit more robust and less troublesome, but I think I may just be falling for their clean and uncluttered web-design. For some reason I always fumble around on the ATi site when looking for drivers and nVidia’s are simple and easy to find. It’s all in the presentation I guess.

No one has mentioned Larrabee, yet. I personally think Intel will dominate the market shortly after its release… It just seems like such a great “well, duh” idea; the prospect of virtually infinite numbers of processors… Don’t need 6 of your 12 cores to render a frame? fine, use the other 6 for Havok. I for one cannot wait. Anyone else intrigued?

We use ATI here, nvidia is really scarce around these parts.
I would rather have nvidia though… seems like they have much better developer tools and better drivers.

Like slappy is saying part of it is just their website… a lot of the times when I need to download drivers it’s because somethings not rendering right… which pisses me off. Then having to navigate through their crappy site just makes things worse. Very uncool.

Is anyone here actually able to use the “Professional” cards like ATI’s FireGL or Nvidia’s Quattro series? We always bog down to the fact that our internal tools rely on fairly high-end hardware features and speed, and the FireGL and Quattro seem to lag several years behind the performance of their “retail” counterparts when it comes to “game engine” like performance.

Which is a damn shame, since I could use a bit more speed in 3dsmax etc sometimes.

SamiV.

[QUOTE=samivRMD;1389]Is anyone here actually able to use the “Professional” cards like ATI’s FireGL or Nvidia’s Quattro series? We always bog down to the fact that our internal tools rely on fairly high-end hardware features and speed, and the FireGL and Quattro seem to lag several years behind the performance of their “retail” counterparts when it comes to “game engine” like performance.

Which is a damn shame, since I could use a bit more speed in 3dsmax etc sometimes.

SamiV.[/QUOTE]

I am always blown away when I read the performance tests of workstation GL cards, compared to the consumer-level cards. However, they really are specialized cards, and you need to use apps that take advantage of them. TBH, I am no expert on this, but I do know that having a workstation card in the right app can result in speeds many times faster. But what we are doing in game development, especially the technical artists, is so varied and flexible that we’d probably end up having more issues with workstation cards (especially showing real-time graphics and the like). For any programmer that’d benefit from a workstation card, he’d probably want to be using a beefy consumer card to avoid potential driver/compatibility issues. For regular artists, I don’t know- I have limited experience with workstation cards (college), and have only read a few comparisons/reviews in 3d magazines, which have been very enlightening about their use. My uninformed rule of thumb would probably be, if what you are doing could be done on a Mac, you could probably benefit from a workstation card? I wonder if anyone here has more experience.

Tom’s just did a comparison of 7 workstation cards, you can check out: http://www.tomshardware.com/reviews/FireGL-Quadro-Workstation,1995.html

From the Tom’s Hardware article:

In terms of performance, the differences are clear: if you compare a gaming graphics card with its (almost) identical workstation brother, the drivers ensure that a workstation model runs the professional applications much faster. Naturally, the gaming card thus runs those high-end app significantly slower. On the other hand, you could compare these graphics cards in gaming, though that doesn’t make much sense since nobofy buys a pricey workstation card for entertainment.

I just want the best of both worlds, right? :slight_smile:

But the performance differences in some certain aspects are very real. A friend of mine has been working on the OpenGL renderer of a certain engineering application for years, and the reason their company always recommended the Quaddro cards for their customers was simply: “It’s much more important to be able to draw millions of antialiased lines than texturing a single polygon”. He had some pretty hair bending statistics on the performance numbers, but they really don’t make sense to me since the HW in the consumer and the pro cards is pretty much identical. I would really like to be able to get some of the pro HW for just modeling reasons and then “with a flip of a bit” just switch them to a consumer mode (and a different driver model) for some internal tools using DirectX.

SamiV.

Nvidia Quadro cards, mainly to leverage XSI’s preference to Opengl over DirectX.

Nvidia all the way…

Like everyone else, I don’t know why. Maybe it’s because every time I hear someone complain about their system, it usually has an ATI card in it.

However, I have never used a high end card. I’m running an 8800GTS right now and it does fine, I’m sure if I used a high end card, I would see a difference, but I don’t want to until I can actually buy one :slight_smile:

Funny you say that, as I tend to find the ‘expensive’ gaming level cards much speedier at best, and vritually the same as the ‘mid range professional’ cards at worst. (I only have experience up to the 1700/3500 range of quadros however as a cavaet.)

Lean towards the Quadro/FireGL if your pipe is heavy in the OpenGL applications as they ARE a bit smoother in those vs the gaming level cards.

For my money right now, I’m still in the 8800->9800 nvidia phase as the money pick for speed/cash.

But I also haven’t had to purchase any workstations since last December, so I haven’t read up on the latest.

ATI for engine development. It’s the lowest common denominator as I find it less forgiving with different texture types. I personally use Nvidia though because I find it better, driver and quality wise.