Last nVidia card I used was the glorious 8800GT.
I'm kicking about in triple monitor gaming-land after I upgraded to a Sapphire Radeon HD 5870 Eyefinity 6 Edition.
It's the most money I've ever spent on a single piece of computer hardware (was $699 when I bought it) and I'm still, 2-3 years on, happily using it.
That being said, if you're willing/happy to stick with the ATi side of the fence, the 7870 represents some awesome bang for your buck with low voltage and temperature overheads, it's sure to be a f*****g reliable card.
I solidly vouch for Sapphire too, this card runs at 90+ degrees under load, and whilst it's only really been 'hot' [Read: mid to high 30's with some decent levels of humidity] in the ACT in the past fortnight, it is eating up shit like Far Cry 3 at 5760x1200.
... Far Cry 3 at 1920x1280 ...
Err, I'mma assume that's a typo, that or one WEIRD monitor.
That's an interesting take on 'why to upgrade' simul.
I certainly don't agree with the logic.
In the past 3 or so years, dx 11 has settled in and become the norm, and until dx12 arrives I see no reason to bust a nut upgrading, there are some strongly performing video cards for a very not strong price point; and until there is another 'class' [read: new video cards being made for new feature-sets] of video card I see no reason to throw big dollars at it.
(Especially if you're saving a few pennies to buy into the next gen of consoles.... assuming there's a good reason to buy into them...)