Ding dong the witch is dead! Now there's half a chance I can persuade Upstairs to have the damn passively-cooled fail-o-matics yanked from all our chassis and replaced with actively-cooled cards (or at least kick them out for the next OOW & upgrade cycle). Sure, they're fine if you have the airflow I guess, but whoever specced our workstations for the last rollout didn't check that first.
Indeed. I see a couple of these units at work and hoping to some nice low power, passive Maxwell cards as a replacement. Don't need much power but I'd love the nice multimonitor support. Hopefully the maximum number of displays has increased and there is support for MST hubs. I have a reason to test a setup with 8 displays and running them all off of one video card/system would greatly simplify things.
Mirroring across two cards has been flaky in my experience which is necessary in the setup. While true that I could just have all the mirrored displays connected to a single card, it quickly becomes a pain to support if you have to troubleshoot remotely and have to have some one else be your hands if necessary.
I was hoping that with both nVidia and AMD pushing multimonitor setups as well as 4K, the number of logical displays their cards would increase. I'd be content if this required MST hubs as large multi-display setups are rather niche.
This will cause Minimum System requirements to go up in some games since drivers wont be updated with new game profiles. Nearly every AAA game has the 8800GT w/ 512GB as the minimum GPU.
That depends on how you define "up." There are plenty of newer cards on the market with the same performance. The 8800 line just happens to be incredibly popular, so it's a good "target."
What's actually likely to start pushing min specs up will be when cross platform titles start dropping the PS3 and XB360 because supporting really old/slow PC cards was doable using the quality level that was already created to support the last generation consoles.
The HD 4000 series by AMD...Come on now, I was talking about it with AMD as the context. And the HD Graphics 4000 by Intel is a single GPU, not a series.
This is kinda weird on the AMD side as not much changed between the HD4000 and HD5000 series in terms of architecture. The main thing that the HD5000 series added was Eyefinity.
The transition from the Radeon 4000 to the 5000 series coincided with a die shrink from 55 nm to 40 nm. This allowed AMD to increase the number of functional units on a die but the actual design of those units was the same. Case in point, the Radeon 4870 and 5770 differ in core specs mainly due to the memory bus width (256 bit vs 128 bit wide) and clock speeds. Sure, due to the shrink AMD was able to offer more performance for the same price but the underlaying architecture was the same. This is what makes AMD dropping the Radeon 4000 support in their drivers odd.
Things were different in the 6000 series. AMD couldn't go a shrink so they had to increase die size as well as the efficiency of their designs. The Radeon 6900 line was indeed a big change as that card was a VLIW4 design instead of VLIW5. The Radeon 6800 and 6900 lines also got enhanced TMU's and ROP's for more throughput. Though the lower end of the 6000 series had some rebrands: the Radeon 6770 was the same as the 5770 for example.
I think there was a UVD generation jump, and obviously the tessellation engine changed. Wasn't Eyefinity introduced with the AMD HD 5k series?
I think the combination of these changes affect the front-end a lot more than the VLIW5->4 shift. So yes, I agree in premise but not in driver maintenance.
I think UVD may have changed and I did mention Eyefinity above. The changes to UVD would take some rewriting but ultimately a minor part of the drivers. The tessellation engine was upgraded but wasn't entirely new either (DX10.1 had it as an optional feature).
The VLIW5 to VLIW4 change did require a fair amount of compiler tuning. There were a few edge cases where the Radeon 6970 at launch would lose in compute to a Radeon 5870 due to unoptimized compilers.
Last driver that came out for G45 series (DX10) was 11/2012, so no, Intel's legacy support is pretty poor. Don't even ask about CedarView/Trail drivers...
To really merit a major hardware change, DX12 would have to introduce a change outside of the shader pipelines, like introducing more programmability into TMU, ROP or tessellation units.
The shader cores on modern chips can pretty much emulate any new raw function if necessary. DX12 hardware may have an additional instruction(s) and/or hardware to execute it faster.
MS goal seems to be reducing overhead which may require alterations to how DX works on the software side, not necessarily new hardware. Ultimately I see DX12 being a cleanup of the DX10 and DX11 specs along with a new driver model.
Wow, I can't believe the 8800GTX is EIGHT years old! It still blows away the last gen consoles, so it's sort of hard to believe it.
This sort of seems fast, but I guess it's really not, and it's still much much much better support than AMD or Intel have. AMD doesn't even properly support their GPUs when they're new...
I don't see what's hard to believe about an eight year old top end graphics card being faster than an eight year old console (Xbox 360 was released back in 2005) given that the graphics card cost more than the entire console.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
33 Comments
Back to Article
blanarahul - Wednesday, March 12, 2014 - link
Introducing TechReport subscription!edzieba - Wednesday, March 12, 2014 - link
From the out of support list: "NVS 295, NVS 300"Ding dong the witch is dead! Now there's half a chance I can persuade Upstairs to have the damn passively-cooled fail-o-matics yanked from all our chassis and replaced with actively-cooled cards (or at least kick them out for the next OOW & upgrade cycle). Sure, they're fine if you have the airflow I guess, but whoever specced our workstations for the last rollout didn't check that first.
Kevin G - Wednesday, March 12, 2014 - link
Indeed. I see a couple of these units at work and hoping to some nice low power, passive Maxwell cards as a replacement. Don't need much power but I'd love the nice multimonitor support. Hopefully the maximum number of displays has increased and there is support for MST hubs. I have a reason to test a setup with 8 displays and running them all off of one video card/system would greatly simplify things.Ryan Smith - Wednesday, March 12, 2014 - link
Maxwell is still limited to 4 displays per GPU.Mr Perfect - Thursday, March 13, 2014 - link
Matrox has M-series cards capable of 8 displays each, if you don't want to have two normal cards in a system for some reason.Kevin G - Friday, March 14, 2014 - link
Mirroring across two cards has been flaky in my experience which is necessary in the setup. While true that I could just have all the mirrored displays connected to a single card, it quickly becomes a pain to support if you have to troubleshoot remotely and have to have some one else be your hands if necessary.I was hoping that with both nVidia and AMD pushing multimonitor setups as well as 4K, the number of logical displays their cards would increase. I'd be content if this required MST hubs as large multi-display setups are rather niche.
jasonelmore - Wednesday, March 12, 2014 - link
This will cause Minimum System requirements to go up in some games since drivers wont be updated with new game profiles. Nearly every AAA game has the 8800GT w/ 512GB as the minimum GPU.inighthawki - Wednesday, March 12, 2014 - link
That depends on how you define "up." There are plenty of newer cards on the market with the same performance. The 8800 line just happens to be incredibly popular, so it's a good "target."DanNeely - Wednesday, March 12, 2014 - link
What's actually likely to start pushing min specs up will be when cross platform titles start dropping the PS3 and XB360 because supporting really old/slow PC cards was doable using the quality level that was already created to support the last generation consoles.tipoo - Wednesday, March 12, 2014 - link
I have to hand it to them, this is only happening now, while AMD already isn't supporting the HD4000 DX10.1 series in Windows 8.1.kyuu - Wednesday, March 12, 2014 - link
Uh, why would AMD be supporting an Intel iGPUs...?Novaguy - Wednesday, March 12, 2014 - link
He's clearlyctalking about the Amd HD 4000 series. I had the HD 4650, for example.tipoo - Wednesday, March 12, 2014 - link
ಠ_ಠThe HD 4000 series by AMD...Come on now, I was talking about it with AMD as the context. And the HD Graphics 4000 by Intel is a single GPU, not a series.
tipoo - Wednesday, March 12, 2014 - link
Actually I guess that's a series too, but HD 4000 = Radeon, HD Graphics 4000 = Intel.Kevin G - Wednesday, March 12, 2014 - link
This is kinda weird on the AMD side as not much changed between the HD4000 and HD5000 series in terms of architecture. The main thing that the HD5000 series added was Eyefinity.blzd - Wednesday, March 12, 2014 - link
More changed from 4000-5000 than it did from 5000-6000 series. Maybe you have those mixed up.5000 series brought us twice the performance at the same cost almost across the board.
Kevin G - Wednesday, March 12, 2014 - link
The transition from the Radeon 4000 to the 5000 series coincided with a die shrink from 55 nm to 40 nm. This allowed AMD to increase the number of functional units on a die but the actual design of those units was the same. Case in point, the Radeon 4870 and 5770 differ in core specs mainly due to the memory bus width (256 bit vs 128 bit wide) and clock speeds. Sure, due to the shrink AMD was able to offer more performance for the same price but the underlaying architecture was the same. This is what makes AMD dropping the Radeon 4000 support in their drivers odd.
Things were different in the 6000 series. AMD couldn't go a shrink so they had to increase die size as well as the efficiency of their designs. The Radeon 6900 line was indeed a big change as that card was a VLIW4 design instead of VLIW5. The Radeon 6800 and 6900 lines also got enhanced TMU's and ROP's for more throughput. Though the lower end of the 6000 series had some rebrands: the Radeon 6770 was the same as the 5770 for example.
lmcd - Wednesday, March 12, 2014 - link
I think there was a UVD generation jump, and obviously the tessellation engine changed. Wasn't Eyefinity introduced with the AMD HD 5k series?I think the combination of these changes affect the front-end a lot more than the VLIW5->4 shift. So yes, I agree in premise but not in driver maintenance.
Kevin G - Thursday, March 13, 2014 - link
I think UVD may have changed and I did mention Eyefinity above. The changes to UVD would take some rewriting but ultimately a minor part of the drivers. The tessellation engine was upgraded but wasn't entirely new either (DX10.1 had it as an optional feature).The VLIW5 to VLIW4 change did require a fair amount of compiler tuning. There were a few edge cases where the Radeon 6970 at launch would lose in compute to a Radeon 5870 due to unoptimized compilers.
MrSpadge - Wednesday, March 12, 2014 - link
"This leaves Intel as the only vendor with D3D10 parts still receiving mainstream support"It may be Intel's mainstream suppoort treatment, but it's not really any better than legacy support from the green and red team.
tipoo - Wednesday, March 12, 2014 - link
Yeah really. Intel was doing four yearly driver releases last I checked, or has that changed?powerarmour - Thursday, March 13, 2014 - link
Last driver that came out for G45 series (DX10) was 11/2012, so no, Intel's legacy support is pretty poor. Don't even ask about CedarView/Trail drivers...The Von Matrices - Thursday, March 13, 2014 - link
It's been a lot better since the Sandy Bridge generation; driver releases are quarterly or sometimes more frequently.tviceman - Wednesday, March 12, 2014 - link
Pour one out for the G92.dgingeri - Wednesday, March 12, 2014 - link
I'd like to see how my former high end video card, the 8800GTX, performance compares to my current GTX 680. It would be interesting.piroroadkill - Thursday, March 13, 2014 - link
Not quite, but this very site has 8800GT vs GTX 680.http://www.anandtech.com/bench/product/521?vs=555
RussianSensation - Thursday, March 13, 2014 - link
8800GTX = 100%GTX580 = 282% (2.82x faster)
http://www.computerbase.de/artikel/grafikkarten/20...
GTX580 = 100%
GTX680 = 137% (37% faster)
http://www.computerbase.de/artikel/grafikkarten/20...
Implies, GTX680 is 282% x 1.37 = 386% of 8800GTX or 3.86x faster.
tipoo - Thursday, March 13, 2014 - link
That's actually not that crazy, even small seeming, for how old of a card the 8800GTX is now.powerarmour - Thursday, March 13, 2014 - link
I assume this also means EOS for Nvidia's ION too? (although it's not specifically mentioned)vailr - Thursday, March 13, 2014 - link
What about Microsoft's pending DirectX 12 announcement?I'm assuming support for DX12 would require some kind of GPU hardware upgrade, or not?
Kevin G - Thursday, March 13, 2014 - link
To really merit a major hardware change, DX12 would have to introduce a change outside of the shader pipelines, like introducing more programmability into TMU, ROP or tessellation units.The shader cores on modern chips can pretty much emulate any new raw function if necessary. DX12 hardware may have an additional instruction(s) and/or hardware to execute it faster.
MS goal seems to be reducing overhead which may require alterations to how DX works on the software side, not necessarily new hardware. Ultimately I see DX12 being a cleanup of the DX10 and DX11 specs along with a new driver model.
Wolfpup - Friday, March 14, 2014 - link
Wow, I can't believe the 8800GTX is EIGHT years old! It still blows away the last gen consoles, so it's sort of hard to believe it.This sort of seems fast, but I guess it's really not, and it's still much much much better support than AMD or Intel have. AMD doesn't even properly support their GPUs when they're new...
Johnmcl7 - Friday, March 14, 2014 - link
I don't see what's hard to believe about an eight year old top end graphics card being faster than an eight year old console (Xbox 360 was released back in 2005) given that the graphics card cost more than the entire console.