Comments Locked

54 Comments

Back to Article

  • VulkanMan - Tuesday, November 7, 2017 - link

    The writing was on the wall.

    He seems to be more into the movie business now.
  • beginner99 - Wednesday, November 8, 2017 - link

    Yeah. We called this her on AT when the sabbatical was announced. Was obvious this was a soft-layoff.
  • damianrobertjones - Wednesday, November 8, 2017 - link

    Good lord man... Don't call him a her!
  • Qwertilot - Wednesday, November 8, 2017 - link

    The typo is (presumably!) for here :)
  • vladx - Tuesday, November 7, 2017 - link

    Surprise, surprise, just like I said in the previous article AMD is in full damage control mode.
  • ddriver - Wednesday, November 8, 2017 - link

    History repeating itself.

    After HecKtor RuiNz messed up AMD and caused it to lose its fabs, he got rewarded by becoming an executive of GF. Too bad he didn't get to enjoy it for long.

    And now Raja botched a critical product, and as a reward he gets a cushy and likely much better paid job at AMD's main competitor.

    The only mystery is why he was given that sabbatical and a chance to leave on his own rather than being disciplinary fired. I guess too big for jail, much like HecKtor was found guilty of multi billion dollar worth of insider trading and given nothing but a slap on the wrist.
  • ddrіver - Sunday, November 12, 2017 - link

    Actually maybe they gave him as a Trojan horse to Intel. Might have been a package deal GPUs+Raja.
  • jjj - Tuesday, November 7, 2017 - link

    Vega 10 is terrible (in gaming) in perf per W, perf per mm2 and much worse in perf per cost(HBM) and the worst thing was that it was 1 year late. AMD needed to accelerate and it slowed down.
    Still , would have been nice to see what Navi delivers before judging Raja Koduri.

    My bet is that his next job is AR/VR related, although the gold rush in auto could be appealing too.
  • vladx - Tuesday, November 7, 2017 - link

    Probably joining Tesla, after all they just partnered with AMD to build next gen AI processor.
  • ddriver - Tuesday, November 7, 2017 - link

    It wasn't all that bad really. It was just designed to hit a much lower target, pushing it to the limit is what made it terrible at power efficiency, which happens with literally every chip in existence.

    Having a poor start with the drivers and little effort from engine developers to optimize didn't help either.

    Vega's biggest flaw was how much it was delayed. It would have been a great product a year ago, with lower clocks and TDP.
  • Azix - Tuesday, November 7, 2017 - link

    vega aint that bad at efficiency. 200-250W at 1600Mhz on a vega 56 running 1080 perf is not horrible. but they can't afford to cherrypick so they pump the volts.
  • jjj - Tuesday, November 7, 2017 - link

    It's more likely that they missed targets. If i recall correctly, leaked slides were showing the server SKUs at 25TFLOPS and 225W TDP.
    In any case, if you look at gaming vs the 1080, you got let's say 280W vs 180W and AMD saves a lot of power by using HBM so being so far behind is terrible.
    Die size is 55% larger than the GTX 1080 and from a cost perspective HBM makes it much worse.
    We'll see how Vega does in other products but its first showing is terrible from a competitive position point of view.
  • JasonMZW20 - Tuesday, November 7, 2017 - link

    Vega64 hits an efficiency stride in Wolfenstein 2. I don’t OC my Vega64 much because I’m still on air, but I have slightly increased clocks while drastically lowering voltages 1557MHz/1.032v and 1652MHz/1.078v. Runs at about 1600MHz/1.050v.

    I’ve logged 182W-193W (total range is 174-206W) in Wolf 2 with all settings maxed and using 7.1GB of VRAM at 1080p. Clocks are 1600-1605MHz/1.050v.

    Conversely, Superposition 1080p Extreme pushes Vega to 243W with spikes of 252W at times at only 1575MHz/1.025-1.031v. A lot of fixed function geometry, I think; uses less than 4GB VRAM. Not sure if it’s culling overdrawn primitives either.

    Clearly Vega needs its new hardware features used (RPM, triangle culling/DSBR, deferred rendering/tiling, and an ability to convert fixed-function geometry datasets to NGG/Primitive Shaders on-the-fly - I wonder if that’s even possible), otherwise it’s an upclocked Fiji that drains power.
  • Santoval - Wednesday, November 8, 2017 - link

    Are these new Vega hardware features *still* not exposed by the drivers, even after so many driver versions? I am unpleasantly dumbfounded..
  • mdriftmeyer - Wednesday, November 8, 2017 - link

    It's not a matter of them being exposed in the driver. They are exposed. It's a matter of engines implementing them.
  • Samus - Wednesday, November 8, 2017 - link

    I think the RTG of AMD went into damage control mode when Maxwell launched. That was a real wakeup call and they have never really responded, technologically, instead depending on lower price points to compete in performance (with no real competitive product in the mid-range...until Fury and Vega, both of which couldn't compete at the high end.) The only real market for Radeon products is mining.

    That said, it's hard to ignore the design wins AMD has had with gaming consoles. They have effectively dominated this market segment since the Wii (which, unfortunately compared to the Gamecube, lacked an AMD logo.) The first mainstream console since the Wii to lack AMD graphics is the Switch, which has pretty underwhelming graphics, especially when not docked, and that's going to really hurt it as time goes on.
  • vladx - Wednesday, November 8, 2017 - link

    "The first mainstream console since the Wii to lack AMD graphics is the Switch, which has pretty underwhelming graphics, especially when not docked, and that's going to really hurt it as time goes on."

    Not really, Nintendo hasn't banked on high-grade graphics on it's platform for a really long time. I'd say they given up for good on bringing mainstream games to its' platform.
  • Cooe - Wednesday, November 8, 2017 - link

    Fiji (Fury, Fury X, Nano) competed with Maxwell 2 just fine. At stock clocks on both cards the Fury X was generally slightly faster at 4K, a wash at 1440p and slower at 1080p. This is still true today, though Fiji's not quite as far behind at the lower resolutions as it used to be. It just wasn't the HBM riding death-blow that AMD (and the fans) had been hoping for. Nvidia launching Big Maxwell 2 into the consumer space with the equally fast 980 Ti right before the Fiji launch really took the wind out of AMD's sails, but that still didn't make Fiji any less competitive with the best Nvidia had to offer. People (and AMD) had just been expecting it would be going up against the 980 at first, where it would have been an utter beat down until Nvidia could respond.

    Vega otoh, is an ENTIRELY different bag of chips. Without it's specific hardware features coded for in software, it could only match Nvidia's X80 series card that was using the regular consumer Pascal silicon already a year old (not weeks as in the 980 Ti vs Fury X case), let alone going toe to toe with Big Pascal (1080 Ti) as Fiji did with Maxwell 2.
  • smilingcrow - Wednesday, November 8, 2017 - link

    "It wasn't all that bad really. It was just designed to hit a much lower target, pushing it to the limit is what made it terrible at power efficiency, which happens with literally every chip in existence."

    It hasn't happened with Nvidia for quite a while now so that's no excuse.
    If they only designed it to compete in the mainstream then fair enough and it shows their lack of ambition. But even at that level it's not power efficient.
  • Stuka87 - Wednesday, November 8, 2017 - link

    If you downclock Vega to ~1200Mhz it is very efficient. However it is then slower than Pascal, so they upclocked it which moved it out of its efficiency range.
  • trane - Wednesday, November 8, 2017 - link

    Actually, Vega is a more advanced architecture than Pascal. It's downfall is that it's suffering from clock speed limitations. At ~1200 MHz, Vega is literally as efficient as Pascal at ~1700 MHz. Consider compute and games that utilize some of Vega's advanced features like Wolfenstein II (it matches 1080 Ti), it's well ahead.

    Sadly, that's a massive clock speed deficit, and to bridge some of that, they have had to push Vega 10 to the ragged edge of perf/W. At ~1500 MHz, Vega's perf/W is about half that at ~1200 MHz. Some of it is due to going GlobalFoundries - we have seen how Xbox Scorpio is significantly more efficient than RX 580 or 470 at TSMC; and how GP107 at Samsung 14nm clocks much lower than GP106 at TSMC. But some of it is also due to Nvidia's superior scaling. For this gen, Nvidia took the right call by focussing on boosting clock speeds, while AMD were trying to innovate.

    AMD should learn from this mistake - the market is decided in the here and now. While it's no secret that AMD cards age better, I think they have gone too far with Vega. There are so many advanced features here, and most of it is just wasted in die space and power. Literally, releasing Fury X die shrink with GDDR5X would be as fast in DX11 games, and also something they could have released in 2016.

    However, the pains of Vega does mean they now have an advanced architecture to build upon. Let's hope they let go of that innovation bug and start focusing on actual product that beats Nvidia in the here and now.
  • The_Assimilator - Wednesday, November 8, 2017 - link

    > releasing Fury X die shrink with GDDR5X would be as fast in DX11 games, and also something they could have released in 2016

    Fury X, or even a die shrunk version, could never have been coupled with GDDR5/X for one simple reason: power consumption. 8GB of GDDR5/X would've added an extra ~30W on top of Fury's already ridiculous TDP, which would have pushed the card's perf/watt into no man's buy territory.

    Of course, AMD was never going to admit their chips are power hogs and that they literally had no choice but to find something more efficient than GDDR5/X to be even remotely competitive in perf/watt, so instead they got their marketing department to spin HBM as "the future", and of course the fanboys ate it up hook, line and sinker.

    NVIDIA, in contrast, learned well from the disaster that was Fermi and haven't made the same mistake again, and as such has had no problem using GDDR5/X on even their most power-hungry parts. That's also why they've already committed to the known quantity that is GDDR6, while AMD continues to muddle along with HBM and defective silicon interposers driving their chip yields into the floor.
  • pepone1234 - Wednesday, November 8, 2017 - link

    >Vega is a more advanced architecture than Pascal. It's downfall is that it's suffering from clock speed limitations. At ~1200 MHz, Vega is literally as efficient as Pascal at ~1700 MHz.

    I thought those extra 4 billion transistors of vega were there to provide better clocks without sacrificing efficiency. What's the point then in comparing vega at 1200mhz and pascal at 1700mhz?

    Then stay with the fury. You at least save 4 billion transistors. Something has to be very very wrong with vega.
  • Bateluer - Wednesday, November 8, 2017 - link

    Come one man, you know none of that is true. Vega's perf/watt is excellent, with most its high TDP coming solely from the higher clock speeds RTG was forced to ship it with. And I don't know why this 'Vega is a year late' myth keeps coming up. Vega was, at most, a month late, well within predictions for a major product launch. It was always scheduled on AMD's roadmaps for Q2 2017, going back to published documents in mid 2015. This myth gets reposted at Reddit frequently too, and it gets put down there as well.
  • Stuka87 - Wednesday, November 8, 2017 - link

    Vega was already well into development before Raja took over RTG. So Vega being good or bad was not Raja's doing.
  • Dr. Swag - Tuesday, November 7, 2017 - link

    I must say I'm surprised, but at the same time I'm not really that surprised.

    RTG seems to have had issues from the start. From bad marketing, to bad choices (one 6 pin on the 480 for example), to bad launches, they just haven't seemed to be doing well.

    When Raja announced he would be taking a sabbatical I though it was like a step towards resignation, and that it might be good for RTG to have Lisa Su as the head instead seeing as she's done well for amd.

    Maybe this'll be good for amd, but only time will tell.
  • ash9 - Tuesday, November 7, 2017 - link

    Roger's gone to work for Intel, like engineers before him who went to Apple and Tesla. Seems to work out for AMD.
  • webdoctors - Tuesday, November 7, 2017 - link

    Forty is a significant number in history. It is a number representing transition, testing and change

    ???

    9/11 is a significant # in history, so is 420.

    Every # has some significance at some place or time.

    Engineering is a tough game, can't sugarcoat numbers unlike other fields.
  • bennyg - Tuesday, November 7, 2017 - link

    Start of next investor call TL;DR: Hello, Item 1, regarding our missed projections for RTG, please accept this freshly scapegoated head on a plate.
  • Flunk - Tuesday, November 7, 2017 - link

    Seeing as Vega was an entire year late this has been coming for a while.
  • karthik.hegde - Tuesday, November 7, 2017 - link

    This is what Raja said last time he joined AMD: "I always had the dream of building a Pixar like company in India and I got an opportunity to engage with a group of people who have the same mission.".

    May be that gives a picture of where he is headed.
  • vladx - Wednesday, November 8, 2017 - link

    But is there a market for high-grade visual effects in Bollywood? Looking at how things went at RTG during his reign, Raja is not really good at analysing market trends and needs.
  • Hurr Durr - Wednesday, November 8, 2017 - link

    There actually is. If you go by their recent films like "Robot", they can give Hollywood a run for its money on the imagination of people who design effects already, but the technical side is lacking.
  • vladx - Wednesday, November 8, 2017 - link

    Point was, do indian films usually get high enough budgets for such effects? It's all about money in the end.
  • Topweasel - Wednesday, November 8, 2017 - link

    Is the budget really needed. Outside hardware costs which can be really expensive. Everything should be cheaper. Specially the manpower. A studio with a large setup budget to start with could do affects for dozens of movies and spread that cost across all of them.
  • Dribble - Wednesday, November 8, 2017 - link

    Tbh I suspect a clash of personalities and company direction. It seems likely that ryzen and the cpu group get most of the investment, and the gpu group had to suffer because of that. Wouldn't be much fun to be expected to build class leading gpu's in that environment.
  • Sauls - Wednesday, November 8, 2017 - link

    I have faith in amd agen, it was lost with polaris and with vega. And now seeing good news brings faith back.
  • willis936 - Wednesday, November 8, 2017 - link

    Dispatching one scapegoat does not change the behavior of the herd.
  • peevee - Wednesday, November 8, 2017 - link

    Let me guess - he does not have to work anymore after kickbacks from Intel for giving them AMD GPU...
  • mdriftmeyer - Wednesday, November 8, 2017 - link

    Intel has a sublicense for Polaris tech.
  • HStewart - Wednesday, November 8, 2017 - link

    Does anybody else find this interesting, right after Intel / AMD GPU news. Maybe there is something more to that news than what is being mention in the news.

    He obviously smart, he probably understands AMD's financial more than general public and can see the writing on the Wall. If you are high up person in the technology and you don't like where the news of what is coming than you make preparations to protect your well being.

    I am thinking this is not going to be the only departure from AMD in the next couple months - it sounds like he was a major visionary leader - especially with Apple using the GPU and likely it is upset with Intel / GPU deal.

    Of course who really knows - he may have health issues or just wants to relax and retire.
  • zodiacfml - Wednesday, November 8, 2017 - link

    He is going to Intel
  • HStewart - Wednesday, November 8, 2017 - link

    Interesting, I am actually in a similar situation with my work - I was part of deal for a current company that purchase rights of previous company. I got the benefit of working at home and my current company got my 14 years of knowledge.

    In my situation,. the code is vastly different and enhanced than previous code. If you are correct, it might mean that Intel is using the logic provided by AMD - I expect the next generations of GPU's will be significantly improved. I was one of original designers of my work's product and my current company was amazed they got me - but I did not feel growth in my previous company and they were also hurting financially. Interesting now my previous company was purchase and things are vastly different - but I am very happy.

    This could possibly be win-win situation, Intel's gets a very knowledgeable person in GPU which everyone knows they need and AMD possibly may get money to reduce there horrible debt.

    But if I was Intel, I would just purchase AMD and get it over with. But Antitrust laws may not allow it - but things have change since the earlier days of PC - Intel's true completion now is ARM not AMD.
  • ianmills - Wednesday, November 8, 2017 - link

    I guess he was let go after the intel/Amd announcement to save face. AMD let's Raja go in a positive way so Raja will reciprocate and say positive things about amd

    If amd tried to humiliate raja, he would try to humiliate amd in return
  • HStewart - Wednesday, November 8, 2017 - link

    I am thinking with yesterday announcement - Raja was part of deal with Intel - it happen to me and would be logical for both Intel and AMD todo it.
  • rocketbuddha - Wednesday, November 8, 2017 - link

    <quote>Anand once called Raja “the king”</quote>
    No pun intended. But "Raja" in most Indian languages means "The King". :-)

Log in

Don't have an account? Sign up now