Comments Locked

200 Comments

Back to Article

  • alfreddelacruz - Wednesday, November 8, 2017 - link

    No loyalty.
  • sharath.naik - Wednesday, November 8, 2017 - link

    Loyalty to whom?
  • MonkeyPaw - Wednesday, November 8, 2017 - link

    Or, perhaps he likes the challenge of starting a project like this and running with it.
  • wolfemane - Thursday, November 9, 2017 - link

    ...straight into the ground.

    Yup
  • nathanddrews - Thursday, November 9, 2017 - link

    Exactly. Loyalty is a fine thing, but ambition and risk are what drive innovation.
  • ddriver - Thursday, November 9, 2017 - link

    Unfortunately the industry has one single priority and that is to drive profit at any means, quite often at hindering or holding innovation hostage.

    Such a field has room for only one loyalty - the loyalty to profit.
  • wumpus - Thursday, November 9, 2017 - link

    Things are never clear as that for companies that can't fit their employees into a single room. Raja is much more beholden to "Brand Raja" and needs to keep the shine on that. In any event, relentlessly maximizing quarterly profits is typically a recipe for disaster, not modern innovation.

    Bill Gates relentlessly pursued market share. Obtaining nearly 99% marketshare where ever MS went, profits eventually followed. Jobs pursued "Apple Branding" and profits followed. Intel managed to pursue "x86 domination" and did well until they pulled the Itanic, in which case "x86 domination" did wonders for AMD until Intel could regain x86 supremacy.

    Profits will matter, if only because Intel can actually begin to calculate them with discrete components (it isn't quite possible with the current GPUs). But while revenues will be *somewhat* possible to account for (not really, most will be bundled with CPUs), costs won't, and there will be plenty of politics driving the accounting. Making sure the "profits" look good for "Brand Raja" will be made and lost depending on how the costs are accounted for (and what "profits" are made in CPU+GPU bundles), not any "externally objective accounting".
  • HStewart - Thursday, November 9, 2017 - link

    One must think positive about the industry - are you really sure it only drive profits - a while back I spent $3500 on Dell Inspiration 7000 - it is 300Mhz Pentium II that in today world even an Atom computer around $100 could out performed it. Today $3500 can let you buy 4 or more laptops many times more powerful than this original Dell

    And one would say that this is profit and that it holding innovation hostage - ok if you believe this than go ahead used 300Mhz processors with non intelligent graphics.

    Raju is also in wonderful position in exiting future that EMIB will like bring us in the future that will radically change computers in the future.
  • SleepyFE - Friday, November 10, 2017 - link

    As far as i can tell EMIB doesn't radically change computers. It's just cheaper than an interposer (did i use the right word?).
  • mdriftmeyer - Saturday, November 11, 2017 - link

    The interposer is patented technology AMD won't license; and like all marketing by Intel, they proclaim their approach a leap forward. It's not.
  • Threska - Monday, November 20, 2017 - link

    The Interposer may be patented but stacking is still very much a thing.

    https://archive.fo/Af3EZ
  • Alistair - Wednesday, November 8, 2017 - link

    This actually makes a lot of sense.

    Brand CPU GPU
    Intel Core Raja 2
    nVidia ARM/Denver Geforce
    Apple Swift etc. Non-imagination
    AMD Ryzen Radeon

    No more just CPU or just GPU companies. nVidia will be running Windows on ARM soon probably.
  • 06GTOSC - Thursday, November 9, 2017 - link

    "nVidia will be running Windows on ARM soon probably."

    Yeah that'd be a successful strategy....Windows ARM is doing great...
  • nathanddrews - Thursday, November 9, 2017 - link

    Your future prediction has been proven by the past. NVIDIA had Tegra devices running Windows 8 ARM several years ago and it didn't work out very well.
  • UltraWide - Thursday, November 9, 2017 - link

    I think he/she means in the same line of the Qualcomm Snapdragon SoC that will run full Windows 10 and not the RT variant.
  • HStewart - Thursday, November 9, 2017 - link

    Same result - emulation will failed and also not licensed.
  • HStewart - Thursday, November 9, 2017 - link

    Good senses - Windows RT is dead.
  • HStewart - Thursday, November 9, 2017 - link

    But you what if NVidia also has license to use EMIB technology for their GPU. Do you really believe that AMD will be sole supplier of graphics chips - Intel got Raju on board to compete against NVidia and of course little AMD.
  • mjeffer - Sunday, November 12, 2017 - link

    Nvidia isn't really targeting the desktop with their CPU. They're targeting things like self driving and sensor heavy applications. I'd be VERY surprised if they ever ended up on desktops.
  • Byte - Wednesday, November 8, 2017 - link

    I guess the rumors were true with the huge war internally at amd against the former ati team. I guess now that raja is jumped ship, he can bring his old team over slowly.
  • Alexvrb - Wednesday, November 8, 2017 - link

    They gave Raja RTG like he wanted... and overall RTG has been lackluster in the discrete market under his leadership. Vega 56 is the only interesting release in recent history, mainly because of the disruptive pricing (currently starts around $400). Shrug. Here's hoping Navi is a bit more impressive.
  • Stuka87 - Thursday, November 9, 2017 - link

    Vega was started before Raja took over RTG though.
  • Alexvrb - Thursday, November 9, 2017 - link

    Raja Koduri has headed up graphics since they rehired him (yes, rehired) from Apple in April of 2013. 4.5 years isn't enough for you, hmm? RTG was a restructuring, yes, and they did that to please him a little over 2 years ago. But even prior to RTG, he was the graphics boss.

    I still might buy a Vega 56 if the price is right, but they sure as heck aren't stealing Nvidia's performance or efficiency crowns, nor are they making much of a dent in marketshare.
  • FreckledTrout - Thursday, November 9, 2017 - link

    I would be curious how much of the cash RTG brought in when AMD's CPU's were getting trashed over the last 7 years went to R&D of Vega. I get this feeling AMD used the ATI/RTG's earnings to build Zen.
  • Alexvrb - Thursday, November 9, 2017 - link

    Have you SEEN their financials? Do you have any idea how much they paid for ATi? A lot of people still feel it was a mistake. If anything it diverted tons of money from their CPU division for years. Look at their situation now. Lots of debt, had to sell off assets - they sold their fabs. They have been scratching and clawing for every bit of cash they can get, just to stay alive. They are even selling graphics to Intel for a MCM setup, thus shoring up Intel's only major weakness.

    Will it all pay off with their Zen-based efforts? Maybe, but you give FAR too much credit to their graphics division. Nvidia has been edging them out in almost every measure, including marketshare. The main reason I often use AMD graphics is pricing, and that's not always a given. I certainly wouldn't put Raja on a pedestal.
  • HStewart - Thursday, November 9, 2017 - link

    Of Raju had inside information about AMD including financials and this could be the primary reason he left.

    But I think people give way too much credit for Zen - in lots of ways it last dish efforts to glue dies to get just increase core count.

    But I think what most people don't realize that even the internal Intel GPU's are good enough for average customers needs - IE for word processing and spreadsheets.
  • Alexvrb - Saturday, November 11, 2017 - link

    Zen is the architecture, not the die setup, and it's a LOT more impressive feat of engineering than Vega. Also "glued-together" dies is nothing new. Intel has done it and as a matter of fact does it today. It's not mandatory for the architecture, however it has some advantages in terms of yield and flexibility. Both Intel and AMD have also ramped up the scalability and power of their "glue", by the way.

    As far as Intel's existing iGPUs being good enough... you're wrong. If you were right, Intel still wouldn't be taking GPUs seriously. This is 2017. GPU-accelerated workloads are growing, not shrinking. MR is the future. Intel is buying graphics chips from AMD, and working on their own to eventually replace those. So both their iGPUs, MCM GPUs, and discrete GPUs will all be upgraded (and NEED to be upgraded) in the coming years.
  • Alexvrb - Saturday, November 11, 2017 - link

    Also I don't really mean to trash Vega, it's just very iterative vs last gen GCN. It also runs into similar problems trying to scale up as the last big die GCN chips... it becomes a power hog. Vega should shine a lot brighter in a power-constrained environment.
  • Samus - Thursday, November 9, 2017 - link

    A lot of people called this yesterday, I just wouldn't believe it because...I mean don't we have non-compete agreements? Perhaps Intel's deal with AMD on Monday was part of the negotiations to get Raja?

    Either way, this is awesome. nVidia needs real competition. Someone needs to put the brakes on their gravy train.
  • 06GTOSC - Thursday, November 9, 2017 - link

    Non-compete agreements are practically impossible to enforce.
  • Byte - Friday, November 10, 2017 - link

    In California, non-compete is practically illegal. That's why there is so much poaching in silicon valley.
  • HStewart - Thursday, November 9, 2017 - link

    "I guess the rumors were true with the huge war internally at amd against the former ati team. I guess now that raja is jumped ship, he can bring his old team over slowly."

    This often happens in development teams when a take over happens. And the ones that stayed on that supported Raju will likely be jumping ship.
  • III-V - Wednesday, November 8, 2017 - link

    Corporations don't have any loyalty to their employees -- why should they have any loyalty to their employers?
  • Samus - Thursday, November 9, 2017 - link

    It's true. It's why he left Apple. AMD was actually quite loyal to him (on the surface) but I have a feeling them giving him up, especially without violating a non-compete, had a lot to do with that deal Intel announced Monday to use AMD GPU's on their packages. Something tells me this is a mutually beneficial move, ESPECIALLY for Raja. I mean, he's now effectively running an entire division of Intel. He could potentially become CEO in 10 years. He had no room to move up in Apple, especially since at the time they were undeniably confused as to what to do with PowerVR.
  • Icehawk - Wednesday, November 8, 2017 - link

    Loyalty to what? A corporate? The only loyalty I have is to my paycheck.
  • peevee - Thursday, November 9, 2017 - link

    How about to his team?
  • lordken - Monday, November 20, 2017 - link

    oh, how pathethic. Hows pimping your girls going? paycheck good enough?
  • beginner99 - Thursday, November 9, 2017 - link

    I find the lack of a no compete clause more interesting. I mean yes hew was off for 2 month but really that is way to little. He knows everything about AMDs upcoming products like Navi. He will know exactly what kind of performance level to target.
  • III-V - Thursday, November 9, 2017 - link

    Non-compete clauses aren't valid in certain states. California's a big one. Raja apparently lives in San Fransisco, and I'd imagine it's no big deal for someone of his pay grade to work remote and travel when necessary, so AMD can't touch him. Well, they could try, but they'd be fighting a severe uphill battle.

    Intel's also HQ'd in California (as well as AMD), so he's pretty much invincible.
  • jab701 - Thursday, November 9, 2017 - link

    Non-compete applies to poaching employees. If the employee approaches a company on its own looking for a job then there is no problem and I believe that's what the recent court case in California was about (employees looking for work were turned down based on CEO agreements).

    However, someone at a high level as him wouldnt just go an apply for a job so it is possible that Intel heard he was fed and and wanted to leave AMD and offered him a job.
  • xype - Thursday, November 9, 2017 - link

    Loyalty to a company is awesome until the point where you get fired or your department disbanded and the company goes "We’re reaaaallly sorry to let you go, here’s a box with the stuff from your desk."
  • bill.rookard - Thursday, November 9, 2017 - link

    The kicker to this whole deal is that if/when they release their GPU, I'm going to bet that AMD will be looking VERY closely to see if it contains any unlicensed AMD IP or designs.
  • peevee - Thursday, November 9, 2017 - link

    Raja is all about advancing his career and nothing more.
  • HStewart - Thursday, November 9, 2017 - link

    I think you forgetting that Raja addition to Intel makes a major change in Intel GPU involvement. Intel is going to be much serious completion - which is need since AMD purchase ATI.
  • peevee - Thursday, November 9, 2017 - link

    I don't think Raja brings any significant expertise Intel did not have without him.
  • HStewart - Thursday, November 9, 2017 - link

    Loyalty, congratulations to Raja for finally maturing it the industry with high position in original develop of x86 - this is a Win-Win situation for both Raja and Intel. What Intel gets out if Raja's knowledge and experience with GPU - and Raja gets high position in a major company that that is not billions in debt.

    What is really interesting is the Intel / AMD GPU deal - it sounds like to me it connected and part of deal - AMD debt is serious and Intel money could help keep AMD a float. But it also protects Intel with any legal actions because of Raju.

    I don't expect Intel's future GPU's will be similar to Radeon - but something very new - highly connected to GPU

    Yesterday I only guess about this and it came true - but that was based on my past experience in software development.

    This is extremely smart move for both Intel and Raju - Intel is major need to improved there GPU's to completely with AMD/RTG and especially NVidia. Everyone knows that Intel's integrated GPU's lagged behind AMD and NVidia.

    As for Raju. this was a big move to be VP in Intel and congratulations.
  • Wwhat - Sunday, November 12, 2017 - link

    Seems AMD needs both a new chief and a team to check all intel output for stolen stuff.

    It's funny, I know some silly retail stores make sales personal sign an agreement to not work in the same branch for X amount of months after leaving, yet this guy who's known to hop around and has intimate knowledge of road maps and propriety tech can just hop from tech giant to tech giant at a moment's notice.
    And of course that makes it interesting to scout him out, even if you don't need him and it\s just to offer him more money so the firm he's working for is damaged while they look for a new guy.

    In short: AMD should redo their contracts.
  • drexnx - Wednesday, November 8, 2017 - link

    what is this, their third serious crack at graphics?

    i740, Larrabee, ??????

    3rd time's the charm?
  • AS118 - Wednesday, November 8, 2017 - link

    Yeah, that's my thought. Is this movement going to have any legs, or will Raja and his discrete team going to be gone after a year, with Intel giving up and just agreeing to pay AMD for old designs and licensing?

    It really seems like a lot of trouble to go through when it's probably easier and cheaper to just buy from AMD and/or Nvidia.
  • The Hardcard - Wednesday, November 8, 2017 - link

    Easier and cheaper, but far less lucrative. GPUs and neural networks are going to be by far the most profitable semiconductors going forward. Every company that has the resources to compete would be wise to do so.
  • lmcd - Wednesday, November 8, 2017 - link

    GPU technology is where Intel's fabrication lead (which is still massive, if that isn't clear) can be abused the most dramatically.
  • Spunjji - Thursday, November 9, 2017 - link

    This. I'm quite interested to see how it pans out. My understanding is that GPUs benefit more from a process tuned for density than one tuned for outright speed / minimal leakage, so it will be at odds with Intel's most recent directions in turns of fabrication... but then who knows, maybe they'll bring something amazing to the Mobile marketplace and be content with merely competitive (and "synergistic" at the top end.
  • amrs - Thursday, November 9, 2017 - link

    The manufacturing lead for future products would be easier to believe in if Intel were able to actually manufacture their FPGAs at 10 nm but it doesn't look like they can. Stratix 10 is not shipping, only a couple of Arria 10s.
  • name99 - Friday, November 10, 2017 - link

    (a) Uh Intel's "massive" fabrication lead is actually no longer clear. Sure, sure, there 10nm claims are better than TSMC/Samsung's 10nm claims. But TSMC are SHIPPING 10nm today, and may well be shipping 7nm before Intel ships anything in volume at 10nm.

    (b) There is a political angle to this that no-one seems to want to consider. How do the Phi group and the Nervana group react? Jensen Huang (having exactly the same thought as I had when I heard this!) seems to be the only person who has noticed this, and Toms Hardware the only journalists who understood the point of what he said:
    "If you have four or five different architectures to support, that you offer to your customers, and they have to pick the one that works the best, you are essentially are saying that you don't know which one is the best [.....] If there's five architectures, surely over time, 80% of them will be wrong. I think that our advantage is that we are singularly focused."

    There are multiple issues to his point.
    - In the past Intel has repeatedly crippled devices (and thus its entry into new markets) because of an obsession with x86. Atom/Quark are obvious, but I'd argue the same for Larrabee [would be a much more successful, much better throughout processor with a leaner ISA], and for IA64 [substantial reason for failure was dicking around with stupid attempts to make it x86 compatible].

    + Raja seems interested in something more than just a GPU, but a generic throughput processor. So what ISA will it execute? And how does it not compete with Phi?

    + How many HW architectures (and SW architectures) is Intel willing to support? Plenty of people complain that their existing Gen drivers are lousy. They're going to create a whole new ISA (and compiler? and drivers? and optimized libraries?) for a new architecture? Meanwhile doing the same thing also for Movidius? And Nervana?

    + What is Intel eventually going to ship? Five very different SKUs all trying to sell into the just not that large HPC/data warehouse market? For a company that complains that mask sets costs so much it's sensible to reuse them for the same die selling different numbers of cores, this seems a very strange direction to go.

    Everyone is excited right now because Intel is covering every buzzword (except AR...)
    But no company is large enough to do everything, and this looks to me like Intel being stretched way too thin, floundering born of desperation, not a coherent plan.
  • tipoo - Wednesday, November 8, 2017 - link

    Larrabee was a product of their post Itanium x86 or bust ptsd. Hopefully this is more sane. The Gen graphics already aren't bad per watt.
  • Pinn - Wednesday, November 8, 2017 - link

    Can confirm that ptsd.
  • CharonPDX - Wednesday, November 8, 2017 - link

    Well, third crack - I wouldn't call it third SERIOUS crack...

    i740 was more "proof of concept to show AGP works."

    Larrabee was canceled as a GPU before it even got as far as prototype graphics card stage.
  • extide - Thursday, November 9, 2017 - link

    Oh no, there were prototypes made, and they even worked. The Knights Corner product that was released to retail actually still had the fixed function texture units still in the silicon, you can see them in a die shot. The performance just was never competitive.

    Check out this blog post for a great read: https://tomforsyth1000.github.io/blog.wiki.html#%5...
  • CheapSushi - Thursday, November 9, 2017 - link

    And what's wrong with that? I don't understand your point.
  • tipoo - Wednesday, November 8, 2017 - link

    Reading between the lines a bit, I think Raja was trying to get RTG spun out or bought out by a wealthier party, but that didn't work out. So Intel sniped him after a ho-hum Vega performance instead.
  • ToTTenTranz - Wednesday, November 8, 2017 - link

    Intel has been downsizing their iGPU division and now they want to make high-end discrete GPUs out of one high-profile hire?

    Unless old rumors are true and Intel is buying RTG from AMD in the end...
  • lilo777 - Wednesday, November 8, 2017 - link

    iGPUs and discrete GPUs are two very different things. Intel is probably trying to advance their GPU/AI offerings.
  • Pinn - Wednesday, November 8, 2017 - link

    They do have some true neural circuits—only ibm is competing there. Google is yet more SIMD.
  • beginner99 - Thursday, November 9, 2017 - link

    Not buying much easier to just poach the good employees from AMD. And they hired the guy that should know who these are.
  • Filiprino - Wednesday, November 8, 2017 - link

    Maybe this is part of the deal of having access to AMD's GPU tech.Now it will work for Intel with AMD technology too.
    But this shows that Intel can't make a good enough x86 vector architecture to rival NVIDIA. Does it?
  • Filiprino - Wednesday, November 8, 2017 - link

    “I am incredibly excited to join the Intel team and have the opportunity to drive a unified architecture vision across its world-leading IP portfolio that help’s accelerate the data revolution.”

    What happens to HSA then?
  • HStewart - Thursday, November 9, 2017 - link

    "But this shows that Intel can't make a good enough x86 vector architecture to rival NVIDIA. Does it?"

    First of all Intel has decades of experience with x86 and they are ones that created the original designed with AMD clone. ( Partly because of evil IBM desire for 2nd sourcing )

    As for GPU, there are new GPU deigned and NVidia and ATI/AMD have been doing it decades - this news is showing that Intel is tired of fooling around in GPU segment.
  • Lord of the Bored - Saturday, November 11, 2017 - link

    And fast-forward a bit and Intel is making clones of AMDs x86-64 processors. The 64-bit "long mode" originated at AMD, and Intel publicly mocked it right up until they cloned it.
  • jjj - Wednesday, November 8, 2017 - link

    After Bulldozer 2.0 for AMD, he's gonna build another one?
    As for the ethics of this move, hard to go any lower. Not that working for Intel is gonna be any fun.
    I'd say , he deserves to be elected POTUS.
  • milkywayer - Wednesday, November 8, 2017 - link

    People switch companies fairly often. At the end of the day, your loyalty is just to yourself and immediate family.
  • IKeelU - Thursday, November 9, 2017 - link

    You sound like a terrible person.
  • milkywayer - Thursday, November 9, 2017 - link

    Ok. Now go save Sony or Microsoft on one of the Xbox One X posts. They need a savior like you.
  • Hurr Durr - Thursday, November 9, 2017 - link

    Leftist pain is always delicious. Give more!
  • 0ldman79 - Wednesday, November 8, 2017 - link

    Hmm...

    I've been kind of interested in getting an Intel card *just* for Quicksync. I will admit this has kind of dropped off since I got my 970. The hardware acceleration on it is a smidge better than it was on my 750 ti.

    Both encode at 264 quite well with a few tweaks. I get excellent quality and file size, though the tweaks do slow down encoding considerably and share a fair portion of the load to the CPU.

    It is still three to four times faster than CPU encoding though.
  • sonny73n - Thursday, November 9, 2017 - link

    I've transcoded thousands of videos. Sorry to tell you that QuickSync and NVENC can't hold a candle to x264, quality wise. Of course, software encoding is much slower but that's where you get the quality output thru thorough analysis of the source.
  • AS118 - Wednesday, November 8, 2017 - link

    Well this is interesting, but given Nvidia and AMD's patents, I can't imagine this'll be easy or cheap for them. I wonder what the graphics will be for. My guess is AI and Deep Learning, possibly automotive.

    No matter what they make, they'll probably have to go to either AMD or Nvidia (if not both) for Licensing though. The Raja hire might be to help them either design around patents, or just to get insights on how to license those cheaply, maybe in corroboration with AMD.
  • The Hardcard - Wednesday, November 8, 2017 - link

    Not only does Intel have a lot of their own patents, but they also have truckloads of semiconductor patents that AMD, Nvidia, Imagination, and other companies rely on. It won't be overly difficult to weave more back and forth cross licensing into the web.
  • Drumsticks - Wednesday, November 8, 2017 - link

    What an interesting week in our world.
  • HStewart - Thursday, November 9, 2017 - link

    Yes it is - it started out with something that would be disastrous for Intel - giving up to AMD on GPU and turn into opportunity for Intel to lead in GPU world.

    I believe it not over yet, Apple connection to this news is very interesting. Maybe possibly even a connection with Next Generation Consoles - I maybe wrong but who knows.
  • rocky12345 - Wednesday, November 8, 2017 - link

    What a joke the guy pretty much screwed up the Vega series but hey he really hyped it all up so much so that Loyal AMD fans were so worked up for the release that when it got here and did not meet up to the hype everyone started spitting on the Vega series chips. Then he says oh I am going on a 40 day vacation. I am sure Lisa Su was very happy to get him out of there and was hoping he would quit because if he quits and it is him that breaks the contract and AMD no longer owes him on the contract. No here is hoping there was a no compete clause in his contract.

    Good for Intel they have lots of money to throw at a project so Raja should be very happy there at least until all of his promises he makes start to come back on him & Intel lets him go as well.

    I hope Lisa Su or who ever takes over his position can fix all the mess he has created over at AMD. I am sure glad he was not part of the CPU team at least the Ryzens pretty much work as expected and perform fairly well and do not drain the power grid like the Vegas do.
  • nevcairiel - Wednesday, November 8, 2017 - link

    I don't believe a single high-level manager can be held accountable for all technical failures of a product. Its easy to pick them as scape-goats, but in reality its never that easy.
  • beginner99 - Thursday, November 9, 2017 - link

    But it is. That why a manager has such a high salary. Because of the responsibility. A managers main skills need to be social skills to get the people in-line and working together. With AMD the main issue was probably money but one could also argue that if the is more convincing to the board he would get more money to work with.
  • HStewart - Thursday, November 9, 2017 - link

    But who is really is responsible is the Raju - Manager - not the blaming the Guy that read the writing on the wall. It just funny how AMD fans love when Intel announce the GPU was being added to Intel but turn on the same guy that help that out when it goes to Intel.

    Congratulations to Raju for waking up and maturing in this world of CPU/GPU's.
  • risa2000 - Thursday, November 9, 2017 - link

    Actually, the only "job" of any high-level manager is to be a point of responsibility. Of course he is not responsible for particular implementation, or technical design. But he is responsible for that the people reporting to him get all the resources and guidance to meet their goals, or to change them (either the goals or the people) if they cannot. They are not paid for nothing.
  • DigitalFreak - Wednesday, November 8, 2017 - link

    Vega was too far along in the design phase when Raja joined AMD for him to have done anything with it. Don't be so butt hurt.
  • HStewart - Thursday, November 9, 2017 - link

    Raju saw the writing on the wall at AMD with there over a billion in debt. Vega's can't be blame on Raju - but AMD as the whole.

    Lisa Su - should wake up and live to reality - the Sleeping Giant (Intel) has waken up/
  • pSupaNova - Thursday, November 9, 2017 - link

    How many times have heard that?

    Intel are too late to this party just like they were too late smartphone market.
    Nvidia have the all important software stack sewn up in compute.
  • tipoo - Wednesday, November 8, 2017 - link

    I really do want to see the Intel dedicated GPU though.
  • euskalzabe - Wednesday, November 8, 2017 - link

    I don't like the idea of giving Intel any more power, but yeah, I also want to see that GPU.
  • Notmyusualid - Thursday, November 9, 2017 - link

    and me.

    Nvidia have fleeced me long enough.
  • atirado - Wednesday, November 8, 2017 - link

    I'm guessing Intel knows AMD will sure.

    I'm guessing he's received advise he is in a defensible position.

    But yes, one hire does not make a GPU...
  • HStewart - Thursday, November 9, 2017 - link

    But this guy was a leader of a team - but it also means if product's failure is not limited to the leader.

    Raja was smart and has inside information about AMD and wanted out.
  • IKeelU - Thursday, November 9, 2017 - link

    Not smart enough to make a good GPU for AMD, it seems.
  • HStewart - Thursday, November 9, 2017 - link

    I believe Raju came back to AMD after initial designed of Vega - basically he didn't like what he saw happening at AMD and left for a better and more trustworthily opportunity.

    With his connections to Apple, my guess is in the future we will see Apple using products from Intel for GPU also.
  • Vatharian - Wednesday, November 8, 2017 - link

    They get AMD to drop in silicon, cough up HBM waffers, and now suck in their chief engineer... that seriously sketchy. It's either Lisa Su is spewing blood trough her ears, or she knew beforehand. Knowing the secrecy going on between those two, I doubt press will really learn.
  • euskalzabe - Wednesday, November 8, 2017 - link

    I would highly doubt Su had no idea this was coming... they must have known at least partially.
  • Vatharian - Wednesday, November 8, 2017 - link

    It's not a given - if SHE sent him on the... "sabbatical", then he came back to bite her ass.

    But it's sketchy for chief engineer to not have any grace period for working for competition.
  • RaistlinZ - Wednesday, November 8, 2017 - link

    Truly. You'd think something like that would have been written into his contract, but I guess it wasn't.
  • Friendly0Fire - Thursday, November 9, 2017 - link

    They do have non-compete clauses in the contracts, but those clauses are null and void in California, which is both where Intel is headquartered and where Raja lives.
  • Bob Todd - Thursday, November 9, 2017 - link

    Non-competes can also be hard to enforce in very specialized fields that only have a handful of players.
  • HStewart - Thursday, November 9, 2017 - link

    Basically I think Intel and AMD made a deal and the chief engineer was part of deal. This happen to me in a different way.

    This story sounds more real now than original store about Intel using AMD GPU - why should they do so when they have ability to do it on there own - only with need help from a good engineer.

    Now one thing that is still interesting is possible Apple connection. Possibly Apple has found it better to work with Intel than AMD and is part of the push for this to happen - but who knows
  • euskalzabe - Wednesday, November 8, 2017 - link

    WHOAAA... THIS. IS. INSANE. I didn't see this coming.
  • Pinn - Wednesday, November 8, 2017 - link

    I may have bought the Andy Grove Port intel card and returned it. Only thing good about current intel graphics is decent Linux support. Intel is desperate. I may still get the xpoint card as my intel 750 1.2tb card is being a champ and doesn’t throttle for hermals.
  • Frenetic Pony - Wednesday, November 8, 2017 - link

    Hires the guy that just failed to do exactly that at AMD to do it for them. Now there's 2 possibilities:

    A. He was drastically underfunded as he implied and AMD's financial situation is at fault.
    B. The failure of Vega to upend, or even totally compete, in the market is his fault and Intel just made bad hiring decision.

    It'll be interesting to see which, but a win for us consumers either way. 3 way GPU fight, 3 way GPU fight! Go go competition
  • Yojimbo - Wednesday, November 8, 2017 - link

    The answer is pretty obviously A.

    Koduri was AMD's CTO of graphics before 2009, though, so he may be responsible for some of the problems that beset AMD, though.
  • Friendly0Fire - Thursday, November 9, 2017 - link

    AMD made some good moves in the graphics space during that era though. GCN was a good arch early on, sometimes more power efficient than Nvidia's, and they managed to get their proprietary API basically enshrined into the standards as D3D12/Vulkan (which share a lot in common with Mantle).

    It's only the last few years that have been quite rough, and there's no telling if that's just underfunding. GCN's long in the tooth and should probably have been replaced by now, so that's my guess.
  • Yojimbo - Thursday, November 9, 2017 - link

    When GCN first came out, it was more power efficient than the Fermi architecture because it was newer than the Fermi architecture and because NVIDIA was still getting the ball rolling with the building of their architecture. Fermi was very inefficient, but Kepler showed that that inefficiency wasn't something that was inherent in NVIDIA's overarching design. GCN GPUs, however, have shown over the years that they consistently lag behind NVIDIA's GPUs in throughput efficiency, i.e. the percentage of execution units they can keep busy, as well as showing that they can't maintain as high of clock rates under load as NVIDIA's GPUs. I think the consistency of these situations and the fact that AMD benefits from DX12 as much as it does perhaps shows some issue related to the basic architecture of GCN. They fell behind pretty early, when Kepler was released, before the effects of AMD's reduced R&D spending should have caught up with them. I don't know if the memory bandwidth issues GCN GPUs have had are a result of separate failures or if they are also related to basic GCN architectural decisions.

    GCN definitely should have been replaced by now. Some of the issues have been slowly improved over the years, perhaps evidenced by the smaller performance advantage DX12 implementations have over DX11 implementations with Polaris and Vegas compared to AMD's earlier architectures. But I think that instead of allocating resources towards doing the work necessary to resolve the issues, AMD have put their resources towards their CPUs and have let their GPUs rot a bit.
  • BOBOSTRUMF - Wednesday, November 8, 2017 - link

    Timeline, my interpretation:

    Apple to Intel : " We want better Graphics in smaller package or we cancel all our deals and use ARM Cpu's in our MacBooks"
    Intel to Apple: "We can't do it, sorry, not smart enough in Graphics "
    Apple to Intel: "We have someone inside AMD, worked for us, we can trust him, he will deliver you the graphics, but you lower the prices !"
    Intel to Apple "Interesting, let us think a strategy"
    Intel to AMD "We want to licence your Graphics technology, we pay good money !"
    AMD's Lisa Su "No !"
    Intel executives "Only for Apple products, already our market, we will not compete on PC market, we PROMISE "
    AMD's Raja Koduri to Lisa Su " We must accept, Intel will give up making graphics, and we win money from licensing. WIN-WIN for us Lisa "
    Lisa Su "Alright Raja, I trust your judgement "
    Intel's Shareholders to Intel's executives" How much we pay for licence !"
    Intel executives "80-100$, a good deal"
    Intel's Shareholders "Nooo!, Apple already gives us too little for our CPU's, greedy bastards. And we want graphic cards of our own, with Intel's logo not AMD's. Can you hire the mole?
    Intel's executives "We have 10 lawyers to look over his contract, yes he can join another tech company if he resigns "
    Intel's Shareholders "Good, licence the technology but don't mention anything about any Apple's exclusivity. Use 50 lawyers if needed. Then hire the mole !
  • webdoctors - Wednesday, November 8, 2017 - link

    For some reason, every time you mentioned the mole I kept thinking of that scene from Austin Powers: https://www.youtube.com/watch?v=QEExYuRelbg
  • euskalzabe - Wednesday, November 8, 2017 - link

    LOL that was great :D
  • SquarePeg - Wednesday, November 8, 2017 - link

    That's worth a thumbs up.
  • xype - Thursday, November 9, 2017 - link

    You forgot the


    5 years later, Apple: "Lol we’re making our own CPUs and GPUs anyway, bozos!"

    :P
  • colemar - Thursday, November 9, 2017 - link

    BOBOSTRUMF,

    But there is no IP licensing in the Intel/AMD deal just announced. It's an Intel project where Intel funded AMD for a custom GPU design.
  • Inteli - Wednesday, November 8, 2017 - link

    If Intel ends up developing enthusiast/gaming GPUs instead of focusing on just neural network/compute GPUs, I would be very excited. We haven't had 3 players in the enthusiast GPU space for a very long time, and that third option could help balance out the market, provided Intel is competitive and RTG becomes competitive.
  • realistz - Wednesday, November 8, 2017 - link

    AMD was holding Radeon team hostage. This is a promotion for Raja.
  • tipoo - Wednesday, November 8, 2017 - link

    I appreciate filing this under "Woah".
  • Ryan Smith - Wednesday, November 8, 2017 - link

    It really is that kind of a week.

    Even if you bought the rumors of Raja being hired by Intel, announcing the hire and the initiative to produce dGPUs the very next day is quite a turnabout.
  • Yojimbo - Wednesday, November 8, 2017 - link

    Yeah, Qualcomm, Broadcom, Intel, NVIDIA... There's a big melee brewing. Most likely with IBM sitting on the sidelines. NVIDIA just wants to sell their GPU accelerators on every platform, but this move by Intel could force their hand to try to develop a platform themselves. Even if NVIDIA reacts just by staying their course, Broadcom seems to be gunning for Intel's data center dominance. Maybe NVIDIA buys IBM's Power unit. I don't think it's likely, but maybe NVIDIA can buy AMD and sell the graphics portion to Qualcomm :D Could Intel revoke the x86 license without risking anti-trust investigation? What would happen to Intel's access to AMD's x86 patents if they revoke the license?

    I was just trying to imagine yesterday what the market would be like if, when AMD approached NVIDIA to buy them before AMD bought ATI, AMD's board had accepted Jensen Huang's rumored demand to lead the combined company. Would they have made the same Fusion mistake under Huang? I have no idea. Fusion must have been the strategy that led AMD to pursue the merger, but NVIDIA seemed convinced that memory incompatibility would sink it, which turned out to be correct.
  • lmotaku - Wednesday, November 8, 2017 - link

    Because of how old x86 is now, it's partially open. Intel can only require a license to specific features which AMD already pays for. AMD made the x86-64 instructions which Intel pays the license fee to.

    Why exactly would Intel say "Stop adding new features to our old x86 instruction set!" "We never wanted to be able to use 64bit registers!" ?

    Would be kinda stupid to say "don't make improvements for us!" Since AMD made AMD64, Intel hasn't really changed x86 all that much. It's different than back in the day where you could put an AMD CPU on an Intel-based mobo and it used all the same designs. That would be a problem and Intel wouldn't let that fly today without royalties.

    Other than that, your other speculations are anyone's guess because I didn't pay close enough attention when that stuff was going on.

    I think ATi got a lot of slack back in the day but they always did what they did now. Provide better performance for less money. I don't care about wattage, never did. It's definitely a plus, but ppw has never been on my radar, only ppd, and AMD has continued to delver that. They have a big enough spot on the playing field to keep going—especially now that they have dominated in consoles and iGPUs.

    This era of GPUs (R9 to RX to Vega) as far as ppw goes ts been underwhelming each time. R9 did faily well, but RX was supposed to push them further. We were disappointed a little because they continued to just be "Okay" or "Good enough". Problem is, we expected more yet again from Vega. It didn't come, and Raja will get a lot of slack for this folly and Lisa Su will be praised for how well she did with the CPUs. It could have been a bad strategy, lack of money, or anything that caused it, but I think AMD would have been better off keeping the ATi branding. Loyal fans have been waiting for ARG, or something to come from AMD. (Ati Radeon Group for example). ATi stood upon "Technology you can trust", which we still trust ATi, and RTG was flashy and modern for a bit, but we need a from the ground up revival like Ryzen.
  • Yojimbo - Thursday, November 9, 2017 - link

    I think you misinterpreted the x86 part of my post. My point was Intel's reaction to NVIDIA buying AMD. Intel does not want NVIDIA to have an x86 license. If NVIDIA were to buy AMD and Intel could find a way to deny NVIDIA an x86 license, they would.

    Also, x86 is partially open, but that doesn't help a newcomer who wants to compete with Intel with a modern x86 CPU. A lot of software takes advantage of the newer instructions. Not being able to handle those instructions would be a no go.

    As far as ATI, look at the historical market share. NVIDIA and ATI were neck and neck back then, with NVIDIA usually a little out in front. AMD bought ATI when ATI's market share was near a high water mark (they had just passed NVIDIA for a short time, if I remember), and AMD's CPU business was just going sour. GCN was introduced in 2011, and I think it was the first architecture that was built from the ground up to enact AMD's Fusion strategy. It was designed to be an architecture for heterogeneous processors in APUs. Over the years it has showed its problems trying to be a big, discrete GPU. NVIDIA first took over the high end of the market and then with Maxwell took over most of the whole PC GPU market. AMD didn't have the money to fix the problems. I think Mantle was a desperate attempt to try to relieve some of the deficiencies of the architecture as a high-power discrete gaming GPU by exposing some of the features that were put into the architecture for AMD's heterogeneous computing plans. But, much like GPUOpen, developers don't like when a hardware company asks them to do the platform work for them, not when they have other options; it adds a lot of complexity and cost... for not much gain.

    As far as AMD's recent GPUs, they were "okay" and just "good enough" (I would argue they were not nearly good enough. From a consumer's perspective, once AMD lowered their operating margins to razor thin levels they may have been "good enough", but from a business perspective the fact that they had to operate with such low margins meant the products were not good enough) not because that's what they wanted them to be, but because that's what their architecture and their financial resources allowed them to be.

    You shouldn't be creating a dichotomy between Lisa Su succeeding with CPUs and Raja Koduri failing with GPUs. That is myopic. Lisa Su is CEO of all of AMD, and is responsible for the allocation of resources to various parts of the company. Firstly, before Lisa Su arrived, AMD made the strategic decision to pursue Fusion and design their GPU architecture around it, as discussed above. Later, after AMD's CPU business turned sour, AMD made a strategic decision to put resources into reviving their CPU business at the expense of their GPU business, which is what was bringing in money to them at the time. I think this decision must have been made when Lisa Su was at the company, but before she was CEO. However, Lisa Su continued with the strategy when she took over. In fact, I think she probably intensified it, because she enacted various cost cutting measures (as the company was facing a financial crisis). Look at AMD's R&D budget and compare it to NVIDIA's. Then consider that AMD is simultaneously trying to compete with Intel in CPUs. Finally, consider two more things: 1) The re-hiring of Jim Keller by Lisa Su (not yet CEO) in 2012, and 2) the relative success of Zen to AMD's GPUs. Just follow the probabilities to figure out where the money most likely was going. Raja Koduri and his RTG was making a Thermopylae-like stand. Only he doesn't get the adulations as Leonidas does, so he decided to go somewhere else. Could he have done better than he did? Maybe, I have no idea. Could he have taken the fight to the enemy? Certainly not.
  • Kvaern1 - Thursday, November 9, 2017 - link

    "If NVIDIA were to buy AMD and Intel could find a way to deny NVIDIA an x86 license, they would."

    The cross license agreement between AMD and Intel is not transferable:

    "(c) Termination Upon Change of Control. Subject to the terms of, and as further set forth in, Sections 5.2(d) and 5.2(e), this Agreement shall automatically terminate as a whole upon the consummation of a Change of Control of either Party."

    See https://www.sec.gov/Archives/edgar/data/2488/00011...
  • Yojimbo - Thursday, November 9, 2017 - link

    Yes, that was clearly implied by my post. It were transferable, Intel would have no option to revoke it, right?
  • Kvaern1 - Thursday, November 9, 2017 - link

    If it was I didn't get it but that might be an issue with my non native english.
  • Yojimbo - Thursday, November 9, 2017 - link

    No problem. By the way, it looks like the capture period of that agreement ended November 2014. I suppose they signed another one since then. Also, there's a whole lot of information redacted in that publication of the agreement.
  • Yojimbo - Thursday, November 9, 2017 - link

    I didn't find any newer one, so perhaps that's the last agreement they signed. If so, they aren't sharing any patents after November 2014.
  • Yojimbo - Thursday, November 9, 2017 - link

    Also, maybe one confusing thing is that I used the word "revoke". Technically, Intel doesn't revoke it, they just choose to not sign another one with the new entity. But in doing so they effectively revoke the grant of the license to the ongoing business that has changed hands.
  • HStewart - Thursday, November 9, 2017 - link

    So does this mean also that Qualcomm emulation x86 is also invalid since they don't have an license. I personally think AMD has voided the original license since they make changes in designed.

    I also wonder with Intel / AMD GPU does that give Intel license on technology. Or does Raju have to a completely new technology which I would believe would be better especially with Intel vast knowledge in CPU and production experience and also with Intel purchases.

    This week could be another thing, Intel first step in purchase of AMD which would really change things.
  • Yojimbo - Friday, November 10, 2017 - link

    "So does this mean also that Qualcomm emulation x86 is also invalid since they don't have an license."

    Qualcomm does x86 emulation? I didn;t know about that. Can you be more specific?

    "I personally think AMD has voided the original license since they make changes in designed."

    That's not how a patent cross license agreement works. It is an agreement that the two participating companies won't sue each other for doing something in their designs that the other company might think falls under a patent it holds. AMD isn't expected to copy Intel's design, they are simply allowed to implement features int heir own designs which are mentioned in Intel's patents (and vice versa).

    "I also wonder with Intel / AMD GPU does that give Intel license on technology."

    No. Intel has access to patents to avoid legal troubles by designing a GPU through an agreement with NVIDIA signed in 2011 that expired in March 2017, if I recall correctly. So Intel is covered by all patents that NVIDIA filed before March 2017. The AMD / Intel chip deal does not involve any patent licensing. AMD is building a semi-custom GPU for Intel for Intel to put in their product. It's much like AMD building the SoC for the PlayStation 4 for Sony.

    "This week could be another thing, Intel first step in purchase of AMD which would really change things."

    Intel would have to spin off AMD's CPU business in order for the purchase to pass regulators. They'd probably rather let that sleeping dog lie rather than to catalyze the passing of AMD's CPU business into the hands of a more dangerous rival (like Qualcomm, Broadcom, or NVIDIA). Additionally, shareholders might be angry and think it were a poor use of capital to paid $5+ billion to buy AMD's graphics unit. It's not like the unit is in immediately usable shape for them to compete with NVIDIA in compute acceleration. Intel would still have to spend years developing a software ecosystem. Probably Intel thinks that during those years it will take them to develop the software, it can develop its own GPU for far less than $5 billion. Just my guess.
  • MCX151 - Wednesday, November 8, 2017 - link

    I wonder what kind of IP docs he signed? I suppose if AMD finds his fingers in the pot of intel tech, Intel will should be ready to get their check book ready...

    As for this guy, he delivered a crap product... Maybe he will get his act together and make something truly competitive to nvidia. Doubt it tho, if he couldn't figure it out with ATI, then he's not gonna get much with Intel. My opinion of course! :D

    Now maybe ATI can move on and design better products without this loser.
  • ssnitrousoxide - Wednesday, November 8, 2017 - link

    It doesn't matter if Raja really is responsible for the Vega flop. Intel needs a force to contain nVidia, so does AMD. So any move that limit nVidia's expansion is beneficial to all parties.
  • Yojimbo - Wednesday, November 8, 2017 - link

    I don't think Koduri is responsible for the Vega flop. AMD's strategic decisions are responsible for their GPU woes. Firstly, they oriented their GCN architecture for a gamble on their Fusion line instead of for what was best for a discrete GPU. Then, when their CPUs started to be far outdistanced by Intel's and they got into financial difficulty, they both cut their overall R&D budget and shifted their resources to resolve the CPU problems. That left little money to work out their long-term GPU problems while simultaneously trying to build competitive GPU products in the short term. They did not have the money to flesh out their compute-oriented software ecosystem, either. So when Koduri came back to AMD he was given an overwhelming challenge. If Intel thought he was a failure they probably would have hired someone else.
  • Yojimbo - Wednesday, November 8, 2017 - link

    Unless Intel's plan is to play dirty and try to cut into NVIDIA's profits to reduce their financial might (which seems a bit dangerous, like trying to corner a tiger), the reason Intel want to develop a high-end discrete GPU is for the compute space, not for graphics. They just don't want to publicly undermine the rest of their scattershot methods of trying to combat NVIDIA's GPUs in the data center (Xeon Phi, AVX-512 on Skylake, Nervana, Altera) while their GPU is under development. They also don't want to make any promises before they have any idea what they can come up with.

    I think it's probably too late. The time to make a GPU was back in the 2000s, instead of working on the Larabee project. But I wonder how NVIDIA will respond. Do they continue to focus solely on GPUs? Or do they worry that Intel's market prowess can convince system builders to use Intel's GPUs instead of NVIDIA's? If the latter, they might try to develop their own CPU, look to make an acquisition, or pursue a merger.
  • edzieba - Wednesday, November 8, 2017 - link

    I'd love to see a really radical architecture from Intel, either building on the base of Larrabee (according to alums like Abrash and Forsyth, Larrabee for graphics was functional but not chipped: https://tomforsyth1000.github.io/blog.wiki.html) or something new entirely. Current GPU archs from both AMD and Nvidia are based on throughput optimisation: shove as much work in at once as you can, keep individual cores idle for as little time as possible, nobody really cares as long as a whole screen gets drawn at the end. For VR, you want latency optimised: getting jobs done as fast as possible, and ideally do them in the order your display will read them out ('race the beam'). Whether that can mesh well with current/future GPGPU demands remains to be seen.
  • BreakArms - Wednesday, November 8, 2017 - link

    Larrabee likely sucked very badly. It was a self contained system running FreeBSD kernel intercepting DX calls to x86 cores, there's just no way it was going to be competitive at any price they could sell it for.
  • wr3zzz - Wednesday, November 8, 2017 - link

    What is Raja Koduri's track record? From what I've read it seems he is more a self-promoting managerial type than true architect star like Jim Keller.
  • peevee - Thursday, November 9, 2017 - link

    careerist
  • Freakie - Wednesday, November 8, 2017 - link

    "I would happily argue that outside of Apple, most other PC OEMs don’t “get it” with respect to graphics, but at this juncture that’s beside the point."

    I think you don't really "get" laptop graphics, Ryan. The reason why GT4 isn't in other laptops, is because the price Intel charges for it. When laptop OEMs can put a more powerful CPU and GPU in a laptop with longer battery life, in the same size as the MBP, at a cheaper price than the MBP, then of course they aren't going to do GT4. To charge what it costs for GT4 to consumers is something only Apple can do because Apple consumers, in the end, don't care what the hardware is inside so long as it's not utter crap. This isn't a matter of other OEM's not "getting" laptop graphics, its a matter of OEMs actually making what their customers want, which are laptops that beat Apple's in every way except for the Apple logo, and the OEMs succeed in that.

    Sure GT4 might be a bit more power efficient, but the price difference from a discrete GPU just as (or more) powerful than GT4 can be spent on a slightly larger battery, which would add about 2-4 ounces to the end weight ,at most, to make up for the power efficiency difference. And of course with a discrete solution they can more evenly spread out the thermal load, and add an extra fan on top of that, further decreasing the amount of heat that reaches the user. It makes for a much cooler product than even Apple can make using GT4.

    But since you said you would happily argue the point, I'm open to the fact that I might be missing something! Is there a reason why an OEM should go with GT4 over a discrete graphics solution?
  • vladx - Thursday, November 9, 2017 - link

    Well said, the only reason Apple buys GT4 SKUs is because they know they can fleece iSheeps later on with their products.
  • lilmoe - Thursday, November 9, 2017 - link

    You're not missing any point. Lots of Apple fans don't get, some _don't_ want to get it.

    Luckily, this Intel/Raja move won't be panning out anytime soon, and Ryzen Mobile is almost in products.

    It's a time race, Zen2 and 7nm isn't too far away either. Hopefully along with Navi.

    Now, what EVERYONE reeaaly doesn't get is that Zen and Vega WILL shine in low power applications. Lots of server, and ALL of mobile.
  • TEAMSWITCHER - Thursday, November 9, 2017 - link

    I must have two computers at my disposal in case of a failure. No customer support plan is good enough to guarantee zero down-time, and delivering for my clients is critical. I have found that the Desktop + Laptop combination is a good solution without complete redundancy. The Desktop provides performance, the laptop provides mobility. Each machine backs up the other.

    I want my desktop to have a powerful GPU(s) .. where I can enjoy a large 4K display whether I'm working or gaming. Laptops are not good for gaming; the screens are small, the speakers are weak, the fans are loud - it's a terrible experience even with discreet graphics. Intel's integrated solutions provide enough 3D support for my work-related tasks, either on the road or when my desktop computer is down. I can sacrifice ultimate performance for a thinner and lighter device.

    Best of all, I don't have to call my clients and tell them a sob story about my dead computer.
  • at80eighty - Wednesday, November 8, 2017 - link

    GOOD

    anything that brings strength to more players is welcomed.
  • lilmoe - Thursday, November 9, 2017 - link

    Good? We're talking Intel here. This isn't about balancing the market here...
  • IKeelU - Thursday, November 9, 2017 - link

    Seriously. This move is (potentially) like giving the 800-lb gorilla another weapon. Nvidia's stock might have gone through the roof the last 3 years, but they have no where near the clout that Intel has.
  • dothackjhe - Wednesday, November 8, 2017 - link

    I do not know what's his full story about AMD. But, damn, just as when AMD is already ahead of the competition, that's the time he chose to leave. Worst, he just jumped ship towards a direct competitor.
  • Icehawk - Wednesday, November 8, 2017 - link

    Since when can any of intels graphics models keep up with a $150 card? That's GTX960/1050/R560 territory and those will smoke anything intel has ever put out.
  • Cliff34 - Wednesday, November 8, 2017 - link

    I love the creative headline..keep it up :)...

    All jokes aside. I thought he sign some sort of clause with AMD that he won't jump ship to his competitor. There are so much trade secrets from AMD he can take and use in Intel.
  • IKeelU - Thursday, November 9, 2017 - link

    That contract would be unenforceable in California.
  • serendip - Wednesday, November 8, 2017 - link

    One word: antitrust. If Intel grabs the CPU and GPU markets for computing devices larger than a smartphone, then expect antitrust proceedings if Intel's bundling of CPUs and GPUs is seen as anticompetitive against AMD and Nvidia. Also, at this point, AMD is in a weird spot with the new Zen lineup and also EMIB GPU collaboration with Intel.
  • Kvaern1 - Thursday, November 9, 2017 - link

    I think you might be living in the past. It's been a while since X86 was the only choice for consumer computing needs.
  • Alexvrb - Wednesday, November 8, 2017 - link

    "their existing GT4-class iGPUs, which are, roughly speaking, on par with $150 or so discrete GPUs."

    Maybe in the mobile market. I don't know prices for MXM cards. But for desktop discrete graphics not even close. In actual gaming performance they're probably closer to an RX 550, even with that pile of eDRAM. Unless you were talking about when GT4e first was released.
  • tyger11 - Wednesday, November 8, 2017 - link

    In the immortal words of noted industry expert Bugs Bunny, ”You realize, of course - this means war.”
  • Zingam - Thursday, November 9, 2017 - link

    Let's count how many times somebody writes: Vega is a flop!
    Let's count how many times somebody explains why Vega is a flop!
  • sseemaku - Thursday, November 9, 2017 - link

    Very interesting times, I wonder how will an Intel discrete GPU perform!
    And one minute silence to the people who said Intel is giving up on GPUs after they signed a contract with AMD. Looks like AMD made a mistake!
  • systemBuilder - Thursday, November 9, 2017 - link

    They better do something they haven't made a faster chip sine 2013 the broadwell IRIS 5200 was their fastest mobile chip then, AND NOW! Their marketing bimbos have been hard at work inventing meaningless names for their stagnant, rotting product on be ...
  • systemBuilder - Thursday, November 9, 2017 - link

    The article and recent vapor launches i7-8xxx shortages) suggest that Intel is in an unprecedented position of weakness ...
  • Alexvrb - Thursday, November 9, 2017 - link

    I do think they launched prematurely (from an inventory standpoint) to get something out to deal with Ryzen's marketshare gains. They realized if they "launched" the product they could get people to stop and wait that might otherwise have jumped ship. But Coffee Lake is a good lineup (especially the cheaper non-K variants, annoyingly) and eventually supply will catch up.

    I hope AMD keeps iterating on Ryzen at a regular cadence, and expands to cover even lower TDP (tablets). I don't think they'll get another chance to really make a comeback.
  • Notmyusualid - Thursday, November 9, 2017 - link

    I suspect ddriver will read this story, take it like a death in the family, and with any luck, self implode.
  • azrael- - Thursday, November 9, 2017 - link

    So, Raja leaving AMD is a pipeline story, but Raja joining Intel is breaking news?
  • Ryan Smith - Thursday, November 9, 2017 - link

    When Intel is announcing that they're going to be producing discrete GPUs, yes. Raja isn't even the most important part here; it's Intel's titanic shift in their GPU strategy that is.
  • TEAMSWITCHER - Thursday, November 9, 2017 - link

    Does this mean that the recently disclosed Kaby Lake-G processors (with an AMD GPU incorporated into a multi-chip package) is only an interim solution? Soon to be replaced with Intel own GPUs on a multi-chip package? Couldn't they have done this now? Why did they need AMD?
  • Yojimbo - Thursday, November 9, 2017 - link

    I'd assume that this news means Intel will eventually replace AMD's GPU with their own if there is a follow-up product that comes out in the appropriate time frame. But I don't think anyone outside Intel can guess when Intel thinks it will have such a GPU. It depends on how long they've already been working on a discrete GPU and what strategy they take to achieve it.

    I'd also assume that they need AMD now because they don't currently have an appropriate GPU themselves. That seems to suggest either they wanted this product out in a particular proximate time frame or they won't have an appropriate GPU in the near future. Those two alternatives don't tell us very much, so everything is pretty much clear as mud.
  • Iamzedanger - Thursday, November 9, 2017 - link

    People have been focusing on what this means in relation to Nvidia, but I think we should also be thinking about AMD. I think the many lake g exists not just because of apple, but also to thwart 45 W ryzen mobile offerings. Odd yes, that Intel had to get a you from amd itself to combat and, but that's exactly it: they couldn't do anything else. I think ryzen 45w will come, why not? Amd finally has a core that can compete with Intel, and they have the far better gpu, they can absolutely blow Intel away at 35/45w. Intel can't compete if amd launches ryzen 45w with a really good igp. So they actually just got some Radeon for themselves.

    Same thing in the desktop space, Intel can see the damage and will do with their apu's once they are released. Fast ryzen cores with Vega igp? It will be really tough for Intel, especially if there's assymetric crossfire with discrete Vega gpu's. And it won't stop. Intel is basically forced to do something about their graphics, or be left behind. They can't partner up with Nvidia. They can't partner up with amd on desktop. So they have to create their own gpu's. Of course they also had to address the HPC and other markets so it's a win win.
  • iTon - Thursday, November 9, 2017 - link

    i'm not sure intel will success on graphics though hires raja koduri for loyalty
  • hescominsoon - Thursday, November 9, 2017 - link

    Intel's problem has always been their absolutely crap drivers. If this hire resolves that then it is a good move.
  • willis936 - Thursday, November 9, 2017 - link

    Yeah because when I think "high quality graphics drivers" I think "Radeon".
  • willis936 - Thursday, November 9, 2017 - link

    "aja’s move to Intel has definitely made AMD’s job harder"

    typo: you mispelled easier
  • cap87 - Thursday, November 9, 2017 - link

    You're in denial bud, if anyone has the resources to make HBM2 go full scale deployment and adoption it's Intel. AMD could only dream of having the budget Intel appoints for R&D and now that they've snatched Raja Koduri away it's only a matter of time before he starts courting other GPU developers and get the ball rolling real quick. Unless Intel drops the ball AMD is in for a world of pain.
  • willis936 - Thursday, November 9, 2017 - link

    RTG has been bad since long before ATI’s acquisition. Seeing Raja through the door can only help Lisa build a proper GPU architecture.
  • Dragonstongue - Thursday, November 9, 2017 - link

    could be Raja had already worked his arse off to ensure Navi maybe even further had already been designed by "his team" while he was lead and he "jumped" to Intel to ensure both AMD and Intel were/are directly able to address Nv GPU concerns..

    Saying Navi as much of the previous including Polaris was more or less already "done" by time Raja became lead of graphics division, Vega might have been him giving muscle where muscle was needed much like Keller did for Zen, "optimize" whereas Zen+ Zen++ are very much driven by Keller/team, Raja likely similar, brought onboard to smooth transition over and give a fighting future, cause Polaris/Vega/GCN no matter who looks at it are NOT terrible designs, far from, they just need more time to really optimize them, maybe to garner more experience from Intel in regards to lower actual power use will help Radeons going forward just as much as Radeons will help Intel for their own graphics ability to compete.

    AMD already has some potential direct competition to Nv graphics, whether folks like to believe it or not, Radeons have more or less always been able to take on from lowest end to highest end for GPU space vs Nv (not to mention they seem to be made "fatter" whereas Nv designs generally only seem to be "fat" that is extra functions for hashing, advanced features for DX versions etc only at their highest end)

    Maybe the story is more complex, that AMD knew Raja was going to do this all along once RTG was :functional; again, so he went to Intel to give a man in the middle RTG for Intel as well so they can directly help each other forward for licensing deals for graphics while maintaining their mostly duopoly for cpu (x86) status quo, maybe that was AMD deal with Intel, if you want us to have/help for your graphics, Raja MUST be the middle man that formed both divisions (or at least was/is leader for both)

    Radeon group was "disbanded" before Raja came back to AMD and he "made" RTG happen, maybe the long term goal for AMD and Intel was exactly as we now see it (Raja to esnrue smooth transition for both companies in graphics??) for Intel and AMD to be able to compete on many fronts not only x86, they needed folks like Jim Keller and Raja to "lead them forward"

    Interesting to say the least, probably a well engineered plan overall, not as "cloak and dagger" as it appears?
  • agilesmile - Thursday, November 9, 2017 - link

    Dragonstongue, I'm also thinking AMD and Intel negotiated this move and integration of GPU was part of it. Probably Raja wanted to move to work on a new project I guess, but likely he keeps good relationships with AMD fellows.
    I also agree that this move may help remove Nv monopoly in compute space. I think that high margin use cases are linux based and Nv stack is all closed source. Now with Intel in, it's likely an OSS stack will appear/mature for Intel and AMD GPUs so they get leverage over their competition (because of higher dev interest).
  • SaturnusDK - Thursday, November 9, 2017 - link

    So the question is if we'll see more sacred cows slaughtered in the coming months and have nVidia announce a collaboration with AMD to deliver custom Ryzen-based CPUs to pair with their discrete graphic on a single package to fight Intel?
  • Dragonstongue - Thursday, November 9, 2017 - link

    GCN Uarch is NOT at fault, they cannot seem to "trim them back" on the lowest end to keep power in check, and do not have the financial grunt to ensure devs are able to code for is properly, but that being said, all the advanced "grunt" for workstation and the like pays in spades where that extra "stuff" can and is more effectively leveraged, Nv is a lean mean fighting "basic needs" machine for the lower end where they are not able/capable to do the most advanced things in DX10-11-12 like Radeons can, cause they "trimmed the fat" which is fine for purpose built "race car" but, not everything wants a specific purpose built "race car" design, Pascal (except for highest end) is far more a tweaked/custom "tuner" where GCN has always been a full out top tier all bells and whistles F1/Le mans type racer.

    the Uarch of GCN has always been VERY solid, except for some memory feeding issues in some ways, but, really depends on the section of the market the intended card is competing, for example, my good old 7870 is holding up VERY well and still to this day has not been more or less completely "replaced" for similar grunt at similar power levels specific size bus, shaders, transistor density etc....something GCN for sure is GREAT, the specific transistor density, likely is not easy to figure out how to turn things off and on at will to keep performance/power/temperature in check, cause as an "area density" GCN/Polaris/Vega is still technically more "efficient" then Pascal, if you take all them fancy settings Radeons have and plant them on Pascal, or conversely take all the fancy power saving whatever "lean mean" from Pascal and transplant to GCN/Polaris/Vega likely power would go up noticeably for Nv and massively down for Radeons.

    That is all am saying, it takes much power to have all them fancy bells and whistles, that is why mostly Fermi, Kepler, Maxwell were using LOTS of power but more "fleshed out" than Pascal technically is, trim a chunk of all that away use fancy clock gating etc ramp clocks up, so essentially (bad analogy) took a run of the mill full weight Porshe and strip out all the seats, use extensive carbon fibre, ramp up the turbos etc etc, full out custom "tuner" the Radeons since back to the 2000 series (3870-4870-5870-6870-7870-280-380-460-580)pretty much for the most part did not truly "trim the fat" was +/- more or less same raw "given performance" same raw given "power required" etc etc, whereas the similar competing Nv designs went from being "hugely powerful requiring lots of power" to "less power but technically less powerful but higher clocked"

    Many different ways to look at it I suppose, hard to directly compare what used to be a "behemoth" that was "put on a diet" but learned to run very fast vs a behemoth that is ok being a behemoth and sometimes has issues curbing its appetite.

    I think (IMO) is AMD/Radeon could figure out how to turn all that extra grunt off and on at will while retaining the "full fat" design, will end up being killer grunt, whereas for Nv to put that grunt back after they did everything to "strip the fat" it probably wouldn't work as well for Nv as it would for AMD to "optimize" power down vs the other "power up".

    Interesting, very very interesting.
  • FreckledTrout - Thursday, November 9, 2017 - link

    Honestly AMD's biggest issue is there process. Its nailing them on both Zen and Vega right now. That 14nm process just scales up in frequency like crap. I mean look at underclocked Vega 56 or lower clocks of Ryzen, very efficient. More than anything getting on that new 7nm process is there biggest thing they need to do right now.
  • Yojimbo - Friday, November 10, 2017 - link

    Eh, NVIDIA uses Samsung's 14 nm process for the 1050 and 1050 Ti. It looks a slightly less optimal than TSMC's 16 FF+, but it's not that big of a difference. The GlobalFoundries 14 nm process is supposed to be very close to Samsung's, it just took GlobalFoundries longer to get it up and running.
  • peevee - Thursday, November 9, 2017 - link

    "Intel has the financial and fabrication resources to fight NVIDIA, something AMD has always lacked"

    Not always. There was a time what is now GlobalFoundries has been a part of AMD, while NVidia never had their own production.
  • peevee - Thursday, November 9, 2017 - link

    What would be helpful in the article is performance comparison graph (in FLOPS) between Intel solutions and the ranges from AMD and NVidia, not just abstract EUs compared only to each other.
  • Stuka87 - Thursday, November 9, 2017 - link

    Soon as I saw Raja was leaving AMD I said to those around me "He is going to Intel". And sure enough, here he is.
  • peevee - Thursday, November 9, 2017 - link

    At this point x86 is so inappropriate for the modern chip technology and computing requirements, it holds the whole industry down. ARMv8 is not much better.
  • PeachNCream - Thursday, November 9, 2017 - link

    That's like pretty much every modern CPU though. o.O What other option is there right now that could replace one or both of them?
  • peevee - Thursday, November 9, 2017 - link

    There is no option. Only somebody with resources of Intel or AMD or Qualcomm can fully develop one.

    But ideologically, it should be realized that the current limits are power efficiency and physical length of interconnections transmitting most signals (which is again a primary determinant of power), and the only performance which matters anymore is for processing of large datasets, in massively parallel manner. And nobody cares about programming in codes or assembly anymore, only stack-based high-level programming languages. It is not 1945 anymore, Von Neumann should be abandoned and the whole architecture rethought from scratch.

    And that logically leads to:
    1) as little of unnecessary processing as possible (so less caching and other data copying including to-from registers, no speculative execution etc)
    2) Processing should happen as close to memory holding the data to be processed as possible, preferably within 1mm.
    3) temporary unused transistors are OK as long as they do not make signal paths longer and can be used when the data in the memory nearby needs to be processed in massively parallel manner.

    It could be solved by an architecture which does not have any general-purpose registers and as few specialized as possible (IP and SP) to be able to redistribute state to many cores as cheaply (in terms of power) as possible. Directly-cached (non associatively) top of the stack is the register file, full-stop.
    Bit-based addressing and processing (because you don't need to unpack and process 16 bits when you need 10 bits for video etc).
    Hardware map-reduce to distribute the work close to the memory where data is, not a piecemeal and fixed-width vector operations.
    Compressed (not reduced) instruction set.
    Native, transparent virtualization.
  • karthik.hegde - Thursday, November 9, 2017 - link

    I really like how you describe it, and I believe the way to do it is to build a true heterogeneous processor, where there are special purpose blocks for every possible workload it would run and the blocks are power gated when not in use. Along with this, there is a normal CPU as a control block/to churn out any workloads that is not a part of the special purpose modules.
  • peevee - Thursday, November 9, 2017 - link

    "where there are special purpose blocks for every possible workload it would run "

    I think it should be the other way around. All these costly and 99.99% useless fixed function blocks are there just because the outdate, 1940s basic architecture cannot handle any of the loads.
  • jwbarker - Thursday, November 9, 2017 - link

    This certainly puts his farewell letter in a new light: "I will be following with great interest the progress you will make over the next several years." Yes, I bet you will. I wonder if he giggled when he wrote that?

    ---

    On the GPU front, I hope it is as many suspect, that Intel is targeting the deep learning market. That segment is in desperate need of competition, even if it's Intel.

    Though they will need to learn a lesson that AMD so far hasn't, the hardware is incidental when compared to the software. By which I mean that they're not competing with NVidia GPUs, but rather with CUDA. Look at all the major deep learning toolkits and they offer two options, CPU mode or CUDA. And the reason is that NVidia invests a lot of money in both developing toolkits (CuDNN) and supporting integration of support for their hardware.

    My lab is looking into purchasing a small GPU accelerated setup for experiments and we didn't even consider AMD GPUs simply because CUDA is so entrenched.

    If Intel wants to get one over on NVidia, this is what they'll have change. Luckily, unlike AMD, Intel has the money for a massive effort to integrate support for their hardware into popular toolkits.
  • peevee - Thursday, November 9, 2017 - link

    "My lab is looking into purchasing a small GPU accelerated setup for experiments and we didn't even consider AMD GPUs simply because CUDA is so entrenched."

    Why not OpenCL and not be bound to a single provider forever?
  • jwbarker - Thursday, November 9, 2017 - link

    As I said, CUDA is entrenched. The tools don't support OpenCL. Again as I said, NVidia spends a lot of effort/money to get CUDA support into the tools. Nobody does that for OpenCL.
  • Hxx - Thursday, November 9, 2017 - link

    Unlike AMD who was barely scraping by even when they acquired the Radeon division, Intel actually has the money to start a discrete GPU division. I am super excited. If anyone can threaten Nvidia it will likely be Intel.
  • zodiacfml - Friday, November 10, 2017 - link

    Finally. I thought they will just ignore this market. Whenever I read an article touting the benefits of GPU to supercomputing, I have to scratch my head why Intel never did something about it.

    They have to conquer the gaming market for better scale. I just hope Intel goes all out in this war e.g. using the latest manufacturing node for the GPUs. Anyway, once they release 10nm products, GoFlo and TSMC are just months behind with 7nm.

    I hope to see a product early next year using 14nm to serve the shortage/expensive price in gaming GPUs these days
  • zodiacfml - Friday, November 10, 2017 - link

    Exciting times. This reminds me of the time AMD acquired ATI back then. This is a high-tech war between semiconductor giants which includes ARM and licencees. There seems to be a desire from everyone to go into AI/Deep Learning.
  • mdriftmeyer - Saturday, November 11, 2017 - link

    Try two to three years.
  • versesuvius - Friday, November 10, 2017 - link

    Is this stupid or what? Are we actually led to believe that there are only 2 people in the world who are capable of designing a high end discrete GPU and that one of them is permanently working at NVIDIA and the other is jumping around from AMD to Apple to Intel dispensing miracles and magic?
  • peevee - Friday, November 10, 2017 - link

    It is. But the executives who hired him needed to protect their behinds in case of a failure. "But we hired the best, with proven track record, blablabla".
  • Santoval - Saturday, November 11, 2017 - link

    I thought Intel's GPU issues (and those of the GPU business by and large) was not one of more capable lead architects, resources and manufacturing prowess, but one of patents, since almost all GPU patents are owned by companies other than Intel. I also thought that Intel's iGPUs since at least 2011 were based on an Nvidia patent cross-licensing deal, which lapsed in April 2017 and was not renewed. So on what patents will their discrete GPUs and new iGPUs be based? Will they license patents from AMD, ARM, Imagination (PowerVR) or renew their Nvidia deal? I even wonder how they are apparently OK with their *current* iGPUs being sold, since their Nvidia deal lapsed. As far as I know their deal with AMD did not involve any patents but they simply bought certain GPUs from them for their mid-power CPUs (largely intended for Macs).
  • Vlad_Da_Great - Saturday, November 11, 2017 - link

    @santoval. Both AMD and NVDA. The problem is NVDA CAN NOT sue for the next close to 10-15 years. Intel was and will never renew their licensing agreement with NVDA. Time for them to go in the $2 per share zone. Just SELL.
  • tn_techie - Sunday, November 12, 2017 - link

    Intel had a 1.5 B$ 6-year litigation-settlement cross-licensing agreement with Nvidia that expired on March 31st, 2017. And it was widely assumed that the expiration of this agreement implicated that Intel would no longer have access to Nvidia's licensed IP portfolio. In other words, Intel would no longer be allowed to use Nvidia's patents in its own home-made iGPs, starting from April 2017.
    In reality, that deal granted Intel the right to use Nvidia's IP portfolio until the patents expiration, and not the licensing deal expiration.

    "The term of the patent cross-license agreement continues until the expiration of the last to expire of the licensed patents."

    So basically, although the licensing agreement did expire, Intel has the right to continue to use Nvidia's patents that were filed on or before March 31st 2017, indefinitely and in perpetuity, since any patent is up-for-grabs once it expires.

    However, what the licensing agreement expiration means is that Intel won't have access to Nvidia's new patents that were filed on or after April 1st 2017.
  • tamalero - Monday, November 13, 2017 - link

    I guess Raja couldnt pull a Jensen Huang by wanting complete control of the graphic division and Lisa told him to GTFO after he failed to deliver.
  • corinthos - Friday, November 17, 2017 - link

    At long last, the Larribee dream will come TRUE!
  • finefunny - Wednesday, November 22, 2017 - link

    Are AMD or Intel Processors a Better Choice for My Needs?

    In any processor comparison, it is important to determine what you're looking for out of a processor. Are you seeking speed? Graphics performance? Affordability? Flexibility? Do you play games, primarily, or are you seeking a simple processor to allow for multi-tasking while browsing the Internet? Knowing the answers to these questions can help you decide which processor is right for you.

    If you are looking for a more cost-effective processor, AMD processors may be the best bet. For students, budget gamers, and individuals with straight-forward computing needs, AMD processors are a great choice – they are powerful and fast enough for most while undercutting competitor pricing. And yet, you give up little when going with AMD chips. They can be "overclocked" for increased speed, provide robust and vibrant graphics, and are available in multi-core setups for multithreaded performance.

    On the higher end of the spectrum, Intel processors shine. If you are seeking performance that is at the upper range of the market, there is sure to be an Intel chip that suits your needs. You may pay more for it, and you may give up flexibility for this performance, but the processor itself will provide you with lightning speed, incredible capability, and beautiful graphics. In terms of outright power, it's hard to beat Intel processors.
  • laptopcharger - Monday, November 27, 2017 - link

    we can see that by hiring an experienced hand like Raja Koduri and by announcing that they are getting into high-end discrete GPUs they are running towards a great lead.If you have any issue regarding your LAPTOP CHARGER just visit the site:
    <a href="https://laptopcharger.ae/"> LAPTOP CHARGER</a>
  • mariya123 - Monday, December 4, 2017 - link

    very informative post. I really appreciate your post. thanks for sharing keep it up.
    if you are looking for technical solution related with belkin wireless router so please go to my blog
    <a href="https://fixingblog.com/how-to-setup-belkin-wireles... belkin wireless setup </a>

Log in

Don't have an account? Sign up now