Seeing Is Believing: Intel Teases DG1 Discrete Xe GPU With Laptop & Desktop Dev Cards At CES 2020
by Ryan Smith on January 9, 2020 12:00 PM ESTWhile CES 2020 technically doesn’t wrap up for another couple of days, I’ve already been asked a good dozen or so times what the neatest or most surprising product at the show has been for me. And this year, it looks like that honor is going to Intel. Much to my own surprise, the company has their first Xe discrete client GPU, DG1, at the show. And while the company is being incredibly coy on any actual details for the hardware, it is none the less up and running, in both laptop and desktop form factors.
Intel first showed off DG1 as part of its hour-long keynote on Monday, when the company all too briefly showed a live demo of it running a game. More surprising still, the behind-the-stage demo unit wasn’t even a desktop PC (as is almost always the case), but rather it was a notebook, with Intel making this choice to underscore both the low power consumption of DG1, as well as demonstrating just how far along the GPU is since the first silicon came back to the labs in October.
To be sure, the company still has a long way to go between here and when DG1 will actually be ready to ship in retail products; along with the inevitable hardware bugs, Intel has a pretty intensive driver and software stack bring-up to undertake. So DG1’s presence at CES is very much a teaser of things to come rather than a formal hardware announcement, for all the pros and the cons that entails.
At any rate, Intel’s CES DG1 activities aren’t just around their keynote laptop demo. The company also has traditional desktop cards up and running, and they gave the press a chance to see it yesterday afternoon.
Dubbed the “DG1 Software Development Vehicle”, the desktop DG1 card is being produced by Intel in small quantities so that they can sample their first Xe discrete GPU to software vendors well ahead of any retail launch. ISVs regularly get cards in advance so that they can begin development and testing against a new GPU architecture, so Intel doing the same here isn’t too unusual. However it’s almost unheard of for these early sample cards to be revealed to the public, which goes to show just how different of a tack Intel is taking in the launch of its first modern dGPU.
Unfortunately there’s not much I can tell you about the DG1 GPU or the card itself. The purpose of this unveil is to show that DG1 is here and it works; and little else. So Intel isn’t disclosing anything about the architecture, the chip, power requirements, performance, launch date, etc. All I know for sure on the GPU side of matters is that DG1 is based on what Intel is calling their Xe-LP (low power) microarchitecture, which is the same microarchitecture that is being used for Tiger Lake’s Xe-based iGPU. This is distinct from Xe-HP and Xe-HPC, which according to Intel are separate (but apparently similar) microarchitectures that are more optimized for workstations, servers, and the like.
As for the dev card, I can tell you that it is entirely PCIe slot powered – so it’s under 75W – and that it uses an open-air cooler design with a modest heatsink and an axial fan slightly off-center. The card also features 4 DisplayPort outputs, though as this is a development card, this doesn’t mean anything in particular for retail cards. Otherwise, Intel has gone with a rather flashy design for a dev card – since Intel is getting in front of any dev leaks by announcing it to the public now, I suppose there’s no reason to try to conceal the card with an unflattering design, which instead Intel can use it as another marketing opportunity.
Intel has been shipping the DG1 dev cards to ISVs as part of a larger Coffee Lake-S based kit. The kit is unremarkable – or at least, Intel wouldn’t remark on it – but like the DG1 dev card, it’s definitely designed to show off the video card inside, with Intel using a case that mounts it parallel to the motherboard so that the card is readily visible.
It was also this system that Intel used to show off that, yes, the DG1 dev card works as well. In this case it was running Warframe at 1080p, though Intel was not disclosing FPS numbers or what settings were used. Going by the naked eye alone, it’s clear that the card was struggling at times to maintain a smooth framerate, but as this is early hardware on early software, it’s clearly not in a state where Intel is even focusing on performance.
Overall, Intel has been embarking on an extended, trickle-feed media campaign for the Xe GPU family, and this week’s DG1 public showcase is the latest in that Odyssey campaign. So expect to see Intel continuing these efforts over the coming months, as Intel gradually prepares Tiger Lake and DG1 for their respective launches.
83 Comments
View All Comments
FANBOI2000 - Friday, January 10, 2020 - link
We might see RT 2 years early but that is a moot point if you can't afford it and developers will write software for the average PC and the high end will be less of a thought. Certainly those gamers who can justify over 1000 for a graphics card will be hardcore gamers but I don't see them being as profitable to developers as the middle ground. So they write for the middle ground because you get diminishing returns spending time and money wowing people with graphics few will be able to get-simply because they don't have the budget for the high end cards required for access.Kangal - Friday, January 10, 2020 - link
I don't know if this qualifies as "competition".It's struggling to keep up with an RX 550, yet it's likely drawing more power (60W ?), generating more heat, and most definitely has fewer features (codecs, etc etc). And based on what we know, it will likely cost more too.
And that's the RX 550, the joke of modern GPUs. There's practically zero reasons to get an RX 550 already when you can instead get the (slightly) superior GT 1030 for cheaper. Heck, I think Intel's "free" Iris UHD630 is more impressive than this. And instead of all of those, I'd much prefer to instead have a "free" Navi-8 APU when they eventually launch in 12 months time.... or step up to a proper dGPU like a GTX 1650 or better.
I don't think this was a good idea for Intel to showcase this. It's like inviting friends over to your house, and presenting a pigsty!
Alistair - Friday, January 10, 2020 - link
i bought a gt 1030 for a customer, never seen such a bad and slow card beforeyou're looking at the 1650 super minimum for a real video card, there isn't anything worth buying under $150
Alistair - Friday, January 10, 2020 - link
1650 super is 4x faster? for a few extra bucksFANBOI2000 - Friday, January 10, 2020 - link
The proper 1030 with gddr is OK for 720 gaming. That is damming with faint praise as it is Xbox 360 levels but it is OK for some games in a general purpose PC. I agree the 1650s is where you want to be if gaming is going to feature at 1080p or if you are going for esports competively or FPS. but the 1030 still has a place, especially second han although sometimes the 9 series can offer better value there. I've seen a 980 and 1030 both go for £100 on eBay and I know which one I'd rather have.FANBOI2000 - Friday, January 10, 2020 - link
Will it? NVIDIA are all about machine learning now and I think that is where Intel will really be working. How much cash is there really in gaming cards, especially when you consider there are two well known brands in that sector and you aren't known for such abilities. It doesn't even seem like AMD is going that heavily with a price war based on their recent releases.I'm not a hardcore gamer, 1080p on high settings will placate me, so I am still sticking with my 1060 6GB as it will be good enough at least until say 6 months into the new consoles. I don't think the consoles will drive things much at first as developers will still be writing games for Xbox S/X just as they did with the 360. It may well mean we have the status maintained for a while yet, maybe it is the new normal given AMDs recent pricing.
thestryker - Thursday, January 9, 2020 - link
If the leaks regarding the desktop part are accurate it is just a desktop version of the exact same thing in Tiger Lake, EUs and all. I suspect this is just an easy way for Intel to design a working video card and get something out for vendors to work with. I wouldn't be particularly surprised if whatever this was never even made it to the consumer.I really do hope they're able to deliver though as with the price increases in the video card industry there's plenty of room for Intel to come in and undercut the others without taking a loss doing so. If they're able to do that it may put some pressure on AMD and Nvidia to return to reality with pricing.
tipoo - Thursday, January 9, 2020 - link
Yeah, sounds like a way to work on the Xe architecture without having to switch out to a Tiger Lake system. If it's not largely for developers, maybe some bundle in OEM parts.Uroshima - Friday, January 10, 2020 - link
How many did get that weird idea that the demoed DG1 is nothing else than a Tiger Lake with the CPU fused off ... after all, it is the LP version, it runs below 75W, they run some feeble demoes ...And lets not forget that LP was added later in the game next to the "imba" versions.
My really nasty thought is that LP is Gen12 graphics which uses some Xe technology which turned out better than the beefier Xe versions. (like they overcomplicated and the GPUs are running hot and have a bunch of power hungry features that do not help in mundane GPU tasks)
That would lead this year to Tiger Lake APUs, some heavy duty GPU for professionals and the real "gamer grade" discrete graphics in like ... 2-3 years?
bill44 - Thursday, January 9, 2020 - link
4x DP? Not 1x HDMI + 3x DP?