System Performance

Acer offers just a single processor across the Predator Triton 500 lineup. Intel’s Core i7-8750H is a six-core processor with a 2.2 GHz base, and 4.1 GHz boost frequency. This is a Coffee Lake processor, and is the lowest tier of the hex-core i7 models available. But with six cores and twelve threads, it still offers a significant amount of performance in a 45-Watt envelope.

The base tier of this laptop ships with 16 GB of DDR4, and the review unit comes with the full 32 GB allotment. There’s two SODIMM slots if RAM upgrades are something you are into. For storage, Acer offers either 512 GB of NVMe SSD storage, or two 512 GB NVMe drives in RAID 0. I’m personally not a big fan of the RAID 0 thing, since a single larger drive would likely offer better real-world performance at less cost, but it tends to be a thing in gaming laptops unfortunately.

To test system performance, the Acer Predator Triton 500 was run through our laptop workloads. Graph comparisons are against other GTX 1070 and GTX 1080 laptops we’ve seen over the last couple of years, but if you’d like to compare the Triton 500 to any other laptop we’ve reviewed, please check out our online bench database.

PCMark

PCMark 10 - Essentials

PCMark 10 - Productivity

PCMark 10 - Digital Content Creation

PCMark 10 - Overall

PCMark 8 - Home

PCMark 8 - Work

UL’s PCMark is a comprehensive system test, offering multiple workloads to stress various components. Since we’ve not had a lot of gaming laptops to test since PCMark 10 was released, PCMark 8 is also included in these results. PCMark 8 Creative wasn’t included due to an error on one of the tests. The hex-core CPU doesn’t do a lot for PCMark, which focuses more on office tasks and the like, but the Predator Triton 500 still performs well.

Cinebench

Cinebench R15 - Single-Threaded Benchmark

Cinebench R15 - Multi-Threaded Benchmark

Recently Cinebench R20 was release, and we’ll be transitioning to it once we get some more data, but for this review R15 was used. The Core i7-8750H does well in the single-threaded test, and the extra cores provide a nice boost in the multi-threaded results. It can’t hang with the Core i9-8950HK in the GT75 Titan, but that device does have an 800 MHz frequency advantage.

x264

x264 HD 5.x

x264 HD 5.x

The x264 test converts a video using CPU, and is likes more cores and higher frequencies. The extra cores give the Triton 500 a speed boost over the quad-core models that used to ship in the 45-Watt range, but once again the Core i9 really stretches its legs here.

Web Tests

Unlike most benchmarks, web benchmarks are influenced heavily by the underlying browser, and since browsers are updated all of the time, performance can change over time as well. Normally it goes up, but we’ve standardized on Microsoft Edge since Windows 10 launched, and Edge performance has taken a step backwards over the last couple of updates.

Mozilla Kraken 1.1

Google Octane 2.0

WebXPRT 2015

WebXPRT 3

Performance is still good, but there does seem to be a regression in Edge on some of these tests. When we move to the Chromium based Edge, we’ll likely take that opportunity to move into some new, more modern, web tests.

CPU Conclusion

Acer’s choice to go with the Core i7-8750H is a good one. It lets them compete on price, and the hex-core CPU offers great performance. It can’t quite keep up with the Core i9-8950HK, but it still offers stout performance in the 45-Watt class.

Storage Performance

Acer couples two NVMe PCIe 3.0 x 4 SSDs together in the highest model in their Triton 500 range, which is what we have to review. RAID 0 doesn’t really offer much of a benefit for most people on most tasks, although there’s little doubt it does boost storage benchmark results, which is likely why so many gaming laptops ship this way.

In sequential tests, the RAID 0 pretty much maxes out the PCIe link for read, although for write there’s no benefit of the RAID. It also doesn’t likely help much with the random results either, which is why the extra risk of running RAID 0 doesn’t really outweigh the added costs of having to purchase two drives. One single 1 TB quality drive would almost certainly outperform the 2 x 512 we have here.

Design GPU Performance: Turing With Max-Q
Comments Locked

46 Comments

View All Comments

  • MrRuckus - Thursday, April 25, 2019 - link

    This... Do yourself a favor and compare. 2080 Max-Q to standard 2080 you're looking at an average 30% drop in performance. No thanks. To me its a niche market that doesn't make a lot of sense. You want portability, but these are not meant to game on battery. They're meant to be plugged in all the time. Thinner and Thinner while trying to keep the performance (or the naming scheme of performance oriented parts) seems like an ever increasing losing battle. I personally wouldn't pay the premium 2080 price for a 30% hit in performance.
  • DanNeely - Thursday, April 25, 2019 - link

    Thin and light enough to pass as a normal laptop, while still able to game significantly better than on an IGP is a valid market segment; and at the x60 level is probably what I'll be getting for my next laptop in a year or two. OTOH for people who want something that's a normal laptop first and a gaming system second, I suspect Optimus (for battery life) over GSync is probably the better choice until/unless NVidia can integrate the two.
  • Rookierookie - Thursday, April 25, 2019 - link

    Yeah, I wish they had a version that offered a FHD panel with Optimus. No 4K, no GSync, and lasts at least 6 hours on battery surfing the web. The Gigabyte Aero is still more or less the only viable option for thin, light, powerful, and decent battery.

    The ConceptD 7 that comes out later this year looks really attractive, but it has a 4K panel so again I'm not optimistic about battery time.
  • Opencg - Thursday, April 25, 2019 - link

    optimus is great with the exception of when they wire the igpu to the external ports. imo you should be able to plug into a vr headset or external gsync monitor. you might give up the ability to run a presentation on a projector in low power optimus mode but imo when you are pluging into external displays you are probably plugging into the wall anyway.
  • Brett Howse - Thursday, April 25, 2019 - link

    Apparently this laptop has a MUX to allow you to choose between G-SYNC and Optimus. I've updated the article and am re-running the battery life tests right now and will add them in when done. So really this is the best of both worlds.
  • jordanclock - Thursday, April 25, 2019 - link

    The biggest problem is that Max-Q can mean everything from specs that match the desktop down to half the TDP rating. So a 2080 Max-Q can be below a 2070 Max-Q depending on how each OEM configures the TDPs and clocks.
  • Tams80 - Saturday, April 27, 2019 - link

    You're clearly not the market for it.
    There is a significant market for one machine that is as best as possible for gaming when plugged in, but decent as an everyday machine when not. I'd say that market also value the machine being light for transporting between places.

    You may poo poo such a use case, but it can be very tiresome maintaining more than one machine. People are willing to pay a premium for lower specs for that. Then there are those who just want the 'best' numbers in everything, but they'll buy anything.
  • Fallen Kell - Monday, April 29, 2019 - link

    You're clearly not the market for it.
    There is a significant market for one machine that is as best as possible for gaming when plugged in, but decent as an everyday machine when not. I'd say that market also value the machine being light for transporting between places.


    Except that he is the market. The complaint is that these companies are putting in features that they are marketing the device on which the device has no chance of being able to accomplish. This device isn't the best as possible for gaming when plugged in. It can't cool itself even when plugged in to have even a chance of using the 2080 GPU or even the CPU for more than a few seconds before thermal throttling. These shenanigans should ALWAYS be called out. If they want to make a thin laptop, then they should put in hardware that it can actually use at full capability without thermal or power throttling when plugged in. In this case, it would probably mean stepping down the CPU and GPU.

    The marketing guys know that will affect sales because they can't say they have the top of line components, but the whole max-q that Nvidia released along with relying on Intel's thermal throttling is letting companies say they have all this hardware in a thin laptop and people are buying them thinking they have that hardware, not knowing they are buying something that is possibly running 30-40% slower than it should be and could have saved hundreds of dollars and had the exact same performance if they simply used the "slower" parts that worked in the thermal and power loads of the laptop in the first place.
  • plewis00 - Saturday, April 27, 2019 - link

    Totally agree. If you want performance then expect a big machine to cart around. Making things thinner doesn't work all the time, Apple's MacBook range is a great example of how to ruin machines by making them obsessively thin.
  • not_anton - Friday, April 26, 2019 - link

    Apple did the same with Macbok pro 15, my Radeon 460 works at mere 960MHz with decent performance and ridiculously low power for a 1000-core GPU chip.
    The downside is the value for money that you'll get - about the same as Apple's :D

Log in

Don't have an account? Sign up now