ASUS ROG Swift PG278Q Monitor Released in APAC/EU, North America Coming September
by Ian Cutress on July 24, 2014 5:49 AM ESTOne of ASUS’ many releases during Computex was for their new ROG Swift PG278Q monitor that boasted a number of impressive specifications all at once. The PG278Q combines a 2560x1440 panel capable of 120/144 Hz operation with support for NVIDIA G-Sync and 3D Vision, putting it firmly in the region of gaming and hence the ROG moniker.
Aside from NVIDIA G-Sync, the PG278Q comes with a Turbo Key on the rear for quick selection between 60 Hz, 120 Hz and 144 Hz depending on user preference. The GamePlus hotkey gives a crosshair overlay to enhance the gaming environment (useful in games that do not offer steady central crosshairs), as well as timer functions. The OSD is navigated by a joystick-like nub behind the side of the monitor.
Response time is listed as 1ms GTG, with 16.7M colors and 160-170º viewing angles. Connectivity is via DisplayPort 1.2 only, with a USB 3.0 pass-through hub also in the electronics. VESA support is for 100x100mm, and the monitor is listed at 7.0 kg (15.4 lbs). The PR gives a bezel dimension of 6 mm.
Due to the high refresh rate and inclusion of G-Sync, the Swift comes in as one of the most expensive TN panels on the market. Pricing will start at $799, varying by region, and the monitor should be available in Taiwan, APAC and EU today, with China in mid-August and North America by the end of August.
ASUS ROG Swift PG278Q | |
Display | 27-inch (68.5cm) widescreen with 16:9 aspect ratio |
Resolution |
2D mode: 2560 x 1440 (up to 144 Hz) 3D mode: 2560 x 1440 (up to 120 Hz) 2D/3D surround: 7680 x 1440 (2D up to 144 Hz / 3D up to 120 Hz) |
Pixel pitch | 0.233mm / 109 PPI |
Colors (max) | 16.7M |
Viewing angles | 170-degree (H) / 160-degree (V) |
Brightness (max) | 350cd/m² |
Response time | 1ms (GTG) |
ASUS-exclusive technologies |
ASUS GamePlus Technology (Crosshair / Timer) ASUS Refresh Rate Turbo Key (60 Hz /120 Hz/ 144Hz Overclocking) ASUS 5-way OSD Navigation Joystick |
NVIDIA® technologies |
NVIDIA® G-SYNC™ Technology NVIDIA® 3D Vision™ Ready NVIDIA® Ultra Low Motion Blur Technology |
Input/output |
1 x DisplayPort 1.2 2 x USB 3.0 (Upstream x 1, Downstream x 2) |
Stand |
Tilt: +20°~-5°, Swivel: ±60°, Pivot: 90° clockwise Height adjustment: 0~120mm VESA wall mount: 100 x 100mm |
Size | 619.7 x 362.96 x 65.98mm |
Weight (est.) | 7.0kg |
Source: ASUS
74 Comments
View All Comments
ninjaquick - Thursday, July 24, 2014 - link
Funny, both Samsung and LG, makers of all the best panels on earth, including panels in NEC/Dell ultrasharp, all mobile devices, etc.; are Korean.The ASUS panel is probably Korean as well.
chizow - Thursday, July 24, 2014 - link
I was referring to the entire monitor, mainly the drive electronics, but yes the panel substrate does come out of the same 2-3 factories, but its the drive electronics that differentiate them. When I'm referring to "Korean" panels I'm talking about the numerous blackbox makes and models from off-brand Korean labels that are unofficially overclocked to 120+Hz. Sure they may technically hit 120+Hz at 2560x1440p, but the end result is going to be far inferior to this panel.londedoganet - Friday, July 25, 2014 - link
According to TFT-Reviews, the panel on the PG278Q is made by AU Optronics, which is Taiwanese, and is also known to make excellent panels.DanNeely - Thursday, July 24, 2014 - link
IIRC the cheapest 60hz non-TN 4k displays are still around the $2k mark.As for GSync being doomed, only if nVidia decides to implement FreeSync. If they insist on only supporting their proprietary version we'll be stuck with both for the foreseeable future and either be locked into the same GPU vendor for multiple years or have to pay extra for displays that support both.
otherwise - Thursday, July 24, 2014 - link
agreed about FreeSync, I don't understand why people think that it will kill GSync. If any company knows how important first mover advantage is, it's probably nVidia, considering that CUDA still dominates OpenCL in terms of market share.RussianSensation - Thursday, July 24, 2014 - link
The problem with G-Sync is that you the minute you buy a G-Sync monitor, you are stuck with NV GPUs for the next 5-10 years that you keep the monitor. Problem with that is often NV GPUs are far and away too expensive for the performance. For example, R9 290s can be had for $700 but a single 780Ti which is far slower is $650. In other words, there is an expensive price to pay to get the G-Sync feature -- forcing you to pay so much more for NV GPUs with similar or worse performance. For that reason many people are waiting for FreeSync alternative that will work on any GPU. Also, $800 for a TN panel is a no go for many. I would much prefer an IPS 4K 30-32" monitor because washed out greys and TN's poor colour quality is a huge compromise for watching movies and 2D work.inighthawki - Thursday, July 24, 2014 - link
I'm not sure where you're pulling your numbers, but last I checked, R9 290X's were not $700 (Maybe back during the bitcoining phase where there was low supply?) and a 780Ti has superior performance in 95% of scenarios.chizow - Thursday, July 24, 2014 - link
"For that reason many people are waiting for FreeSync alternative that will work on any GPU."Yes, that is the AMD Mantra, free, wait, someday. Maybe?
There's no guarantee Nvidia supports FreeSync btw, so you'd be in the same vendor lock-in situation, just with an inferior solution that hasn't even yet been officially announced as being supported by any hardware vendors.
Nvidia products are more expensive sure, but G-Sync is part of the reason why. You get premium features and support when the products are still relevant during their lifecycle over the endless waiting game that comes with AMD solutions and support.
Also, the price disparity between R9 290/x and 780/Ti is only a recent development once AMD got their supply issues sorted. We all know that for most of their first 5-6 months on the market, the roles were reversed and AMD was the one charging a huge premium for their products ($550 290 and $750 290Xs).
Death666Angel - Thursday, July 24, 2014 - link
"AMD was the one charging a huge premium for their products" You are confusing retailers with AMD. I have seen no mention of AMD charging more for their chips during the Coin-Craze. Not a lot AMD could do about retailers charging as much as customers were willing to pay.chizow - Thursday, July 24, 2014 - link
No, I'm not confusing anything. AMD wanted you to believe they were being held captive by the invisible hand just like everyone else, but in reality, they were selling their parts at what the market would bear and that price increase was reflected in the prices charged by retailers. There were plenty of indicators however, including from AMD's own board partners, that it was an unexpected supply constraint coupled with increased demand from bitcoin mining that led to the huge increases in price.But this is all readily obvious if you look at AMD's Q2 financials that captured all of this action, just as I predicted when this topic was raised here months ago. Decrease in total GPU revenue, units shipped, increase in margins, profit and ASP. Basically, that means AMD was selling fewer cards at a higher price, margin, and profit. Conclusion = price hikes were being driven by AMD. By the time they got their supply in order and cryptocoin demand dried up, there was a glut of inventory and prices bottomed out.