Although the Xperia Z4 Tablet was announced as far back as MWC, it wasn’t until recently that the enhanced noise rejection of the touchscreen was really shown off. At first it wasn’t exactly clear how this was accomplished, but it turns out that this is a Qualcomm technology at heart, namely improveTouch. For those that are unfamiliar with this technology I've attached a video from Qualcomm demonstrating the technology below.

At a high level, improveTouch uses a Snapdragon SoC to effectively replace aspects of external touchscreen controllers. This is done by moving touch processing from the touch screen controller to the SoC’s application processor. By doing this, it’s possible to enable faster sampling rates/lower touch latency for the digitizer and more advanced noise rejection algorithms like water droplet rejection. Although improveTouch currently doesn’t support underwater touch, according to Qualcomm it’s technically possible to implement such features in the future. It’s also possible for do additional advanced context awareness/sensor fusion with this system, such as improved proximity sensing with the use of a traditional IR sensor and the touch screen.

One obvious potential pitfall of such a move would be that power consumption could increase significantly over traditional systems, but according to Qualcomm power consumption is competitive with other solutions when at the same sampling rate and processing complexity, partially because touch controller solutions tend to lag on process node. For applications where the display is off and the AP is usually asleep like double-tap to wake, improveTouch has something called the Autonomous Touch Engine, which has an MCU and other components to avoid waking up the application processor unnecessarily. In addition, the cost of such a solution is competitive as some components from a traditional touch screen solution can be eliminated. It's also worth noting that the CPU load of this system is said to be minimal and won't affect performance of the AP, although Qualcomm didn't disclose a specific figure and it's likely that CPU load effects depend upon a large number of factors.

Although improveTouch is capable of improving user experience by reducing touch latency and more sophisticated digitizer functions like water droplet rejection, according to Qualcomm there are significant benefits on the OEM side as implementation is simpler with this architecture. By using a narrow-band filter on the analog front-end which is said to have 60 dB SNR (it isn't clear whether this refers to dynamic range, average SNR, peak SNR, or some other measure), getting the touch sensor to work well with the touch controller is easier than traditional solutions. However, it remains to be seen whether adoption for improveTouch will become significant among OEMs that use Qualcomm SoCs.

Comments Locked


View All Comments

  • Murloc - Tuesday, August 18, 2015 - link

  • r3loaded - Tuesday, August 18, 2015 - link

    I don't think so, Nvidia have a similar system called DirectTouch in their Tegra SoCs, so this isn't the first example of such technology.
  • boyang1724 - Tuesday, August 18, 2015 - link

    DirectTouch from the Tegra 3 offloaded touch input to the CPU, which is mainly used to increase sampling rate and reduce processing time. It didn't really have anything to do with water. Also, pretty much all the Tegra line had atrocious SoCs. Also, I don't think DirectTouch even shipped on any devices, and they later opted for DirectStylus on their Note tablets.
  • girishp - Tuesday, August 18, 2015 - link

    Murloc like many others love to be the first commenter on any article. "First" had nothing to do with Qualcomm doing it first or not.
  • nathanddrews - Tuesday, August 18, 2015 - link

    I'd love to see this combined with Microsoft's 1ms high-performance touch. Touchscreens might actually be enjoyable to use!
  • name99 - Tuesday, August 18, 2015 - link

    How is this (MS' 1ms system) useful in actually existing devices?
    Apple gave a talk about touch latency at WWDC. There are two essential points that gate what you can do:
    - under normal conditions, the reason you care about this latency is to control what you see. So you are gated by how fast things are displayed. Among other things, for now that means you are gated by the 60Hz frame rate
    - for current programming paradigms, you're also gated by the event loop (and that in turn is, in current programming paradigms) gated by the frame rate.

    What this means is that essentially, unless you plan to boil the ocean of display and event loop technology, the best you can do is operate more efficiently within these constraints.
    Apple, in iOS8 and earlier, had 4 frames of latency between touch and updated display.
    In iOS9 through a variety of techniques, they pull this down to 2 frames on older iOS devices and on iPad Air2 (which has 120Hz touch sampling) down to 1.5 frames.

    They didn't discuss hardware smarts in doing this (eg whether they have some aspects of the touch controller embedded on either the M8 or the A8) but I found their discussion of the SW/architecture aspects compelling, and I suspect these alternative systems for super low latency touch are basically useless on actually existing platforms, in terms of fitting into the rest of the current paradigm.
  • girishp - Monday, August 24, 2015 - link

    I have iOS 9 on iPhone 6 plus. The touch performance is no where near 2 frames. Maybe when scrolling extremely slow. In conclusion, this also depends on how fast you're scrolling (and I'm not talking about unrealistic speeds, just normal scrolling for browsing web pages).
  • jjj - Tuesday, August 18, 2015 - link

    So they got inspired by Nvidia and are trying this too. The problem is that in high end discrete can do better while in budget integration with the display driver is more efficient.
    Sure the touch controller is just another sensor so you could connect it to a sensor hub/DSP and manage it like all the other sensors to create new things.
  • vortmax2 - Tuesday, August 18, 2015 - link

    I like the idea...there's plenty of room for improvement in the touch arena. The adoption question, focusing on the droplet rejection scenario will ultimately depend on how many phones/tablets become waterproof. I'm hoping all of them end up being IP67 or better at some point.
  • sheh - Tuesday, August 18, 2015 - link

    Finally, droplet rejection! I always drip on my phone and tablet while drinking, speaking excitedly, or just going for a relaxing stroll in the rain. Sometimes it can be a real problem, as I use the devices to remote control the emergency red button of a nuclear facility.

Log in

Don't have an account? Sign up now