The Mega Menu

Multiple menus to choose from. Each drag-n-drop customizable.

Premium Pack

One flexible templates series to rule them all. Everything you need to build an awesome Hubspot website.
Force actuator datasheet

Incredible Support

Questions are bound to pop up. When they do, you can expect a fast and detailed response via chat or email.
Learn More

Additional Services

Need a little extra help building your website? I've got you covered and over a range of design and development services.
Learn More

OptoFidelity Blog

3 min read

Force Sensing Testing as a Demo

12 October 2016

The ongoing evolution of smart devices means endless, and sometimes also sleepless, R&D hours for our test automation developers. We have to have a good radar to sense what will be the new trends and features on phones or any multimedia device – and we have to head our expertise and solutions to meet the expectations. Most preferably we have to do that before customer can define what this means from testing aspect.

One of our methods to stay in and up-to-date in this eternal development cycle is to generate discussion by carrying out demo tests for real devices and publish the results. As true tech geeks we are interested the performance of the devices itself. But we are also eager to show what kind of data our solutions can provide from the user experience aspect. You can read our older test and comparison results from here. This time our passionate test technology expert Tommi explored the role of force sensing testing for smartphones.

Force sensing testing – still a lot of business for developers

Force sensing is a relatively new input method for smartphone displays. This method gives application and OS developers an opportunity to utilize force data to create new applications and gestures. There are already devices available for public like Apple iPhone 6s & 7s, Apple iPhone 6s & 7s plus, Huawei Mate S Premium, ZTE Axon Mini, Huawei P9 plus, Meizu Pro 6. We believe the rumors whereby many more will be available in the near future.

At the moment force data has mainly been used for gestures like peek and pop by Apple, Magnifier by Huawei and application preview with Pressure by ZTE. These gestures do not have so strict requirement for the reported weight or time between pressing the screen and reporting the press. There are many use cases which would benefit from having very exact force measurement with rapid reporting tied to touch screen reporting interval.


  • Drawing applications could utilize the data by modifying the line width based on the applied pressure.
  • Games with improved drive controls (turning, throttle, brake) without lifting the finger from the screen i.e. more pressure, more throttle.
  • Keyboard accuracy and reliability improvements

Force sensing is not a new thing in the industry. It has been used in so called active styluses already many years by Wacom, Microsoft and Apple. It has also been trying to get traction in touch pads used in PC and Mac computers.

For testing we used OptoFidelity™ GoldFinger which is a versatile non-intrusive test system designed to support both production line testing and calibration as well as R&D testing. The system is perfectly suited to production line force calibration and testing due to its fast software controllable weight control, which makes it possible to apply several weights to the same position just by setting the wanted weights by software.

Test results and what we learned

Testing clearly showed us that there are differences between devices and in their behavior when different weights are applied to the screen and different parts of the screen. According to our testing force sensing solutions used in the devices are not yet usable for measuring exact weights in the whole measurement. Furthermore it seems that there can be a lot of variation at the reported weights dependent on the tested position and used weight. Performance seems to be optimized for the area in middle of the screen and weights between 200-400 grams.


For an application developer, data quality and reproductibility are extremely important when creating applications. If there is a big variability in input data, application and gesture controls need to be optimized according to the biggest variation. This often means slower and simpler applications with sacrificed performance. Applications will often become gesture control based instead of exact information based. One of the most challenging applications is to make scale application or very detailed controls e.g. for car games with force enabled throttle and brake control as those require force data to be exact and stable in the whole operation range.

With our OptoFidelity™’s GoldFinger, developer can characterize force supporting devices by applying multiple force levels with software controlled system. Furthermore they can also get the reported force values from the devices. Testing is non-intrusive and using the publicly available API’s from the devices. So the results are comparable to the results that developer would be seeing with their applications as well.

This was the 1st round of testing with force sensing enabled devices. Our target is to continue also this testing when new devices come available.

Got interested? Read the whole test results here and report here or contact directly to us for further questions

Tommi Niemi
Written by Tommi Niemi

Technology Expert, Touch & Display