UltraSense Brings Next-Gen Multi-Mode Touch Sensing Technology to Smart Surfaces

motherboard hardware

Multi-mode touch sensing solution provider, UltraSense Systems releases TouchPoint Edge for next-generation of multi-mode touch sensing technology. It is intended to replace a cluster of mechanical buttons under any surface materials cost-effectively. It further forge the UI/UX paradigm shift from mechanical to digital interfaces for smart surfaces.

 

A fully integrated SoC that replicates the touch input of mechanical buttons by directly sensing up to eight standalone UltraSense ultrasound + force sensing TouchPoint P transducers, TouchPoint Edge also uses an embedded, always-on Neural Touch Engine (NTE) to discern intended touches from possible unintended false touches, eliminating corner cases and providing input accuracy of a mechanical button.

 

The coming years will witness a slew of change in the way smart surfaces are used to interact with products. The first UI/UX paradigm shift started with the smartphone over a decade ago with the removal of the mechanical keyboard to simply tapping the keyboard on a captive screen. UltraSense System’s first-generation sensor, TouchPoint Z, continues the paradigm shift by cost-effectively removing mechanical buttons and improving the user experience (UX) in smartphones, electric toothbrushes, home appliances and automotive interior overhead lights.

 

TouchPoint Edge takes the experience to the next level in applications that use many mechanical buttons. For instance, the automobile cockpit has many uses cases including removing mechanical buttons in the steering wheel, center and overhead console controls for HVAC and lighting, door panels for seating and window controls and even embedded into soft surfaces like leather or even in foam seating to create new user interfaces where mechanical buttons could not be implemented before. Other applications include appliance touch panels, smart locks, security access control panels, elevator button panels and a multitude of other applications.

 

“In just three years from first funding, we were able to develop, qualify and ship to OEMs and ODMs a fully integrated virtual button solution for smart surfaces,” said Mo Maghsoudnia, CEO of UltraSense Systems. “We are the only multi-mode sensor solution for smart surfaces, designed from the ground up to put neural touch processing into everything from battery-powered devices to consumer/industrial IoT products and now automotive in a big way.”

 

Human machine interfaces are highly subjective and are extremely complex under a solid surface to replicate a mechanical button press. It is more than an applied force being larger than a threshold to trigger a press of a surface. When a user applies force to a mechanical button, the user is applying a time-varying force curve where the mechanical button reacts with a lot of non-linearity due to friction, hysteresis, air gaps and spring properties to name a few. As a result, a simple piezoresistive or MEMS force-touch strain sensor with some algorithms and one or two levels of triggering thresholds cannot effectively and accurately recreate the user experience of a mechanical button and eliminate false triggers.

 

TouchPoint Edge, with multi-mode sensing and embedded Neural Touch Engine, processes on-chip machine learning and neural network algorithms, so the user intention can be learned. As with TouchPoint Z, TouchPoint Edge captures the unique pattern of the user’s press with respect to the surface material. The data set is then used to train the neural network to learn and discern the user’s press pattern, unlike traditional algorithms which accept a single force threshold. Once TouchPoint Edge is trained and optimized to a user’s press pattern, the most natural response of a button press can be recognized. Additionally, the unique sensor array design of the TouchPoint P transducer allows for the capture of unique, multi-channel data sets within a small, localized area, as a mechanical button would be located, which greatly improves the performance of the neural network to replicate a button press. The Neural Touch Engine improves the user experience and is even better enhanced by being tightly coupled with the proprietary sensor design of TouchPoint P to provide optimal performance. Finally, having the Neural Touch Engine integrated into TouchPoint Edge is a game changer in system efficiency where neural processing can be performed 27X faster with 80% less power versus offloading the same system setup to an external ultra-low-power microcontroller.

 

“The challenges of replacing traditional mechanical buttons with sensor-based solutions requires technologies such as illumination of the solid surface, ultrasound or capacitive sensing, and force sensing,” said Nina Turner, research manager of IDC. “But those sensors alone can lead to false positives. The integration of machine learning integrated with these touch sensors brings a new level of intelligence to the touch sensor market and would be beneficial in a wide array of devices and markets.”

 

Key Features of TouchPoint Edge

 

  • Neural Touch Engine for processing Machine Learning and Convolutional Neural Net
  • Open interface allows for non-proprietary and even non-touch sensors inputs (e.g., inertial, piezo, position, force, etc.) to be processed by the Neural Touch Engine
  • Direct drive and sense of eight multi-mode TouchPoint P standalone transducers
  • Embedded MCU and ALU for algorithm processing and sensor post processing
  • Integrated analog front end (AFE)
  • Configurable power management and frame rate
  • I2C and UART serial interfaces
  • Two GPIO for direct connect to haptic, LED, PMIC, etc.
  • -40C to +105C operating range
  • 5mm x 3.5mm x 0.49mm WLCSP package size

 

Key Features of TouchPoint P

 

  • Multi-mode standalone piezo transducer for ultrasound + strain sensing
  • -40C to +105C operating range
  • 6mm x 1.4mm x 0.49mm QFN package

 

TouchPoint Edge evaluation kits using TouchPoint P transducers will be sampling to select customers next month with production samples available in Q1 2022.

Announcements

ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

Share this post with your friends

Share on facebook
Share on google
Share on twitter
Share on linkedin

RELATED POSTS