UltraSense Systems Brings Neural Processing to Smart Surfaces with a New Generation of Multi-mode Sensing Solutions

Distributed touch sensing architecture allows for up to 8 buttons to be easily and cost-effectively integrated under any surface for automotive, consumer/industrial IoT and home appliances

English
Chinese

San Jose, Calif., December 14, 2021 – UltraSense Systems today announced its next generation of multi-mode touch sensing technology with the release of TouchPoint Edge to revolutionize and cost-effectively replace a cluster of mechanical buttons under any surface material (e.g., metal, glass, plastic, etc.) and further forge the UI/UX paradigm shift from mechanical to digital interfaces for smart surfaces.

TouchPoint Edge is a fully integrated system-on-a-chip (SoC) which replicates the touch input of mechanical buttons by directly sensing up to eight standalone UltraSense ultrasound + force sensing TouchPoint P transducers. TouchPoint Edge also uses an embedded, always-on Neural Touch Engine (NTE) to discern intended touches from possible unintended false touches, eliminating corner cases and providing input accuracy of a mechanical button.

Smart surfaces will further change the way we interact with products in the coming years. The definition of smart surfaces is a solid surface with underside illumination to show the user where to touch. The UI/UX paradigm shift first started with the smartphone over a decade ago with the removal of the mechanical keyboard to simply tapping the keyboard on a capacitive screen. Not all smart surfaces will be a capacitive display though. UltraSense System’s first-generation sensor, TouchPoint Z, continues the paradigm shift by cost-effectively removing mechanical buttons and improving the user experience (UX) in smartphones, electric toothbrushes, home appliances and automotive interior overhead lights.

TouchPoint Edge enables smart surfaces in everyday products, especially for automotive interiors. Soon, mechanical buttons will be removed from steering wheels, center and overhead console controls and door panels. Additionally, TouchPoint Edge can be integrated under soft surfaces, such as leather and foam seating, which until now, non-mechanical user interfaces could not be easily implemented. Other applications include appliance touch panels, smart locks, security access control panels, elevator button panels and a multitude of other applications.

“In just three years from first funding, we were able to develop, qualify and ship to OEMs and ODMs a fully integrated virtual button solution for smart surfaces,” said Mo Maghsoudnia, CEO of UltraSense Systems. “We are the only multi-mode sensor solution for smart surfaces, designed from the ground up to put neural touch processing into everything from battery-powered devices to consumer/industrial IoT products and now automotive in a big way.”

Human machine interfaces are highly subjective and are extremely complex under a solid surface to replicate a mechanical button press. It is more than an applied force being larger than a threshold to trigger a press of a surface. When a user applies force to a mechanical button, the user is applying a time-varying force curve where the mechanical button reacts with a lot of non-linearity due to friction, hysteresis, air gaps and spring properties to name a few. As a result, a simple piezoresistive or MEMS force-touch strain sensor with some algorithms and one or two levels of triggering thresholds cannot effectively and accurately recreate the user experience of a mechanical button and eliminate false triggers.

TouchPoint Edge, with multi-mode sensing and embedded Neural Touch Engine, processes on-chip machine learning and neural network algorithms, so the user intention can be learned. As with TouchPoint Z, TouchPoint Edge captures the unique pattern of the user’s press with respect to the surface material. The data set is then used to train the neural network to learn and discern the user’s press pattern, unlike traditional algorithms which accept a single force threshold. Once TouchPoint Edge is trained and optimized to a user’s press pattern, the most natural response of a button press can be recognized. Additionally, the unique sensor array design of the TouchPoint P transducer allows for the capture of unique, multi-channel data sets within a small, localized area, as a mechanical button would be located, which greatly improves the performance of the neural network to replicate a button press. The Neural Touch Engine improves the user experience and is even better enhanced by being tightly coupled with the proprietary sensor design of TouchPoint P to provide optimal performance. Finally, having the Neural Touch Engine integrated into TouchPoint Edge is a game changer in system efficiency where neural processing can be performed 27X faster with 80% less power versus offloading the same system setup to an external ultra-low-power microcontroller.

“The challenges of replacing traditional mechanical buttons with sensor-based solutions requires technologies such as illumination of the solid surface, ultrasound or capacitive sensing, and force sensing,” said Nina Turner, research manager of IDC. “But those sensors alone can lead to false positives. The integration of machine learning integrated with these touch sensors brings a new level of intelligence to the touch sensor market and would be beneficial in a wide array of devices and markets.”

Key Features of TouchPoint Edge
• Neural Touch Engine for processing Machine Learning and Convolutional Neural Net
• Open interface allows for non-proprietary and even non-touch sensors inputs (e.g., inertial, piezo, position, force, etc.) to be processed by the Neural Touch Engine
• Direct drive and sense of eight multi-mode TouchPoint P standalone transducers
• Embedded MCU and ALU for algorithm processing and sensor post processing
• Integrated analog front end (AFE)
• Configurable power management and frame rate
• I2C and UART serial interface
• Two GPIO for direct connect to haptic, LED, PMIC, etc.
• -40C to +105C operating range
• 3.5mm x 3.5mm x 0.49mm WLCSP package size

Key Features of TouchPoint P
• Multi-mode standalone piezo transducer for ultrasound + strain sensing
• -40C to +105C operating range
• 2.6mm x 1.4mm x 0.49mm QFN package

TouchPoint Edge evaluation kits using TouchPoint P transducers will be sampling to select customers next month with production samples available in Q1 2022.

About UltraSense Systems
Founded in 2018 and headquartered in San Jose, Calif., UltraSense Systems is changing the UI/UX paradigm in smartphones, consumer/industrial IoT, home appliances and automotive by creating multi-mode touch sensing solutions to enable smart surfaces with precise, highly localized, buttonless interfaces. Its TouchPoint product line enables customers to deliver seamless touch HMIs on hard and soft surfaces, including metals, glass, plastics, wood and leather. The company has raised over to $24M to date from investors including Robert Bosch Ventures, Artiman Ventures, Abies Ventures, Sony Innovation Fund and Asahi Kasei Corporation.

Resources

About UltraSense

UltraSense Systems transforms driver touch interfaces in automotive with its In-Plane Sensing solutions, which enable multi-mode sensing, processing, and AI / Machine Learning algorithms to turn almost any surface into the ultimate touch experience. UltraSense In-Plane Sensing includes a SmartSurface Human Machine Interface (HMI) controller for programmable audio, illumination, haptics, and user feedback. When integrated into steering wheels, infotainment systems, and other in-vehicle interfaces, UltraSense offers a more intuitive and modern experience for drivers; a more integrated, easier to manufacture and thinner solution for tier-suppliers; and greater design options plus recyclability and sustainability benefits for automakers. For more information, visit ultrasensesys.com.

Scroll to Top