STMicroelectronics participated in Sensors Converge 2024, the international fair that took place in Santa Clara, California, from June 24 to 26, which focused on the latest innovations in sensing, processing, and connectivity. Attendees at our booth experienced ST’s technology first-hand through 18 interactive demos, showcasing a wide range of applications, from engaging games to development kits and reference designs. They could interact with 30 ST experts ready to answer design challenges, offer troubleshooting advice, and help spark new ideas.
Beyond presenting our demos, this event was an occasion to meet partners and customers and strengthen our industry standing. ST’s experts actively participated in the Sensors Converge conferences, sharing their knowledge on various topics, such as Piezo MEMS and our Lab-in-Fab facility, Machine Learning on Sensors, edge AI Design Tools, Market Trends, and the impact of AI-enabled smart sensor technology across industries.
ST products were also among the finalists for the Best of Sensors Awards 2024: we won the award in the Optical & Imaging category for the VD55G1, the smallest VGA global shutter imaging sensor. This blog post highlights some of our demos on display at the event.
ST Trivia Game and ST Drop the Chip
The STEVAL-PDETECT1 evaluation kit offered visitors a fun and engaging experience, by integrating the functions of the TMOS infrared sensor and the Time-of-Flight (ToF) sensor. These sensors work in synergy to achieve low power continuous detection and accurate localization.
The first game employs user positioning to indicate potential responses to a Trivia. Once you step into the dark circle monitored by the TMOS infrared sensor, your presence is detected: the game begins and you are given a random question to answer. The Time of Flight takes over and to select true or false as an answer you simply need to step into one of the zones and stand there for three seconds. In the second game, the presence detected by TMOS is used to start playing. A hand gesture, detected by the ToF sensor, triggers the release of the “Chip” which then bounces left and right unpredictably, until it lands into one of the six prize buckets.
RGB-Z fusion camera with VD55H1 iToF sensor
The VD55H1 is the latest addition to the Flightsense family, offering 3D color imaging with an RGB-Z fusion camera that combines the ST VD55H1 0.5Mpx iToF sensor and an RGB camera. This low-noise and low-power sensor features a 672 x 804 pixel (0.54 Mpix) resolution and is built on a backside-illuminated stacked wafer technology.
With a 940 nm illumination system, it can produce high-definition depth maps with a typical ranging distance of up to 5 meters, extendable beyond 5 meters with patterned illumination. The sensor operates at a 200 MHz modulation frequency with over 85% demodulation contrast, offering depth precision twice as good as typical 100 MHz modulated sensors. It supports multifrequency operation for long-distance ranging and boasts a low-power 4.6 µm pixel, achieving average sensor power consumption down to 80 mW.
The VD55H1 outputs 12-bit RAW digital video data over a MIPI CSI-2 interface and supports frame rates up to 60 fps in full resolution and 120 fps in analog binning 2×2. It is fully configurable through the I2C serial interface and features a 200 MHz LVDS and a 10 MHz 3-wire SPI interface for flexible laser driver control. Optimized for low EMI/EMC and easy calibration, the VD55H1 is ideal for integration into compact 3D cameras.
ST continuous asset tracking
Smart Asset Tracking, a crucial aspect of contemporary business operations, involves monitoring and managing valuable assets throughout their lifecycle. The STEVAL-SMARTAG2, a flexible NFC Tracker evaluation board equipped with sensors, includes an extensive software library and a sample application to monitor and record sensor data via NFC from Android or iOS devices. It also features a range of environmental and motion MEMS sensors, including two specialized 3-axis linear accelerometers: a high-G unit for impact detection and an ultra-low power LIS2DUXS12 accelerometer with machine learning capabilities.
The LIS2DUXS12 incorporates a decision tree classifier to enable asset status recognition with minimal power consumption, without any intervention of the main microcontroller. In the demo shown Scooter status monitoring is possible locally via the ST Asset Tracking App on a mobile phone or remotely through the Asset Tracking Web dashboard.
Elevation geo-location sensor with NextNav solution
This demo shows how a barometric pressure sensor can create altitude information, playing a pivotal role in improving vertical geolocation capabilities across diverse applications.
In the SensorTile.box PRO, the LPS22DF sensor measures atmospheric pressure, which fluctuates with altitude due to changes in air density. The pressure data can be used to calculate altitude relative to ground and sea level. This is achieved by integrating it with the user’s location in the ST BLE Sensors Smartphone app and combining it with the NextNav Pinnacle service.
MEMS Sensors: enhancing human-machine interaction
At Sensors converge we showcased our latest motion MEMS within our IMU portfolio, including various application scenarios. For instance, the LSM6DSV16BX IMU facilitates head position estimation through Sensor Fusion Low Power. Additionally, the vibration from an individual’s jawbone can activate a keyword recognition algorithm running on the MCU, using only the vibration data.
Since April 2024, our portfolio has included the LSM6DSV32X, which was also a finalist in the Best of Sensors Award. This 6-axis inertial measurement unit with an extended full-scale range 32g accelerometer and a 4000 degrees-per-second (dps) gyroscope features edge AI capabilities enabling in-sensor computing. This latest gen IMU leverages advanced user-programmable embedded features, such as a finite state machine (FSM) for configurable motion tracking and a machine learning core (MLC) for context awareness. The availability of the analog front end allows the system to connect an external input to be digitized and filtered by the sensor ASIC, enabling user interface functions such as tap, double tap, triple tap, long press, or left-to-right and right-to-left swipe.
Furthermore, thanks to our X-CUBE-MEMS1 libraries, it can measure intensive movements and impacts, including freefall height estimation. It can enable long battery life in consumer wearables, asset trackers, and impact and fall alarms for workers. For instance, it reduces the power budget for functions such as gym-activity recognition to under 6µA.
Last, but not least, visitors could discover the LSM6DSO16IS, our IMU with the Intelligent Sensors Processing Unit (ISPU). Among its applications we can find self-learning in-sensor activity recognition, where both training and inference are processed in the ISPU, and an air-mouse implemented by our partner Neuton.AI with their AutoML platform.
- Discover more on our Sensing solutions