SPIDERS (System for Processing In-situ Bio-signal Data for Emotion Recognition and Sensing) is low-cost, lightweight and compact wearable platform that can monitor human emotions will benefit a wide area of research and applications, such as continuous health monitoring, elderly care, depression treatment, entertainment, so on and so forth. Thus, we present SPIDERS which is a low-cost, wireless, glasses-based platform for continuous in-situ monitoring of user’s facial expressions and real emotions. SPIDERS costs less than $20 to assemble and can continuously run for up to 9 hours before recharging.
![](https://i1.wp.com/icsl.ee.columbia.edu/wp-content/uploads/2020/05/Picture1.png?fit=1024%2C419)
Overview:
SPIDERS has a four-layer system architecutre. Each layer in the architecture directly enables the applications or functionalities in the layer directly above.
![](https://i0.wp.com/icsl.ee.columbia.edu/wp-content/uploads/2020/05/Architecture.png?fit=1024%2C381)
We present algorithms to provide four core functions (eye shape and eyebrow movements, pupillometry, zygomaticus muscle movements, and head movements). The algorithms use the bio-signals acquired from three non-contact sensors (infrared camera, proximity sensor, an inertial measuring unit). SPIDERS can be also extended to include functionalities provided by contact-based sensors, such as heart rate sensors and 8-channel EEG sensors.
Core Function Library:
We present novel and robust vision-based techniques to perform pupillometry as well as eye & eyebrow shape detection using IR-band gray-scale images, from an IR camera positioned at a low angle from the eye as to not block the field of view of the user. A proximity sensor is used to detect the movements of zygomaticus muscle (smile muscle), which aims at adding another dimension of knowledge to the facial expression.
Advanced Functionalities:
![](https://i0.wp.com/icsl.ee.columbia.edu/wp-content/uploads/2020/05/Picture3.png?resize=653%2C338)
SPIDERS distinguishes between different classes of facial expression and real emotion states based on these four bio-signals. We prototype advanced functionalities, including facial expression detection and real emotion classification. Specifically, we deploy a novel facial expression detector based on landmarks and optical flow that leverages changes in a user’s eyebrows and eye shapes to achieve an accuracy that outperforming the state-of-the-art approach. We also implement a pupillometry-based real emotion classifier with higher accuracy than other low-cost wearable platforms that use sensors requiring skin contact.
![](https://i0.wp.com/icsl.ee.columbia.edu/wp-content/uploads/2020/05/FlowChartLogic.png?fit=1024%2C722)
Contributors
Jingping Nie, Yigong Hu, Yuanyuting Wang, Stephen Xia, and Xiaofan (Fred) Jiang
![](https://i0.wp.com/icsl.ee.columbia.edu/wp-content/uploads/2020/05/Picture6.png?fit=1024%2C132)
Code Repository
Code: https://github.com/Columbia-ICSL/SPIDERS
Publications
2020 | IEEE/ACM IoTDI ’20
Jingping Nie, Yigong Hu, Yuanyuting Wang, Stephen Xia, and Xiaofan (Fred) Jiang
2020 | ACM/IEEE IPSN ’20
Demo Abstract: Wireless Glasses for Non-contact Facial Expression Monitoring
Yigong Hu, Jingping Nie, Yuanyuting Wang, Stephen Xia, and Xiaofan (Fred) Jiang
[Best Demo Award]
You must be logged in to post a comment.