Social-distancing Monitor

A device to monitor social distancing in crowded areas


The world has been hit hard by COVID :cry: . Until vaccines arive, scientists have advised the use of masks and social distancing as preventory measures to control the spread of the disease.

But having lived all our lives as social beings, it is difficult for us to stay apart, often even unconsciously.

We have made this AI-based social-distancing monitoring tool to remind and nudge people to keep a safe distance from others.

The following short clip displays the tool in action, where the line between two people turns red if the distance between them is unsafe:

The green line joining two people turns red if they are too close! The text on the top left denotes whether the entire scene is safe or not.

This tool works in realtime, processing about 1 frame per second, sufficient to monitor distance between people.

How it works

For detecting people, we use a custom-trained Mobilenet SSD. This neural network is then deployed on a Raspberry Pi with a rotating camera to enhance the field-of-view of the camera. Behind the scenes, a top-view of the scene is calculated using the camera homography matrix. Using this top-view, intra-person distances are calculated.

The Raspberry Pi was attached to a panel of LED lights and a speaker. In case of people standing within a distance of 1.3 metres of each other, the LED lights flashed and an audio file played through the speaker requesting people to move apart.

The complete setup is enclosed in a box for weatherproofing.

Thorough testing

This tool has been thoroughly tested for several days in multiple shops of the local marketplace of our university, IIT Kharagpur.

Me (in a green mask) demonstrating the tool to IIT Kharagpur's director Prof. Virendra Tiwari (in a white hat).
Complete testing of the tool outdoors. Green lines denote safe distances, red lines denote unsafe distances.

Acknowledgments - The work was performed as part of Autonomous Ground Vehicle Research Group (AGV) by Indu Kant Deo, Yash Khandelwal and me, with the guidance of Prof. Debashish Chakravarty and Prof. Aditya Bandopadhyay.