Geophone Footstep Detector

University of Cambridge 2019

Brief Description

The project uses an SM-24 Geophone to detect footsteps and other low-frequency actions in its surrounding environment.

The casing allows the sensor to move freely and always be in contact with the floor or other surface to be measured.


More Information
geophone gif

The Device

The Geophone sensor was created as a way of non-invasively tracking people's presence in a building. The idea came from researchers using Geophone-based sensors to track animal presence in the wilderness. The device consists of a Raspberry Pi and a Geophone sensor, as well as a side-mounted microphone to verify the Geophone's precision.

The Geophone sensor itself is mounted on an enlarged plastic pin that sits on the floor while not being hard-mounted on the sensor case itself to increase sensor precision. Additionally, the LCD screen on the cover of the case shows the Geophone sensor readings as they occur.

The Raspberry Pi is connected to a real-time platform enabling ultra-low latency data visualisation to show events as they occur.

The case was designed using Rhino 6 and then 3D printed using the Ultimaker 2.

Event Capturing

The sensor is triggered after the Geophone readings keep increasing for a specified amount of time. First, we calculate the median value of the sensor over the duration of two seconds and see if it keeps increasing. Second, we check the event buffer—a list comprised of previous readings—to predict the likelihood that the event is about to start.

If the right conditions are met, the device sends out a message to the real-time platform that a person is walking past. During this, the sensor keeps track of what it is measuring to estimate how big of an event has happened—this permits visualising the data as either just a few footsteps or an entire group of people walking by. Finally, when the readings start to decrease and the median and standard deviation for the reading values drop under the predefined threshold, the event is considered finished and the sensor uploads that data online instantaneously.

The event-first approach to sensor readings is far superior to simple polling as it allows for the lowest latency data transfer between the device and the visualisation, only comparable to that of interrupt-based sensors like reed switches.

More on event capturing can be found in my research paper titled “Data Management for Building Information Modelling on the Real-time Adaptive City Platform”; please refer to the /research section of the portfolio.

Data Visualisation

The browser-based visualisation shows the events as they occur. The black dots represent the event start, whereas the blue dots show the magnitude (event intensity * duration) of the event recorded. A small blue dot may indicate a single person passing by, whereas larger ones show a group of people. After an event-start message has been received, the visualisation starts shaking (a la earthquake), indicating that an event is happening, and it does not stop until the event-finished message is received that has the required visualisation readings—the detected intensity of the readings, duration, and magnitude.

The visualisation can also be transformed into a time series plot, showing how the events unfolded. The y-axis represents the Geophone readings, aka the intensity, whereas the area under the curve represents the event magnitude.


justas brazauskas