The project uses SM-24 Geophone to detect footsteps and other low-frequency actions in its surrounding environment.
The casing allows for the sensor to move freely and always be able to be in contact with the floor or other surface to be measured.
The Geophone sensor was created as a way of non-invasively tracking people presence in a building. The idea came from researchers using Geophone-based sensors to track animal presence in the wilderness. The device consists of a raspberry Pi and a geophone sensor, as well as a side-mounted microphone to verify the geophone precision.
The Geophone sensor itself is mounted on an enlarged plastic pin that sits on the floor, while not being hard mounted on the sensor case itself to increase sensor precision. Additionally, the LCD screen on the cover of the case shows the geophone sensor readings as they occur.
The Raspberry Pi is connected to a real-time platform enabling ultra-low latency data visualisation to show events as they occur.
The case was designed using Rhino 6 and then 3D printed using the Ultimaker 2.
The sensor is triggered after the geophone readings keep increasing for a specify amount of time. Firstly, we calculate the median value of the sensor of the duration of two seconds and see if it keeps increasing. Secondly, we check the event buffer - a list comprised of previous readings – to predict the likeliness that the event is about to start.
If the right conditions are met, the device sends out a message to the real-time platform that a person is walking past. During this, the sensor keeps a track of what it is measuring to estimate how big of an event has happened – this permits to visualise the data either as just a few footsteps or an entire group of people walking by. Finally, when the readings start to decrease and the median and the standard deviation for the reading values drops under the predefined threshold, the event is considered finished and the sensors uploads that data online instantaneously.
The event-first approach to sensors readings is far superior to simple polling as it allows for the lowest latency data transfer between the device and the visualisation, only comparable to that of an interrupt-based sensors like reed-switches.
More on event capturing can be found in my research paper titled “Data Management for Building Information Modelling on the Real-time Adaptive City Platform”, please refer to the /research section of the portfolio.
The browser-based visualisation shows the events as-they-occur. The black dots represent the event start, whereas the blue dots show the magnitude (event intensity * duration) of the event recorded. A small blue dot may be a single person passing by, whereas bigger ones show a group of people. After an event-start message has been received, the visualisation starts shaking (a la earthquake), indicating that an event is happening and it does not stop until the even-finished message is received that has the required visualisation readings – the detected intensity of the readings, duration and the magnitude.
The visualisation can also be transformed into a timeseries plot, showing how the events unfolded. The y axis represents the geophone readings aka the intensity, whereas the area under the curve represents the event magnitude.