While working on the William Gates Building (WGB) smart building test-bed, I have experimented with a range of data visualisation to display sensor and BIM data.
This page contains two talks on the creation of smart building visualisations and broader sensor deployment.
The video materials show the WGB recreated solely from the information we acquired from the official floorplans that can be accessed by all Computer Lab researchers.
This page further shows two visualisation demos for the WGB. One was written in D3.js, the other in Three.js.
These are incomplete examples and do not show the full scope of the project, e.g. including all of the APIs, data collection and integration architectures that were created during this initial deployment. Please refer to the published and pre-print papers for more information.
While developing front-end data visualisations for the Adaptive City Platform we experimented with a range of tools to display the incoming sensor data from dozens of deployed sensors.
For the most part these experiments were done using the D3.js library.
Real-time sensor heatmaps
Instead of relying on the click-on-the-sensor-and-inspect-the-data paradigm, our approach was to construct individual room-bound heatmaps for the WGB.
8x8 IR sensor integration
Some of the deployed sensors came with an 8x8 IR camera, allowing us to see real-time snapshots of what these sensors were seeing.
More examples
On-demand latest sensor readings (pre-alpha) - discontinued.
Room selection from a list (pre-alpha) - discontinued.
Experimental Three.js visualisation
A demo of how a 3D interactive body can be generated from simple 2D floorplan data (pre-alpha) - discontinued.
These experiments culminated in the creation of the Adaptive City Platform, links can be found at the top of the page or by clicking on /research.