After Dark at Tate Britain

The live broadcast event, held over five consecutive nights, enabled over 100,000 people to control and “see” through the eyes of one of four custom-built, video-streaming robots from a web browser.

While the robots allowed visitors to “see” and explore, four art experts provided live commentary over the video streams, creating another level of engagement and experience that would be impossible during daylight hours.

Role

Concept, Visual Design, Code

More info

www.afterdark.io

Tech Used

  • WebGL, GLSL Shaders + Three.js — realtime colour grading (custom 3D LUT from Da Vinci Resolve Workflow)
  • GStreamer — custom pipelines to send a low latency H264 streaming from HD Camera attached to a Raspberry Pi 2, over wifi, to a receiving Linux PC to a V4L virtual device
  • WebRTC, Peer.js — low latency, robot to robot controller, video streaming
  • WebSockets— realtime messaging
  • Redis — realtime state
  • PostgreSQL — queue stateLiveStream + Blackmagic Video Capture– robot splitscreen + commentator audio, live video broadcasting
  • Node.js — robot control, serial comms, server applications and management
  • NW.js – various applications from queue management to video composition
  • OpenFrameworks/C++ — early prototypes (including embedding grey codes into video stream)

Thanks

Tate’s IT Team for being onboard, championing the project, and making it possible to leverage their infrastructure.

RAL Space for specifying the right hardware for the robots, providing consultation and trips to their workplace.