Lighting, projection, and theme park ride development.
At Bot & Dolly we applied the same principles used to coordinate sensors and actuators in robotics to live entertainment and immersive design.
In some cases—helping a major theme park operator develop a new ride—the actuators were brushless DC motors.
In others, the “actuators” were stage elements: projectors, moving head stage lights, or multimedia cues.
Core to this work was building and maintaining a model of the physical world, and that effort usually began with calibration. It was necessary to calibrate the static environment, camera and projector intrinsics and extrinsics, kinematics (and occasionally dynamics) of any mechanical assemblies. My toolkit in this domain includes algorithms and software libraries such as OpenCV, Ceres, ROS, Matlab solvers, and the like.
For accurate positioning of elements in a scene, or performance capture, we relied on motion capture technology such as Phasespace’s active LED system.
Another common element in nearly all live entertainment applications was careful synchronization of high-resolution time. In a typical scenario we might have DMX lighting controlled at 10Hz; GPUs rendering, camera shutters rolling, and projectors scanning at 60Hz; a positioning system updating at 250 Hz; motor controllers running at 1 KHz; and so on. Solving problems in this domain usually involved some combination of film technology like timecode and genlock; stage automation protocols like DMX, OSC, and MIDI; fieldbus protocols like EtherCAT and CANbus; TCP/IP-based protocols; real-time operating systems; real-time-safe software; and, occasionally, an oscilloscope.
Finally, there was the work of building the right interfaces for the artists who were creating the content, and for the technical crews who were running nightly performances. I learned how to quickly extend tools like Maya, MotionBuilder, and TouchDesigner, along with web technologies, to build tools in our users’ “native language.”