Some Like It Bot is a short film that depicts the clumsy first date of a human and his cyborg crush.
It was made by students at Chapman University's Dodge College of Film and Media Arts and directed by recent graduate Alex U. Griffin
The centerpiece of the film is a robot-only bar that the lead human has to sneak into in order to go on a date. This set piece allowed the students from the Californian university to design an accompanying cast of interesting background characters. However through doing so they quickly realised they'd need to take a unique approach to capturing them on screen.
"We had some really fun ideas like bartenders that served from the ceiling and intimidating 12-foot-tall robot bouncers," said Griffin. "A cast like that could have cost us quite a bit of time and money."
Instead of using the College's own optical mocap room to create robots hanging from the ceilings for their live-action short, Griffin and co. used Xsens MVN full-body camera-less mocap suit.
"With the suit on hand, our team could act out our moves anywhere and then feed that data into MotionBuilder," said Griffin. "The characters that we visualised in our heads began coming to life before our eyes."
"Our actors, myself included, knew from the rehearsals and shot plans where the cameras were going to be, and more importantly, where their sightlines were," said Griffin. "This way, we could use motion capture for the actors to play off their digital characters on set. After all, it's one thing to animate a whole world. A live-action blend with digital counterparts is a whole different beast!"
When the students were ready to start filming, the Xsens team paid a visit to help them kick it off. After a morning walkthrough that included a quick system set-up and information on how to raise the output quality, students started to experiment with the system's capabilities. Three days later, the entire film had moved on to post-production.
"We tried some real-time filming with the Xsens MVN suit, often doing a shot with an actor as the robot and another with our live human interacting with them from a distance, in order to keep our frame clean," said Griffin. "Our crew really liked using this technique with over-the-shoulder shots or close ups. Everyone had a basic idea of where they were in the shot so we were able to match eyelines on set instead of spending a ton of time re-shooting or editing in post."
The weekend of production was followed by a quick post workflow, using the clean data from Xsens' inertial system. This came as a welcome change for the students who were used to keyframing for hours or even days after filming.
Sign up for CIO Asia eNewsletters.