WRITTEN BY MediaMonks
If you have any interest in tech, chances are you’ve watched an episode or two of Black Mirror, Netflix’s massively popular series that dives into uneasy relationships between people and tech (and if you haven’t, we suggest you start). While the drama sets its sights in the near future, its power—and proficiency for inducing anxiety in viewers—stems from the fact that the questions it explores are just as applicable to the present day.
If you take solace in the fact that Black Mirror’s terrifying tech hasn’t yet (or never will) come to fruition, don’t get too comfortable: the future is now, they say, and the mad minds at MediaMonks Labs are releasing a web app that brings the show one step closer to reality.
In the season 2 episode “White Christmas,” a neural implant called Z-Eye provides an augmented reality interface within a user’s field of vision. Among its features is the ability to block other people in real life, which makes them appear as gray silhouettes with muffled speech. Inspired by the feature, MediaMonks Labs built an AR filter that likewise renders a user’s body as a pixelated, black-and-white shape (try it here–best enjoyed on a tablet or laptop).
While it’s not quite as advanced as what you see on Black Mirror—it doesn’t require an implant, thankfully, and works with only one person in the view—it demonstrates how technology featured in speculative fiction is closer to reality than audiences might anticipate. In this case, the technology is so accessible that it requires little setup for users to play with. “What makes this so cool is that it’s running in your browser and any out-of-the-box webcam,” says Joe Mango, Creative Technologist at MediaMonks Labs, who built the tool. “It’s clickable and usable at any time.”
The filter is powered by BodyPix, an open-source machine learning model that enables body-part segmentation. “Segmentation” is a process in which a machine takes an image or video and separates the pixels that belong to a person versus those that do not. It’s the same idea behind using a green screen, no screen necessary.
What makes BodyPix unique is that it not only separates a body from its surroundings, but can also segment that body into 24 specific parts; for example, the left side of the face versus the backside of a subject’s right arm. Such technology could have several applications, like creating precise, body-tracking augmented reality filters or image editing.
With BodyPix as an underlying machine learning model, the Labs-developed body-blocking tool isn’t TV magic. And even though the technology that makes it work is quite complex, the concept behind body blocking is relatedly simple.
Rendering formless, pixelated bodies in real time, the Labs-developed body-blocking tool isn’t movie magic. The tech powering it is quite complex, but the concept behind body blocking is rather simple and relies on editing two identical video feeds together. On one end, the tool takes the camera feed and applies a shader that distorts and pixelates it. On the other end, a separate version of the video is segmented to cut out the user’s body—as if you took the magic wand tool image-editing program to select an object and delete it. This version of the video is pressed on top of the distorted one, resulting in a pixelated, black-and-white body surrounded by an otherwise normal environment.
While the tool isn’t totally accurate to the technology as seen in the show, its underlying mechanics can help viewers understand and envision a technology that previously may have seemed possible only in the realm of science fiction. It also demonstrates the value of trying to make the impossible possible through cutting-edge and emerging technology.
“It may seem unusual that a creative and production company like MediaMonks is doing such in-depth technological research,” says Mango, though that openness to experiment can help brands find new opportunities to take experiences available in their worlds and replicate them for fans in ours. When Mango stumbled upon BodyPix in GitHub, for example, “My mind immediately went to the Black Mirror episode.”
“We peer into Pandora’s box to see what’s possible—and the questions raised.”
The pitch to build upon the software to reproduce the startling Z-Eyes feature was surprisingly easy: “When I approached [Head of Labs Sander van der Vegte] with the idea, he said, ‘That sounds cool, let’s do it!’” recounts Mango. “In a nutshell, it’s that simple to get sign-off on a project thanks to the open, creative environment.” But considering Black Mirror is often a case study in troubling uses of technology, this zeal to experiment shouldn’t be viewed as flippancy. “We’re able to peer into Pandora’s box to see the interesting possibilities that are available—and the questions they raise,” says Mango.
And that ties back to how such projects can help end-users better apply fictitious, futuristic tech they see in the media to their own lives—and to anticipate or wrangle with some of the ethical dilemmas that inspire speculative fiction. And of course, such projects are just plain cool, too. In essence, it all boils down to making a connection that breaks down barriers between the real and imagined.