HUMAN: subtitles enhancing access and empathy

I came across this video on a friend’s Facebook feed. I’m a chronic multitasker, but by half a minute in I stopped doing whatever else I was doing and just watched and listened. This is the part that grabbed my heart:

This is my star. I had to wear it on my chest, of course, like all the Jews. It’s big, isn’t it? Especially for a child. That was when I was 8 years old.

Also Francine Christophe’s voice was very powerful and moved me. She annunciates each word so clearly. My French isn’t great, but she speaks slowly and clearly enough that I can understand her. Also, the subtitles confirm that I’m understanding correctly and reinforce what she’s saying.

I noticed that there was something different about the subtitles. The font is clear and elegant and the words are positioned in the blank space beside her face. I can watch her face and her eyes while I read the subtitles. My girlfriend reminded me of something I had said when I was reviewing my Queer ASL lesson at home. In ASL I learned that when fingerspelling you position your hand up by your face, as your face (especially your eyebrows) are part of the language. Even when we speak English our faces communicate so much.

I’ve seen a bunch of these short videos from this film. They are everyday people telling amazing stories about the huge range of experiences that we experience on this planet. The people who are filmed are from all over the world, and speaking in various languages. The design decision to shoot people with enough space to put the subtitles beside them is really powerful. For me the way the subtitles are done enhances the feeling of empathy.

A couple of weeks ago I was at a screening event of Microsoft’s Inclusive video at OCAD in Toronto. In the audience were many students of the Inclusive Design program who were in the video. One of the students asked if the video included description of the visuals for blind and visually impaired viewers. The Microsoft team replied that it didn’t and that often audio descriptions were distracting for viewers who didn’t need them. The student asked if there could’ve been a way to weave the audio description into the interviews, perhaps by asking the people who were speaking to describe where they were and what was going on, instead of tacking on the audio description afterwards. I love this idea.

HUMAN is very successful in skillfully including captions that are beautiful, enhance the storytelling, provide access to Deaf and Hard of Hearing people, provide a way for people who know a bit of the language to follow along with the story as told in the storyteller’s mother tongue, and make it easy to translate the film into other languages. I’m going to include this example in the work we’re going around universal design for learning with the BC Open Textbook project.

I can’t wait to see the whole film of HUMAN. I love the stories that they are telling and the way that they are doing it.

3 thoughts on “HUMAN: subtitles enhancing access and empathy”

  1. As someone who relies on lip reading, I didn’t find the positioning of the subtitles really any more useful than they would have been if they were at the bottom of the screen as usual. They were still much the same distance from Francine’s mouth. But the shorter line length was helpful in keeping her face and emotions in focus. There was plenty of room to move the subtitles closer to her face and mouth, as it is the mouth I always focus on when lip reading, even when subtitles are present. And it was great that the subtitles had been cleaned up, not just left as is, as most automatically generated subtitles are.

  2. Thanks for your comment. I hadn’t thought about the perspective of someone who lip reads. I completely agree that the automatically generated subtitles are often garbage.

  3. Not that I can lipread French, by the way! It is just a habit for me to focus on the mouth whenever I’m ‘listening’ to someone, by whatever means.

Leave a Reply

Your email address will not be published. Required fields are marked *