The debut of Sunspring at the Sci-Fi London film festival last year—the screenplay of which was written entirely by an AI bot—showed us that the technology is not quite at the point where we can replace human storytellers completely, but definitely hints at how a collaboration between humans and AI could potentially supercharge the way we tell video stories. The MIT media lab recently investigated the potential for such collaboration in film storytelling, and questioned whether it was possible for AI to identify common emotional arcs, and if humans could use this to predict audience response. Machine-learning models that rely on deep neural networks were created by the team to “watch” small slices of video and estimate their positive or negative emotional content by the second. This technology paves the way for machines to view an untagged video and create an emotional arc for the story based on all of its audio and visual elements. After sifting through the data from thousands of videos and their respective constructed emotional arcs, humans then annotated the clips with various emotional labels at each video element to measure its accuracy. These insights allowed researchers to identify videos that share the same emotional trajectory and classify them into specific “families” which produce similar graphs of visual valence over time. Audience engagement could then be predicted, measured by number of comments a video would receive on social media. Realistically, it doesn’t look like AI will be replacing us any time soon, but collaborating with AI could definitely inspire video storytellers to look at their content objectively and make edits to sharpen stories and amplify engagement.
Source: McKinsey & Company