Davis is head of production for support video content, for Reality Labs, one of the verticals at Meta, which focuses on VR, AR and AI. She also participated in a roundtable session at the Media Insights conference, presenting “Using Data to Create Meaningful Content.”
“For VR, we have hardware such as the Quest VR headset,” she says. “I produce the support content for the Quest headset. We look at customer support data to determine what types of support content topics to create. And I partner with our Meta video production team to create those videos.”
Once you have passed through this type of customer support data, and dealing with feedback, what factors do you consider when it comes to content?
Davis relates, “When we consider content to create, we also must consider the impact of the content, and how is it driving positive change in our customer support data. Customer support data can range from looking at contact drivers. The top issues that customers face, with their headsets, ticket volume, NPS scores, CSAT scores. When we are creating the content, we do consider those data points, and how do we positively impact those data points? We also partner with user research. They have good data to help us drive how our customers consume the content. Where did they go? What’s the discoverability? Then we also consider outside trends. We are seeing a lot of short form content. We do have to factor in the data that we’re seeing externally on how to create content.”
With all of this customer and user data, what kind of metrics are you leveraging? How do you find the positive changes in the data, and then what do you do?
“We had to evolve the way we measure our video content,” says Davis. “It used to be what I’ve heard they’re called vanity metrics, so likes and views, impressions. That didn’t translate for customer support and that didn’t translate to revenue for our vertical. I had to take a step back and think, how can we get those measurable insights to align with our support data. One way that I did that was at the end of our videos, and our videos are hosted on our store help center. If you go to our help center, you look up an article, you’ll find our videos. At the end of the videos, we ask a survey. Was this video helpful? Yes or no? Did this video resolve your issue? That’s a way for us to determine NPS around video. But we do have future plans to kind of enhance that survey. Maybe we can surface relative content. We are evolving in ways to get insights, is this really helping, if not, why, and pinpoint ways that we can improve our video content.”
Would you be able to share some of those insights, or the evolution of insights in this media?
Davis observes, “When I first started creating videos in-house for tech companies about ten years ago, it was really just views as what drove the main focus for metrics. That works depending on what vertical you’re in. I’ve worked on sales teams, marketing teams, product marketing, sometimes views are a metric that matters. Over time, what I’ve seen is engagement rates, so audience retention, how long are customers or audiences watching your content, where are they watching your content as we evolve in the platforms that are available. It used to be just desktop or web. Now we’re looking at mobile. Now we are looking at VR immersive content. Where are they watching the content? Maybe there will be future metrics, because it has evolved over time, and it’s going to continue to evolve, especially as we are going to this world of short form content. Maybe the content types evolve based on data.”
The content types evolving based on data would absolutely apply to all VR or AR content. What are you thinking these days about Metaverse content, VR content, AR content. What have we learned essentially?
“I think it’s still early. We’re still learning,” she says. “We’re learning VR is very immersive. You have very different frame rates in VR than you would if you were to watch something on TV or on your phone. There are different dynamics that play into the experience for audience members in an immersive environment versus not. We’re still learning how to refine things like frame rate. We’re still seeing how things like VR or AR products are going to become more accessible. We are making it more accessible by making it more affordable, but it’s still early. I know Apple is going to have their headset coming out, so it does justify the work that we’ve been doing.”
She adds, “It creates a lot of opportunity. This year, we introduced reels into VR and so short-form content is now in an immersive environment. We’ll see how that goes. We do have Ray-Ban Stories, which are the smart glasses. There’s just a lot of opportunity for content right now in those environments.”
Watch the complete video from the Media Insights & Engagement Conference as Seth Adler and Shenaika Davis discuss VR, short-form and video content, and more.
Contributors
-
Seth Adler heads up All Things Insights & All Things Innovation. He has spent his career bringing people together around content. He has a dynamic background producing events, podcasts, video, and the written word.
View all posts -
Matthew Kramer is the Digital Editor for All Things Insights & All Things Innovation. He has over 20 years of experience working in publishing and media companies, on a variety of business-to-business publications, websites and trade shows.
View all posts