By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

Fandom Improves Community Safety and Reduces Cost by 50% with Coactive AI

Learn how Coactive AI helped Fandom categorize and filter images, ultimately reducing hours spent moderating content by 74%.

500 hrs
Per week manual labeling hours reduced by 74%
2.2M
image uploads per month and 90% automatically reviewed
6 weeks
For production-ready solution
50%
Cost savings
Fandom improves community safety and reduces cost by 50% with Coactive AI
Founded
2004
What they deliver
Community Experiences
http://www.fandom.com
Started using Coactive
2023
Use Case
Content Moderation
and Metadata Generation

Outstanding customer support! They went above and beyond to help me resolve my issue. I felt valued as a customer, and their commitment to ensuring my satisfaction left a lasting impression.

testimonial person photo
Florent Blachot
VP of Data Science & Engineering
@
Fandom

About Fandom

Fandom is the world’s largest fan platform where fans immerse themselves in imagined worlds across entertainment and gaming. Reaching more than 350 million unique visitors per month and hosting more than 300,000 wikis, Fandom is the #1 source for in-depth information on pop culture, gaming, TV and film, where fans learn about and celebrate their favorite fandoms. Fandom’s Gaming division manages the online video game retailer Fanatical.

Fandom Productions, the content arm of Fandom, enhances the fan experience through curated editorial coverage and branded content from trusted and established publishing brands Gamespot, TV Guide and Metacritic, along with its Emmy-nominated Honest Trailers.

Opportunity:

From Westeros to Ponyville: Accepting the Quest to Improve the Fan Experience

Fandom is the world's largest fan platform reaching 350 million fans worldwide each month. From Game of Thrones to My Little Pony, Fandom’s mission is to power fan experiences by providing fans with the largest online library of information on anything and everything in entertainment, gaming and pop culture. There are three unique types of clients that Fandom serves, each with different experiences related to the platform:

  • Users seeking information
  • Superfan editors looking to create content
  • Advertisers trying to appeal to fans

Hundreds of millions of visitors engage with Fandom every month—which means hundreds of millions of possibilities that those users might upload new content to enhance the pages, or wikis, that interest them. While most images are uploaded with good intentions, about 0.5 percent of the 2.2 million images uploaded monthly are malicious and violate Fandom’s terms of service. Given the significant scale, this small percentage results in high costs for manually reviewing each image.

With tens of millions of images uploaded per year, it became challenging for the Fandom Trust and Safety team to personally moderate each image. Fandom hired contractors to assist with the manual image moderation process, which took about 500 hours per week— a costly expense added to an already taxing process.

The team needed a solution that would automate visual content moderation, to not only ensure Fandom’s continued user trust, but also community safety and advertising opportunities.

Solution:

Managing Mischief and Combatting Trolls with Coactive AI

The Fandom team had heard about Coactive AI and did a deep dive to learn more about its capabilities. Coactive is a platform that unlocks the value of video and image data for critical use cases in search, tagging, and analytics through multimodal artificial intelligence (AI). Coactive worked with Fandom on a proof of concept (POC) to discuss ways the platform could help it with user safety

“We started the POC process in early 2023 and after only a couple of months of testing, we saw the value,” said Florent Blachot, VP of data at Fandom. “The Coactive team delivered us a solution in just six weeks. We recognized that the return on investment for this platform was very good, and that the potential for the future is not just focusing on user safety, but far beyond that."

As a first step, the Fandom team curated about 20 million problematic images and categorized them into 25 nuanced labels, such as “gore,” “nudity,” or “weapons,” for example. From there, Coactive created lightweight machine learning models and fine-tuned them for a powerful foundation model that recognizes offensive imagery. Each image is given a score between 1 and 100 for how well it fits into its label. For images that score 90 and above, Coactive deems them inappropriate and removes them from Fandom. For those that score below 10, the image is allowed on the site. In the middle of the spectrum, the content moderation team manually reviews images between 10 and 90 and either deletes or accepts them. This helps further train the models to learn the nuances between things like gore with blue alien blood and fake body parts.

After a successful POC, Fandom integrated the Coactive API into its stack to operationalize the process for end-to-end image moderation.

Outcome:

Empowering Fan Communities and Refining the Process Along the Way

With the new image scoring process in place, bad images are pulled from the wikis in a matter of seconds, not hours. Blachot added, “On average, this process takes less than half a second. It’s significantly faster than the previous system, where some images could remain on the page for 24 to 36 hours before removal. And it elastically scales with traffic and seasonality.”

Automating image moderation has not only saved the Fandom team time, but also improved morale—as they no longer spend hours sifting through disturbing imagery.

“Coactive enables us to make automatic judgments for about 90 percent of the images uploaded to Fandom. That alone gives us a 50 percent cost reduction and 74 percent reduction in manual hours spent reviewing images,” said Blachot. “Before, our team and our contractors were spending about 500 hours per week reviewing the queue. Now it's down to 130 hours maximum. This solution not only reduces cost, but it also has the added mental health benefit—which is priceless.”

Working with Coactive to refine the process is an ongoing effort. Fandom is unique in that much of its content is based in fictional realms, so its visual content doesn’t always fit the status quo data solution. Coactive works closely with Fandom to continuously improve the image sorting process.

Timothy Quievryn, Director of Community Safety at Fandom, added:

“Coactive met our wildest expectations, and year one has been such a success. With year two, we plan to work on our metadata and SEO processes, and we’re looking forward to making it even better.” he said. “A good platform is nothing if you don't have good people representing it. And Coactive has some wonderful people we've genuinely enjoyed working with.”

After a successful POC, Fandom integrated the Coactive API into its stack to operationalize the process for end-to-end image moderation.

Download this Case Study

Coactive AI helps data teams extract insights from unstructured image and video data. It integrates visual data with familiar SQL and big data tools, using pre-trained models for trend analysis, content moderation, search, and mapping.

Thank you!
The case study has been successfully downloaded.