Our competition took place in January 2026 and we invited students to design and develop small, browser-based educational games that help users better understand and recognise online manipulation, specifically related to fake activity market. The challenge focused on creating experiences that are simple, engaging, and accessible, while delivering meaningful learning outcomes.
Each submitted game demonstrates how playful design can support serious learning objectives — transforming complex concepts about online influence and fake market into interactive challenges that encourage reflection and awareness.
Fact or Fake is a fast-paced browser game where players act as moderators for a simulated social media feed, dealing with bot attacks and misinformation. The game focuses on text-based posts styled like Twitter/X. Unlike standard quizzes, this game simulates the chaotic speed of a real feed and the realism of misinformation spreading online.
By Nathan Jones & Amber Fleming - Durham University
Real or Bot? is a lightweight, browser-based mini-game that teaches players to spot suspicious social media accounts using common credibility signals, with tiered questions, instant feedback, and short explanations.
By Chengkang Li - Newcastle University
"Sold Out" is a simulation game where players take on the role of an online seller trying to survive in the harsh world of e-commerce. Players start by trying to promote and sell everyday items soon realising the only way they can turn a profit is to “please” the algorithm. The game illustrates how e-commerce algorithms prioritise engagement over truth, giving sellers a choice of staying honest and going bankrupt or cheating to increase profits.
By Sami Ustuner - Newcastle University
The player will compare two profiles and attempt to identify the impersonator. The game includes 10 levels of difficulty, each highlighting a common tactic used in online manipulation. By recognising subtle warning signs, players can strengthen their digital awareness and reduce the chance of engaging with fake accounts.
By Nguyen Thien An - Durham University
The game designed to help understand the security risks of modern AI assistants. Based on the industry-standard OWASP Top 10 for LLMs, the game places players in a simulated "Red Team vs. Blue Team" environment. First, players step into the shoes of an attacker, using simple conversational prompts to trick the AI into leaking secrets or generating fake news. Then, they switch to defense mode to understand how these vulnerabilities occur and apply countermeasures to secure the system.
By Suraj Maurya & Ojasvi - University of Luxembourg
"Can You Spot?" helps users recognise online misinformation cues. Players read short articles and determine whether the statement is True. End of round summaries and final end of game feedback is used to strengthen critical thinking and improve awareness to misleading content.
By Christian Ombo - Durham University