Data commercialisation: value and risk
Player data is valuable. Apply data analytics to the digital footprints left by gamers, and you’ve got everything you need to know about their preferences, interactions and in-game activity. Developers use player data to optimise game design and craft tailored experiences that keep gamers coming back for more.
Predictive data analytics go further still, helping developers to anticipate the next big thing in gaming, shape future releases with user-relevant features and, of course, boost revenues.
Elia Kim co-leads the Gaming and Immersive Technology group at Simmons & Simmons in Tokyo. He points to the value of data commercialisation in the gaming industry. “User data enhances the likelihood of producing games that resonate. Take Rollic Games, which since 2018, has developed more than 200 mobile games, clocking up more than two billion downloads.”
But if data is commercially valuable to developers, it’s worth even more to hackers. So, as the gaming industry becomes more data centric, risks of data breaches increase. Developers are caught between gamers’ rising demands for data privacy and government clampdowns on data-sharing across borders to safeguard national security.
Europe’s General Data Protection Regulations (GDPR) set the global benchmark for data protection. Gaming companies use it to guide their compliance on cross-border data flows and keep them on the right side of local data protection laws. Elia advises gaming companies to be alert to some notable GDPR obligations:
- Valid consent for minors: Parents or legal guardians must provide their consent for online gaming by minors. Developers must make reasonable efforts to verify that the consent is legitimate.
- Automated banning systems for player misconduct: Platforms that use automated systems to enforce game rules and player conduct might be deemed to make decisions without human oversight. This could breach Article 22 of GDPR and could significantly impact an individual’s legal rights.
- In-game marketing: Player data used to tailor in-game advertisements is considered personal data. Game developers must have a valid legal basis for engaging in data-processing activities, which GDPR defines as ‘profiling’.
User-generated content: Who really owns it?
User-generated content (UGC) transforms players from spectators into content creators. Their spin-off characters and stories can contribute to a game’s ongoing success. Some developers provide editing tools to encourage gamers’ creative contributions.
Player modifications (mods, as they are known) can be highly lucrative. Among them is Dota 2 Battle Pass, which started out as a mod for Blizzard Entertainment’s Warcraft III. It brings in almost US$300 million each year.
But, if content is created by modders, who really owns the intellectual property (IP)? It’s a complex issue and, as Elia admits, there is no straightforward answer.
Game developers own exclusive rights to their creations. But when modders create something that is recognizably new, yet a derivative of the original content, they also gain protections. Inevitably, disputes arise. Blizzard, for instance, had to settle for non-commercial use of the Dota trademark, while Valve, owner of Dota 2, retained commercial rights.
“This case underscores the complexities of copyright law,” says Elia. “And it highlights the need for clear guidelines and agreements, between developers and modders, to navigate ownership of UGC derivatives and protections for creative contributions.”
Ways to address UGC
Accept inherent risk:
Roblox turns itself into a development sandbox. It allows players to monetize their creations with one-time fees, special privileges and private-server subscriptions. Hobbyist developers are said to earn between US$250 and US$100,000 per month.
Notifications:
In-game messages alert users that by creating content they may breach the game’s end-user license agreement or terms of service. These take-down notices are largely ineffective and could create liability issues for platforms that facilitate UGC.
Online safety and protection of minors
Online gaming can be addictive. It appeals to younger audiences, exposing minors to risks, such as cyber bullying, exploitation and inappropriate content. Globally, regulators are stepping up, helping to create a safer gaming environments for everyone.
- The UK introduced its Online Safety Act in 2023 to regulate harmful and illegal content and require age-verification measures.
- South Korea’s 2011 ‘Shutdown’ Law, which restricted under-16s from gaming after midnight, was replaced in 2021 with a Choice Permit system, allowing parents and guardians to set the curfew for gaming hours.
- New York legislation, in 2014, mandates social media companies verify that minors have parental consent before providing access to addictive content.
- Australia’s Online Safety Act, 2021, is designed to manage online safety risks and ensure providers’ systems prevent distribution of harmful content, including communication and play between users.
- Singapore’s Online Criminal Harms Act addresses the risks of online gaming, including cyber-bullying and online grooming. Authorities can direct game developers and internet service providers to restrict or block offenders' access to games and communication platforms.
- China’s strict regulations curb screentime for minors. In 2021, minors were restricted to one hour a day at weekends and during public holidays. In 2022, China mandated a youth mode, restricting usage time, payments and filters, based on age appropriateness.
Priorities for responsible developers
Gaming is a commercial success story, driven by technological advances and an expanding global audience eager for the next new release. However, this rapid boom brings complex challenges that must be addressed if the industry is to grow responsibly.
Respecting data privacy; supporting mod creators’ endeavours and IP, and safeguarding online play are not just ethical choices, but strategic priorities for the future of gaming.
















.jpg?crop=300,495&format=webply&auto=webp)

