October 12, 2020
Jürgen Bänsch is Director for Policy and Public Affairs at the Interactive Software Federation of Europe (ISFE), and Director of Policy and Government Relations at the Pan European Game Information (PEGI). Ahead of PrivSec Global 2020, he takes PrivSec through the work being done in the gaming sector to protect the data and privacy of minors.
Protection of children’s data online has hit the headlines lately, with the UK Information Commissioner (ICO)’s announcement of a new children’s code enhancing provisions in the GDPR to safeguard the online activity of children.
But, in the European gaming industry, work has been ongoing for some time to protect minors, and Jürgen Bänsch, Director for Policy and Public Affairs at Interactive Software Federation of Europe (ISFE), is well placed to shed some light on the topic.
“Minor protection is one of the key areas of our lobbying in Brussels; it was also the reason why the ISFE association was set up,” he explains.
The ISFE is the European association that represents the video game industry in Brussels, and its membership comprises large video game publishers in Europe (the likes of Microsoft, Sony and Nintendo), mobile game publishers and national trade associations.
Bänsch’s role is that of lobbyist, particularly on consumer protection issues – including data protection – and his remit spans protection of minors, developing projects such as using commercial video games in children’s education, e-commerce, and more. Crucially, he is also Director of Policy and Government Relations at the Pan European Game Information (PEGI), a Europewide self-regulatory system for the gaming sector.
“At the beginning of the century, a lot of concern was voiced in the public debate about the impact of violence on children. That debate’s now a bit more mature, but it’s still the key focus of the PEGI organisation,” says Bänsch.
A ratings battle
At its core, it is a rating system that assigns a minimum age for each game, below which gameplay is not advised. The ratings advise parents rather than mandate, and are accompanied by basic information as to why the rating has been ascribed – for example, the presence of violence, drugs, or an atmosphere of fear.
During development, publishers complete a questionnaire providing detail of the game’s content and engage in a dialogue to ensure they understand the impact that content can have on their future rating. The finished game is then played by an independent administrator, who licenses the developer to use a specific rating and content descriptors for the relevant platform. Ratings will apply on a pan-European basis.
“The idea of the ratings really provides information for the parents to decide whether or not game content is suitable, but it also includes a number of requirements to keep online environment safe, to maintain a responsible advertising policy, to adhere to all of the necessary privacy laws, and so forth. So, really, it forces companies to behave responsibly in terms of minors,” he says.
Crucially, signing up to PEGI is voluntary. But more than 2,300 companies have done so, committing to a code of conduct and principles. The rating (and complaints process for unhappy consumers or publishers) is outsourced to an independent administration – the Netherlands Institute for Classification of Audiovisual Media (NICAM). There is an appeal board composed of independent experts, empowered to decide on additional measures – for example imposing fines, sending members of the offending publisher for training, or even removing a game from the market.
In addition, there is a PEGI Council, which convenes annually and consists of officials from national European member states who work with ministries involved in the protection of minors.
“I like to see it more as developing into co-regulation,” says Bänsch.
“Minor protection is mainly regulated at national level. We try to involve the national member states to a very high extent, because that is where the minor protection competencies and expertise lies, and the expertise lies as well, so we need to involve that level of government in the system.”
Risk and reward
So what about privacy and security risks for children while they play video games? According to Bänsch, there aren’t as many as you might expect.
“I think that is because we are not really a heavy marketing industry as such, which is maybe a strange thing to say, but our focus really is on the provision of gameplay and selling gameplay or gameplay elements to people,” he explains.
He adds: “It is really not on trying to acquire as much personal data as possible and then using that personal data for marketing purposes. That is really not who we are, and that is really the core of the confusion – that any type of digital industry would focus their business model on trying to get as much personal data from a user and then using that to get income from marketing. Absolutely not. The focus is not: let’s try to develop a little game so we can extract as much personal data as possible and make big bucks with that.”
Instead, according to Bänsch, most data collected by games publishers is non-personal data that provides insight on gameplay habits.
“There is so much competition in terms of game content that the whole focus is on: let’s try to monitor as much as possible if the consumer likes this game and what can we do to make the game better – to make it a bit more difficult, or a bit more easy, and so forth. There is a whole monitoring process of gameplay data that is essential for the video game industry, and a lot of data processing is really focused on that,” he explains.
The introduction earlier this month of the Age Appropriate Design Code in the UK (enforceable after a 12-month transition period) obligates organisations designing online content for UK-based children – including games publishers – to follow 15 standards supporting existing data protection laws under the GDPR.
Bänsch takes a positive view of the new code.
“Children in the GDPR are very briefly mentioned, and it’s very unclear sometimes what this actually means in real life. Article 8 is very short, and it’s not harmonised. The cut-off age [for consent] is not harmonised in Europe, which is a big problem as well. There hasn’t been a lot of focus from data protection authorities on the issue of children and minors and the ICO really was the first to give attention to it, which is extremely welcome,” he says.
“Obviously we would even more like to have a Europe-wide guidance to this. But the ICO has proven itself sort of expert in the issue of children and data and I think we have welcomed the principle approach that they have to the fact that children deserve specific protection, and that the interests of the child are essential. I think that is something that we have been doing as an industry as such as well for a long time.”
Bänsch emphasises the safeguards already in place, including parental control systems on devices which allows a parent to set up a specific account for his child. The child can then access services only with parental oversight, for example on data sharing, online interaction, exposure to marketing content and user-generated content.
“All of these things are safeguards really that we have imposed and that are completely aligned with the ICO code. locally our industry is in contact with the ICO in terms of understanding the Code a bit better, but I do not imagine that it’s going to be a totally different world when this code will be enforced,’ he says.
A major issue on Bänsch’s radar in the near future is the planned – but still yet-to-be-agreed – upgrade of the 2002 European ePrivacy Directive. The existing and proposed regulation of cookies is something that he finds particularly contentious.
“So, this new ePrivacy Regulation is still in the pipeline. It’s been discussed for years now. As always, our concern is: what will be the impact on our ability to monitor gameplay data? If we cannot monitor gameplay data – and to be clear, it’s not monitoring individual people, it’s monitoring what’s happened within the game – we cannot build new games. Or you cannot develop the game, and you need to constantly develop the game if you want to have people continue to play the game.
“Gameplay data is kind of non-personal data, but obviously from a certain perspective you can say there is maybe a grey zone, because it will always have some level of device ID on that. Gameplay data is essential for us and the impact on gameplay data, that is really important. That is why we have reservations about the ePrivacy regulation.”
That same concern governs the ISFE’s response to any possible future regulation of non-personal data:
“The European Commission is also talking about regulation of non-personal data. In the next year, we will see a Data Act”, Bansch says.
“The question there, of course, is what would it mean? Again, what we are interested in is looking at gameplay data in an aggregated form, so not individual gameplay data but millions of people playing at the same time. We need to be able to process that type of data. It has no implications for privacy or for protection of personal data as such, but obviously anything that will alter our ability to do that will alter our ability to build new games and will affect the business model.”
Hear more from Jürgen Bänsch at PrivSec Global, 3 December, on Technological Development and the Impact on Children’s Data Privacy. Click here for more information.