How the tech industry will have to step up to fight online toxicity and child abuse

When it comes to fighting online toxicity and sexual abuse of children, most companies say they are supportive. But complying with the laws can become tricky.

The proposed federal legislation, dubbed the EARN IT Act (short for Eliminating Abusive and Rampant Neglect of Interactive Technologies), creates incentives for companies to “earn” their liability protection for laws that take place on their platform, particularly related to online child sexual abuse. Civil libertarians have condemned it as a way to circumvent encryption and an attempt to scan all messages.

If passed, the bipartisan legislation could force companies to react, said Carlos Figueiredo, director of community trust and safety at Two Hat Security, in an interview with VentureBeat. The legislation would take the extraordinary step of removing legal protections for tech companies that fail to police the illegal content. That would lower the bar for suing tech companies.

Companies may be required to find illegal material on their platforms, categorize it, and verify the ages of users. Their practices would be subject to approval by the Justice Department and other agencies, as well as Congress and the president.

Two Has Security runs an AI-powered content moderation platform that classifies or filters human interactions in real-time, so it can flag online cyberbullying and other problems. This applies to in-game chat that most online games use. 57% of young people say they have experienced bullying online when playing games, and 22% said they have stopped playing as a result.

Two Hat will be speaking about online toxicity at our GamesBeat Summit Digital event on April 28-29. Here’s an edited transcript of our interview with Figueiredo.

Above: Carlos Figueiredo is director of community trust and safety at Two Hat.

GamesBeat: The EARN IT Act wasn’t really on my radar. Is it significant legislation? What’s some of the history behind it?

Carlos Figueiredo: It has bipartisan support. There’s pushback already from some companies, though. There’s quite a lot of pushback from big tech, for sure.

There are two aspects to it right now. One is the EARN IT Act, and the other is coming up with a voluntary set of standards that companies could adopt. The voluntary standards are a productive aspect. It’s awesome to see companies like Roblox in that conversation. Facebook, Google, Microsoft, Roblox, Thorn–it’s great to see that in that particular conversation, that separate international initiative, there’s representation from gaming companies directly. The fact that Roblox also worked with Microsoft and Thorn on Project Artemis is awesome. That’s directly related to this topic. There’s now a free tool that allows companies to look for grooming in chat. Gaming companies can proactively use it in addition to technologies like Photo DNA from Microsoft. On an international level, there is a willingness to have all those companies, governments, and industry collaborate together to do this.

On the EARN IT Act, one of the biggest pieces is that–there’s a law from the ‘90s, a provision. It says that companies have a certain exception. They don’t need to necessarily deal with user-generated content. They’re not liable for what their platform–there’s a pass, let’s say, in that sense. The EARN IT Act, the legislation calls for industry standards, including incentives for companies who abide by them, but it also carves an exception to this law from the ‘90s. Companies would have to have minimal standards and be responsible. You can imagine that there’s pushback to that.

GamesBeat: It reminds me of the COPPA (Children’s Online Privacy Protection Act) law. Are we talking about something similar here, or is it very different?

Figueiredo: COPPA is a perfect example to discuss. It directly affected games. Anybody who wants to have a game catering to under-13 players in the U.S., they must protect personally identifying information of those players. Of course it has implications when it comes to chat. I worked for Club Penguin for six years. Club Penguin was COPPA-compliant, of course. It had a very young user base. When you’re COPPA-compliant at that level, you need to filter. You need to have proactive approaches.

There’s a similarity. Because of COPPA, companies had to take care of private information from children, and they also had to make sure that children were not, through their own innocence, inadvertently sharing information. Talking about child protection, that’s pertinent. What the Act could bring is the need for companies to have proactive filtering for images. That’s one potential implication. If I know there is child exploitation in my platform, I must do something. But that’s not enough. I think we have to go beyond the knowledge of it. We need to be proactive to make sure this is not happening in our platforms. We could be looking at a landscape, in the next year or so, where the scrutiny on gaming companies to have proactive filters for grooming, for image filtering, means that will become a reality.

Above: Panel on Safety by Design. Carlos Figueiredo is second from right.

GamesBeat: How does this become important for Two Hat’s business?

Figueiredo: Because of the very DNA of the company–a lot of us came from the children’s space, games catering to children. We have long been working in this area, and we have deep concern for child safety online. We’ve gone beyond the scope of children, protecting teenagers, protecting adults. Making sure people are free from abuse online is a key component of our company.

We have our main tool, which is used by a lot of leading game companies around the world for proactive filters on hate speech, harassment, and other types of behavior. Some of them also work for grooming detection, to make sure you’re aware if someone is trying to groom a child. Directly related to that, there’s an increased awareness in the importance of people knowing that there is technology available to deal with this challenge. There are best practices already available. There’s no need to reinvent the wheel. There’s a lot of great process and technology already available. Another side of the company has been our partnership that we forged with the RCMP here in Canada. We work together to produce a proactive filtering for child abuse imagery. We can find imagery that hasn’t been cut a lot yet, that hasn’t become a hash in Photo DNA.

The implication for us, then, is it helps us fulfill our true vision. Our vision is to ensure that companies have the technologies and approaches to reach an internet where people are free to express themselves without abuse and harassment. It’s a key goal that we have. It seems like the idea of shared responsibility is getting stronger. It’s a shared responsibility within the industry. I’m all about industry collaboration, of course. I firmly believe in approaches like the Fair Play Alliance, where game companies get together and put aside any tone of competition because they’re thinking about facilitating awesome play interactions without harassment and hate speech. I believe in that shared responsibility within the industry.

Even beyond shared responsibility is the collaboration between government and industry and players and academia. To your question about the implications for Two Hat and our business, it’s really this cultural change. It’s bigger than Two Hat alone. We happen to be in a central position because we have amazing clients and partners globally. We have a privileged position working with great people. But it’s bigger than us, bigger than one gaming community or platform.

GamesBeat: Is there something in place industry-wide to handle the EARN IT Act? Something like the Fair Play Alliance? Or would it be some other body?

Figueiredo: I know that there are already working groups globally. Governments have been taking initiatives. To give a couple of examples, I know that in the U.K., because of the team responsible for their upcoming online harms legislation, the government has led a lot of conversations and gotten industry together to discuss topics. There are active groups that gather every so often to talk about child protection. Those are more closed working groups right now, but the game industry is involved in the conversation.

Another example is the e-safety team in Australia. Australia is the only country that has an e-safety commissioner. It’s a whole commission inside of the government that takes care of online safety. I had the privilege of speaking there last year at their e-safety conference. They’re pushing for a project called Safety By Design. They’ve consulted with gaming companies, social apps, and all sorts of companies globally to come up with a baseline of best practices. The minimum standards–we think Safety By Design would be this idea of having proactive filters, having good reporting systems in place, having all these practices as a baseline.

The Fair Play Alliance, of course, is a great example in the game industry of companies working together on multiple topics. We’re interested in enabling positive player interactions and reducing, mitigating negative behavior, disruptive behavior. There are all sorts of disruptive behavior, and we have all sorts of members in the Fair Play Alliance. A lot of those members are games that cater to children. It’s a lot of people with lots of experience in this area who can share best practices related to child protection.

Above: Carlos Figueiredo speaks at Rovio Con.

GamesBeat: How much of this is a technology problem? How do you try to frame it for people in that context?

Figueiredo: In terms of technology, if we’re talking about images–for a lot of gaming companies it could be images on their forums, for example, or perhaps they have image sharing even in the game, if they have avatar pictures or things like that. The challenge of images is critical, because the volume of child abuse imagery online is unbelievable.

The biggest challenge is how to identify new images as they’re being created. There’s already Photo DNA from Microsoft, which creates those digital IDs, hashes for images that are known images of child abuse. Let’s say we have a game and we’re using Photo DNA. As soon as somebody starts to upload a known image as their avatar or to share in a forum, we’re able to identify that it’s a known hash. We can block the image and report to law enforcement. But the challenge is how to identify new images that haven’t been catalogued yet. You can imagine the burden on a gaming company. The team is exposed to this sort of material, so there’s the point of wellness and resilience for the team.

That’s a technology problem, because to identify those images at scale is very difficult. You can’t rely on humans alone, because that’s not scalable. The well-being of humans is just shattered when you have to review those images day in and day out. That’s when you need technology like what Two Hat has with our product called Cease, which is machine learning for identifying new child abuse imagery. That’s the technology challenge.

If we go on to live streaming, which is obviously huge in the game industry, it’s another problem in terms of technological limitations. It’s difficult to detect child abuse material on a live stream. There’s work being done already in this area. Two Hat has a partner that we’re working with to detect this type of content in videos and live streams. But this is on the cutting edge. It’s being developed right now. It’s difficult to tackle this problem. It’s one of the hardest problems when you put it side by side with audio detection of abuse.

The third area I want to point out is grooming in text. This is challenging because it’s not about a behavior that you can simply capture in one day. It’s not like somebody harassing someone in a game. You can usually pinpoint that to one occasion, one game session, or a few occasions. Grooming happens over the course of weeks, or sometimes months. It’s the perpetrator building trust with a child, normalizing the adult-child relationship, offering gifts, understanding the psychology of a child. That’s a huge challenge technologically.

There are great tools already available. We’ve referenced a couple here, including Project Artemis, which is a new avenue. Of course you have Community Sift, our product from Two Hat. There are folks doing awesome work in this area. Thorn and Microsoft and Roblox have worked on this. There are new, exciting initiatives on the cutting edge. But there’s a lot of challenge. From our experience working with global clients–we’re processing more than a billion pieces of content every day here at Two Hat, and a lot of our clients are in the game industry. The challenge of scale and complexity of behavior is always pushing our technology.

We believe that it can’t be technology alone, though. It has to be a combination of the right tools for the right problems and human moderators who are well-trained, who have considerations for their wellness and resilience in place, and who know how to do purposeful moderation and have good community guidelines to follow.

Above: Two Hat’s content moderation symposium

GamesBeat: Is anybody asking you about the EARN IT Act? What sort of conversations are you having with clients in the game industry?

Figueiredo: We have lots of conversations related to this. We have conversations where clients are coming to us because they need to be COPPA compliant, to your previous point, and then they also need to be sure of a baseline level of safety for their users. It’s usually under-13 games. Those companies want to make sure they have grooming topics being filtered, as well as personally identifying information. They want to make sure that information isn’t being shared by children with other players. They need proactive filtering for images and text, primarily for live chat in games. That’s where we see the biggest need.

Another case we see as well, we have clients who have largely successful gaming platforms. They have very large audiences, in the millions of players. They want to make a transition, for example, to a COPPA-compliant scenario. They want to do age gating, maybe. They want to address the fact that they have young users. The reality is that we know there are games out there that don’t deliberately face players who are under 13, but children will try to play everything they can get their hands on. We also seem to be coming to a time, and I’ve had many conversations about this in the last year, where companies are more aware that they have to do something about age gating. They need to define the age of their users and design products that cater to a young audience.

That design needs to have a consideration for the privacy and safety of younger users. There are brilliant companies out there that do segmentation of their audiences. They’re able to understand that a user is under 13, and they’re talking to a user who is over 13. They’re able to apply different settings based on the situation so they can still comply with COPPA. The under-13 user isn’t able to share certain types of information. Their information is protected.

I have a lot of those conversations on a daily basis, consulting with gaming companies, both as part of Two Hat and within the Fair Play Alliance. From the Two Hat perspective, I do community audits. This involves all sorts of clients — social platforms, travel apps, gaming companies. One thing I believe, and I don’t think we talk about this enough in the game industry, is that we’ve gotten a lot of scrutiny as game companies about negative behavior in our platforms, but we’ve pioneered a lot in online safety as well.

If you go back to Club Penguin in 2008, there were MMOs at the time of course, lots of MMOs, all the way back to Ultima Online in the late ‘90s. Those companies were already doing some levels of proactive filtering and moderation before social media was what it is nowadays, before we had these giant companies. That’s one element that I try to bring forward in my community audits. I see that game companies usually have a baseline of safety practices. We have a lot of examples of game companies leading the way when it comes to online safety, player behavior, and player dynamics. You recently had an interview with Riot Games around the whole discipline of player dynamics. They’re coining a whole new terminology and area of design. They’ve put so much investment into it.

I firmly believe that game companies have something to share with other types of online communities. A lot of us have done this well. I’m very proud of that. I always talk about it. But on the flip side, I have to say that some people, they come to me asking for a community audit, and when I do that audit, we’re still far away from some best practices. There are games out there that, when you’re playing, if you’re going to report another player, you have to take a screenshot and send an email. It’s a lot of friction for the player. Are you really going to go to the trouble? How many players are actually going to do that? And after you do that, what happens? Do you receive an email acknowledging that action was taken, that what you did was helpful. What closes the loop? Not a lot of game companies are doing this.

We’re pushing forward as an industry and trying to get folks aligned, but even just having a solid reporting system in your game, so you can select a reason–I’m reporting this player for hate speech, or for unsolicited sexual advances. Really specific reasons. One would hope that we’d have solid community guidelines at this point as well. That’s another thing I talk about in my consultations. I’ve consulted with gaming companies on community guidelines, on how to align the company around a set of string community guidelines. Not only pinpointing the behaviors you want to discourage, but also the behaviors you want to promote.

Xbox has done this. Microsoft has done very well. I can think of many other companies who have amazing community guidelines. Twitch, Mixer, Roblox. Also, in the more kid-oriented spaces, games like Animal Jam. They do a good job with their community guidelines. Those companies are already very mature. They’ve been doing online safety for many years, to my previous points. They have dedicated teams. Usually they have tools and human teams that are fantastic. They have the trust and safety discipline in house, which is also important.

Clients come to us sometimes with no best practices. They’re about to launch a game and they’re unfortunately at that stage where they need to do something about it now. And then of course we help them. That’s very important to us. But it’s awesome to see when companies come to us because they’re already doing things, but they want to do better. They want to use better tools. They want to be more proactive. That’s also a case where, to your original question, clients come to us and they want to make sure they’re deploying all the best practices when it comes to protecting an under-13 community.

Above: Melonie Mac is using Facebook’s creator tools to manage followers.

GamesBeat: Is there any hope people have that the law could change again? Or do you think that’s not realistic?

Figueiredo: It’s just a hunch on my part, but looking at the global landscape right now, looking into COPPA 2.0, looking at the EARN IT Act of course, I think it’s going to be pushed fairly quickly by the normal standards of legislation. Just because of how big the problem is in society. I think it’s going to move fast.

However, here’s my bit of hope. I hope that the industry, the game industry, can collaborate. We can work together to push best practices. Then we’re being proactive. Then we’re coming to government and saying, “We hear you. We understand this is important. Here’s the industry perspective. We’ve been doing this for years. We care about the safety of our players. We have the approaches, the tools, the best practices, the discipline of doing this for a long time. We want to be part of the conversation.” The game industry needs to be part of the conversation in a proactive way, showing that we’re invested in this, that we’re walking the walk. Then we have better hope of positively influencing legislation.

Of course we want to, again, in the model of shared responsibility–I know the government has interests there. I love the fact that they’re involving industry. With the EARN IT Act, they’re going to have–the bill would create a 90-member commission. The commission would include law enforcement, the tech industry, and child advocates. It’s important that we have the industry representation. The fact that Roblox was in the conversation there with the international initiative that’s looking toward a voluntary approach, to me that’s brilliant. They’re clearly leading the way.

I think the game industry will do well by being part of that conversation. It’s probably going to become legislation one way or the other. That’s the reality. When it comes to creating better legislation to protect children, Two Hat is fully supportive of that. We support initiatives that will better protect children. But we also want to take the perspective of the industry. We’re part of the industry. Our clients and partners are in the industry. We want to make sure that legislation accounts for what’s technically possible in practical applications of the legislation, so we can protect children online and also protect the business, ensuring the business can continue to run while having a baseline of safety by design.

Source: Read Full Article