In today’s gaming landscape, online gaming is a pretty big part of it. When this happens, data collection is almost always a part of it, as well as interacting with other games, too. If you’re curious how Microsoft’s Xbox division handles both, let Xbox boss Phil Spencer shed a little light.
In an interview, Spencer was asked about data collection on Xbox Live given how online privacy the like is pretty big nowadays.
Yeah. When you sign up and log in to Xbox for the first time, you go through some questions about the data use that we have. And we’re very transparent about the use of that data, assuming people opt in to the different kind of rings of data sharing that we have. One of them that we use is with our third-party partners that build content on our platform, that they want to know, what experiences are people having in their games? How far did they get in the games? Did they own the previous version of those games? So this is more of a creative outlet and business outlet to allow our partners to be smarter about the people who play. But that’s an opt-in our network. And the same thing with any — we have child accounts on our platform. So if a child is on our platform, your children, I would encourage anybody out there to create a child account on Xbox Live. You can manage that with your mobile phone and get real detailed data on what your kids are doing, who the friends are on the network that they play, block people that they play, block spending. All of that is critically, critically important. It’s one of the reasons I think networks should have child accounts. This idea that certain networks out there just assume that everybody on their network is over the age of 13 I think is not a responsible way to run a network. I’m not saying we figured it all out, but I love that we’re public about what we stand for and what our goals
Alongside data collection, another thing that’s usually associated with online gaming is toxicity, harassment and so on. Spencer comments that AI (Artifical Intelligence) helps with this.
So we have a couple of things. In the background, we can monitor the sentiment of a conversation. And the A.I. does a good job of kind of highlighting when a conversation is getting to a destructive point. We have automated tools that will actually flag a message thread. And we will give the people in the thread a note that says, hey, this is getting to a point where we see it’s becoming destructive. So either calm it down or we’re going to shut down the conference. There’s a Report A User button that’s built right into the user interface. So if our behind-the-scene tools aren’t following or if somebody says something that we don’t catch and you want to report, you report it. That comes into our systems. We have a full team of policy and enforcement that follows up on those. But I think that Report A Message, Report A User, Report A Conversation, just really, really critical to help the community — I won’t say police itself, but at least report on itself.
In related news, Spencer also mentioned that the Xbox Series X|S consoles have sold more than any previous Xbox.
Source: New York Times (login required)