cancel
Showing results for 
Search instead for 
Did you mean: 
FutureShock_
Expert Protege
Status: New Idea

So there's a third party app called ToxMod (which is passive and respects privacy) that uses machine learning to automatically clip and record offending content in public multiplayer VR lobbies and send it to developers. For example, developers can configure ToxMod to activate when a user utters a certain word or phrase (like a slur), clip it, then send an audio/voice recording to the developers (at which point, they can take action). ToxMod adds an agent to each lobby, as per my understanding, to accomplish this. 

I know Meta won't use third party apps like this but I think a similar system should be developed by Meta specifically for flagging children who use their parents' accounts and passing (for example) isChild = true along to developers via the API so developers can add the appropriate filters to their games and improve their communities. It wouldn't even need to be something ubiquitous, it could passively audit users in multiplayer games over a period of time and flag accounts as it goes (and there could be an appeals process where adults could unflag their own account after they create a separate junior account for their kid). 

What this could mean for content:

- Developers could add a checkbox giving the ability for adults or young people who aren't flagged as children to opt out of playing with people under the age of 13
- Developers could bar children from certain modes or limit children to only playing with other children
- Developers could bar children from multiplayer altogether

On a technical level, I don't think it should be that difficult to tell when a kid is using an adult's Meta account. Factors like height, pitch, type of language used, and so on should be pretty easy for an algorithm to pick up on. Perhaps human moderators could be part of the decision to flag the account as well with the algorithm just sending anonymized data to the moderation team much like ToxMod does. 

Why, you ask? 

Because children ruin many social VR and multiplayer VR experiences. See this post. It's gotten so bad that many mature users avoid crossplay games or give up on certain communities due to the influx of children. There are certain games, like Onward, that most children simply aren't mature enough to play (steep learning curve, slow paced, punishing) so instead of playing the game as intended, children make their own fun which means disrupting anyone who does want to play the game as intended. That's without getting into how loud children are, how they dominate voice comms, etc. 

The fact is that the majority of adults would rather not hang out with children in their free time and unless something is done about this, then I think VR is going to get a reputation as being something which is for kids. Like Fortnite or Roblox. If VR is ever to go mainstream, it needs to give people the ability to interact with others who are similar in age and maturity. There's also the danger this poses to children that comes from interacting with adult strangers in games like VRChat.

I think the proposed system or a more complex system that adds more flags to an account would do wonders for the quality of VR communities. 

1 Comment
MawMawTx16
Honored Guest

Completely agree with this idea! First, having children using adult accounts on Meta Quest puts them in danger of pedophiles lurking around to seek them out. Unfortunately, parents aren't paying attention.

Secondly, I've completely given up on any of the community type of games due to the language and behavior of the children. They think they're cool going around cussing and using racial slurs and other insults. Going into VRChat is a nightmare!

There must be a way to program AU or something to monitor and block accounts for this kind of behavior.