Valve has 1,700 CPUs working non-stop to bust CS:GO cheaters

(Image credit: DoctorRed2000)

All popular multiplayer games fight never-ending battles against cheaters. But as Counter-Strike: Global Offensive rose in 2014 to become the most-played FPS in the world, a few things made it particularly susceptible to hacking. 

As the 10th game released on Source (and the third mainline CS), there were already piles of knowledge on how to tamper with Valve's engine. Hacks built for ancient stuff like Half-Life 2: Deathmatch could, with a few minutes' tweaking, perhaps function in CS:GO (although Valve says they'd be trivial to detect). Design-wise, the traits that make CS:GO a skillful game of angles and accuracy also make cheats more effective. Weapons are highly lethal, so putting those guns in the hands of aimbots makes them even more devastating. And CS:GO's focus on information and stealth means that knowing the location of your opponent is invaluable—fertile ground for wallhacks.

CS:GO's fight against hackers is "important, valuable work" according to Valve, but if you've played the FPS, you may have noticed a couple years ago that things were beginning to get dramatically better. Not only did Reddit complaints and frustrated replay clips of cheaters seem to circulate less frequently, but the perception of cheating—as hazardous as anything to a competitive game's health—seemed to dissipate. We published stories of high-profile bans, along with news of thousands of cheaters getting banned in single waves. How was Valve purging most of these jerks?

"Cheaters didn't get the memo we were doing it, and players were super happy and we were just busting cheaters left and right. It felt so good."

—John McDonald, Valve

In one of the only in-depth moments of transparency on this topic, Valve programmer John McDonald spoke at the Game Developers Conference last week in San Francisco about how he and Valve used deep learning techniques to address CS:GO's cheating problem. This approach has been so effective that Valve is now using deep learning on "a bunch of problems," from anti-fraud to aspects of Dota 2, and Valve is actively looking for other studios to work with on implementing their deep learning anti-cheat solution in other games on Steam.

Solving CS:GO's cheating problem

While between projects sometime in 2016, McDonald noticed that "The only thing the community was talking about was cheating," based on online discussion and a private email address that received mail from CS:GO pros. "It was this, just, deafening conversation," he says. The uptick in VAC bans around this period, McDonald says, supported what Valve was hearing.

To combat the issue, Valve and McDonald looked to deep learning, a solution that had the potential to operate and adapt over time to new cheating techniques—attractive traits to Valve, which has historically elected to automate aspects of Steam rather than hire hundreds of new employees to tackle issues like curation. What Valve created is known as VACnet, a project that represents about a year of work. 

VACnet works alongside Overwatch, CS:GO's player-operated replay tool for evaluating players who have been reported for bad behavior. VACnet isn't a new form of VAC, the client and server-side tech that Valve's used for years to identify, say, when someone's running a malicious program alongside a game. VACnet is a new, additional system that uses deep learning to analyze players' in-game behavior, learn what cheats look like, and then spot and ban hackers based on a dynamic criteria.

"Our customers are seeing fewer cheaters today than they have been, and the conversation around cheating has died down tremendously."

—John McDonald, Valve

McDonald says that "subtle" cheats remain difficult to solve, but in building VACnet, Valve decided to target aimbots first because they present themselves at specific, easily-definable points during rounds of CS:GO: when you're shooting. This allowed Valve to build a system that captured the changes in pitch (Y-axis) and yaw (X-axis)—degree measurements in a player's perspective—a half a second before a shot, and a quarter second after. This data, along with other pieces of information like what weapon the player is using, their distance, the result of the shot (hit, miss, headshot?) are the individual 'data particles' that together form what Valve calls "atoms," essentially a data package that describes each shot. 

VACnet can't necessarily spot a cheater based on one atom, though. "We need a sequence of them, what we actually want is 140 of them, or at least that's what the model uses right now … We just take the 140 out of an eight round window and we stuff those into the model, and we're like, 'Hey, if you were to present this sequence of 140 shots to a [human] juror, what is the likelihood you would get a conviction?'"

Pretty good, as it turns out. Both players and VACnet report players for judgment in Overwatch. But when VACnet reports a suspected cheater, they're almost always a cheater.

"When a human submits a case to Overwatch, the likelihood that they get a conviction is only 15-30 percent, and that varies on a bunch of factors, like the time of the year, is the game on sale, is it spring break. There's a bunch of things but the point is human convictions are very low," says McDonald. "VACnet convictions are very high, when VACnet submits a case it convicts 80 to 95 percent of the time." 

A slide from McDonald's talk: a model of the relationship between Overwatch and VACnet.

That doesn't mean Valve plans to phase out its cheater theater, Overwatch. Both systems work together: VACnet learns detection techniques from Overwatch, McDonald says. "Because we're using Overwatch and we didn't actually replace all player reports, we just supplemented them, that means that the learner [VACnet] is getting the opportunity to evolve along with human jurors. So as human jurors identify new cheating behaviour, the learner has the opportunity to do the same thing."

McDonald adds that when VACnet has been recently retrained with player data to spot a new cheat, the conviction rate might be nearly 100 percent for a short period before cheaters adapt to it. When Valve quietly rolled out VACnet to CS:GO's 2v2 competitive mode earlier this month, McDonald says "the conviction rate for that mode was 99 percent for a while, it was great. Cheaters didn't get the memo we were doing it, and players were super happy and we were just busting cheaters left and right. It felt so good."

Large Hacker Collider

To bring VACnet to life, a server farm had to be built that could handle CS:GO's millions of players, loads of data, and grow as CS:GO grew. Right now there are about 600,000 5v5 CS:GO matches per day, and to evaluate all players in those matches Valve needed about four minutes of computation, amounting to 2.4 million minutes of CPU effort per day. You need about 1,700 CPUs to do that daily work.

So Valve bought 1,700 CPUs. And 1,700 more, "so we'll have room to expand," McDonald says, hinting at Valve's intention to bring VACnet to other games. Conservatively, Valve had to have spent at least a few million dollars on that hardware: 64 server blades with 54 CPU cores each and 128GB of RAM per blade. That's a drop in the bucket compared to the estimated $120M CS:GO brought in off of game copy sales alone in 2017, but it probably represents one of the beefiest anti-cheating farms built for a single game. 

The work continues, but from McDonald's perspective, VACnet is kicking ass, and has potential application not only in non-Valve games, but in other stuff on Steam. "Deep learning is this sea-change technology for evolutionary behaviour," says McDonald. "We think that it is really helping us get developers off of the treadmill without impacting our customers in any way. Our customers are seeing fewer cheaters today than they have been, and the conversation around cheating has died down tremendously compared to where it was before we started this work."

Early December 2017 brought a new milestone for the system: VACnet started producing more convictions than non-convictions in Overwatch. "The system works great," says McDonald.

Following publication, GDC uploaded the full video of McDonald's hour-long talk to the GDC Vault. Watch it here.

Evan Lahti
Global Editor-in-Chief

Evan's a hardcore FPS enthusiast who joined PC Gamer in 2008. After an era spent publishing reviews, news, and cover features, he now oversees editorial operations for PC Gamer worldwide, including setting policy, training, and editing stories written by the wider team. His most-played FPSes are CS:GO, Team Fortress 2, Team Fortress Classic, Rainbow Six Siege, and Arma 2. His first multiplayer FPS was Quake 2, played on serial LAN in his uncle's basement, the ideal conditions for instilling a lifelong fondness for fragging. Evan also leads production of the PC Gaming Show, the annual E3 showcase event dedicated to PC gaming.