A developer in Portland opened GitHub on Tuesday morning and found a new toggle in their Copilot settings. It was already switched on. The label read: "Allow GitHub to use my data for AI model training."

They didn’t turn it on. Nobody did. GitHub did it for them.

Starting April 24, 2026, every Copilot Free, Pro, and Pro+ user will have their interaction data — code snippets, prompts, file names, repository structure, navigation patterns, even thumbs-up feedback on suggestions — fed directly into GitHub’s AI training pipeline. The setting is opt-out, not opt-in. If you don’t find the buried toggle at /settings/copilot/features and manually disable it before that date, your code becomes training material.

GitHub’s community discussion post announcing the change has racked up 117 thumbs-down reactions. The comments are not kind.

Here’s what makes this particularly sharp: Copilot Business and Enterprise customers are exempt. Their contracts protect them. Students and teachers get a pass too. But if you’re an individual developer paying $10 or $20 a month for Copilot Pro? Your code is fair game unless you opt out. Organizations get privacy by default. Individuals get privacy by discovery.

GitHub’s justification is straightforward — Microsoft employees’ interaction data already improved Copilot’s suggestion acceptance rates, and broader data will make the models better. That’s probably true. But the mechanism matters more than the outcome. Every major tech company that has pulled this move — defaulting users into data collection and burying the opt-out — has done so because they know most people won’t change the setting. That’s the whole point of making it opt-out.

The timing is worth noting too. This lands less than a month after GitHub’s parent company Microsoft reported $24.1 billion in cloud revenue for Q2 2026, with AI services cited as the primary growth driver. The pressure to feed these models isn’t academic. It’s quarterly earnings.

My Opinion

I’ll be blunt: this is a trust violation dressed up as a product improvement.

Here’s what bugs me. Developers chose GitHub. Many of them pay for it. They write code on the platform, push commits, leave feedback on AI suggestions — all under a reasonable expectation that their workflow data belongs to them. GitHub just unilaterally changed that expectation and made the developer responsible for discovering the change and reversing it. That’s not consent. That’s a dark pattern with a 30-day grace period.

The two-tier system is the tell. If this data collection were truly benign, enterprise customers wouldn’t need contractual protection from it. The fact that organizations get an automatic shield while individuals get an opt-out checkbox buried three clicks deep tells you everything about who GitHub considers a customer and who it considers a resource.

Every developer reading this should go to /settings/copilot/features right now and turn it off. And then ask yourself: if they’ll change the default once, what stops them from changing it again after the next privacy policy update you didn’t read?


Author: Yahor Kamarou (Mark) / www.humai.blog / 27 Mar 2026