Adobe support told a sysadmin to stop using InDesign for confidential files due to Firefly AI auto-uploading images. IT folks are furious over the lack of MDM controls.

I was just sipping my coffee and scrolling through Reddit when I nearly spat it all over my monitor. An escalated Adobe support rep just dropped an absolutely legendary quote: "I would recommend that you refrain from using InDesign for handling confidential information."
Are you kidding me? The world's top design software vendor telling clients not to use their tool for sensitive stuff? You literally can't make this shit up.
So here's the tea: A sysadmin was fighting in the trenches with Adobe support when they got hit with this bombshell. What exactly triggered this response?
Well, Adobe is currently riding the GenAI hype train hard with Firefly. Because of this, every single image you drop into a modern InDesign document silently gets beamed up to Adobe's Firefly servers to generate Alt-Text. Sounds super convenient, right? Hold your horses.
The massive red flag here is that the sysadmin couldn't get a straight answer from Adobe on one crucial question: Are these confidential images and unreleased concept arts being used to train the Firefly AI model? Imagine a random dude typing a prompt into Firefly and it accidentally spits out your client's top-secret, billion-dollar design. Big yikes.
And you know what the worst part is? In the corporate world, IT manages software via Registry keys or MDM profiles to configure things globally. But Adobe straight up said "Nope." They insist there's no way to remotely disable this feature on Windows or Mac devices.
So, what's the workaround? Poor IT guys are literally walking around, physically sitting at every single computer, opening InDesign, Photoshop, and Illustrator, and disabling the AI feature by hand. Pure sysadmin torture! The OP was so fed up they advised freelancers and studios to ditch Adobe and switch to Affinity (where AI is at least disabled by default).
The moment this post went live, the sysadmin community grabbed their pitchforks. Here are the main vibes from the comment section:
1. The "Nuke It From Orbit" Squad
One security wizard suggested a brute-force method: If you can't use a Registry key, let the firewall do the talking. Just block outbound traffic to firefly.adobe.com at the network/proxy level. It might break some other features, but desperate times call for desperate measures. Better to break a few eggs than leak corporate data.
2. The Compliance Doomers Many folks pointed out this isn't just an Adobe problem—it's a massive industry trend. Vendors are desperately shoving AI into everything (looking at you, Microsoft Copilot and Google Gemini) and silently phoning home with telemetry. If you're in healthcare, legal, or finance, this hidden AI data exfiltration is a fast track to a massive compliance fine.
3. The Enterprise Confusion One user chimed in saying, "Hey, Adobe can disable it from their end for Enterprise accounts." Immediately, the OP and others clapped back: "We are on Enterprise, and our reps said it's impossible!" The communication is so busted that even Adobe's own support teams seem to be reading from different, contradictory scripts.
4. The Defeated IT Guy
User fluffy_warthog10 shared a heartbreaking story. They spent weeks writing a detailed compliance report proving Adobe's AI broke all their security requirements. The reward? C-level execs verbally abused them, screaming things like "You don't know anything about Adobe" and "Legal approved this contract years ago!" It's the classic IT tragedy: you try to save the company, but execs blinded by the "AI revolution" throw you under the bus.
To wrap this up: AI features are no longer just "cool additions" to your tech stack. They are prime vectors for unauthorized data exfiltration. Tech giants are aggressively cramming AI into every orifice of their software to appease shareholders, leaving devs and IT admins to clean up the ensuing mess.
The takeaway? Never blindly trust an "Enterprise Agreement." If you're handling NDA materials or highly sensitive client data, audit your damn toolchain. If a tool has "AI" attached to it, sniff its network traffic and see if it's phoning home.
Because if your client's unreleased game art gets regurgitated by an AI image generator, crying "But Adobe support said it was fine!" isn't going to save your job.