Slack users on the web –on Mastodon, on Threadsand next Hacker News— they reacted alarmingly to unclear privacy page which describes how their Slack conversations, including chats, are used to train systems that the Salesforce-owned company calls “machine learning” (ML) and “artificial intelligence” (AI). The only way to opt out of these features is for your company’s Slack setup administrator to send you an email asking you to turn it off.
A policy that applies to all Slack instances – not just those that have opted in Frail artificial intelligence addendum – states that Slack’s systems “analyze Customer Data (e.g., messages, content, and files) submitted to Slack, as well as Other Information (including usage information) in accordance with our privacy policy and customer agreement.”
Basically, everything you type in Slack is used to train these systems. Slack says data “will not leak between workspaces” and that there are “technical controls to prevent access.” Still, we all know that conversations with AI chatbots are not private, and it’s not tough to imagine that it could go wrong in some way. Given the risks, the company has to offer something unusually convincing in return… right?
How will Slack benefit from using your data to train AI?
The section describing the potential benefits of Slack feeding all conversations into the immense language model states that it will enable the company to deliver better search results, better autocomplete suggestions, better channel recommendations, and (I’m kidding) improved emoji suggestions. If all this sounds useful to you, great! Personally, I don’t think any of these things – except possibly better search – will make much of a difference in making Slack more useful for getting work done.
The emoji thing in particular is absurd. Slack literally says it needs to forward your conversations to the AI system for this to happen provide better emoji recommendations. Consider this real quote, which I promise it’s from Slack website not Onion:
Slack can suggest emoji reactions to messages by using the content and sentiment of the message, historical emoji usage, and the frequency of emoji employ on your team in different contexts. For example, if 🎉 is a common reaction to holiday messages on a given channel, we’ll suggest that users respond to recent, similarly positive messages with 🎉.
I am overwhelmed with awe at the implications of this amazing technology and no longer worry about any privacy implications. Artificial intelligence is truly the future of communication.
How to opt out of AI training in Slack
The bad news is that as an individual user you can’t opt out of Slack by using your conversation history to train its immense language model. This can only be done by a Slack administrator, which in most cases will be someone from your company’s IT department. There is no opt-out button in settings – admins must send an email requesting this opt-out.
Here’s Slack’s exact language on the matter:
If you wish to exclude your Customer Data from Slack’s global models, you can opt-out. To opt out, please ask your organization, workspace owners or primary owner to contact our customer support team at [email protected] with the URL of the workspace/organization and the subject “Slack global model opt-out request”. We will review your request and respond once your opt-out is complete.
This smacks of a shadowy pattern – doing something irritating to discourage people from doing it. We hope the company will make the cancellation process easier for the current influx of customers.
A reminder that Slack chats are not private
Honestly, I’m a little amused at the prospect of using my Slack data to improve search and emoji suggestions for my former employers. In previous jobs, I often sent private messages to work friends full of negative opinions about my manager and the company’s management. I can imagine Slack recommending specific emojis whenever a specific CEO is mentioned.
While this idea is entertaining, the whole situation is a good reminder for workers everywhere: Your Slack chats are not really private. Nothing you say on Slack – even in a direct message – is private. Slack uses this information to train such tools, that’s true, but the company you work for can also easily access these private messages. I highly recommend using something that is not controlled by your company if you want to talk about your company. May I suggest Signal?
Update: After this article was published, a spokesperson shared with Lifehacker that Slack does not develop its own LLM solutions or other generative models using customer data and pointed to the company’s recently updated privacy policy (which is cited extensively above) to better explain how it uses customer data and generative artificial intelligence.