Slack tweaks its principles in response to user outrage at AI slurping

Salesforce division Slack has responded to criticism by users outraged that its privacy principles allowed the messaging service to slurp customer data for AI training unless specifically told not to, claiming the data never leaves the platform and isn't used to train "third party" models.

The app maker said its ML models were "platform level" for things like channel and emoji recommendations and search results, and it has now updated the principles "to better explain the relationship between customer data and generative AI in Slack."

It said it wanted to clarify that:

The privacy principles were overhauled in 2023 and contained the text: "To develop AI/ML models, our systems analyze Customer Data (e.g. messages, content and files) submitted to Slack."

Yes, that's correct - the principles that got customers so upset allowed Slack to analyze messages in order to train its models. In fact, pretty much anything dropped into Slack channels could be used. The implications of this are far-reaching, and users for whom the penny dropped were vocal in their criticism.

For its part, Slack has insisted that data will not leak across workspaces, although it did admit that its global models used customer data. Messages within the workspace, however, were presumably fair game.

The principles have since been tweaked slightly, and now read: "To develop non-generative AI/ML models for features such as emoji and channel recommendations, our systems analyze Customer Data."

A Slack spokesperson told The Register "Note that we have not changed our policies or practices - this is simply an update to the language to make it more clear."

The slurping is also on by default, something that could raise an eyebrow with regulators. To turn it off, Slack demands the workspace owner email its customer experience team requesting an opt-out. It does not give any indication of how long that opt-out will take to be processed for customers who do not want to have their data used in the training of Slack's global models.

Opting out means that a customer will still enjoy the benefits of the globally trained models, just without being part of those global models.

The Register asked Slack why it did not choose an opt-in model and will update this piece should we receive an explanation.

Slack says it uses the data to better parse queries, help with autocomplete and come up with emoji suggestions.

According to the company's privacy principles, "These types of thoughtful personalizations and improvements are only possible if we study and understand how our users interact with Slack."

Over on Threads, someone who claims to be an engineer at Slack says that they don't train LLMs on customer data. Aaron Maurer, who according to their LinkedIn profile is a ML and AI person at Slack, says the org's "policy, as we have published numerous places, is we do not train LLMs on customer data. For instance, to add another spot: https://slack.com/help/articles/28310650165907-Security-for-Slack-AI." However, in principle and according to Slack's Ts&Cs, it can if it wants to.

Matthew Hodgson, CEO at Element, told The Reg that he found it "utterly mind blowing" that Slack was "proposing training AI on private customer data."

"It's bad enough that cloud vendors like Slack and Teams have access to your unencrypted data in the first place, but to then feed it into an opaque and unpredictable LLM model is terrifying."

For context, Slack is not the only service to use customer data for model training. Reddit getting friendly with OpenAI and adding its forum posts to ChatGPT is another example, although customers paying a subscription to use Slack would be forgiven for being a little surprised to find their data being used as global training fodder unless they opt out.

Slack's change happened in 2023, and the furor highlights the need for users to check what their data is being used for as AI hype continues to surge through the tech industry. ®

Search
About Us
Website HardCracked provides softwares, patches, cracks and keygens. If you have software or keygens to share, feel free to submit it to us here. Also you may contact us if you have software that needs to be removed from our website. Thanks for use our service!
IT News
Mar 27
Dell sheds ten percent of staff for the second year in a row

Confirmed: 12,000 people let go over 12 months

Mar 26
US defense contractor cops to sloppy security, settles after infosec lead blows whistle

MORSE to pay -- .. .-.. .-.. .. --- -. ... for failing to meet cyber-grade

Mar 26
Credible nerd says stop using atop, doesn't say why, everyone panics

Bad news about the Linux system monitor may be on the way

Mar 26
Microsoft patches patch that broke USB printing in Windows 11

Now the only nonsense printed out will come from the user

Mar 26
US closes subsidiary loophole on dozens of Chinese entity list members

Bad news for American tech businesses making money off deals with sanction-dodging companies

Mar 26
Schneider Electric pumps $700M into US ops as AI datacenter demand surges

Meanwhile, Apple is lining up '$1B' of Nvidia Blackwell Ultra kit

Mar 26
British govt wants to mainline AI, but its arteries are clogged with legacy tech

Spending watchdog says digital professionals need seat at top table to ensure 'transformation'