Skip to main content

SlackSlack faced backlash for using customer data to train AI models without clear user consent

In 2024, Slack faced significant backlash after users discovered the company's policy of using customer messages and interactions to train its AI/ML models. Critics argued Slack had not adequately informed users about this data usage. The policy required workspace administrators to opt out by contacting Slack directly, rather than providing a simple user-facing toggle, drawing comparisons to other companies' controversial AI training practices.

Scoring Impact

TopicDirectionRelevanceContribution
User Autonomy-againstsecondary-0.50
User Privacy-againstprimary-1.00
Overall incident score =-0.429

Score = avg(topic contributions) × significance (medium ×1) × confidence (0.57)

Evidence (1 signal)

Confirms Policy Change May 1, 2024 documented

Users discovered Slack's AI training policy used customer data with opt-out only available to admins

In 2024, users discovered Slack's policy of using customer messages to train AI/ML models, with opt-out requiring workspace administrators to contact Slack directly rather than providing a user-facing toggle. The revelation drew significant backlash on social media and comparisons to other companies' controversial AI training data practices.

Related: Same Topics