in , , ,

A New FTC Enforcement Tactic Against AI Companies: Algorithmic Disgorgement

Read Time:2 Minute, 44 Second

“Algorithmic disgorgement” is a new and powerful enforcement weapon that the Federal Trade Commission (FTC) has implemented in settlements with artificial intelligence (AI) corporations in recent years. In light of the recent settlement with the Rite Aid Corporation, this paper explores the notion of algorithmic disgorgement, its development, and its growing relevance in the FTC’s remedial toolkit.

A Comprehensive Guide to Algorithmic Disgorgement

The process of algorithmic disgorgement entails the forcible removal of algorithms created with data that was obtained unlawfully. Commissioner Rebecca Kelly Slaughter underlines that the idea behind this remedy is based on the idea that businesses shouldn’t be permitted to make money off of data obtained illegally or from algorithms created using such data.

The FTC’s interest in algorithmic disgorgement is a result of worries about AI model development’s insatiable demand for data. There is concern that businesses can compromise their data privacy obligations in their haste to improve or develop AI models. In response, the FTC views algorithmic disgorgement as a means of preventing businesses from profiting from algorithms created using customer data that was obtained unlawfully.

The Algorithmic Disgorgement Evolution

Algorithmic disgorgement has just recently been used; it was used for the first time in the FTC’s 2019 settlement with Cambridge Analytica. The precedent for requiring the removal of algorithms created using data that was obtained illegally was established by this case. Further agreements have reinforced the use of algorithmic disgorgement as a potent enforcement weapon, such as the December 2023 settlement with Rite Aid and the 2021 lawsuit with Everalbum Inc.

See also  US Oil Production Dominance Sets New Records, But Future Growth Is Uncertain

A Pioneer in Algorithmic Disgorgement: The Rite Aid Case

The FTC’s first use of its Section 5 unfairness jurisdiction against an allegedly discriminatory use of AI makes the settlement with Rite Aid in December 2023 more notable. According to the FTC, Rite Aid used face recognition technology without following reasonable standards, which led to discriminatory and erroneous misidentifications.

As a result, Rite Aid is required by the FTC to delete specific algorithms associated with their facial recognition technology. As stated in its April 2023 “Joint Statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems,” the FTC is committed to tackling discriminatory applications of AI, and this underscores that commitment.

Consequences for AI Firms

Algorithmic disgorgement is a “significant part” of the FTC’s AI enforcement strategy, as the assistant director of the FTC’s Division of Privacy and Identity Protection noted in July 2023. Businesses in the AI sector need to exercise caution when it comes to the source of their training data in order to prevent the FTC from taking legal action and destroying their built algorithms.

Results

Algorithmic disgorgement is turning into a key remedy as the FTC’s enforcement strategy changes in the AI era. The FTC’s dedication to combating discriminatory AI practices is demonstrated by the Rite Aid lawsuit. AI businesses must be diligent in complying to data privacy standards to avoid the possibility of losing created algorithms and associated data due to FTC proceedings.

We will keep an eye on, evaluate, and publish reports on the FTC’s enforcement tactics as events take place. Please get in touch with us if you have any inquiries regarding current procedures or want assistance with compliance. Keep up to date to handle the constantly changing artificial intelligence regulatory landscape.

What do you think?

Ancient Bog Body Discovered in Northern Ireland with 2,000–2,500 Years of History

“Analyzing Tesla’s Fourth-Quarter Earnings: A Game-Changer for Stock Investors”