Google Hit With $270 Million Fine, Partly Due to AI Training Methods

– Google was fined roughly $270 million for training its AI chatbot Bard without informing news outlets
– French regulators accused Google of breaching Commitment 1 by not negotiating in good faith and not cooperating with a monitoring trustee
– Google stated the fine was “not proportionate” but agreed to pay, and emphasized the need for clarity on how to pay publishers for content used to train AI models

French regulators fined Google approximately $270 million for breaching commitments related to negotiating deals with news outlets for their content. The watchdog alleges that Google used journalists’ content without informing them to train its AI chatbot Bard, now known as Gemini. Google had previously promised to negotiate in good faith based on transparent criteria, which the regulators referred to as “Commitment 1.”

Regulators found that Google failed to inform publishers about the use of their content to train Bard, failed to cooperate with the monitoring trustee, negotiate in good faith, and provide complete revenue information. Google was fined €250 million for these violations and did not dispute the facts. Google stated that the fine was not proportionate to the allegations but agreed to pay to move forward.

The issue of how tech companies train their chatbots has been a contentious topic. In the past, a UK regulator fined AI company Clearview for scraping biometric data, which was later overturned on appeal. The New York Times sued OpenAI for allegedly using its content to train its large language model, ChatGPT. Some publishers have reached deals with companies like ChatGPT to use their content.

Google stated that it is focused on sustainable approaches to connecting people with quality content and working constructively with French publishers. The company expressed willingness to address concerns from publishers and regulators and emphasized the need for clarity on payment processes to create a more sustainable business environment.

Source link