A drama of the hearing room in its own right – with a judge brandishing a hammer and a attentive jury, was played in the crossing of London King’s on Monday, a few steps from the headquarters of Google Deepmind.
Google was tried for allegations to break its promises on AI security.
Participants of this false production were demonstrators of Pauseai, an activist group fearing that technological companies rush into AI with little consideration for security. On Monday, the group gathered near King’s Cross Station to demand that Google be more transparent about the safety checks it works on its most sharp AI models.
The demonstrators put outside the office of Google Deepmind. Hugh Langley / Business Insider
Pauseai argues that Google broke a promise it made at the IA 2024 security summit in Seoul, Korea, when the company agreed to consider the external assessments of its models and publish details on how external parties, including governments, participated in the risk assessment.
When Google launched Gemini 2.5 Pro, its latest frontier model, in April, it did none of these things. The company said it was because the model was always “experimental”. A few weeks later, he published a “model card” with some security details, which some experts criticized to be too thin on the details, Techcrunch previously reported. Although the security report refers to third -party testers, he did not specify who they were.
For Pauseai, it’s not good enough. More importantly, said the organization, it is not to let go of time and allow Google to create a previous one.
“If we let google get out of it by breaking the floor, this sends a signal to all other laboratories according to which the promises of security are not important and the commitments to the public do not need to be preserved,” said the organization of Pauseai, Ella Hughes, addressing the crowd, which had gradually extended to around 60 people.
“Right now, AI companies are less regulated than sandwiches.”
The demonstrators demonstrate to the King’s Cross in London. Hugh Langley / Business Insider
There is a lot to fear about AI. Economic disturbance. Moving the work. Disinformation. Deepfakes. The annihilation of humanity as we know it.
Focusing on the specific question of the Google Safety report is a way for pause to put pressure for a specific and achievable short -term change.
About 30 minutes after the start of the demonstration, several intrigued passers -by had joined the cause. After a catchy speech of Hughes, the group went to the offices of Google Deepmind, where the false audience production took place. Some google employees leaving for the day seemed perplexed while songs to “stop the race, it’s dangerous” and “test, do not guess”.
“The regulation of AI internationally is in a very bad place,” the founder of Pauseai, Jep Meitartetsma, told Business Insider, stressing how US vice-president JD Vance warned against the too regulating AI at the top of the AI action.
A false trial in the courtroom takes place in front of the office of Google Deepmind. Hugh Langley / Business Insider
On Monday, it was the first time that Pauseai gathered on this specific problem, and it is not clear what comes next. The group engages with members of the British parliament who will present these concerns in the mast, but Meitartertsma is reluctant to say a lot about how Google engages with the group and their requests (a Google spokesperson has not responded to a comment request for this story).
Meitrtartsma hopes that support will develop and that the reference polls suggest that the public as a whole is Fear that AI will move too fast. The group Monday was made up of people from different backgrounds, some of whom work in technology. Meitrtartsma himself directs a software development company and regularly uses IA tools from Google, Openai and others.
“Their tools are incredibly impressive,” he said, “that’s what worries me so much.”
Do you have something to share? Contact this journalist by email to [email protected] or signal at 628-228-1836. Use a personal email address and a non-work device; Here is our guide to share information safely.