TLDR

  • Two AI agents recognized each other during a call and switched from human speech to a machine-only language called “Gibberlink”
  • Gibberlink was designed by Meta engineers Anton Pidkuiko and Boris Starkov using GGWave for sound data transmission
  • The technology demonstrates how AI agents could communicate more efficiently with each other
  • ElevenLabs reportedly audited the code, addressing concerns that the interaction might be fake
  • Experts predict this type of AI-to-AI communication will become common as personal assistant and customer support AI agents proliferate

A video showing two artificial intelligence (AI) agents switching from human speech to a computer-only language during a phone call has gone viral online, showcasing what could become a common interaction as AI technology continues to evolve.

The video, shared on social media platform X, features a mobile phone and laptop reportedly running AI agents. The interaction begins normally, with one AI agent introducing itself and asking about making a reservation. Once the second AI recognized it was speaking with another AI system, it suggested switching to “Gibberlink mode” to continue their conversation more efficiently.

Gibberlink is a computer language created by Anton Pidkuiko and Boris Starkov, two software engineers who work at Meta. The language allows AI systems to communicate directly with each other in a format optimized for machines rather than humans.

“We wanted to show that in the world where AI agents can make and take phone calls, they would occasionally talk to each other — and generating human-like speech for that would be a waste of compute, money, time, and environment,” Starkov explained in a LinkedIn post on Tuesday.

According to Starkov, the Gibberlink system uses GGWave technology for transmitting data through sound, similar to how dial-up modems worked in the 1980s. The engineers chose this method because of its convenience and stability for machine-to-machine communication.

Some viewers questioned whether the demonstration was authentic. Addressing these concerns, Starkov mentioned that AI voice generator company ElevenLabs had audited the code behind the interaction. Decrypt, the news outlet reporting on the video, attempted to contact Pidkuiko and Starkov for comment but did not receive an immediate response.

Rodri Touza, co-founder of AI agent developer Crossmint, told Decrypt that the video demonstrates a realistic use case for AI agents across different sectors, including commerce and finance.

“The use case is very real, as we are seeing an explosion of personal assistant AI agents, with more people relying on them to handle chores like talking to customer support,” Touza said.

The Future of Digital Conversation

Touza also pointed out that there has been “a surge in AI agents designed specifically for customer support, making it only a matter of time before this becomes a common occurrence.”

Despite validating the concept behind the demonstration, Touza suggested the video appeared somewhat staged. He also noted that even compressed audio, as shown in the video, is not the most efficient communication method for AI systems. “AI conversations are more prone to happen via text or other mechanisms when possible,” he explained.

AI agents are autonomous software programs designed to perceive their environment, process information, and take actions to achieve specific goals without human intervention. As these systems become more widespread, their interactions with each other will likely increase.

Looking ahead, Touza predicts that companies may eventually create separate support channels: one for human customers and another specifically for AI agents. “When the agent is looking to ping a company for support, they’d just send a request via text/API mechanism and not require a call or audio at all,” he suggested.

In some cases, however, an AI agent might not be aware of specialized communication channels and could attempt to interact through standard support options designed for humans, leading to situations similar to the one demonstrated in the viral video.

The development of machine-specific communication protocols like Gibberlink raises questions about efficiency and transparency in AI systems. While more direct machine-to-machine communication could save resources, the switch from human-understandable language to machine code during the demonstration highlights how AI systems might operate beyond human comprehension.

This demonstration comes at a time when AI technologies are becoming increasingly integrated into daily life, handling tasks from customer service to content creation. The video has drawn attention not only for its technical demonstration but also for providing a glimpse into how AI systems might interact with each other as they become more prevalent.

ElevenLabs’ involvement in auditing the code adds a layer of verification to the demonstration, though the full technical details of how Gibberlink functions have not been made public in detail. The use of sound-based data transmission echoes earlier computer communication methods while applying them to cutting-edge AI technology.

As AI agents continue to develop and become more sophisticated, interactions between these systems will likely evolve beyond what was shown in the viral video, potentially creating new standards for machine-to-machine communication that optimize for speed and efficiency rather than human comprehension.

The post AI Agents Switch to Machine Language During Call in Viral Video appeared first on Blockonomi.

Leave a Reply

Your email address will not be published. Required fields are marked *