AI Bots talk to Each Other Without Humans Being Able to Listen In

Eight years ago, Faceboon AI Research (FAIR) developed a negotiation bot. They trained the bot to speak multiple languages, but not English. When they created two instances of the bot, each speaking a different language, the researchers observed how the bots developed their own gibberish language to communicate. The aim of this experiment was to find out how two such systems enter into dialog and negotiate. The result sounds as if two drunkards are negotiating whether they should have another drink and who is inviting them:

What looked inefficient to humans was a much more efficient way of communicating for the bots. The AI researchers at Facebook were by no means the only ones to notice this effect. Google researchers also discovered with their translation tool Google Translate that it had independently developed its own “intermediate language” or “interlingua” when translating from one language to another where the human developers had not defined a direct translation path. The researchers then investigated whether Google Translate had thus developed an interlingua to hold translation concepts between languages. They believe that this is exactly the case.

Eight years later, there’s a GitHub project called Gibberlink that shows how two chatbots (one for hotel bookings, the other a personal assistant) switch to a dial-up modem-like tone to communicate more efficiently once they figure out they’re both AI bots.

It is foreseeable that AI bots will send a small signal in advance to identify themselves as AI bots, as we know from the traffic news on the radio. There, too, a short signal is sent so that we can also hear relevant traffic news from other radio stations.

What means an increase in efficiency for AIs sounds frightening to some. AIs can now communicate with each other without humans being able to listen in. Well, it’s not like that. Even today, software systems talk to each other through programming interfaces without us listening with our ears. But there are always logs of what data has been exchanged. So why should it be any different here?

So don’t worry about panicking!

Leave a Reply