Many large and every small company in Silicon Valley seems to be busy doing nothing but driving the AI revolution forward. New large language models with even more amazing capabilities are launched on the market almost daily, only for the next developments to seem like old hat just a short time later.
Only one really big player doesn’t seem to be involved. Because they don’t want to? Or because they can’t? I’m talking about Apple, currently the most valuable company in the world with a market valuation of 3.5 trillion dollars. So far, Apple has not been able to come up with its own language model, and Apple Intelligence has also been plagued by problems. Apple also has little to show in terms of GPUs, and with the discontinuation of the development of autonomous cars last year, the company’s biggest AI-related initiative has come to a screeching halt.
However, Apple has not always shone by being the first mover and being the first to market with a product or product category. However, as soon as Apple introduces a product, it usually changes the way we have looked at them up to now.
A first look at an expressive and cute lamp robot presented by Apple machine learning researchers in a paper gives us exactly this feeling. Entitled ELEGNT: Expressive and Functional Movement Design for Non-Anthropomorphic Robot, the accompanying four-minute video of a lamp that interacts with users, reacts to gestures and moves accordingly is very similar to the well-known lamp from animation studio PIXAR. Not only that, but the lamp looks extremely human and cute thanks to its expressive reactions.
The lamp itself is equipped with several actuators so that it can rotate and move its arm to where light is needed. It also has a built-in camera to recognize objects, as well as a microphone and loudspeaker, with which it can also talk to people thanks to a chatbot. The LED light itself can also serve as a projector, but I’ll come to that in a moment.
The video shows several scenarios in which the lamp directs its light beam to where light is needed depending on the user’s recognized tasks. In one scenario, it recognizes that an object is to be photographed with the iPhone and illuminates the object. However, the lamp also recognizes when it cannot perform a task if it would have to move out of its range to do so. It can also remind the user to drink a glass of water if she has been reading for too long. After the user has put the cup down again, the lamp directs its light back towards the book. In another example, the lamp helps the user with a task by projecting a tutorial with instructions onto the wall. In the last “social” example, the lamp plays music and moves rhythmically to it.
This lamp is reminiscent of another video in which the interaction of a photographer with her camera is shown. The camera is mounted on a gimbal, moves freely, follows her gestures, nods in agreement or turns around and also appears very human and expressive.
We can see from these examples that embodied AI does not necessarily always have to take on a humanoid robot form, but that almost all objects in our household and workplace could be suitable for AI-controlled animation. The iron and the washing machine are just two applications that I have already mentioned in this article.
Not only does Apple use AI to add really useful functions to an object that seems as commonplace to us as a lamp, it also gives it expressive and cute features. These make it easier for people to deal with such everyday objects and at the same time add to the delight of using them.
We can look forward to seeing what other everyday objects Apple and others will augment with AI in similarly useful and expressive ways.
