The ‘Attachment Economy’ is now coming to your desk

What do tech companies have to do to get your attention? Have you heard about the Attachment Economy ? It’s the next evolution of the Attention Economy. The Attention Economy concept was first articulated by economist Herbert A. Simon in 1971. He wrote that “a wealth of information creates a poverty of attention.” The idea was first applied to television and advertising, but really became central in this century with the arrival of social networking. Facebook, Instagram, YouTube, TikTok and other social networks are based entirely on the Attention Economy idea. They offer infinite scrolling and push notifications to maximize the percentage of our lives we spend glued to a screen by using algorithms optimized to grab and hold attention. We’re on the brink of a new phase in the Attention Economy, called the Attachment Economy. The mere grabbing of attention is no longer enough to win in the global competition for users’ time. Companies now see an opportunity to use AI to enhance their chatbots and robots with personality designed to capture our emotional attachment. (Note: if you’re interested in the Attachment Economy concept, you should know I’m writing a book about it, with draft chapters published on Substack. You can subscribe here: https://www.theattachmenteconomy.com/ .) We’ve already seen a new generation of online Attention Economy software products. All of them, including Replika, Character.AI , Talkie AI, Candy AI, Nomi AI, Kindroid, Chai AI, and Romantic AI, rely on faking human characteristics to hijack human emotions for the benefit of the companies selling subscriptions. Hardware-based Attachment Economy products now exist in the market and in concept. Over the past year, AI hardware products explicitly designed to foster emotional attachment have emerged; they include Casio’s Moflin, Mission AI’s Unee, Euvola, Tuya Smart and Robopoet’s Fuzozo, and Ludens AI’s Cocomo and INU devices. So far, the Attachment Economy hardware market has failed to go mainstream. Most of the products are gimmicks or novelty items, designed exclusively as “companions” or “pets.” They’re not useful. One of the more interesting Attachment Economy hardware products to emerge was the Honor Robot Phone , which was unveiled March 1. It marks the first attempt to bring smartphones into the Attachment Economy. The Honor Robot Phone is an Android smartphone with a fold-out camera feature that works like a DJI Osmo Pocket 4. When untucked from its compartment, it’s a 200-megapixel camera on a gimbal that uses AI to simulate reactions to what it pretends to see through the camera. The phone simulates self-expression by rotating, tilting, nodding, shaking and other “gestures.” For example, it can nod in agreement when you’re talking to it or say “no” by shaking its “head.” It can appear to “dance” to music and simulate other kinds of body language recognizable as such by people. Honor advertises the product as a sentient being that looks around at the world in delight and wonder and plans to start selling the phone in the second half of 2026. The next phase of Attachment Economy hardware will be desktop robots. They’ll fake emotional intelligence and exhibit programmed personality traits via AI, like the “companion” devices. But they’ll also be useful. Major companies are doing research on and building prototypes of AI desktop robots. Apple’s ELEGNT You may remember that last year I wrote in this space about Apple’s ELEGNT research project .  (That piece was based on a research paper published by Apple in January 2025, called, “ELEGNT: Expressive and Functional Movement Design for Non-Anthropomorphic Robots.”) Apple’s experimental prototypes resemble the iconic Pixar desk lamp, Luxo Jr. The prototypes are basically a lamp on an articulated arm connected to a base that moves, nods, shakes, bows and otherwise conveys recognizable body language gestures. Apple’s J595 Apple has also been working on a desktop robot product internally called Project J595. The device is described as an iPad mounted to an articulating mechanical arm on a base running a future AI-laden version of Siri. The screen faces you during FaceTime calls, expresses emotions on a screen-based “face” and exhibits limited “body language” with its arm. The device will not only provide a voice interface to AI and do various iPad-like functions, but can also serve as a hub for home automation. The robotic desktop display will run a new operating system codenamed “Charismatic,” which we can assume might well power a whole range of Attachment Economy devices. The J595 project is headed by former Apple Watch head Kevin Lynch and Apple is shooting for a 2027 launch. Lenovo’s AI Workmate Concept Lenovo earlier this month unveiled a proof-of-concept device called the AI Workmate Concept . It’s basically a black softball-sized round object attached to a robot arm. The robot has a face on a screen (including a mustache, for some reason) that makes facial expressions. Based on the demos, the device has a projector that can display information on the desktop or a nearby wall. It’s connected to the company network and appears agentic, meaning it can go fetch information based on the conversations it overhears or direct user voice interaction. It can also accept input in the form of hand gestures or typed prompts. The Workmate concept also has a scanning feature, so it can scan documents or signatures you write on paper and integrate those as digital data to be processed by the AI, printed or added into presentations. Lenovo’s AI Workmate is a proof-of-concept device desgned to be an office assistant of sorts. Lenovo The device doesn’t do any novel work beyond regular agentic AI stuff. The difference is that it has a “personality” designed to make you feel like it’s a sentient being that’s helping and befriending you. OLED AI Mini PetBot The OLED AI Mini PetBot is another concept companion desktop robot, this one created by Samsung Display (the display-manufacturing arm of Samsung). The robot is built around a 1.34-in. circular OLED screen that serves as its “face.” The Mini PetBot responds to both voice and touch input and its circular OLED face displays animated expressions that shift in real time based on user interaction. Although it has “Pet” in its name, the Mini PetBot is just a voice interface to an AI chatbot, and so it’s conceived as something useful. All these projects have one thing in common: They don’t in any way resemble a person in appearance, but do mimic human or animal body language and gestures. They do this so that you will like them, love them, and be deluded by them into believing that they have sentience, feelings, thoughts and experiences, which they do not and cannot have. Welcome to the next wave of the Attachment Economy: Useful desktop devices that ape body language to make you fall in love with them. AI disclosure: I don’t use AI to do my writing. The words you see here are mine. I do use Claude 4.6 Opus via Kagi Assistant (disclosure: my son works at Kagi) — backed up by both Kagi Search, Google Search, as well as phone calls to research and fact-check. I used a word processing application called Lex, which has AI tools, and after writing the column, I used Lex’s grammar checking tools to hunt for typos and errors and suggest word changes.