The AI conversation is still dominated by chatbots and productivity copilots — tools that live in tabs, apps and office workflows. But the consumer tech pipeline points in another direction as well: AI that’s physically there, sharing a space with you, responding with movement, sound, eye contact, and the small rituals of a pet or a helpful housemate.
That theme has been visible in early CES 2026 coverage from outlets running live round-ups, which — even when robots aren’t the headline act — keep slotting in home devices that behave less like appliances and more like companions (see TechRadar’s CES 2026 live hub of announcements and Engadget’s round-up of the coolest CES 2026 gadgets). The point isn’t that everyone is buying a robot dog tomorrow; it’s that AI is increasingly being packaged as a relationship, not just a service.
This “bodied” shift matters because the interface changes expectations. A chatbot can be closed. A device with eyes that tracks you across the lounge room can invite something closer to social behaviour — and with it, social attachment, social risk, and a different set of questions about trust.
Why robot pets keep returning — and why they’re changing
Robot pets have been “coming soon” for decades. Yet the category keeps resurfacing because it fits a gap that phones never quite fill: ambient companionship that doesn’t demand constant typing, scrolling or decision-making. It’s not a coincidence that many consumer examples look like animals — creatures we already understand as emotionally expressive without language.
Sony’s Aibo remains a widely cited case study: a dog-shaped robot designed to learn routines and react to people in a home. Sony positions Aibo as an interactive companion with behaviours that evolve over time (Aibo’s current positioning is outlined on the official Sony Aibo product page). Casio’s Moflin goes further into “emotional creature” territory, presenting itself as a fuzzy, non-verbal pet whose responses are meant to feel mood-like (Casio describes the concept on its Moflin product page). Other consumer robot pets, such as Keyi Tech’s Loona, lean into animation-style expressiveness and playful interaction (see the Loona robot pet overview).
What’s changing now is less about novelty and more about the software supply chain. Advances in on-device chips, better microphones and cameras, and more capable speech and vision models (even when pared back for cost) mean these products can do more than pre-programmed tricks. They may be able to maintain context, recognise household members (sometimes in limited ways), and respond with a more coherent “personality”. That’s part of what makes them more compelling — and what can make their presence more sensitive.
Companionship has evidence — but it’s not settled
The public pitch for companion robots is often simple: they can reduce loneliness. The evidence is more complicated. Reporting in Nature notes that social robots show promise in aged care and other settings, but also highlights mixed results and the difficulty of running long-term, real-world studies that separate novelty effects from lasting benefits (see Nature’s feature on the future of social robots). A separate report in Science similarly emphasises that researchers disagree about how effective home social robots will be, especially outside controlled environments (see ScienceDirect on why not everyone is convinced social robots are good for us).
That ambiguity matters when marketing meets vulnerability. Products aimed at older adults, people living with dementia, or children may be beneficial, but they can also risk becoming a substitute for human contact if deployed primarily as a cost-saving measure rather than an additional layer of care. Tombot, for instance, frames its robotic puppy as an emotional support companion specifically designed with seniors and dementia care in mind (outlined on Tombot’s official site). Therapeutic robots such as PARO — the robotic seal used in care settings — have amassed a large body of published studies, though the quality and consistency of outcomes vary and remain debated in clinical communities (PARO collates research links on its research overview page).
A more defensible conclusion, based on current reporting and the state of research, is narrower than the hype: companion robots may improve mood, engagement and social interaction for some people in some contexts, particularly when thoughtfully integrated into care — not when dropped in as a replacement for relationships.
Embodied AI: hard problems, new shortcuts
Moving AI into a body exposes it to the messiness of homes: cluttered floors, mixed lighting, accents, background TV noise, toddlers, pets, visitors, and the sheer unpredictability of daily life. This is one reason “embodied AI” is often described as a particularly challenging frontier in the field. It’s not enough to produce fluent text; the system has to perceive, decide, and act safely in real time.
Builtin’s explainer on what embodied AI is and why it matters captures the central challenge: intelligence in the abstract is different from intelligence that must navigate physics and social norms at once. A lounge room is not a benchmark dataset.
Yet consumer companion robots can sometimes sidestep the hardest parts of robotics. Many robo-pets don’t need to manipulate objects, climb stairs, or make their own way through the house. Their job is to be emotionally legible and responsive in a small radius: look at you when you speak, come when called, “react” to touch, and express simulated excitement or calm. In other words, they can deliver a convincing experience of presence without attempting general-purpose autonomy.
This is also why the category may grow even if “robot butlers” remain elusive. A pet-like device can be valuable even when its capabilities are constrained, provided the interaction feels reliable and safe.
The privacy trade: microphones, cameras, and the cloud
A companion robot’s superpower — noticing you — is also its biggest privacy liability. Many devices rely on microphones for wake words and speech interaction; some use cameras for navigation, gestures or recognition; many connect to cloud services for updates and AI features. That stack creates a persistent question: what data is captured, where is it processed, and who can access it?
Sony’s Aibo documentation makes clear that the product is part of a connected service ecosystem, with specific terms and privacy materials explaining data handling. The details matter: always-on sensors, even if technically “listening for a wake word”, can feel different in a home than on a phone you can put face-down on a table.
More broadly, consumer advocates have warned that in-home robots and smart devices can expose intimate household data — including imagery of living spaces and behavioural patterns — if security is weak or data sharing is expansive. Consumer Reports has covered these risks in adjacent categories such as camera-equipped robot vacuums, which illustrates how quickly “helpful navigation” can become “sensitive footage” (see Consumer Reports on privacy issues with robot vacuums).
For buyers, the practical advice is unglamorous but effective: read the privacy policy, check whether features work locally or require cloud processing, use strong account security, and understand what happens if the company folds or sunsets servers. A robot pet that becomes a brick without cloud access is not just inconvenient; it can be an emotional disruption if a household has bonded with it.
Regulation is catching up — unevenly
As companion robots become more socially convincing, regulators are increasingly interested in how AI systems interact with people, especially children and other vulnerable groups. The European Union’s AI governance framework is often referenced as a bellwether. The European Parliament’s overview of the Artificial Intelligence Act explained outlines a risk-based approach, with stricter obligations for “high-risk” systems and attention to sensitive applications.
How these rules apply in practice to consumer companion robots will depend on specifics: whether a device is marketed for education or care, whether it performs biometric identification, and whether it makes claims about emotional inference. Privacy and AI governance experts have also debated the appropriateness of “emotion recognition” claims in consumer products, given contested scientific foundations and the risk of manipulative design; analysis outlets such as the IAPP have tracked how the EU framework treats these categories (see the IAPP recap of the EU AI Act final compromise).
In Australia, the policy environment is still evolving, with government materials emphasising “safe and responsible AI” and potential guardrails rather than a single, comprehensive AI statute. The Department of Industry, Science and Resources has published consultations and proposals under the banner of Safe and responsible AI in Australia. For companion robots, that translates to an open question: will rules focus on data protection, marketing claims, safety standards, or the psychological impacts of human-like design?
The gap between capability and governance is where much consumer risk can sit — particularly when products are imported, updated remotely, and driven by third-party model providers that can change behaviour after purchase.
Where this is heading: less “robot”, more relationship design
The most telling shift isn’t that robots are suddenly everywhere. It’s that AI product design is increasingly about shaping a relationship over time: routines, habits, “personality”, and the sense that a device remembers you. The body is the delivery mechanism — a way to make software feel like a presence.
That brings potential upside. For people living alone, for families wanting a low-stakes introduction to robotics, or for care settings seeking additional tools for engagement, companion robots can offer moments of comfort and interaction that a screen can’t replicate. It also brings real trade-offs: privacy, security, long-term support, and the ethics of devices engineered to be emotionally “sticky”.
Robot pets may not have been the headline-grabbing stars of CES 2026. But their steady evolution is a signal worth taking seriously: AI is leaving the browser window and arriving in our homes as something you can pat, talk to, and miss when it’s switched off.
