MEET JOY

arrow_drop_down

Aluminium Mono-frame Unibody

Designed to replicate human proportion, balance, and weight distribution.

Synthetic Musculature System

Enabling controlled micro-movements - the kinds of movements humans don't consciously notice, but immediately recognize as real.

Platinum-cure Silicone Skin

Cast directly to achieve realistic elasticity, translucency, and surface response.

Distributed Actuation System

Facial movement is driven by a distributed actuation system beneath the silicone.

Behavioral Expression Model

Trained on millions of real-world human interactions to reproduce natural emotional expression.

Spatially Aligned Audio Output

Voice is emitted directly from the mouth, synchronized to facial movement.

Integrated Perception Systems

Trained on extensive real-world behavioural data to interpret social cues in real-time, enabling context-aware responses.

Onboard Cognitive Processing

Dedicated on-device AI compute integrates perception, motion control, and behavioural models with low latency for continuous real-time interaction.

IN THE FLESH

LONDON
02.27.2026

By submitting your information above you agreee to give Joyous Limited permission to contact you regarding the Joy project.
You're on the list!
Successful applicants will be notified by 02.20.2027.
Oops! Something went wrong while submitting the form.