MEET JOY

arrow_drop_down

Aluminium Mono-frame Unibody

Designed to replicate human proportion, balance, and weight distribution.

Synthetic Musculature System

Enabling controlled micro-movements - the kinds of movements humans don't consciously notice, but immediately recognize as real.

Platinum-cure Silicone Skin

Cast directly to achieve realistic elasticity, translucency, and surface response.

Distributed Actuation System

Facial movement is driven by a distributed actuation system beneath the silicone.

Behavioral Expression Model

Trained on millions of real-world human interactions to reproduce natural emotional expression.

Spatially Aligned Audio Output

Voice is emitted directly from the mouth, synchronized to facial movement.

Integrated Perception Systems

Trained on extensive real-world behavioural data to interpret social cues in real-time, enabling context-aware responses.

Onboard Cognitive Processing

Dedicated on-device AI compute integrates perception, motion control, and behavioural models with low latency for continuous real-time interaction.

IN THE FLESH

LONDON - 02.APR.2026
New date due to high demand!

By submitting your information above you agree to give Joyous Limited permission to contact you regarding the Joy project.
You're on the list!
Successful applicants will be notified by 31.MAR.2026.
Oops! Something went wrong while submitting the form.