There was a period in the mid-2010s when one of the most viral things you could do with a smartphone was stand next to a life-size virtual character and take a photo. A US president appearing in your living room. A tiger walking through a shopping mall. A movie character standing beside you on the street. These experiences were not VR — you were fully in the real world, looking through your phone camera, and something that was not there appeared to be standing right next to you. I saw these experiences, built a proof-of-concept version myself, and watched this format capture public imagination in a way that few AR experiences before or since have matched.
What Life-Size AR Characters Actually Were
The "celebrity beside you" format was a specific type of augmented reality experience built around a simple but powerful concept: place a life-size, realistically scaled 3D character into the user's real environment through the phone camera, anchored to the floor, so it appears to be physically standing in the same space as the user.
Unlike face filters — which overlaid effects on the user's own face — or image target AR — which required a printed trigger — life-size character AR used plane detection to find the floor, placed a full human-scale 3D model on it, and let the user walk around it, stand next to it, and photograph themselves beside it. The character stayed fixed in real space as the camera moved. Done well, the illusion was genuinely convincing.
What made it viral was the photograph. People wanted to share the image of themselves standing beside something impossible — a world leader, a celebrity, an animal that would never be in their living room. The social sharing mechanic was built into the experience by design.
What Inspired Me — A US President in AR
The concept first caught my attention when a video circulated showing a life-size AR version of a sitting US president appearing in someone's room — standing there at full human scale, looking entirely out of place and yet entirely convincing through the phone screen. It was one of those AR moments where the technology stopped feeling like a demo and started feeling like something genuinely new.
🧪 Developer POC — Life-Size AR Figure
Building the Proof of Concept
Seeing that video, I immediately recognised what was happening technically and wanted to build my own version. The concept I had in mind was a life-size standing figure — a public figure relevant to our region — that users could place in their environment and photograph beside. I built a working proof of concept. The project that would have used it commercially did not move forward, but the POC worked — plane detection, a humanoid rigged 3D model at correct human scale, basic idle animation, shadow casting on the detected floor plane. Standing a virtual figure beside a real person through the phone camera and having it look convincing was one of the most satisfying technical moments of that period of my career.
How It Worked — The Technical Flow
1
Floor Plane Detection
The AR framework — typically ARCore, ARKit, or an earlier equivalent like Wikitude's ground detection — scanned the camera feed to identify horizontal surfaces. The user would slowly move their phone across the floor until the system confirmed it had detected a stable plane. A visual indicator — dots, a grid, a highlight — showed where the plane was detected.
2
Character Placement
Once the plane was confirmed, the user tapped the screen to place the character. The 3D model appeared at that position, scaled to real human height — roughly 1.7 to 1.8 metres — standing on the detected floor. The placement anchored the character to that position in real world coordinates.
3
Idle Animation Loop
A good life-size AR character was never completely static. A subtle idle animation — slight breathing movement, small weight shifts — made the character feel present rather than frozen. This was the difference between a convincing AR figure and an obvious 3D model dropped into a scene.
4
Shadow and Occlusion
The most convincing implementations cast a shadow on the floor plane that matched the real-world lighting direction, and handled occlusion — if the user walked in front of the character, the character's body was hidden by the user rather than appearing to float in front of everything. These details were technically demanding but essential for realism.
5
Photo or Video Capture
The experience was designed to be photographed or filmed. The app provided a capture button that saved the AR camera view — user, real environment, and virtual character together — as a shareable image or video. This was the moment the experience was built for.
The Mall Tiger — AR Animals in Public Spaces
One of the most memorable manifestations of this format was life-size AR animals appearing in public spaces — shopping malls, public plazas, event venues. I saw installations where a full-size tiger appeared to be walking through a mall corridor, visible through phones and tablets positioned at specific viewing points. People queued to stand next to it and take photographs.
This format worked particularly well for a few reasons. Animals at real scale — a tiger, an elephant, a dinosaur — created a sense of awe that a human-scale character did not always achieve. The impossibility was more obvious and more exciting. A tiger in a shopping mall was inherently more remarkable than a person in a shopping mall, even a famous one. The experience required no explanation — you understood instantly what you were supposed to do.
🐯 Why Animals Worked Especially Well
Life-size AR animals hit a different emotional register than human figures. A tiger at real scale — roughly 2.5 metres long, 1 metre tall at the shoulder — created a visceral sense of presence that was impossible to ignore. The combination of a creature that people had only ever seen behind glass at a zoo, now apparently standing in a familiar everyday space, produced a reaction that was genuinely difficult to fake with any other medium.
Examples From This Era
🦁
2018 — Google
Google AR Animals
Google launched AR animals directly in Search results — search for any animal, tap "View in 3D", and place a life-size lion, tiger, shark, or dinosaur in your room. Reached millions instantly through the search engine. One of the most widely distributed life-size AR experiences ever.
🎬
2016–2019 — Movie Studios
Movie Character Promotions
Marvel, Disney, and other studios used life-size AR characters as movie promotion tools — stand next to Iron Man, photograph yourself with a Star Wars character. Distributed through dedicated apps and later through social media AR platforms.
🏟️
2017–2020 — Sports
Sports Athlete AR
Sports teams and sponsors created life-size AR versions of athletes — stand next to your favourite footballer or cricket player at real scale. Used for fan engagement at stadiums, retail activations, and branded app experiences.
🛍️
2016–2020 — Retail & Malls
Mall AR Installations
Shopping centres used life-size AR characters as footfall drivers — a life-size tiger, a movie character, a brand mascot appearing in a specific location in the mall, visible through dedicated tablets or phones placed at viewing points.
🎪
2015–2019 — Events
Event & Exhibition AR
Trade shows, exhibitions, and brand activations used life-size AR figures to draw crowds and generate social sharing. A striking AR character in an exhibition booth attracted more attention than any banner or poster.
📱
2019+ — Social Media
Snapchat & Instagram AR
Snapchat and Instagram brought life-size world AR to their platforms — brands could publish AR lenses that placed characters in the user's environment. The distribution reached billions, though the character quality was constrained by mobile processing limits.
The Technical Challenges of Getting It Right
📐
Scale Accuracy
A character that is slightly the wrong height — too tall or too short — immediately breaks the illusion. Getting human scale exactly right required careful calibration of the 3D model against real-world measurements.
💡
Lighting Match
A 3D character lit from a different direction than the real environment looks obviously fake. Estimating real-world lighting direction and matching it on the virtual character was one of the hardest problems in convincing AR.
🌑
Shadow Casting
A character with no shadow floats. A shadow anchors the character to the floor and makes it feel physically present. Casting a convincing shadow on an irregular floor surface required careful shader work.
🚶
Occlusion
If a real person walks in front of the AR character, the character should be hidden behind them. Without occlusion, the character appears to float in front of everything — immediately breaking immersion.
📍
Tracking Stability
The character had to stay fixed in real-world space even as the camera moved. Any drift — the character sliding or drifting from its placed position — destroyed the illusion that it was actually there.
⚡
Performance
A high-polygon, fully animated, shadowed 3D character running at 30+ fps on a 2016 smartphone was a significant performance challenge. Every polygon and texture had to earn its place.
Why This Format Faded — and What Replaced It
The life-size character format did not disappear — it evolved. The novelty of seeing a virtual character at human scale wore off as ARKit and ARCore made the capability common and as every major platform offered some version of it. What replaced pure novelty was purpose: AR characters that did something useful rather than just existing to be photographed.
Today the same core technology — plane detection, life-size 3D model placement, real-world anchoring — powers retail AR try-on (place a sofa in your room), educational AR (place a 3D model of a human heart at desk scale), training simulations, and AI avatar applications. The format matured from a novelty into an interaction paradigm.
ARCore / ARKit
Plane Detection
Unity 3D
Humanoid Rigged Models
Idle Animation
Shadow Shaders
Occlusion Handling
AR Foundation
💬 Developer Reflection — Prabhu Kumar Dasari, 13+ Years in XR
Building the proof of concept for a life-size standing AR figure was one of those development moments where you finish testing and just sit back for a moment. Looking at your phone screen and seeing a full-size figure standing in your room — correctly scaled, animated, casting a shadow — has an impact that is difficult to describe to someone who has not experienced it. The illusion works at a level that surprises even the developer who built it. The commercial project that would have used this did not proceed, but the POC taught me a great deal about what makes AR experiences feel convincing versus what makes them feel like obvious overlaid graphics. Scale is everything. Lighting is everything. The shadow is everything. Get those three things right and the brain accepts the presence of something that is not there.
Frequently Asked Questions
How did life-size AR characters work technically?
Life-size AR characters used mobile plane detection — available through ARKit, ARCore, and earlier through SDKs like Wikitude and Metaio — to identify horizontal floor surfaces. A 3D humanoid or animal model, rigged for animation and scaled to real-world dimensions, was then placed on the detected plane at the user's tap position. The model was anchored to real-world coordinates, so it stayed fixed as the camera moved. Idle animations, shadow casting on the floor plane, and lighting estimation made the character feel physically present.
What were Google AR Animals?
Google AR Animals was a feature launched in Google Search in 2019 that allowed users to place life-size 3D animals in their real environment directly through a Google Search result. Searching for almost any animal — lion, tiger, shark, alligator, eagle, horse — and tapping "View in 3D" and then "View in your space" placed a realistically scaled and animated 3D model of that animal in the user's camera view. The feature required an ARCore-compatible Android device or an iPhone with iOS 12 or later. It became one of the most widely used consumer AR experiences ever deployed.
Why did life-size AR go viral on social media?
Life-size AR experiences produced shareable photographs and videos that were visually striking and immediately understandable. A person standing next to a life-size tiger in their living room communicated its own impossibility without explanation. The combination of a familiar real environment, a recognisable character at a surprising scale, and a real person for scale reference created images that attracted attention when shared. The social sharing mechanic was intrinsic to the experience design — the photograph was the point.