โ† Back to AR History Series

Face Tracking Before Snapchat โ€” Visage Technologies & XZImg

Prabhu Kumar Dasari โ€” Senior Unity XR Developer
Prabhu Kumar Dasari
Senior Unity XR / VR / AR Developer ยท 13+ Years
Tested Visage Technologies SDK in Unity 2014 ยท Face game & overlay concepts ยท GITEX Dubai 2024
When Snapchat launched its first face swap filters in 2015 and dog ear filters in 2016, hundreds of millions of people discovered face tracking for the first time. What most of them did not know was that developers had been building face tracking applications for years before that โ€” using professional SDKs like Visage Technologies and XZImg that required direct communication with vendors, manual Unity integration, and significantly more technical effort than opening Lens Studio. I was one of those developers, testing face tracking in Unity around 2014 โ€” exploring face-controlled games and virtual overlays on the face before any of this was mainstream.

What Face Tracking Actually Is

Face tracking in the context of AR means detecting a human face in a camera frame and precisely locating specific points on it โ€” the corners of the eyes, the tip of the nose, the edges of the lips, the outline of the jaw โ€” in real time, fast enough to keep up with natural head movement and facial expression changes.

Once those points are located, two things become possible. First, the face's position and orientation in 3D space can be determined โ€” so the application knows exactly where the face is, which way it is pointing, and how it is tilted. Second, the relative positions of the tracked points can be analysed to detect expressions โ€” whether the mouth is open or closed, whether the eyebrows are raised, whether the eyes are looking left or right.

These two capabilities โ€” pose estimation and expression detection โ€” are the foundation of everything from Snapchat dog ears to medical facial analysis software to face-controlled game interfaces.

The Two SDKs โ€” Visage Technologies and XZImg

Visage Technologies
๐Ÿ‡ญ๐Ÿ‡ท Croatia ยท Founded 1995 ยท Still Active
Origin: Academic computer vision research
Strength: High-precision facial landmark tracking โ€” up to 84 points
Target: Enterprise, medical, biometric applications
Unity: Available via plugin โ€” required manual integration
Licensing: Commercial โ€” required direct vendor contact
Still active: Yes โ€” now focused on identity verification and biometrics
XZImg
๐ŸŒ Independent ยท ~2013โ€“2016 ยท Discontinued
Origin: Mobile-focused AR and vision SDK
Strength: Lightweight face detection for mobile
Target: Mobile AR developers, game developers
Unity: Mobile-friendly integration
Licensing: Developer-focused, more accessible
Still active: No โ€” discontinued, almost no documentation remains

What I Was Building โ€” Face Games and Virtual Overlays

My exploration of face tracking SDKs around 2014 was driven by two specific ideas that felt genuinely exciting at the time โ€” and that Snapchat would later validate for hundreds of millions of users.

Idea 1 โ€” A Face-Controlled Game

The concept was simple: use facial expressions or head movement to control a game instead of touch controls. Open your mouth โ€” character jumps. Turn your head left โ€” character moves left. Raise your eyebrows โ€” special action triggers. This kind of face-as-controller interaction felt novel and playful, and face tracking SDKs like Visage Technologies made it technically possible.

The challenge was latency and reliability. A game control system needs to respond consistently and immediately โ€” any lag between the player's expression and the game's response breaks the experience. Getting face tracking fast enough and stable enough on the mobile hardware of 2014 to feel like a real game control mechanism was genuinely difficult. The SDKs were capable, but the gap between "technically works in a demo" and "feels good to play" was significant.

๐ŸŽฎ Developer Concept โ€” 2014

Using facial expressions to control a mobile game โ€” open mouth to jump, head tilt to steer, eyebrow raise for special moves. The same interaction concept that casual mobile face games explored years later. Getting the latency low enough to feel responsive was the core technical challenge.

Idea 2 โ€” Virtual Overlays on the Face

The second concept was placing virtual objects on the face โ€” glasses, masks, hats, effects โ€” that stayed precisely anchored to facial features as the person moved their head, changed their expression, and shifted their distance from the camera. This is exactly what Snapchat filters do. In 2014, doing this required integrating a professional face tracking SDK, writing custom rendering code, and solving the alignment and scaling problems manually. Two years later, Snapchat made it a one-tap experience for anyone.

The technical requirements for a convincing face overlay are more demanding than they might appear. The virtual object must stay precisely anchored to the correct facial landmarks โ€” glasses must sit on the nose bridge and over the ears, not drift as the person moves. The scale must update as the person moves closer or further from the camera. The object must respond to expression changes โ€” glasses should not clip through the face when the person smiles. Getting all of this right with the tools available in 2014 required careful implementation.

How Visage Technologies Face Tracking Worked

Visage Technologies' SDK tracked the face by locating a set of facial landmarks โ€” specific anatomical points โ€” in each camera frame. The full SDK could track up to 84 facial feature points, though real-time mobile applications typically used a reduced set for performance.

Facial Landmark Groups โ€” What Visage Technologies Tracked
EyesInner/outer corners, upper/lower lids, pupil centre
EyebrowsInner, centre, outer points โ€” tracks raises and furrows
NoseTip, nostrils, bridge โ€” key for 3D pose estimation
MouthCorners, upper/lower lip edges โ€” open/close detection
JawOutline points โ€” face shape tracking
CheeksReference points for expression detection
ForeheadUpper face anchoring for full head pose
3D PosePitch, yaw, roll โ€” full head orientation in space

In Unity integration, these landmark positions were exposed as 3D coordinates that updated every frame. A virtual object could be positioned relative to any landmark โ€” anchor glasses to the nose tip and eye corner points, anchor a hat to the forehead landmarks, anchor a beard to the chin and jaw points. The SDK handled the computer vision; the developer handled the rendering and anchoring logic.

Getting Visage Technologies Running in Unity โ€” The Reality

In 2014, integrating Visage Technologies into a Unity project was not a simple asset store import. It required downloading the SDK directly from Visage Technologies โ€” which meant contacting their sales team โ€” integrating the native libraries into the Unity project, writing bridge code between the SDK's output and Unity's component system, and debugging build pipeline issues that the documentation did not always cover. I have email threads from April 2014 with Visage Technologies' team working through exactly these integration issues on Android builds.

The APK built from Android's build system worked correctly. The same project built through Unity's build pipeline initially produced incorrect output โ€” the camera feed appeared as a texture rather than a proper AR view. Resolving this required direct support from Visage Technologies' engineering team. This was the reality of working with professional AR SDKs in 2014: the capability was there, but reaching it required persistence and direct vendor relationships.

What Developers Were Building With Face Tracking in 2013โ€“2015

๐ŸŽญ
Virtual Try-On
Eyewear brands were among the first commercial adopters โ€” letting customers virtually try on glasses frames through their phone camera. Precise nose bridge and eye tracking made accurate placement possible. This predated modern retail AR by several years.
๐ŸŽฎ
Face-Controlled Games
Using facial expressions as game controls โ€” mouth open to jump, head tilt to steer, eye blink for actions. Novel interaction paradigm that required low-latency face tracking to feel responsive enough for gaming.
๐Ÿ˜ท
Face Mask & Overlay Effects
Virtual masks, makeup, accessories overlaid precisely on facial features. The direct predecessor of Snapchat filters โ€” same concept, dramatically more technical implementation required.
๐ŸŽฌ
Entertainment & Marketing
Branded face experiences for marketing campaigns โ€” transform your face into a character, apply brand-themed effects, generate shareable photos. Early versions of what Instagram AR became.
๐Ÿฅ
Medical & Biometric
Visage Technologies' precision tracking made it suitable for medical facial analysis, pain assessment, drowsiness detection for drivers, and identity verification. The enterprise focus that kept the company alive after consumer face AR was commoditised.
๐ŸŽ“
Education & Research
Academic research into facial expression recognition, emotion detection, and human-computer interaction. The SDKs provided the tracking infrastructure; researchers built the analysis on top.

Then vs Now โ€” How Face Tracking Changed

๐Ÿ˜“ 2014 โ€” How It Was
  • Commercial SDK licence required
  • Direct vendor contact to download
  • Manual Unity plugin integration
  • Build pipeline debugging required
  • Custom rendering code needed
  • Limited performance on mobile hardware
  • Developer community was tiny
  • No tutorials, sparse documentation
  • Email vendor for support
โœจ 2026 โ€” How It Is
  • Built into iOS and Android OS
  • Free via ARKit / ARCore
  • One-line Unity AR Foundation setup
  • Works out of the box
  • High-level face mesh API
  • 160+ facial blend shapes on modern hardware
  • Millions of developers
  • Thousands of tutorials available
  • Stack Overflow, Discord, forums

When Snapchat Changed Everything โ€” 2015 to 2016

Snapchat launched its first Lenses feature in September 2015 โ€” initially offering a small set of face effects including the now-famous face swap. By 2016, the dog ear filter had become one of the most recognised digital experiences in the world. In two years, face tracking went from a niche capability that required specialist SDKs and significant development effort to something a 12-year-old could apply with a single tap.

What Snapchat did was not invent face tracking โ€” that technology had existed in professional SDKs for years. What Snapchat did was abstract all the complexity away entirely, build a creator platform around it, and distribute it to hundreds of millions of users simultaneously. The technology was the same. The accessibility was completely different.

For developers who had been working with face tracking SDKs before Snapchat, watching this happen was a familiar experience. The technology we had been using in small-scale professional projects was suddenly reaching global scale because a consumer platform had made it frictionless.

Visage Technologies SDK XZImg Unity 3D Android SDK Facial Landmark Tracking 3D Pose Estimation Expression Detection Native Plugin Bridge
๐Ÿ’ฌ Developer Reflection โ€” Prabhu Kumar Dasari, 13+ Years in XR

Testing face tracking in 2014 โ€” building a game concept controlled by facial expressions, overlaying virtual objects on the face โ€” gave me a very early sense of how compelling this interaction paradigm was. The experience of getting it to work, including the email threads with Visage Technologies support to resolve Unity build issues, taught me something important about early-stage technology: the capability always exists before the tooling catches up. The ideas behind Snapchat filters were not new when Snapchat launched them. What was new was making those ideas accessible to everyone. That gap between technical capability and mass accessibility is where the real innovation happens โ€” and it is the gap that platforms like Snapchat, Instagram, and eventually Apple and Google closed for face tracking.

Frequently Asked Questions

Did Snapchat invent AR face filters?

No โ€” face tracking technology and virtual face overlays existed years before Snapchat launched Lenses in 2015. Professional SDKs like Visage Technologies had been offering face tracking capabilities since the early 2010s, and developers were building face-controlled games and virtual overlay applications before Snapchat made the concept mainstream. What Snapchat invented was the consumer-accessible platform and creator ecosystem that brought these capabilities to hundreds of millions of users with zero technical friction.

Is Visage Technologies still active in 2026?

Yes โ€” Visage Technologies is still an active company, though its focus has shifted significantly from mobile AR to enterprise applications. Today it primarily serves the identity verification, biometrics, and driver monitoring markets โ€” areas where its high-precision facial analysis gives it an advantage over consumer-focused alternatives. The consumer face AR market it helped pioneer is now dominated by platform-native implementations in ARKit, ARCore, and social media platforms.

What happened to XZImg?

XZImg was a smaller, mobile-focused AR and vision SDK that was active in the early-to-mid 2010s. It has since been discontinued and very little documentation about it remains publicly available. This is common for the generation of AR SDKs that existed before ARKit and ARCore โ€” many of them were absorbed by larger companies, pivoted to different markets, or simply shut down as the platform-native frameworks made their core offerings redundant.

How many facial points did early face tracking SDKs track?

Early professional face tracking SDKs like Visage Technologies could track up to 84 facial landmark points in their full implementations, though real-time mobile applications typically used reduced sets for performance reasons. Modern ARKit Face Tracking on supported iPhone models detects over 52 blend shapes (expression parameters) and provides a full 3D face mesh โ€” significantly more expressive data than what was practically achievable on 2014 mobile hardware.