โ† Back to Blog

My XR Journey: From Arduino POCs in 2013 to AI-Powered VR at ADIPEC 2025

Somewhere around 2013, in a room in Mumbai, I got a fighting game to work in augmented reality. Characters appearing in the actual physical space of the room, fighting each other through the phone camera. No one had asked me to build it. There was no client brief, no deadline, no commercial purpose. I just wanted to know if it could be done.

It could. And that curiosity โ€” the need to know if the physical and digital worlds could be made to talk to each other โ€” has been the thread running through everything I have built in the twelve years since.

This is that story.

Mumbai โ€” The First Commercial AR (2012โ€“2013)

My career in immersive technology started not with VR headsets or enterprise clients but with a chocolate wrapper and a currency note. At Blink Digital in Mumbai, I was the App Developer on two of India's earliest consumer AR campaigns โ€” the Cadbury Dairy Milk "Dosti Ka Shubh Aarambh" friendship band app in 2012, and the KFC WOW@25 currency-scanning AR campaign in 2013.

The KFC campaign was technically the more interesting problem. We used Vuforia image tracking to recognise Indian currency notes โ€” โ‚น10 to โ‚น1000 โ€” and render the corresponding KFC menu items as animated 3D models sitting on the surface of the note. Getting that to work reliably on notes that were crumpled, folded, worn, under variable lighting conditions, across a country with massive environmental diversity โ€” that was a real engineering problem. The campaign hit 35,000 downloads and reached #1 on the iTunes What's Hot chart. It later won a Webby Award. For 2013 India, that was significant.

But the commercial work was only part of what I was doing. At the same time, in my own time, I was pulling apart every spatial computing SDK I could find.

The Exploration Years โ€” Every SDK I Could Find (2013)

This is the part of my career that rarely appears on a CV but shaped everything that came after.

I connected Arduino to Unity3D โ€” physical hardware talking to a game engine, controlling objects in a 3D environment through real-world input. That early demo is still on YouTube โ€” a car POC from May 2013, rough around the edges, but proof of the experiment. It sounds straightforward now. In 2013 it required digging through documentation, stitching together libraries that were not designed to talk to each other, and a lot of debugging sessions that went nowhere before something finally worked.

I experimented with Zigfu and the Microsoft Kinect โ€” full body gesture control, getting Unity to respond to how a person moved in physical space. I worked with face recognition SDKs โ€” Visage Technologies, XZIMG โ€” tracking facial features and using them as input for interactive experiences. I tried Metaio, one of the most technically advanced AR platforms of that era, before Apple acquired them in 2015 because their technology was simply that good.

And then there was the AR fighting game. Two characters, fighting each other in the actual physical room around you, visible through the phone camera, responding to the geometry of the real space. I built that as a personal POC. Nobody commissioned it. It was just the question I wanted to answer: can I make a game that exists in the real world?

Looking back, what I was doing in that period was building a personal mental model of what was possible at the intersection of physical and digital worlds โ€” before that intersection had a widely understood name, before AR and VR were mainstream categories, before Unity had half the XR tooling it has today. I was mapping the territory by walking it.

The Game Studio Years โ€” Scale, Grind, and Cost (2013โ€“2018)

The next phase of my career took me deep into mobile game development โ€” building games, leading a branch, watching download counts climb past numbers I had not previously imagined.

My first game crossing one million downloads was Bike Racing 2014. It did not stop there. Over the years that followed, leading the development team and mentoring junior developers, the cumulative downloads across games I led crossed one crore. Then two crore. Somewhere in that range โ€” two to five crore downloads โ€” is where I spent my late twenties.

Those were years of extraordinary output and extraordinary cost. Day and night work. The kind of pressure that produces results in the short term and extracts payment in the long term. I was building things that millions of people were playing. I was also running on empty in ways I did not fully recognise at the time.

The body keeps score. The mind keeps score. The moment I understood that health is not something you can defer indefinitely โ€” that you cannot endlessly trade wellbeing for output and expect to break even โ€” that understanding did not arrive as a single dramatic event. It accumulated. And eventually I made a deliberate choice to step back, reduce the pressure, and find a different kind of work.

The Reset โ€” Cognizant and Enterprise XR (2019โ€“2023)

Joining Cognizant in 2019 was a conscious reset. Enterprise development at a global technology consultancy operates at a different pace and with different demands from a game studio pushing to ship. I had time to go deeper, to think more carefully about what I was building and why.

The most significant project from this period was an internal VR Learning Studio โ€” an enterprise platform for training and speech analysis. The system used AI-driven speech metrics to analyse how professionals communicate during training scenarios, providing feedback on clarity, pacing, and comprehension. It was the first time I integrated AI meaningfully into an XR project, not as a feature but as the core mechanism of the experience.

And then came the afternoon at a major real estate technology client that I have written about elsewhere on this site. I was deep in a HoloLens Mixed Reality proof of concept, stuck on a piece of spatial logic. A colleague suggested I ask ChatGPT. I laughed it off. He ran the prompt anyway. The answer appeared in seconds. It addressed exactly what I had been grinding on for an hour.

That was the afternoon I lost my sleep. Not from worry. From the realisation that my decade of experience had just been given a superpower โ€” and that the developers who understood how to use it would operate at a scale that those who ignored it could not match.

Abu Dhabi โ€” The Full Integration (November 2023 โ€“ November 2025)

Two years in Abu Dhabi, from November 2023 to November 2025, were where everything came together.

The VR gas safety training simulator shown at GITEX Dubai 2024 used OpenAI Whisper running on-device via Unity Sentis โ€” voice-controlled training with no internet dependency, no API latency, working in environments where connectivity cannot be assumed. The VR water tanker inspection simulator for ADIPEC Abu Dhabi 2025 integrated Convai for AI conversational guidance โ€” a virtual instructor that responded to trainee voice input in real time, adapting to questions and adapting to mistakes.

These are not demonstrations. They are deployed training systems for industrial clients in oil and gas, built for environments where the stakes of inadequate training are real. The AI integration is not a feature โ€” it is the mechanism that makes the training meaningfully better than what existed before.

Standing at ADIPEC 2025, watching a trainee put on a headset and interact with an AI guide in a photorealistic virtual tanker environment, I thought about the fighting game I had built in a Mumbai room twelve years earlier. The same question, answered at a different scale: can I make the digital world respond to the physical world in a way that feels real?

Back in Hyderabad โ€” What I Am Building Now

I returned to Hyderabad in November 2025. AllInOneAICenter.com is the public-facing result of the shift that started that afternoon at the real estate client โ€” a curated AI tools directory built by someone who uses these tools in actual production work, not someone who reads about them.

MagicBrush Stories is the unexpected thread. A kids YouTube channel built with MagicLight AI and PixVerse. Two children's books illustrated with AI tools, written and published independently. These are not separate from the XR and AI work โ€” they are the same curiosity applied to a different domain. What can I make if I use every tool available to me?

The Thread

Looking back across thirteen years โ€” from the Cadbury wrapper in 2012 to the ADIPEC tanker in 2025 โ€” the through-line is not a technology or a platform or a company. It is the question.

Can the physical and digital worlds be made to talk to each other in a way that feels real? Can that connection be used to teach something, or sell something, or help someone? Can I push it further than I pushed it last time?

Arduino to Unity. Kinect to gesture control. Metaio AR. Currency notes becoming menus. Chocolate wrappers becoming friendship tokens. Bike Racing 2014 to two crore downloads. HoloLens spatial logic. OpenAI Whisper in a VR training room. Convai in a virtual tanker. Children's books illustrated by AI.

The tools change. The question stays the same.

I am still trying to answer it.

From the Portfolio

AR Case Study ยท 2013
KFC WOW@25 โ€” Webby Award Winner
AR Case Study ยท 2012
Cadbury Dosti AR App
VR Case Study ยท 2024
VR Safety Training โ€” GITEX 2024
VR Case Study ยท 2025
VR Inspection โ€” ADIPEC 2025