In 2016, Pokémon Go was a global phenomenon, making press headlines almost every day and bringing augmented reality to the masses for the first time.
Niantic Lab’s mobile game not only captured the attention of nostalgic millennials looking to "catch them all", but also new generations that were born after the original Pokémon frenzy.
Something you probably don't know is that the very first augmented reality video game was actually created 20 years earlier at the University of South Australia and had very little to do with Pikachu and his friends.
Similar to the Pokémon Go game of 2016, virtual creatures would appear in a real-world environment, but instead of catching them, users would shoot at them 🔫.
Indeed, the very first prototype of an augmented reality video game was ARQuake and as you can easily guess, it was made from the popular first-person shooter (FPS) that was released by id Software in 1996, Quake.
In the game, players must find their way out of various maze-like, medieval environments while battling a variety of monsters using a wide array of guns, collecting objects, and completing objectives.
Quake, which will turn 25 years old on June 22nd this year, is widely regarded as one of the most influential video games of all time. The successor to the hit series Doom had pioneered a number of conventions that all gamers now take for granted such as multiplayers-specific maps, social play & clans' concept, and above all, 3D world.
The game was built upon the technology and gameplay of Doom, offering full real-time 3D rendering and early support for 3D acceleration through OpenGL.
But the crucial factor that made the team behind ARQuake choosing this game to develop its augmented reality version was that the source code was freely available. They simply didn't have to write their own game from scratch 😏. Good news when you know all the technological challenges they faced while developing ARQuake.
In 1997, Steven Feiner presented the Touring Machine, the first mobile augmented reality system. Inspired by Ivan Sutherland's "Helmet Mounted Display", it used a head-worn display you look through with an integral orientation tracker, a backpack holding a computer, differential GPS, a digital radio for wireless web access, and a hand-held computer with stylus and touchpad interface.
Thanks to a head-mounted display (HMD) combined with a device that can measure the position and orientation of the user’s head, 3D models were overlayed over the user's view. As the user moved through the physical world the display was updated by the computer in real-time.
With technology small enough to be carried, a whole new area of mobile augmented reality research, both indoor and outdoor, was created like ARQuake at the Wearable Computer Lab at the University of South Australia (USA).
The research group behind the video game consisted of Bruce Thomas an associate professor; Ben Close, John Donoghue, John Squires, Phillip De Bondi, and Michael Morris all honors students; and Wayne Piekarski a Ph.D. student, all of them being in the School of Computer and Information Science.
Bruce's team started by mapping their university campus and building a "Quake environment" out of it. They added their own textures to give a grid pattern, which was easier to see when this image is composited with the real world.
Then, they added all of the objects and monsters to keep the spirit and playability of the original game. They displayed this image inside the HMD that was semi-transparent so the user wearing it can see both the digital creation and the real world at the same time.
Having this combined image, the software developed by the team just had to analyze any move from the user in the real world and make sure the digital one is moving as well, keeping the two perfectly synchronized.
Finally, the last step was to remove the texturing of the buildings, ground, and sky to make it transparent and offer the user a better experience while playing.
First, the user had to put the wearable computer on his back, place the head-mounted display on his head, and hold a simple two-buttons toy gun input device (preferred to a more realistic plastic gun due to the Police's attention on the streets 👮)
This entire 16kg equipment and all the position/orientation information collected from the user movements were replacing the keyboard and mouse controls from the game desktop.
After a quick calibration exercise to align the HMD with his eyes, the user was ready to start playing the game. He had to walk to move through the level (his own movement determining the rate and direction of game movement) and to look around to changes the view, both the game and the physical world being visible through the HMD.
Even though that sounds pretty basic, making the digital world know in real-time where the user is was really complex.
Bruce Thomas and his team had to use a combination of digital compasses, inclinometers, GPS tracking, and pattern recognition technologies just to work out exactly where the user was in the real world before integrating the info inside the game.
They had to face many challenges with the alignment of the two worlds, the accuracy of the outside world tracking, or even with the lightning.
The bottom portion of the screen was a status bar containing information about armor, health, ammunition, and weapon type while the rest was reserved for the augmented reality images of monsters and game objects. In order to make them easier to see and distinguish from the physical world, the monsters' skin color and texture were changed from the original game.
Except for the removal of actions that were not easily reflected in the physical world, like opening locked doors, the user was able to perform everything like if it was playing the desktop version of Quake. He had to walk through virtual doors to open them, to walk over objects to pick them up, or moving through predetermined locations to trigger traps.
The tracking of the user's position and his head orientation handled the majority of the interaction. The only other interaction for him to perform was to shoot or change the current weapon thanks to the toy gun which was used as a physical input device.
At that time, it was a great technological achievement despite the size of the equipment, the pretty pixelated quality of the design, and the relatively basic gameplay.
We don't know about you, but we are pretty curious to imagine what the team would have been able to build with today's technology...
Oh, and we almost forget to tell you but that The Eye of Judgment was the first indoor/home augmented reality game and was published 7 years after ARQuake, in 2007. We'll probably talk more about it in a future blog post. Stay tuned 😉