This Camera Distance Tracking AR project is an augmented reality experience created in Unity, exported to Xcode, then exported to an iOS device. A user can insantiate metallic spheres in their surroundings and walk away from them, while the build records the camera distance of the placed sphere. This project was a great stepping stone for building AR for iOS.
The users and audience for this project are people who enjoy augmented reality. Since this project is very experimental, it could be for beginners who are now learning augmented reality or extended reality in general. This experience is exciting because it can be used by many people, there is no specific age limit so anyone can have fun while using this augmented reality prototype.
The idea I first had was not scalable. I initially wanted to create an interior design AR experience where users would choose furniture pieces from a list then instantiate 3D furniture models onto the horizontal plane of a room and track the distance from the camera. Since I was not able to do that because of my lack of knowledge of AR and lack of time, I simplified the project.
I initially created wireframes for my first project idea before I even started thinking about how I would make it happen in Unity. The first wireframes included all of the screens and the flow how I wanted the experience to flow.There would be a button in the left corner of the screen where users can choose 3D furniture items from a pre-made list. Then the user would be able to tap on their screen and place that 3D model in their surroundings, then based on script, the script would record the 3D furniture's place and distancw from the user's camera.
Once I made those wireframes, I realized that would be too much for me to create in Unity because I do not have that much experience scripting for AR. I decided to look into recording the camera distance and using a simple sphere instead of multiple complex 3D furniture pieces. I also eliminated the button UI as well as the list UI becuase I thought that would be out of the scope of my abilities. So I created a wireframe flow for the simplified project idea.
I started with Unity's AR scripts. The AR session origin, AR reference point manager and AR reference point manager with camera scripts were al the scripts needed for this experience. The AR session origin sets the scene up as an AR scene, the AR reference point manager tracks the objects points in the AR scene, and the AR reference point manager with camera tracks the reference points and distance from the camera.
Once the AR scriots were in place I created the metallic ball that would be generated by a user's tap. I first wanted to make it look like a metallic orbs that float around, simialr to Summit One Vanderbilt in New York City.
I wanted the lights of the user's room to reflect off of the spheres to give it an even more exciting experience similar to the image above, but that wasn't in my scope of knowledge so I decided to create a purple sphere and add a texture from an image that I found online.
After creating the sphere it was time to link the sphere to the ar session origin. When a user taps the screen, the linked sphere would appear in that area. After linking the sphere it was time to test on my iOS device since I couldn't test it in game mode on my laptop.
To test the Unity prototype, I had to export it to my iOS device. In order to do this, I set up my build settings to iOS and to run on the latest version of XCode.
Assembling my Unity file took some time, I saw this screen for about 10 minutes. I'm sure it was just my laptop becuase I was running suhc big programs at the same time.
Once everything was finished being assembled, XCode was prompted to open with the Unity project. Since I had worked in XCode before, I alread had a developer account and I was already signed in. All I had to do was go back and grant signing in capabilities so that the prototype could run on my iOS device.
I started the build and in a few seconds, the build popped up on my iOS device. I had to give the build permission to run. I went into settings and clicked the "Trust" button so that I can test my AR experience.
Through Unity I produced an iOS build with AR functionality that allows limited interaction with the tap of teh screen. Since I am fairly new to AR, it was a great project to start off with, but I am glad I was able to export it to my iOS device because it adds to the experience. I think being able to move around freely with a device and placing spheres in your surroundings makes the experience more interactive. If I had more time on this project, I would want to somehow track the lights of the user's surroudnings and have the spheres reflect that. I would also like to go back and try to build multiple projects that would eventually lead up to my intial idea I previously stated.