Individual reflections - Ionut Grosu 297111
My contributions throughout the semester
As requested, I will start my personal reflections by stating what I think my most important contributions to the group work wrought the semester were:
Keeping track of the dev blogs
Recording demos
Booking a place in the XR Lab ahead of time
Marker-based AR assignment
The first assignment started with me being extremely excited for this course and seeing what I can achieve throughout the assignments. With all of this excitement came the idea of creating a Chicken Invaders type game for the first assignment. Everything worked great in the beginning as I started to mess around with Vuforia and figure out what things I have around me that would be good markers. I remember trying out a small fridge magnet that looked cool but Vuforia’s algorithm found less than 5 feature points. This led me to using plain old QR codes that did not look cool but at least got the job done. The idea was to use a marker for setting up the game arena that would have a couple of walls to in between which some enemies would spawn and while a second marker was used for moving around the player which would shoot bullets at a set interval.
After implementing these things, having enemies spawn and the player shooting bullets, I had a hard time making the bullets register any collisions with enemies. After a couple of days of trying things out and debugging I came up with the idea of putting this implementation aside and hopefully using it for the next assignment and implementing something simpler for this first one. Up to this day my best guess for what was wrong is that as we had the enemies and the bullets on different planes (different markers) they were on different Y axis and therefore no collisions.
Markerless AR assignment
When it comes to the Markesless AR assignment my contribution was implementing the model of the solar system, making sure the size and rotation speed of the planets was somewhat close to their real-world counterparts, and making this spawn on the ground plane detected through the use of ARCore.
I feel that ARCore provided an easy-to-grasp way of developing markerless AR solutions as I do not remember having any issues with using it during the work on this assignment. Of course, I had some issues and challenges, but those were encountered due to my lack of experience with Unity as I did not have the GMD course before XRD, but that has nothing to do with the AR SDK itself and the way it worked.
This assignment helped me get a better understanding of how Markerless AR manages tracking and why some devices do a better job at it. That is done by using Simultaneous Localization and Mapping, your device’s Inertial Measurement Unit and combining the video feed information gathered through whatever camera your device has. It makes sense that some devices do a better job at AR tracking, as the processing power for handling the SLAM, IMU and camera information varies from device to device and the accuracy of the sensors could vary aswell.
VR assignment
As we decided on the idea and split the tasks, my first contribution for this assignment was to 3D scan the XR lab and have a 3D model that we could use for the assignment. This meant that I got to practice my blender skills after a period of not using it. Some other tasks that I had to do during this assignment were setting up a VR project in unity, adding the previously mentioned 3D model of the XR lab in that project, implementing a wrist menu that would be able to spawn Christmas decorations inside that virtual copy of the XR lab and implementing the functionality for drawing on a whiteboard in VR.
This assignment was for sure more challenging compared to the previous 2, but it was also more rewarding when we had everything working. Most of the challenges I have encountered were regarding the actual hardware and not being able to properly connect the headset to my laptop and have the PLAY button in Unity start the game on the headset. This got very annoying very fast, as I had to build the game every time I wanted to see if my changes in code are working. This problem made me create a different new 3D project in Unity that I have used for implementing the whiteboard functionality just because it would be easier to test it this way. When everything was done and tested I had just copied the implementation in the actual assignment project.
Looking back on this, I think that the workflow for developing applications for the Quest is more complex than regular game development for the PC and this complexity presented me with the challenges discussed above.
One important thing that we had to keep in mind while developing for the Quest headset was the application performance so that we could hit a high and stable fps. This was not a problem in the beginning, but when we added the RGB lights around the ceiling and tried making those look better with the use of post-processing the performance really took a hit to the point where we had less than 10 fps and visual bugs started to appear.
XR assignment
Like the previous assignment, this one started with me creating a 3D model in Blender. This time was a model of the first floor of VIA’s Horsens Campus. This was then used for calculating the navigation path necessary for getting to certain interest spots here on campus, such as the Caffe and Makerspace. Sadly, the implementation was not working properly at the end of the assignment. I know that the implementation was alright as Karlo tried it out on a smaller scale, his apartment, so my best guess is that I messed up some calculation and when we used it at a bigger scale the navigation path broke.
Overall challenges
I would say that the biggest challenge I faced throughout this course had nothing to do with XR development, it was using GitHub repositories with these kinds of projects that have big files such as 3D models and the Vuforia Engine SDK. I figured out quite early that I needed to use GitHub LFS (Large File Storage) but I never managed to push a commit on my first try while using this.
Having experienced all of these assignments and technologies used for both AR and VR I think that getting into XR development is not very hard as long as you have some knowledge of game development and Unity usage. Of course, having an understanding of how marker-based or markerless tracking works or how the VR headset you are developing for tracks motion and what it’s performance limitations are will streamline development.
Comments
Post a Comment