Individual reflections – Karlo Plepelic 300545

Individual reflections – Karlo Plepelic 300545

To start of my introduction to my personal reflections, I will be talking about my biggest contributions for each of the projects of this course:

·         Marker based AR – as for this project we have decided that we would be working together for the whole process of this project and therefore we have met in the XR workshop and were working together and conducting research of the set task. The first idea that came to our minds for the Marker based AR experience was a space shooter type game where we would have level design connected to one of the markers and a player “Spaceship” to fight against “bots”. In the start of the development for this project I have been a little confused regarding the options available in Unity overall and therefore wasn’t sure where to start. Having said that, with a few hours of research I have gained a pretty good idea on how to orient around Unity. In continuation of the project, I could say that I contributed the most regarding the shooting and animations of the player “Spaceship”, along with trying to figure out how to create collisions between the objects in the Unity Scene. Most of the problems I encountered with my group at that time were that the collisions were acting up and we couldn’t figure out wat was the problem. Firstly, I made sure that all the objects are instantiated at the same height from the Marker, secondly, I made sure the Spaceship projectiles have collisions turned on and by tuning up those parameters I have gotten collisions to somewhat work properly. What is more, the “working” collisions were not perfect and I was wondering how much more time I would need to invest for the collisions to work all the time, I also wondered if the problem could be coming from the Vuforia and the recognition of the markers because sometimes, the markers would not be recognized even though the lighting was bad in the room that we were doing testing, so that could be the cause of the false recognitions. Moreover we have decided to abandon the project because we have realized that it would take us longer than expected to implement a project like that, especially in the first experience with Unity. So, we have decided to scratch the idea and present different models of small 3D cars on top of the markers which was relatively easy, since it takes very little setup to do that with Vuforia. In a conclusion of the first project it was very interesting trying to create a full on experience and with the help of the presentation that our teacher has gave us and the steps to follow in the presentation, it was relatively easy to get started with Marker based AR.

 

·         Markerless AR – was where the course got even more interesting, because we learned about Markerless AR technologies that actually enable all the possibilities of creating immersive AR experiences. Firstly, we have discussed a few different platforms that support development of AR experiences in Unity and besides that I have done some home research to see the major differences between those platforms. From where we have moved onto some more advanced AR platforms like AR Glasses and discussed usages of those. And finally, to the point of how the technology of reading the images from the camera and depth sensors create AR environments. Moreover, the project for the Markerless AR was to recreate the solar system in Unity and instantiate it on a ground plane using ARCore. With the solar system instantiated we would be able to walk amongst the planets and click a planet to have a UI popup with information about that planet. My contribution to this project was to create a cursor on the ground plane using Raycast in Unity. The raycasting of the cursor to the ground plane was as simple as it could be and there are a lot of tutorials online in case of problems. The essence of the task was to create an empty project in Unity, import necessary libraries and frameworks, after which I created a script that would take the center of the camera view, cast a Ray from the middle of the screen and enable a cursor at the ground plane detected by the chosen framework. I haven’t encountered any major problems while doing so and my part of the project was done by adding a point where if the user clicked on the screen, it would instantiate a new object at the location of the cursor.

 

·         VR Project – The start of the project was to create a Christmas decorated XR Lab in VR. We have split the tasks of who will do what and got to work. My task was to spawn Christmas ornaments and decorations around the VR environment, so I started by creating a VR Scene with the Oculus headset and importing necessary libraries and frameworks. I have inserted models of VR controllers so that the experience would be a little more immersive because I have found the option in the components to do so. From there, I went to create logic or better said I enabled logic for teleporting around the VR world by pressing specific buttons on the VR controller. The teleportation game me problems because it was assigned to a button that I wanted to use for another feature and I couldn’t figure out a was of remapping the button responsible for teleporting, at that time I was using a supplied library that our teacher have given to us, responsible for mapping of the button presses, and I have found the specific button in the library mappings but I could not change it to anything else because than I would remap the button of selection, and simply that button would not be possible to be used. A workaround that I have found in the scripting process was that I can strictly create a script for teleportation but that would be too much hassle and I would spend too much time working on it. Therefore, I have continued to the implementation of the instantiation of the object from a button press by Raycasting to the floor of the VR world and spawning an object at that place. I have used a library from the Unity Asset Store to get the assets of Christmas ornaments into my project, from where I started to develop logic for grabbing the object and moving it. My development steps of the mentioned logic were:

o   Create a point at which the object will be grabbed at so that you are not for example holding the Christmas tree at the bottom of it but rather at the middle of the tree

o   Create a Raycast that would check if the controller was pointing at the movable object

o   If the Raycast was pointing at the object, by a click of a specific button grab that object and move it with the movement of the VR controller

 

·         AR Assignment – A project we decided to take on was to create a navigation AR application for the school rooms. We have taken a 3D model of the ground floor in our school and created a navigation system for the rooms around that floor. My task then was to create the first part of that project that would create a line from the player’s position to the designated location. The process that I followed for that is that I scanned my apartment with a 3D scanner application with my phone, from which I managed to extract exact measurements of my floor plan that I utilized in Unity. In addition, I started to create a script that would take the location of the player at the launch of the app (at this time it was hardcoded location) and by the movement of the player, the player mark would move in the AR space. Moreover, I continued to create a “Navigation Agent” which is a feature of Unity actions, that can map the specified floor and player to the space and determine the possible path the player would need to take in order to get to the desired location. I made a line that would demonstrate to the player which path he needs to take and that was the part of the project that I was responsible for. While developing the last feature I had problems of misaligning the hardcoded location of the first launch of the app and then the path would be a little off but we would solve this problem in the future by scanning the marker that would remap the exact location from the player to the possible navigation space.

  

Comments

Popular posts from this blog

Individual reflections – Vlad Bobeica 293145

Week 46 - One to last week of XR Project Lab Hours