Posts

Showing posts from November, 2022

Week 47 - XR wrap-up & Expo

  Week 47 - XR wrap-up & Expo Contributor: Ionut This was the last week of our XR course. Before the Expo, we tried getting our assignment to work within the VIA campus. We knew that the implementation should be ok because Karlo did the same thing for his apartment and the navigation worked fine. I think that the problem for it not working within the campus was the largest scale: each small miscalculation in the 3D model created would translate into a way bigger one here on campus.  Seeing that the Expo was about to start we decided to scrap the goal of having the navigation working within VIA and showcase the demo of it working in Karlo’s apartment. Good thing Karlo was at home and could record a demo for us here at the Expo. To wrap up, this was an amazing course even though we had our challenges along the way. Looking forward to experiencing the Game Development course next semester so we can come back to the XR world and be able to do even more amazing things. XR assig...

Week 46 - One to last week of XR Project Lab Hours

  Week 46 - One to last week of XR Project Lab Hours Contributor: Karlo For this week we have connected what Ionut has been working on and what I have done in the last week where both of us completed the line navigation to the specific place with a camera view. To continue the development we have decided to create some UI where the user can select a specific place to be directed to. Furthermore we have done a QR scanning of QR codes where it would reset the player position to where the person is actually standing in real place to be able to navigate the user towards their specified ending place. Moreover the AR application is at this point done and it needs to be properly tested. The full application features offer the user the possibility to open the application and select the place they want to be navigated to, what's more, user’s can enable the navigation line to offer the live navigation to the previously specified place. Users can also see a small map of their location and the...

Week 45 - XR assignment part II

  Week 45 - XR assignment part II Contributor: Ionut Seeing that we are not doing a VR project and we don’t need to use the headsets at the XR Lab we also met online this week. During this meeting we split up the work that needed to be done. I looked into creating a 3D model of the campus while the other guys looked into the actual Unity implementation.  Earlier in the week I had received some building plans from Kasper. There was a lot of information but I managed to extract some usable plans for the first floor of the campus. Those were then used as a reference in Blender to start modelling a quick sketch of the walls. Once done, I have imported this model in Unity and followed a guide to try and have a quick POC to test on campus.  Contributor: Karlo As Ionut mentioned a guide that all of us found and decided to follow, we have been working separately on following this guide and I have decided to scan my apartment and map it so that I have a real life overlay of my apa...

Week 44 - XR assignment part I

  Week 44 - XR assignment part I Contributor: Ionut This was the first week working on the 4th Assignment. This one was pretty much a free for all, meaning we could use any technology we have learned about throughout this course. On Thursday morning our team met online to decide if we wanted to do an AR or VR project and discuss ideas. During that meeting it was decided to create an AR app which we would use for indoor navigation on the new (and way more convoluted) VIA campus.  Once we knew what we wanted to do, we began looking into how to implement this. We discovered two SDKs that could be useful and make implementation way easier, ArWay and Stardust. In the end it was clear that using these SDKs would feel a bit like cheating so we decided to do things the harder way and not use them.  That afternoon I met Kasper on campus and talked about our idea. He gave us the green light and agreed to help us get our hands on some campus floor-plans we could use.

Week 41 - VR project

  Week 41 - VR project Contributor: Vlad For the VR project, we agreed on the idea of making our XR lab into a virtual reality immersion. For that Ionut scanned the entire room with his iPhone which had Lidar sensors and depth scanning camera. The result was pretty close to reality, it needed a few adjustments for looking closer to reality. Meanwhile, I and Karlo started to work on the room decoration. I implemented the RGB light stripe with a switch button for room illumination. Karlo and Ionut worked on spawning Christmas decorations for the room, which was possible by choosing what you want to spawn from your remote attached to the hand. Also, a whiteboard where you could draw with a marker was implemented. Everything went fine until we got to test the app with the VR headsets where some performance problems were encountered, the motion was not smooth, and there was a feeling of a very low fps rate. We figured out the problem, it was the post-process profile,  the bloom eff...

Week 39 - Markerless AR assignment wrap up

  Week 39 - Markerless AR assignment wrap up Contributor: Ionut Our group met at the XR lab this week to finish the Markerless AR assignment. As discussed the previous week, we split our work so now we had to merge our implemented features and record a demo for Thursday's presentation once we had everything working. During the implementation of the Solar System, I spent time making sure that the ratio between the scale of the planets and their rotation/movement speed was making sense and was as close to reality as possible. For the scale of the planets, I have used a guide you can see here whilst their rotation speed was taken from here . The main challenges we encountered developing this assignment were with GitHub LFS and the Unity Render Pipeline rendering the Solar System materials pink. The pink materials were overcome by trial and error. Assets used: https://assetstore.unity.com/packages/3d/environments/planets-of-the-solar-system-3d-90219 Youtube Demo: https://youtube.com/s...

Week 38 - Markerless Lab hours

  Week 38 - Markerless Lab hours Contributor: Ionut During week 38 we started working on the Markerless AR assignment, ‘LeARning through AR’. Our idea was to recreate the solar system in Unity and instantiate it on a ground plane using ARCore. With the solar system instantiated we would be able to walk amongst the planets and click a planet to have a UI popup with information about that planet.  We started by setting up the project and repository together. This took a while as we had some setbacks with Git LFS and its account quota. After setting up the project we have split the tasks as such: Ionut will create the Solar System in Unity, Karlo will create an AR cursor/pointer by using Raycasts and Vlad will take care of the UI pop-ups for each planet.

Week 37 - Markerless AR class

  Week 37 - Markerless AR class Contributor: Karlo In this class we have been exploring the workings of markerless AR. In the short hand meaning we have learned some of the systems in our everyday phones that provide the markerless AR with the full functionality. Furthermore, we explored what we would need to make the markerless AR work with Unity and that is ARCore, Vuforia, ARkit or AR Foundation. Each of those alternatives offer pretty much the same functionalities like motion tracking, environmental understanding and light estimation. One of the most important features from the previously mentioned ones is the environmental understanding that offers plane finding and feature points in the real world through the device’s camera. It's also important that the feature points and plane detection is affected by a variety of real world effects like the light conditions and the textures of the surfaces, as well as the amount of possible feature points. Additionally, light estimation ca...

Week 36 - Marker-based AR Lab Hours

  Week 36 - Marker-based AR Lab Hours Contributors: Karlo To continue the work of the last week our primary goal was to implement collision between the player projectiles and enemies getting hit by those projectiles. This was a very hideous process as we were struggling on how to exactly implement such physics. After a lot of trial and error, we have tried to isolate the problems so we wanted to run Unity in the debug mode, with which we also had some problems, until we discovered Logcat that can print out logs from an Android device to Unity. In the moment of realising that it would take us a lot more time to develop an envisioned project, we have resorted to a much simpler idea that included multiple image targets and models of cars that can be scaled up and down by the pinch of fingers on your screen. As before the used assets can be found in the Github Readme file. Another important note is that we have worked together throughout the whole process of the last two weeks. This as...

Week 35 - Introduction to AR

  Week 35 - Introduction to AR Contributors: Karlo This week we have been exploring how image-targets work with the Vuforia engine. From that we have realised that there could be multiple image targets recognized at the same time so therefore we came to the idea of creating a space shooter type game. The process that we took was to have two image targets, one acting as a player and another one acting as an arena with enemies. Important note to add is that in our group no one has had a game development course prior to this course so we were still trying to catch up with Unity development by itself. Furthermore, we have met-up and tried to code and debug our coding process that was going great until we ran into some small problems that were limiting us from continuing. Those problems were mostly from the lack of knowledge of Unity. Another thing is that we have used some code from the internet, and we have also used Unity Assets that we found for free in Unity marketplace. More about...