Experiential Design


8th April - 14th June 2019 (W2-W11)
Jaslyn Tjhai (0333039)
Experiential Design
RMIT Collaboration

Ideation

◔ Week 2

8th of April 2019
Fig. 1.a. Brainstorming
First day of collaboration is all about ice breaking and getting all of us into groups where we started brainstorming for ideas and conceptualizing it. We came up with a lot of vague ideas since the theme is Cities. Our decision came to finalize when we thought of the concept of showing the good and bad side of the city, where the idea is to show a glimpse of a bad side of the city to let users realize that positivity is always the main focus whereby negativity is neglected.

Our initial execution for the first presentation was to use projection mapping where the user will be in a room-like installation, having the good visuals always being the focus and therefore appearing in front as well as on the sides, while the bad visual will appear behind the user. User will be given a mirror for them to hold because that way, users will only be able to catch a glimpse of the bad visuals.

However, based on feedbacks, we had to ditch the mirror as it’s making the whole experience unprofessional. We ended up planning to play with timing itself where the sounds will act as cues, and as the user turn around to look at what’s behind them, they’ll be able to see the bad visual for around 2-3 seconds before it transform into a good visual again.

Fig. 1.b. First presentation on rough ideas and concept

9th April 2019
On the second day, out team discussed deeper to further finalize our ideas where upon discussion we can to think about the idea suggested by Dr. Li Ping. The execution will be in the form of a curved projection where as the users move around, 80% of what’s in front of them would be showing the good visuals but the small section at the side will be displaying the bad visuals. This way, users will only be able to see the bad visuals through their peripherals.

We also had industry people from Studio Behind 90 to give us a short demonstration on 360 illustration for VR.
Fig. 2.a. Studio Behind 90 VR Demo

After consulting Mr. Razif however, we were asked to venture into VR to make the whole experience more immersive where sounds will also be playing a crucial role in this. All of us agreed on it and started to discuss further.
Fig. 2.b. More discussion

10th April 2019
The second presentation was held today where we were to finalize our ideas and start executing afterwards. Our visualization was, user will start of either standing or sitting down looking at 5 screens where 3 screens in front of them would be the good visuals while the other 2 at the side will be showing the bad visuals.

Presentation Slide


Fig. 3.a. A mock up for our presentation

For the interaction itself, we were thinking of having the room to be deteriorated and crumbled down in the end to get the user into the mood. However, upon presentation and feedback, we were asked to actually let the user interact with whatever’s in the room. A way to get the user to actually have some sort of control instead of just watching, so that everything is not so linear.

11th April 2019 - A Trip to Ipoh
For day four, a field trip to Ipoh was arranged for us in hope that the trip can also provide us with some inspiration for our project.


Fig. 4.a. Reaching the first peak

Fig. 4.b. Reaching the second peak
12th April 2019
Our group discussed on what to film for Monday as well as researches to be done during the weekends so that we are all prepared on Monday once the hardware arrived. Mr. Razif borrowed us his Oculus Go for us to have an idea on what VR actually look and feel like since some of us never tried it before. So with that I was in charge of figuring out how to make it connect to our laptop through Unity. However, since Mr. Razif forgot to un-link it beforehand, I wasn’t able to do much during the weekends.

Fig. 5.a. Umar testing out the Oculus Go

However, I did found a lot of tutorials to connect Oculus Go to Unity which I will be utilizing a lot as references.



First Step to Exploring

◔ Week 3

15th April 2019
Our team went out to film footages on Monday while I stayed back to figure out how to work out the Oculus Go after having Mr. Razif un-link it. After a tedious process, I finally managed to do it just in time for the team to come back and test it out. That was also when the hardware arrived so we had to test it out again on HTC Vive.

Our team stayed back to try setting up the HTC Vive so that we have an idea on how to do so on the next day. Once we successfully did so and was able to connect Unity with the Vive, we call it a day and went back to rest.

Fig. 6.a. Excited to try the VR

Fig. 6.b. Setting up and testing the hardware


The Beginning of Experiments

16th April 2019
The day started with us trying to set up the hardware once again as there were just multiple bugs. I was trying to figure out a few scripts for interaction but was actually kind of confused on which to prioritize. So throughout the day, we were only able to just project it in Unity while reviewing the model here and there.

By 2.30 PM, we had a small presentation in our own working area to sort of explain our progress so far, what we’ve done, what are the challenges we faced as well as our next action steps. Before presentation, I was able to try out the script for raycast on Oculus Go using SteamVR to ensure that the script won’t be that much different once I move it to HTV Vive. We ended up being able to have the videos appear only after the user gaze at the windows for a few seconds.

After presentation and feedbacks from lecturers, we continued working on our project where some are doing editing on videos, sounds, models, while me and Jesslyn watched tutorials and try it out on the hardware, going through a lot of trials and errors. By night time, I was finally able to find a decent tutorial on picking up and dropping objects. Once I was finally able to apply it to HTC Vive and all of us were so excited about it that we just played around with the interactable cubes.

Fig. 7.a. Playing around with the newly implemented pick-up-script


The Start of Bugs and Errors
...and a slight change

17th April 2019
Wednesday is the day of even more coding, fixing errors and more trying. For today the focus was to trigger different videos to play. Took all of us a while to finally sit down and really focus on the script. Umar understood the coding language way more than me so he was assisting me as well together with the help of Mr. Razif. Once we finally got the scripts to work, we took a break for a bit. 

Fig. 8.a. Workplace for the week

Once we got that working, we started to work on another trigger event for the bad videos to play. So throughout the day, just been watching tutorials and testing out scripts as well as improvising it. The sounds are also being worked on by Louis. I had to watched a lot of tutorials and figure out how I can actually try and implement in the VR script itself, so there was a lot of errors that I was trying to solve.

Fig. 8.b. Everyone trying to solve a problem

By the time the trigger works, we kind of thought about the point and message of our whole project, but we don’t really see it, so we started discussing. Conclusion of the discussion, we decided to change the flow of the whole project while trying to keep our concept in check. The flow being, user enters the room (start) and triggers the good videos to play, when suddenly a trashcan appears as well, tempting them to interact with it and making it topple down. As the trashcan toppled down, bad videos are triggered, and trashes appear. Users will have to place the trashes inside the trashcan to trigger the good videos to play again. Therefore, message of the story is to be considerate and not ignorant on the surroundings. After confirming on that, we list down a few things needed to be programmed and looked for tutorials for it. With that we called it a day.

18th April 2019

Thursday was pretty much similar to Wednesday where all of us were focused on doing our own work. We managed to program the trash to appear together with the good video, although we wanted to work on the timing for that, but we didn’t really have the time for now.

Fig. 9.a. Default workspace

At this point we’re trying to figure out how to do that as the trashcan has a collider on its own but we wanted it to also have a trigger. We were also trying to figure out how to have the collider shaped like the trashcan. After much thinking and trying, I ended up placing tons of box collider and editing the shapes on the trashcan to make it like a bin, where the final collider inside the trashcan will as a trigger.

Fig. 9.b. A stressed-out look while working on the scripts

After a while though, we noticed that whenever the controller picks up an object, the object itself kept on turning on its own as well as flying elsewhere when we drop it. So we were wondering what was wrong with it. I tried to look back at the tutorials and found that it may be because we added a collision function in when trigger function is already there, since when the controller picks up the object, it enters trigger mode, but the collision still happens and therefore the object that is picked up isn’t still and keeps on moving, same thing when we drop the object. With that, we tried to have the collision function disabled whenever the trigger function is called out and vice versa. Thankfully it worked, but we have another problem with the trash can. There’s just so much problems where we tried and faced errors tons of times. But it was honestly fun trying to figure out the whole thing, trying to understand the programming. This experience definitely allowed me to understand more about the coding language and think more critically and logically.

We ended up still staying up late to completely finalize our prototype before the presentation tomorrow.
Fig. 9.c. Working on the coding

19th April 2019 - Presentation Day
The day of the presentation to panelists from industry, we were able to add in a little bit of sounds in the very last minute.

Fig. 10.a. Setting up and testing
Fig. 10.b. Faces before presentation

Fig. 10.d. Faces during presentation 2
The feedback that we received from the industry people were all straightforward where they question the reason we use VR because the immersive element can’t be seen from the prototype. We were also asked to think about the concept more because based on the new flow, there seem to be two different kind of concept going on.

Fig. 10.c. Faces during presentation 1
Fig. 10.e. A group picture of Mirrorless
Fig. 10.f. A group picture of everyone


This is where things went...

◔ Week 4

We were given a break.

◔ Week 5

30th April 2019
After a week of break, we finally get back on track to continue working on the project. We were introduced to Rumii as a VR platform for collaboration. So for the day itself, we were asked to test it out and also to contact our RMIT team members on the next action steps. Mr. Razif also gave us a small task where we were to analyze and identify the things we can do to improve the user experience for our project.
Exercise from Mr. Razif

1st May 2019
We had a meeting on Google Hangout with our team today where we finalized the things that we wanted to change up for the continuation of the project. Our team members were having class at that moment so we finalized our idea by getting approval from Dr. Li Ping as well. We pitched two ideas where the first one is based on the industry people’s feedback on creating a virtual reality environment for users to venture around and see. However, for this idea we lack interactivity. Another idea was that we allow users to build cities where as they build, the bad side of the cities will also be built as well to show user that there are always consequences to what human do. We proposed having the bad side to be built behind the user so that it’s in user’s blind area and they’ll only notice once they looked around.

Fig. 11.a. A video call with the team


2nd May 2019
Our team consulted both Mr. Kannan and Mr. Razif on our updated idea in class today where the feedback we get was that, making our idea into a sandbox kind of interaction is only going to over-complicate things. Mr. Kannan and Mr. Razif explained their concern on the technical issue as well as the repetitiveness of the whole idea, where the message will not be clear to the users as well. They suggested us to think about the initial idea of the built environment and try to talk about it again to our RMIT team members. If we were to proceed with that idea, while it may be achievable, it’s going to be very challenging.

Meanwhile, I tried to find tutorials on building game using grids in Unity to do some testing.

5th May 2019
We tried to talk to our team members in RMIT regarding the suggestions and feedback provided by Mr. Kannan and Mr. Razif, where they’re against the idea of adjusting again, but we ended up deciding to have another video meeting on Monday during their class as all of us aren’t really free on this day.

◔ Week 6

6th May 2019
We had a discussion with the team today regarding the ideation where they’re still against the idea of changing it so we decided to go with the building idea, which is the second one. However, we propose that we might have to scale It down due to optimization, to a tabletop kind of experience which they’re not really satisfied. We ended up having an option between having the buildings slightly bigger but only up till knee height or having it on a tabletop to which Dr. Li Ping disagreed and rejected, news which we received after our meeting, due to the fact that it will be the same idea as the other team. So in the end, we went with knee-length height for our upscale.

◔ Week 7

13th May 2019
With all the feedbacks from our lecturer, we were kind of pessimistic regarding this upscale version of the project where everything just went for a different direction so we were trying to see if we can just stick with the upscale version that we planned initially before RMIT went back to Australia. However, due to disagreements, we decided to still proceed with this idea. It was pretty difficult as we weren't able to test it out on the real hardware due to limitations.

Based on the meeting, I tried to create a rough prototype on unity to see what the whole project would look like.
Fig. 12.a. A sample building script based on tutorials

15th May 2019
Umar was done with modelling the houses so I was in charge of placing it in unity. We were also advised to try and insert a plane to show the building area so that user won't be mistaken that they can build the houses as far as they wanted.

Fig. 13.a. Progression

◔ Week 8

20th May 2019
I tried to convert the script for it to be compatible for VR again but apparently, it still doesn't work from their side and I've no idea what the error was.

Fig. 13.a. Compilation of errors on Unity
Fig. 13.b. Errors

21st May 2019
We tried to figure out how we can test out the file to see if it's compatible with the hardware, so we tried to build the file on unity first. However, that method did not work as well.

◔ Week 9

24th May 2019
I passed the unconverted Unity file to our RMIT team mates last week so they tried to convert it to VR from their side which they got working. However, the building script was tracking the headset instead of the controller.

◔ Week 10

14th June 2019
            Throughout the weeks, I've been trying to see if I could help them figure out how to solve the converting issue but was still unable to solve it so in the end. They decided to use the prototype that was created during the 2 weeks they were here for their exhibition instead.

15th June 2019
Pictures from exhibition.

Fig. 14.a. RMIT Exhibition
Fig. 14.b. RMIT Exhibition


Expansion

I was able to explore more on 3D animation due to the expansion that we were required to do. Below are a few progression regarding my execution on the expansion project. Since I was in charge of animation, that was what I explored.

Fig. 15.a. Placing the environment and character together

Fig. 15.b. Close up on the food

Fig. 15.c. Close up on the models

Fig. 15.d. Close up on the flies

Fig. 15.e. Animated bad scene

Fig. 15.f. Close up on the fly animation

Fig. 15.g. Animated good scene
Design Report


Reflection

This module and collaboration allowed me to explore and really experiment on Virtual Reality for the first time. I managed to experiment both on Oculus Go and also HTC Vive where a lot of new software were utilized such as RiftCat and SteamVR. The learnings that was acquired from this semester, in my opinion, were not really at the maximum due to the way everything's planned out. However, I believe that if we were to actually carry out the semester normally, it would've provided each of a us with a better project planning as well as knowledge and exploration. Nevertheless, it was a different kind of experience for me this semester and it was great that I still get to explore VR and get an idea on how it actually works in the first place. I was also gained a slightly deeper understanding on programming in Unity.

Comments

Popular Posts