AR Using 3D Objects

Technology

Unity

Objective

The objective of this experiment is to create an Augmented Reality with 3D objects as the target.

Experiment

Create a Unity project, switch the platform to Android, and make sure to enable the Vuforia Augmented Reality engine in the XR Settings. Once everything is set up, insert the AR Camera and 3D Scan for the image target.

The Object Scanning Target

Scan your object using the 3D scan Vuforia app on android. Users will have to scan the object using the Vuforia object scanning target region (refer to the image above).

Place your object on the bounding box (refer to the image above). Once the object has begun to be scanned, the surface region will appear. Press record and start scanning the surface region. If the region has been scanned, the section will turn green. Keep recording until users record the majority of the region (refer to the image below).

Once the object are all scanned, stop recording and export the file (refer to the image below).

Upload the image on the Vuforia developer portal and export the database to unity package. Once it finishes downloading, import the file to unity, and set the image target to the package (refer to the image below).

After inputting the license key in the Vuforia Engine Configuration, users can begin inputting AR attributes to the surface. In this case, I created a cube inside the surface area.

Now users can test it out (refer to the image below).

I also incorporated the depthmask shader in this experiment. The depthmask shader draws faces which are invisible, but which still appear in the depth buffer. This allows you to prevent objects from being drawn where they are occluded by the mask. Objects using this shader will be drawn just after regular opaque objects. However, the shader will prevent subsequent objects from being drawn behind it.

I created a cube to resemble the box to insert the shader and moved the cube on top of the shaded cube.

The final experiment:

Challenge

The tracking was quite a challenge as a clear space is needed to achieve a more accurate tracking.

Conclusion

Although this experiment is quite basic, I hope to incorporate the 3D scan aspect of this experiment into my final project.

AR Animation Trigger Experiment

Technology

Unity

Objective

The objective of this experiment is to create an animation in AR that could be triggered by certain events

Experiment

Before starting the experiment, users have to make sure that they have switched their platform into Android, enable VR Support in the XR Settings, and set the SDK into the built in Vuforia SDK.

After configuring the target object in the Vuforia Developer platform, download the unity package and import it to the project. Then, set the AR Camera and Image Target to the scene.

Create a cube on top of the Image Target. Insert material into the cube if necessary. Then create an animation of the cube. In this experiment, I animated the cube to appear and disappear by recording the scale attribute.

Create an empty state as default state, then add a script to the cube. Below is the code

The script functions as a trigger to the cube. If users were to press the button 1 during default state, the cube will appear, and so on. After adding the script to the cube, the experiment is completed.

Challenge

There was no challenge faced in the experiment.

Conclusion

It is hoped that this experiment will contribute to make a more intriguing and interactive addition to the whole campaign.

360 Camera Experiment

Technology

Insta360 One X is a 360 camera created by Insta360. It is the second generation device from it’s previous Insta360 One camera. It is able to shoot in 4K resolution, slow-motion shooting, flowstate stabilization, and more.

The Insta360 One X provides add-ons like an invisible selfie stick, bullet time stick, and more. Just as the name suggests, the selfie stick will not appear on the rendered video when used. This experiment will be utilizing that feature to shoot videos.

Objective

The objective of this experiment is to try out the video features of Insta360 One X and figure out ways to incorporate it to the final outcome.

Experiment

The Insta360 One X comes with a mobile app for preview during shooting and to manage the video files better. Upon connecting to the camera, users will be greeted with this interface.

(image)

Prior to shooting the video, users may try out the built-in features such as the LOG mode, and shoot with different resolutions. In this experiment I tried out the LOG mode feature. This feature is a way of recording the video to maximize the dynamic range and latitude for color grading (i.e., adjusting the colors to achieve a particular visual effect). Below is the comparison of footage in LOGmode, and normal mode.

(image comparisons)

I tried to color correct two footage in LOG Mode. Here are the comparison of both footages:

LOG Mode
Edited

Challenges

Although it was challenging to experiment with new technology in general, there was no particular challenge found during the experiment.

However, I learned that the protective casing of the camera could cause interference in the invisible selfie stick feature, as shown on the footage above.

Conclusion

Through this experiment, I learned that incorporating the features and add-ons might be beneficial to the end product and create a really immersive first person experience in VR.

NMS Nightfall – 26/10/19

On the 26th of October 2019, National Museum of Singapore held a one day only event called Nightfall. Nightfall is an interactive experience that will reshape the way you experience history at the National Museum (as cited on National Museum of Singapore’s Instagram).

entrance

The interactivity in the exhibition is quite simple yet fun and memorable. I would say that, as a visitor, the experience was really easy to understand and didn’t require a lot of digital knowledge to interact with it.

Upon entering the exhibition, visitors will be greeted with a 180 degrees curved screen which displays an animated story of Singapore’s history.

180 degree curved screen

Then, visitors will enter the exhibit. The exhibition includes a lot of artifacts and small areas which displays digital information. One of them includes two spin-able globes that are connected to a projector, displaying the information. Visitors are supposed to stand on a specific spot facing the globe. The globe will display a title and whichever title the viewer is facing will be displayed on the projector.

the globe area

There was also a segment which displays plants and animals of Singapore. Visitors are able to learn and understand which plants were used for medications by placing the knobs displayed to the pictures on the wall. If the visitors guessed correctly, the images will light up brighter. Visitors could also smell the local spices such as cinnamon, pepper, nutmeg, mace, and so on.

Interactive Plant Area

Furthermore, two digital kiosks were provided for viewers to read further about Singapore’s history.

digital kiosk

Other than the interactivity, to make the event more entertaining and fun, they held a scavenger hunt. However, sadly, I did not get the chance to try it. From what I remember and heard, the scavenger hunt was application based, so participants will have to download the application before playing. As for the technicalities and how it works, I could only assume that the application will provide clues to the hunt. Overall, it was a fun experience and I hope to gain inspiration and incorporate some aspects of it to my final project.

Integrating Unity VR in Android Devices

Objective

The objective of this experiment is to experiment with the ChangeScene function with VR and integrating with the Android bundle.

Experiment

By combining all the previous experiments, I managed to integrate between VR scenes.

After integrating it, I wanted to build and run the project using the Android platform. After setting the XR Settings with Google Cardboard, the project is ready to be build and run. Below is the end result:

Challenge

The Android back button does not work within the application, thus a script needs to be uploaded in order to enable the function. Below is the script.

Conclusion

The experiment is quite simple as of now, however it will be beneficial to the project in the future.

Changing Scenes in Unity

Objective

The objective of this experiment is to evoke the Change Scene function in Unity by pressing a button.

Experiment

In this experiment, I have created two simple scenes containing of one sphere. In scene “Blue,” the sphere color is blue, and in scene “Red” the sphere’s color is red. Below are the images of the scene.

Blue Scene
Red Scene

I also created a button scene for the UI.

After creating all scenes, now we have to evoke the ChangeScene function using the OnClick event in Unity.

Firstly, a script have to be created. In this case, I named the file “MenuScript.” Through this script, we are able to load the scenes placed on the OnClick event.

Then, attach the script to the main camera. The script may be attached to any object as long as it is always available in each scene. After that, link the OnClick event to the main camera on each button (Refer to the image below).

Once everything is set up, run the project. Below is the final result of the experiment.

Challenge

There are no challenges found in this experiment

Conclusion

Although this experiment is extremely simple and still needs to be developed more, it will be beneficial to the end product.

Project Ideas

This project involve a quite sensitive topic. It is best to create something in which the participants can experience certain situations hands-on and in first person; thus, VR comes to mind. However, this project will not only be focused on VR, it will be a campaign involving other technologies such as AR.

For the VR idea, so far these are my options:

  1. A 360 experience of the participant in the eyes of a victim of gender stereotyping which leads into career choices.
  2. A VR experience where the participant plays as a woman applying for a job and is being interviewed. Participants will be abe to decide how the story progresses based on the choices provided.
  3. A VR/360 experience where the participant will play as a woman in a simulation in which the world hasn’t progressed much and people are still enforcing certain stereotypes to women just like decades ago.

From all the ideas presented above, in my opinion, number 2 will be the best option as it gives a first-person experience on how gender stereotyping affect women’s work life. As for the part in which it affect one’s career choice, I think it will be best to display it through small illustrations that will involve AR or perhaps a hand book containing all the information regarding the topic.

Implementing VR footage in Unity

Objective

In order to further develop my research on VR, I will be experimenting on how to implement a 360 footage into Unity.

Experiment

The experiment is to figure out how to implement the video and moving the main camera inside the sphere in order to view the 360 video. It should be noted that the 360 video should be in a Equirectangular format.

Once the video is placed on the sphere, we will have to flip the skin so that the camera will be able to view the video in 360. This is achievable by using a custom shader and material. Below is the code used for the shader.

Once the shader and the material is implemented to the sphere, you will be able to view the video in 360. Below is the final result of the experiment.

Challenges

The challenge of this experiment was figuring out how to implement the video inside the sphere and finding the correct code to implement it. Other than that, no challenges were found.

Conclusion

This experiment is very basic but necessary to the prototype. I will be further experimenting with Unity’s VR engine in order to develop the project.

Virtual Reality Makes Us More Human

As previously stated, in VR, users will be transported into this simulated environment, which they can immerse themselves in and experience the story first hand. This benefit is crucial in creating an impactful result to the user. An interesting statement that further supports this would be a statement conveyed by Chris Milk on his TED Talk in 2015, “How virtual reality can create the ultimate empathy machine.”

“It (VR) is not just a video game peripheral. It connects humans to other humans in a profound way that I’ve never seen before in any other form of media. And it can change people’s perception of each other. And that’s how I think Virtual Reality has the potential to actually change the world.”

Chris Milk, 2015

Furthermore, in The Machine to be Another, created in 2012 by Be Another Lab (an international art collective), empathy is challenged. This project is an Embodiment Virtual Reality System that allows individuals to experience the world through the eyes and body of another (Be Another Lab, n.d.). This art performance aims to stimulate pro-social behavior and overcome intergroup social barriers (Be Another Lab, n.d.). In The Machine to be Another, participants can sit in a chair and swap perception with a performer through VR; they may see another person’s face in the mirror and hear the spoken “thoughts” of a performer through the headphones (Wired, n.d.). According to Wired, n.d., through this performance art, psychologists, neuroscientists, and researchers in six countries can explore issues like mutual respect, gender identity, physical limitations, and immigration. 

Other than The Machine to be Another, researchers from Stanford University in Virtual Human Interaction created a project called “Empathy at Scale.” This project aims to explore ways to design, test, and distribute virtual reality projects that teach empathy (Wired, n.d.). Participants of this project will experience a scenario from the perspective of others through VR (Virtual Human Interaction Lab, 2015). 

Furthermore, in Clouds Over Sidra, a 360 film created by Chris Milk, participants are placed in an environment they can experience as themselves. It tells the story of Sidra, a 12-year-old Syrian girl who has lived in the Za’atari Refugee Camp in Jordan since the summer of 2013. Through this project, Milk aims to utilize VR to generate greater empathy and new perspectives on people living in conditions of great vulnerability (United Nations Virtual Reality, n.d.). This is implemented best through VR as Milk explains that experiencing something first hand will create a bigger impact as the participants will feel her humanity in a deeper way; thus, empathize with her in a deeper way (Chris Milk, 2015).





References

https://www.vrs.org.uk/virtual-reality/what-is-virtual-reality.html

https://www.marxentlabs.com/what-is-virtual-reality/

https://electronics.howstuffworks.com/gadgets/other-gadgets/virtual-reality.htm

https://www.pebblestudios.co.uk/2017/08/17/when-was-virtual-reality-invented/#targetText=The%20first%20virtual%20reality%20headset,and%20his%20student%2C%20Bob%20Sproull.

https://www.wired.com/brandlab/2015/11/is-virtual-reality-the-ultimate-empathy-machine/

https://www.ted.com/talks/chris_milk_how_virtual_reality_can_create_the_ultimate_empathy_machine?language=en#t-607945

http://beanotherlab.org/home/work/tmtba/

https://vhil.stanford.edu/projects/2015/empathy-at-scale/

http://unvr.sdgactioncampaign.org/cloudsoversidra/#.XaVkakYzal5

What is Virtual Reality?

Image result for virtual reality
VR User

Nowadays, Virtual Reality (VR) and Augmented Reality (AR) has taken over the newest technology market, in spite of its first invention tracking back to 1968. This technology explores one’s sense of reality with the use of computer technology by creating a simulated environment (Marxent, 2019). Even though this technology concept may seem a bit foreign years ago, it has become mainstream and since been implemented in various industries such as entertainment, sports, fashion, and many more.

Image result for virtual reality and augmented reality difference
AR vs. VR

Although both of them are quite similar in concept, their user experience is entirely different. To put it simply, VR users can immerse themselves in a new environment, and AR users can experience an enhancement of the world around them. Furthermore, VR users will only be able to interact with the simulated environment whereas AR users can interact with both the real world and simulated world simultaneously. Other than that, the devices used to display both technology are completely different. VR users will typically use a VR headset such as the Oculus and Gear VR, while AR users may be able to use their phone to display the environment. This crucial difference might be an advantage or disadvantage depending on its implementation.

This research blog will further discuss the benefits of implementing VR in the topic “How Gender Stereotypes Affect One’s Career Choices and Work-life.” This topic, in my opinion, is quite sensitive. Empathy is crucial to be able to feel and understand the victim’s struggles. Thus, it is essential to choose the right medium to convey this story to create an impact on the audience. VR would be a suitable medium to use for this topic as users will be able to experience the problem first-hand.





References

https://www.vrs.org.uk/virtual-reality/what-is-virtual-reality.html

https://www.marxentlabs.com/what-is-virtual-reality/

https://electronics.howstuffworks.com/gadgets/other-gadgets/virtual-reality.htm

https://www.pebblestudios.co.uk/2017/08/17/when-was-virtual-reality-invented/#targetText=The%20first%20virtual%20reality%20headset,and%20his%20student%2C%20Bob%20Sproull.

https://learningenglish.voanews.com/a/augmented-reality-versus-virtual-reality/3844772.html