VR – Button Appear After Delay

Technology

Unity 3D

Objective

To make a button appear after a set time. This experiment will be utilized in the VR project during the appearance of choices.

Experiment

Before starting the experiment, create a button. Then, add an animation to the button through the animation window. To create the animation, users will need to set a property to the animation. In this case, the scale property will be used. Set the scale on the first second to zero and the last to users’ desired size.

Then, open the button’s Animator window. Create an empty state, rename it to “none”, and set it as default state. This is achievable by right clicking the empty state. This is done so that the button will not trigger the animation previously made upon starting the scene.

Before continuing to the next step, make sure to set the button’s scale to 0 so that the button will be invisible. Next, create a C# code to delay the animation. Below is the code for reference.

Based on the script above, the button’s appearance is delayed by 2 seconds from the start of the scene. Implement the script into the button, and set the Animator as the button’s animator.

Final Outcome

Challenge

The biggest challenge of this experiment is the fact that C# is not taught to students from Raffles Jakarta during their diploma due to difference in curriculum. This is a major setback as now most of the students are encouraged to use Unity to complete their final project.

Conclusion

Overall the experiment was a success although it took quite some time to figure out.

References

https://forum.unity.com/threads/delay-an-animation.434645/

AR Using 3D Objects

Technology

Unity

Objective

The objective of this experiment is to create an Augmented Reality with 3D objects as the target.

Experiment

Create a Unity project, switch the platform to Android, and make sure to enable the Vuforia Augmented Reality engine in the XR Settings. Once everything is set up, insert the AR Camera and 3D Scan for the image target.

The Object Scanning Target

Scan your object using the 3D scan Vuforia app on android. Users will have to scan the object using the Vuforia object scanning target region (refer to the image above).

Place your object on the bounding box (refer to the image above). Once the object has begun to be scanned, the surface region will appear. Press record and start scanning the surface region. If the region has been scanned, the section will turn green. Keep recording until users record the majority of the region (refer to the image below).

Once the object are all scanned, stop recording and export the file (refer to the image below).

Upload the image on the Vuforia developer portal and export the database to unity package. Once it finishes downloading, import the file to unity, and set the image target to the package (refer to the image below).

After inputting the license key in the Vuforia Engine Configuration, users can begin inputting AR attributes to the surface. In this case, I created a cube inside the surface area.

Now users can test it out (refer to the image below).

I also incorporated the depthmask shader in this experiment. The depthmask shader draws faces which are invisible, but which still appear in the depth buffer. This allows you to prevent objects from being drawn where they are occluded by the mask. Objects using this shader will be drawn just after regular opaque objects. However, the shader will prevent subsequent objects from being drawn behind it.

I created a cube to resemble the box to insert the shader and moved the cube on top of the shaded cube.

The final experiment:

Challenge

The tracking was quite a challenge as a clear space is needed to achieve a more accurate tracking.

Conclusion

Although this experiment is quite basic, I hope to incorporate the 3D scan aspect of this experiment into my final project.

360 Camera Experiment

Technology

Insta360 One X is a 360 camera created by Insta360. It is the second generation device from it’s previous Insta360 One camera. It is able to shoot in 4K resolution, slow-motion shooting, flowstate stabilization, and more.

The Insta360 One X provides add-ons like an invisible selfie stick, bullet time stick, and more. Just as the name suggests, the selfie stick will not appear on the rendered video when used. This experiment will be utilizing that feature to shoot videos.

Objective

The objective of this experiment is to try out the video features of Insta360 One X and figure out ways to incorporate it to the final outcome.

Experiment

The Insta360 One X comes with a mobile app for preview during shooting and to manage the video files better. Upon connecting to the camera, users will be greeted with this interface.

(image)

Prior to shooting the video, users may try out the built-in features such as the LOG mode, and shoot with different resolutions. In this experiment I tried out the LOG mode feature. This feature is a way of recording the video to maximize the dynamic range and latitude for color grading (i.e., adjusting the colors to achieve a particular visual effect). Below is the comparison of footage in LOGmode, and normal mode.

(image comparisons)

I tried to color correct two footage in LOG Mode. Here are the comparison of both footages:

LOG Mode
Edited

Challenges

Although it was challenging to experiment with new technology in general, there was no particular challenge found during the experiment.

However, I learned that the protective casing of the camera could cause interference in the invisible selfie stick feature, as shown on the footage above.

Conclusion

Through this experiment, I learned that incorporating the features and add-ons might be beneficial to the end product and create a really immersive first person experience in VR.

Integrating Unity VR in Android Devices

Objective

The objective of this experiment is to experiment with the ChangeScene function with VR and integrating with the Android bundle.

Experiment

By combining all the previous experiments, I managed to integrate between VR scenes.

After integrating it, I wanted to build and run the project using the Android platform. After setting the XR Settings with Google Cardboard, the project is ready to be build and run. Below is the end result:

Challenge

The Android back button does not work within the application, thus a script needs to be uploaded in order to enable the function. Below is the script.

Conclusion

The experiment is quite simple as of now, however it will be beneficial to the project in the future.

Implementing VR footage in Unity

Objective

In order to further develop my research on VR, I will be experimenting on how to implement a 360 footage into Unity.

Experiment

The experiment is to figure out how to implement the video and moving the main camera inside the sphere in order to view the 360 video. It should be noted that the 360 video should be in a Equirectangular format.

Once the video is placed on the sphere, we will have to flip the skin so that the camera will be able to view the video in 360. This is achievable by using a custom shader and material. Below is the code used for the shader.

Once the shader and the material is implemented to the sphere, you will be able to view the video in 360. Below is the final result of the experiment.

Challenges

The challenge of this experiment was figuring out how to implement the video inside the sphere and finding the correct code to implement it. Other than that, no challenges were found.

Conclusion

This experiment is very basic but necessary to the prototype. I will be further experimenting with Unity’s VR engine in order to develop the project.

Experimenting with AR in Unity

Technology

Unity is a cross-platform game engine released in 2005. Since its initial release date, Unity has continued to evolve and update their versions until the latest, 2019.2.3, that was released on August 2019.

Unity technology is the basis for most virtual reality and augmented reality experiences. It has consumed approximately half of the mobile games on the market and 60 percent of augmented reality and virtual reality content, including approximately 90 percent on emerging augmented reality platforms, and 90 percent of Samsung Gear VR content.

Objective

This experiment was conducted to create a simple AR in Unity using the built-in Vuforia Engine.

Experiment – Final Result

Here, I created a simple sphere object on top of the postcard to be displayed in AR.

Challenges

No challenges were faced during the experimentation; however, it was quite confusing to utilize and explore a new application.

Conclusion

Although this experiment is very basic, I hope to learn more about it in the future and utilize it in my final project.

Omnivirt

Technology

Omnivirt is a Virtual Reality (VR) and 360° Video distribution and advertising platform for brands and publishers, utilizing HTML5 Video player. They power 360° VR video playback across their core media properties and to provide 360° VR ad products that can be offered to their advertising clients. So far, they have partnered with a lot of well-known brands such as The New York Times, AOL, Vice, WSJ, Twitter, Google, and more.

Omnivirt provides a customizable hotspot tool. Users can utilize that tool to connect with another content uploaded to the VR Player, execute JavaScript commands, and redirect to an existing URL.

Objective

The objective of this experiment is to integrate interactivity within a Virtual Reality video using Omnivirt.

Experiment

You can view the experiment here

Challenges

No challenges were faced during the experimentation as the instructions were quite straightforward.

Conclusion

Through Omnivirt, interaction features in a virtual reality is much easier to integrate within the space. If it is necessary, I hope to utilize this technology in my final project.

Experimenting with Leap Motion in Unity

Technology

For this experiment, I used Leap Motion, a sensor device that supports hand and finger motions as input which also supports hand tracking in virtual reality, and Unity, a real-time development platform which could built both 2D and 3D spaces.

Objective

To experiment on how to interact with an object in a virtual reality space using hand motions.

Challenges

  • finding the right tutorial as the tutorials given are dated; thus, supporting different versions of the program.
  • not being able to utilize the new version of the leap motion software as the samples given only supports the previous version.
  • experimenting with something new will always be challenging and confusing at first

Conclusion

I learnt that i could utilize this sensory device in a virtual space to further enhance the digital interactive experience for the viewers through simple hand motions.