We often forget how susceptible we are to the dangers that arise from technology as a result of our dependence on it. This has brought forth the major issue of the rise in incidences of texting and driving. In 2017, 3,166 people were killed in motor accidents involving distracted drivers in the United States alone. Even more alarming is that drivers between 15 to 19 years constitute only 6% of the total driver population as of 2016, yet over 13% of text and driving accidents involve drivers in this age group. Now more than ever, there is a need to address this issue by helping the general population understand that one brief moment of distraction is all it takes to get into a car crash. We are confident that the Crash Course VR experience will help in addressing this issue.
Crash Course creates an environment that is both interactive and impactful to illustrate just how quickly a car accident may occur. When a player grabs the phone, they activate the car crash scene. With the design of intuitive UI and the strong imagery within the VR space, the hope is that players experience and learn in a powerful and memorable way about the danger of texting and driving.
Crash Course hopes to bring our VR simulation module to public institutions, such as high schools and driving schools, to raise the alarm on the dangers of distracted driving. We hope to bring this learning opportunity to the general population and also particularly to the following communities:
- new drivers often unprepared to understand
- young adults, digital natives, often distracted by evolving technology
- overconfident drivers prone to bad driving habits
The most common trauma incidents occur via motor-vehicle injuries. Medical Professionals often view trauma as a disease that is absolutely preventable. Unfortunately, unlike cancer, public opinion towards trauma injuries is often stigmatized as "deserving." Rarely do people raise funds to aid hospitals trauma divisions. As such, not only do we wish to raise awareness to the risk of texting while driving, but we also aim to increase people's perception of their susceptibility to such traumatic events.
We integrated the use of Unity with Oculus Rift. We divided the tasks into the following:
- Setting up Oculus Rift and its accessory devices
- Designing Unity scenes using 3D modeling and material properties
- Animating Objects by learning C#
- Scene conversion
- Carefully manipulating collision and physical settings to avoid unrealistic bugs
- Integration of Unity into Oculus Rift
- Create data visualization using MATLAB
- Asking other attendees to serve as test subjects
- Build live website using Google Sites and registering it to Domain.com
- Perhaps the most difficult step was integrating our software with the hardware. Despite being the first issue we tried to tackle, we could not get Unity and Oculus to communicate until the end of Saturday night. Connecting to Oculus Rift was one of the most important aspect of our project, so we were extremely relieved to have made it happen.
- At one point, the laptop storing most of our code was completely erased because of a mysterious power issue. Despite the major setback, we were able to recreate most of the code and animations. Phew!
- Lastly, we struggled with the Oculus Remote from the very beginning. Based on the short descriptions of the device one single web page and video available online, we were not able confirm its capabilities. Although an article mentioned the remote's ability to control 3 degrees of rotation, we found that it did not have that functionality. It was a simple remote with only 1 button, so sadly we could not integrate it in the end.
- Most of us have never worked with Unity before, nor coded in C#, let alone set up a connection to the VR headset from our laptops. The procedure was surprisingly time-consuming and full of mind-boggling obstacles. Yet, despite the challenges, we overcame the problems one at a time.
- Because of the variety of issues we had to deal with, at one point we had to scale down the scope of our project and put aside our intentions to add a time component and fading effects between scenes. Luckily, we we were able to solve most of our issues and integrate our research on these more complex variables in the end instead of throwing them away.
https://www.crashcoursevr.tech
It only takes a moment's loss in focus to lose your life. Please click the link above for detailed information that accompanies our VR project.
We were able to illustrate just how quickly the scenario can change using the tap of the space key. We would like to give a more immersive experience to our users, allowing them to engage with the simulation environment more via a touch controller, if possible. Touch controllers would make the simulation more realistic, and be able to convey our overarching message that the control of that environment is in the users' hands; it is up to them to choose safety over a moment of distraction. It is our belief that the simulation will raise awareness in the dangers of texting while driving, and consciously choose to avoid such scenarios in the future.
##Video Demo Please DO NOT watch if you intend to experience the full VR experience at our demo. https://youtu.be/-OJ00xldtrE ##DevPost Link: https://devpost.com/software/crash-course-vqwlim
Selina Huang, Tiffeny Chen, Evelyn Liu, My Nguyen, Agnes Sharan