Virtual Reality: Transformation Tool for Learning and Equity in Middle Schools

Virtual Reality: Transformation Tool for Learning and Equity in Middle Schools

To gauge the current state of Virtual Reality (VR) in middle schools in the Bay Area, I talked with Azine Davoudzadeh, Video Educator/Filmmaker, of the San Ramon Valley Unified School District. Azine is actively building curricula to get more young women involved in growing tech fields. She also teaches English/History classes to middle school students and explores how VR can impact differing subjects. Azine is developing Virtual Reality 101 weekend courses for teenagers. Currently, she is designing curriculum for a summer enrichment course she will teach July 24–27, titled “Intro to Virtual Reality” which will give students experience with 360 filming, Unity development and testing existing apps in VR.

Azine uses RICOH THETA in her classrooms.

Azine will be running an Ed Tech Connect meetup on June 30 in San Francisco, open to educators, administrators and technologists.

Azine Davoudzadeh, Video Educator/Filmmaker, San Ramon Valley Unified School DistrictKey

Quotes

"In the next 3–5 years, I see VR/AR/XR being as common as getting a laptop out to write up a paper.”

“The results of the [VR intervention study] did in fact show a statistically significant increase in interest from females towards careers in tech and VR.”

“Another conclusion from the [VR intervention study] was that VR in the classroom led to higher uses of adjectives for all students!”

1. How did you get involved in VR?
About a year ago a friend introduced VR to me in a coffee shop. I started attending meetups and testing as many apps and platforms as I could. I was blown away! I started thinking of ways that I could use VR in my classroom. I thought that this could impact student learning significantly, but I needed to find out how.
My trial by fire was building a curriculum for Virtual Reality 101, a weekend course for teen girls. I realized that VR is the perfect tool for teenagers to express themselves in new ways, while at the same time providing the chance to learn critical new tech skills.

2. Why did you decide to research Virtual Reality in Education for your Master’s Thesis?
About a year ago, I started planning the VR Teen Girls workshop. However, it was not initially successful since it was still early for VR. I noticed that at my school among the tech classes we offered only 26% of the enrollment was girls. I realized that VR has a lot of artistic, visual and design components that girls are drawn to. So I started researching how a VR “intervention” could affect female attitudes toward pursuing tech careers in their future. The results did in fact show a statistically significant increase in interest from females towards careers in tech and VR.

To test the VR intervention, I used descriptive writing samples in order to analyze learning outcomes before and after the use of VR. Another conclusion from the study was that VR in the classroom led to higher uses of adjectives for all students!

3. What technologies resonate most with middle and high school students in 2017?
Obviously, mobile phones are the most used technology among middle and high school students. However, I believe that in a few years VR/AR/XR could be the next big computing platform. As soon as the technology becomes light weight, similar to wearing a pair of glasses, I know that everyone will have a pair. Perhaps further down the line it can be reduced to a pair of contact lenses.

The types of use cases for teens and students right now are social media, gaming, and learning websites such as Khan Academy. Teens are very tech savvy and if they are introduced to XR early. In a few years it will seem normal to put on a head mounted device for learning anatomy in high school.

4. How can educators and administrators better evaluate and implement new VR technologies in the classroom?
I believe that more research needs to be done in this area in order to get buy-in from schools. However, classroom teachers can easily obtain fairly inexpensive Google cardboard headsets that students can use with their phones. This would be a good start for teachers, but more relevant content also needs to be available. At the same time, applying for grants and simply experimenting with different apps can be a good way to find out what works and what doesn’t. Administrators can take baby steps by creating maker spaces at their schools where all teachers can visit and use the technology. This can be more cost effective and equitable, while VR is still growing its content.

5. How can technologists get involved helping school districts implement new technology?
Forming this bond between technologists and schools is key to the success of XR in the future. In order to have useful content that is specifically made for curriculum, technology professionals should engage in a discussion with educators as to their needs in the classroom. Having more events like Ed Tech Connect is also a good place to start these conversations.

For example, one subject I teach is US History. I know that if I were to work with a developer on creating meaningful content, it would completely disrupt the teaching and learning model. When I teach about the Civil War, what if instead of reading about it or seeing a video, we could go there or experience what it was like to gain voting rights during that era?

The other vital tool is having a user friendly platform to create this content on. I have worked with Unity programs in my VR Club and students seem to catch on quickly. This game engine could ideally be used by teachers and students to create content for the common core curriculum that is used in classes.

6. Where do you see VR in the classroom in 3 years?
In the next 3–5 years, I see VR/AR/XR being as common as getting a laptop out to write up a paper. It will help teachers be more effective at their jobs and students to learn exponentially. It will effectively improve learning outcomes, save money for schools, help with students retention rates, build skills and empathy, all while being engaging.

Join Azine at the Ed Tech Connect meetup on June 30 in San Francisco, open to educators, administrators and technologists.

Onsite describing the new THETA

1 Like

Completely packed meetup

1 Like

Gallery of pictures from last night, viewable in VR headset or mobile phone.

https://edtech-theta-june-2017.glitch.me/

My daughter extended Kieran’s demo from last night with particle effects. She’s the one that picked the children’s graphics

https://theta-meetup-june-2017.glitch.me/

Static screenshots from galleries above.

1 Like

I changed the particle effect from rain to stars, as is more fittings to this spectacular event.

Wow, that A-Frame-based Glitch VR gallery is just fantastic! When I saw this post, I jumped up, put the URL in my iPhone browser, switched to goggle mode and slide my smartphone into RICOH’s version of Google cardboard (in other words, cheap, cardboard VR goggles) and, BAM! I’m standing in the middle of a circle of friends and colleagues from the meetup last night! Really crazy, really cool. I mean, I know all of this is possible. But the ease and the quickness and the immediacy… is really fantastic. Feels like I’m #livinginthefuture.

You asked about rotating specific images on the Facebook Workshop Group. As Facebook does not have a good way to put code samples and HowTo posts, I will respond here.

Default orientation

Rotated orientation

('rotation', "-20, -10, 0")

This is the same picture, just rotated 190 degrees.

('rotation', "20, 190, 0")

Orientation Per Picture

Explanation

When the event of changing the image occurs, look for the tag of the image you want to rotate. There are three images in this example, #theta1, #theta2, #theta3. If the src for the image is #theta1, then modify the orientation for #theta1.

Do It

remix

In glitch, remix the project. Glitch :・゚✧

edit

Go to set-image.js

Look for the code below.

    if (data.src === "#theta2") {
      console.log("#theta2 has come up: ", data.src);  
      // data.target.setAttribute('rotation', "20, 190, 0");
      data.target.setAttribute('rotation', "-20, -10, 0");
    } 

Change rotation values until the image is in a position that you want it to be in.

1 Like

As TK was mentioning that we should look into a sphere thumbnail, I experimented more with different effects of mapping images to different objects. I also added animation.

https://snow-sagittarius.glitch.me/

I’m hoping that a lot of people will start playing with this type of “art” as a toy and eventually come up with a better way to show 360 images and video to people.

Daughter built another A-Frame world, this one based on pandas. A-Frame technology is really effective in getting pre-teen (middle school) kids motivated. https://panda.glitch.me/

She seems to be the right age for this. Based on my testing with a middle school kid, I would say that the workshop was spot on! I’ll do more curriculum testing to see how she reacts.

She’s also using A-Frame in Google Cardboard, which is a nice experience.

I converted the panda project into a VR mobile app so the kids can carry around their VR world on their phones and show their friends more easily. Just tap the panda icon and the VR world comes up.

Advantages

  • eliminates browser URL bar
  • faster loading if assets are stored locally on the phone
  • one-touch launching

Now with menus as spheres.

https://menu-spheres.glitch.me/

A-Frame Embedded Test of RICOH THETA Images

A-Frame usually runs in VR headsets or as a fullscreen display on a mobile phone. If you want to have an embedded window, you need to use the <a-scene embedded> tag. The size of the embedded window is controlled with CSS.

Example:

a-scene {
    height: 300px;
    width: 600px;
    margin: auto;
}

Rotation of the preview sphere is handled with a-animation

Example:

<a-animation attribute="rotation"
    dur="9000"
    from="0 0 0"
    to="0 360 0"
    repeat="5">
    </a-animation>

If you run this locally, you’ll need to use something like httpster or fenix web server to get the images to display. Any web server will work. The image won’t be displayed if you open index.html as a file in your browser.

GitHub

1 Like

So you’ve made the embedded spherical image the thumbnail for the image that you link to?

The tag <a-scene embedded> allows you to run A-Frame in a browser window that is not full-screen. The default A-Frame view is fullscreen.

If you want to use spheres as thumbnails, this is an example:

https://menu-spheres.glitch.me/

I wanted to embed the window so I can put the camera control buttons on the same screen like this:

The Electron example is built with GoogleVR. I wanted to see if I could use A-Frame in a Cordova app. It looks possible.

Used the A Frame embedded scene to create a viewport and menu on the same web page.

In the example above, the menu is a mockup. The main purpose of the example is to show that two different scenes can be independently navigated.

Code example