Mapping THETA S dual-fisheye live streaming to a spherical surface in the browser



This blog was originally written by Atsushi Izumihara in Japanese. He uses the ID AMANE.

This is a community translation into English.


When THETA S came out, I tried to stream it to WebRTC and use it as a USB camera. However, I had trouble mapping the dual-fisheye stream onto a spherical surface.

Editors note:
for most people, this was solved with THETA UVC Blender and UVC FullHD Blender. This is the problem the author is trying to solve:

Alternate solution is here

After a while, I saw this article, thank you! I cried out

Editor Note: English translation of the article above is here.

This was exactly what I was looking for.

However, when I accessed the page, the THREE library was not being read correctly, some code was reversing on git, and it did not work properly. So I decided to fix it so that I could move on my own.

Today’s article is a project I tried just a little dancing on the shoulder of the giant. Thank you [to the original author mechamogera]

Finished project

Here’s the code on GitHub. The original was hard to handle with Gist, so I moved things to a git project

Recently getUserMedia is not compatible with https. Use npm http-server. Get the code from GitHub and try it with npm http-server etc

I’ll show you what I recorded



  • three.js v 73
  • Google Chrome

Texture mapping to sphere

Although the basic is as the original article, with the latest revision, the code which is doing the main theta-view.js of theta-view.js is theta-view.js deleted. (Is it a commit mistake?) Since there is a code in the previous revision, I will use this.

Adjustment of Stitching Seam

If you keep the original code, I cut it a bit, so adjust the numbers to make it feel good. Please see the code for details. Smoothing is not done. I want to do!

Get THETA S’s USB live streaming

For this test, I will display the movie with the THETA S locally connected. Acquire the WebRTC 's getUserMedia and use it with the <video> tag. The <video> tag is not displayed directly. The image is mapped by Three.js and displayed in the Canvas created with js.

Bonus: FOV adjustment

Make it possible to widen or narrow the viewing angle. OrbitControl.js the mouse wheel of OrbitControl.js is devoted to another event, you can OrbitControl.js control the FOV with the up and down keys

  function keydown(e) {
    var keyCode = e.keyCode;
    switch (keyCode) {
      case 40:
        // zoom out
        camera.fov += 1;
        if(camera.fov >= 130) camera.fov = 130;
      case 38:
        // zoom in
        camera.fov -= 1;
        if(camera.fov <= 30) camera.fov = 30;
  document.addEventListener("keydown", keydown);



This time it was awkward to set up an https server so I stopped displaying local cameras, but just by using the video sent remotely by WebRTC as a texture, I can finish the interactive 360 web chat.

Continuous operation and usage

I have tested the THETA S in the USB live streaming state continuously for more than 8 hours. It seems to be OK for now, but there were a lot of twists and turns in my earlier tests. I will try using it a bit more continuously.

I thought that it could be used as a surveillance camera all the time but it seems a little inconvenient because it is necessary to press the current hardware key with the special manners of making live streaming status.


When mapping to the spherical surface, I feel that it is not enough at USB 1280x720 at all. (Editor’s Note: Current resolution is now 1920x1080. 4K was announced recently for a future THETA model.) Even if you use HDMI live streaming FullHD with HDMI capture, it probably will not be enough. I thought that there is still room for further development in this area. I am looking forward to the future! (Editor’s Note: See announcement of 4K live streaming.)

Livestreaming from Theta S connected to Raspberry PI 3 B
Displaying THETA's Dual fisheye video with Three.js
Livestreaming from Theta S connected to Raspberry PI 3 B

Hi, I have some issue with resolution when playing video via WebRTC.


I don’t think anyone has tested webRTC with 4K yet. Just so you know, there’s an existing solution for the THETA S with OpenTok. It’s possible that it will simply just work with the THETA V.

Do you have a THETA S or a THETA V?


Many Thanks for info :slight_smile:


Hi, I tried to connect my Theta S to PI 3 B in live streaming mode. And then I followed Ricoh Two-way Video Streaming Sample step by step but I can not login in web browser (use pi chromium). The procedure as follow:

  1. npm start

  2. web browser pop up

    I don’t know what problem. Can you give me some advise


The Ricoh cloud api server is now down. You can give OpenTok a go.

For the webRTC demo you are using, you would need to set up a separate Auth/BOSH server that is not the Ricoh cloud api server. You’d need to modify the code. I have not actually tried it with a separate Auth/BOSH server. The Ricoh demo was temporarily set up for a contest a year ago.

Note, I’m assuming that you were originally trying to use this code.


Thank you for your quick reply. My application is Theta S connect to PI in live streaming mode and watch in the other PC(windows 7). I did a motion streaming server in PI and got the dual fisheyes image on the other PC. I use opencv make the dual fisheyes image to equirectangle view, it looks good. But I want to use this dual fisheyes image to a navigatable 360 degrees view. Do you know any other solution except Three.js.


Oh nice. You have a cool application. I’ve only seen solutions with three.js, I think primarily because it’s the most common. I’m assuming that there’s other solutions out there.

If you already have the dual-fisheye to equirectangular working with opencv, you should be able to get the 360 navigation working with other solutions. Unfortunately, I don’t have a specific recommendation for you.

Would love to see your application when you get it running. :slight_smile: :theta:


Hi Codetricity,
I tried use npm http-server with example work well in localhost. I can not watch image(win 7 chrome and firefox) when run npm http-server in raspberry pi. I had google search it is because the webrtc getUserMedia no longer works on insecure origins. After that i tried to change node.js http to https but still not success. Could you teach me how to run this code in raspberry and watch in the other PC? I am newer JS or web application development.

PC chrome : 63
raspberry pi 3B chromium : 60
Best Regards,


Great to see that you’re making so much progress. Unfortunately, I’m not an expert in this myself. I’m primarily using the code from other people.

There’s some examples of webRTC here:

Though, the examples appear to be using RTCPeerConnection and not getUserMedia

You have an interesting project.

Another strategy could be to use the Ricoh example and attempt to set up your own BOSH Server. the Ricoh cloud API was only used for BOSH authentication.

You could also try to get something working with OpenTok first, then after you have a demo working, move to fishing your own platform.

Also, the Ricoh cloud api seems to be back online offering free trials, though I have not tried it recently.

It might be useful to go through the example here: