External Intervalometer

I’m wondering if I create a remote trigger for the Theta, could I increase the time between shots from 8 seconds to say 4 seconds.

I have seen a bunch of Rasp Pi controllers but would love to know how well they work for timelapse.

Either USB or wifi would work for my application.

Cheers

Phil

I initially wrote this as an attempt to get my Theta S camera to shoot stills faster than using the built in intervalometer. In testing I was able to shave about 2 seconds from the shooting time, This however only seems to work using WIFI and not the USB.

I think Koen Hufkens has had it taking intervals for a month non-stop.

Both of the examples use Raspberry Pi.

Seem to be faster with WiFi, but the RPi to camera connection is less stable.

Thanks, loos like he may have got it down from 8 to 6 seconds…

It also looks like there is some Python code I can try on my laptop to see.

Thanks

Phil

1 Like

I just did a test using a standard ThetaS app and was able to shoot with something like a 4 second interval (by simply tapping the shoot button). So it seems like I should be able to create an intervalometer that would do the same.

I have seen code for the Raspberry Pi but would rather just use my OSX laptop to control the camera as I am using it in a studio environment.

I am not a coder but have done some Python, JavaScript and Node projects in the past.

What do you guys think would be the easiest way to build something simple to shoot (but not transfer) images?

I was thinking maybe I could use the example Python script but wasn’t sure if that was meant only for the Pi and if I would need external libraries etc.

Any advice would be great.

As a side note… before the Theta supported interval shooting, I built a robot that would tap the screen of the iPhone every 10 seconds… this was used by Ricoh when we did the original launch film for the Theta…

Thanks

Phil

1 Like

The easiest thing will be Python or the Bash shell.

I think you can just use Jason’s script with no modification if you use the WiFi method of connection:

I think it will run on OSX without any changes, but I have not tried it. I suspect that you’ll need to modify things in order to get it to work with the USB cable.

The WiFi method is the most generic and easiest way to connect a computer to the camera. it’s the most tested. It’s also possible that Jason may even be able to give you advice if you run into a problem with the script on Mac OSX. He’s active in the forum.

Make sure your Mac laptop to the theta wifi first. the theta is the hotspot. If this is new for you, look at this doc:

http://codetricity.github.io/theta-s/index.html

You can test the connection and taking a picture with Chrome and something like DHC (free tool).

Do you have a video of the robot in motion? I’d love to see it.

Oh, I just remembered that I cornered my son one weekend to help me with a test 4 months ago for a timelapse using USB connected to a desktop computer running Linux. He wrote this script for a single test. It’s not clear if anyone has used ptpcam with a Mac, so it may not work.

There’s a test clip of the images as well as a test edit with Premiere Pro CC VR editing with Mettle SkyBox. The Premiere feature was new at the time he did the test. As he’s in high school and he wrote the script for one test only, I suggest you try Jason’s script or Koen’s script on your laptop first.

https://www.youtube.com/watch?v=hVGMjBnbyWI

My recommendation is to use Jason’s script with WiFi connection.

The script from Jason looks like it needs something called ptpcam.

When I try and run it I get errors.

./tlapser360.sh: line 243: ptpcam: command not found

I’ll post some video of the robot, I called it the RoboTap 300 :slight_smile:

I’ll try the script your son made and see if that has any clues.

Cheers.

Phil

are you running it with -W?

That’s a capital w.

See this:

-W (y/n) Default y Wifi mode unless USB mode is enabled. Control the camera using the direct wifi connection and Opens Spherical camera API.

ptpcam is only needed to control the camera from a USB cable. If you’re getting this error with WiFi, then something is not working as expected and might be easy to fix. Can you try the -W option and let us know if you still have the problem? If you are, it’s possible that @squizard360 (Jason) might want to take a look at it or someone else might. He just wrote the program in the last few weeks, so it’s a good opportunity to debug it on different platforms.

His example command on the README is for a connection with USB.

Using USB take a large resolution photo every 5 seconds for a total of 5 images with the camera set to manual mode, iso 100, white balance auto, and a shutter speed of .5 seconds while writing the images to a directory and leaving them on the camera…

./tlapser360.sh -U y -I 5 -C 5 -m 1 -r h -i 100 -s 0.5 -w 2 -O /mnt/tmp/tlapser_test/

To try WiFi, try this (NOTE: the capital U is replaced with capital W):

./tlapser360.sh -W y -I 5 -C 5 -m 1 -r h -i 100 -s 0.5 -w 2 -O /mnt/tmp/tlapser_test/

-W worked great…

I did receive some errors which I pasted below.

I think this should be perfect for what I need.

I noticed the shutter sound is totally different when using this script, are there options?

Thanks for your help so far.

Here is a fun behind the scenes film that shows my “robot intervalometer”…

Enjoy.

Phil

Phils-MacBook-Pro-2:tlapser360-master philspitler$ ./tlapser360.sh -W y -I 5 -C 5 -m 1 -r h -i 100 -s 0.5 -w 2 
Connecting to camera with WIFI
0001
Setting mode via WIFI
{"name":"camera.setOptions","state":"error","error":{"code":"invalidParameterValue","message":"Any input parameter or option name is recognized, but its value is invalid."}}{
  "name": "camera.setOptions",
  "parameters": {
    "sessionId": "SID_0001",
    "options": {
	  "fileFormat": {
          "type": "jpeg",
		  "width": 5376,
		  "height": 2688
	  },  
	"exposureProgram": 1,
	"iso": 100,
	"whiteBalance":"2",
	"shutterSpeed": 0.5
	}
  }
}

Wow, the video is awesome!

there are options to change the shutter sound. You can turn it completely off or a value between 0 and 100.

I have not looked at the code to see if it changed it.

From the error message, I think that maybe the whiteBalance option needs to be adjusted.

From the error message, it seems like the program is passing a value of “2” but the documentation seems to indicate different parameter values.

The parameters for USB and WiFi are different, so you’ll need to do something like this:

`-w auto’

that’s a lowercase w

Assuming no other differences between the WiFi and USB parameters, it would be this. I have not tested it. If you have another error message, please post it. I suspect it may be a simple fix of changing the parameter values. You’re close.

./tlapser360.sh -W y -I 5 -C 5 -m 1 -r h -i 100 -s 0.5 -w auto -O /mnt/tmp/tlapser_test/

Yup, you nailed it thanks.

I think I have everything I need to get going now.

Glad you liked the video, it’s pretty dated now as that was for the first Theta launch…

Here the actual video we did for them.

http://www.bonfirelabs.com/work/theta-360-camera-launch

Thanks for all your help, this is great. Now I can see the syntax for sending commands I should be able to get exactly what I need.

Cheers.

Phil

1 Like

Were you able to get the script working?

One thing to watch is that the camera has a small buffer so you can shoot photos fairly quickly but once the buffer is full you will have to wait till it’s done processing before it allows you to shoot again. While its processing the camera will just ignore my script, however if you’re downloading the images it could get funky since I don’t check for that. I suggest you do some testing to find the shortest interval that works for you. You’ll know the buffer is full because the camera icon on the camera will blink.

Yes, the script works great but I don’t think I captured more than 5 images so I never hit the buffer issue.

I’ll try tomorrow and see what speed I can get. I’m not transferring the images, just keeping them on the Theta.

Thanks.

Phil

I updated the README with some of this great info.

The image count is -C (capital C), so -C 1000 will get you 1,000 images.

./tlapser360.sh -W y -I 5 -C 1000 -m 1 -r h -i 100 -s 0.5 -w auto -O /mnt/tmp/tlapser_test/

Here’s some tips:

  • WiFi API access will shut off if you plug the camera into the USB port of your computer. To solve this, plug the camera into a wall socket or a battery (not your comptuer)
  • You may need to disable sleepDelay by setting the value to 65535
  • You may need to disable offDelay by setting value to 65535

I have heard of someone using it for a month using the WiFi API. I don’t have the details on that project, so it’s not 100% confirmed, but it seems likely the WiFi API will work for at least a week.

I’m curious to learn more about your experience as I think the script could be used as the foundation for other projects that need to access a loop.

The project also adds input from sensors like GPS and Lux, so that may be of interest for either triggers or additional data.

For example, I’m involved with a community to learn about IoT and I have this sensor on my desk.

The TMD3782 provides the lux and color measurement that I think Jason’s sensor uses.

I also have this vibration and motion sensor on my desk

I’m thinking of triggering images based on motion or vibration of an external sensor.

Even if you only use the timelapse feature of the script, it is still very valuable to get your usage example as you’re the only one on this discussion thread that is a professional artist.

1 Like

These are all great notes, I’ll be working on my setup today.

I won’t be using external sensors to trigger as I’m using OSX and not a Pi but I think once this project is over I’ll be using the Theta on my Pi for something.

Cheers.

Phil

I would like to write a followup story to this leaf story:

Are you okay if I use a few screenshots in the story from your videos? I can insert whatever credit and copyright notice you would like.

Here are the two that I am thinking of:

I am thinking of this progression for the story:

Leaf → Lake → Robot → Artist.

These are the lake pictures I’m thinking of using.

looking upward at the sky in the 360 video

Of course… I would appreciate you run the copy past me before it gets sent out.

PM me and I’ll send you my email address.

Cheers.

Phil

1 Like

Hi, I am setting the exposure and shutter speed manually but it seems like it might be in shutter priority mode.

The switches I am using are:

-W y -I 4 -C 250 -m 1 -r h -i 200 -s 0.03 -w auto

I would expect the ISO to be 200 and the shutter speed to be 1/30 but when I look at the data on the file I am getting an ISO of 320 and a shutter speed of 1/30.

Any ideas?

Thanks

Also, tried going faster than 1/30 by using

-W y -I 4 -C 250 -m 1 -r h -i 200 -s 0.015625 -w auto -v 50

and the shutter speed is still showing as 1/30

Phil