VR180 6K (6340 x 4574 4:3 O/U) using two Panasonic GX-85s w. Meinke fisheye lenses
All Things 3D All Things 3D
9.62K subscribers
375 views
0

 Published On Feb 1, 2022

Here are the test results of using two Panasonic GX-85s with Meinke fisheye lenses to capture almost a full 180 (180 x 170) of my backyard. What makes this important is that I was able to shoot in 4K - 4:3 ratio image mode with the LUT hack to produce a much richer image than the standard 4K - 3840 x 2160, along with more vertical FOV due to the 4:3 ratio. Plus the original spherical resolution is 3328 x 2496 for 220 x 170 degrees FOV allowing you to create a M4/3 VR180 camera system for less than $1700, even less if you can pick up the GX-85s used like I did. The Meinke fisheye lenses are $159.95 on Amazon at 3.5 mm and are close but not quite a full spherical 220 FOV horizontally and 170 FOV vertically, which is not bad considering the only other M4/3 fisheye that can is 3.25 mm and cost $600. Even than I am not sure it won't be complete encapsulation. Of course at this price, the lens is completely manual with focus and aperture rings from 1.7m to infinity, 2.4F - 16F respectively. Sadly there is also extensive chromatic aberration in the outer edge that you can see in the tree branches at the edge of the video, but I should be able to get rid of this with mask based color defringing tool

How did I do it? FIrst you can't just put the cameras next to each other even though they are pocket size, but you can by inverting one of the cameras with the shortest sides next to each other, however that creates another challenge and that is how to keep them in place. I was able to do this with a 'Gearbox' universal camera cage and some extension tubes. From there I had to drill a few new holes so I could align and secure the inverted cameras through the tripod mounting holes. I also had to use rubber shims to bring the sensors in parallel with each other and use double sided tacky tape along the edges making contact to keep them in place. Sadly even with short sides against each other, I still had to contend with 85mm versus 65mm distance between the lens centers. This sadly will cause the eyes to strain at objects close to the camera, but objects further from the camera will have a more effective 3D effect. I can manipulate this in software to get them closer to one another, but this has other consequences, so I am left with 75 mm apart. Knowing now that I am only 10 degree off of a full 180 vertical, I can probably just butt their bottoms up against each other with a stabilizing bar between them and custom threaded bolt, threaded through with enough thread to allow for silicone washer/bushing and tacky tape along the entire rail to provide even more grip to the entire bottom of the plate. The plate will have to extend past both camera to allow for another T-Plate to be attached to allow for a tripod mount.

I also use dummy battery systems to power each of the cameras with a large 5 VDC lithium-Ion battery and USB cables to export files since once mounted, it is pain to get at the batteries. Also thank Panasonic for creating custom memory settings and the ability to trigger the shutter from a phone or the touch screen. Not having these features make adjusting the upside down camera -- painful.

In any case I have found viewing the footage before upload in my Quest 2 pleasant and well within its 17.2 pixels per degree.

How did I create the VR180?

First I took the two seperate video files and used FFMPEG to convert from spherical/fisheye to equiangular based upon the FOVs motioned above. I used HVEC Nvenc to render the video with the added 'roll:180' added to the left camera to invert it vertically to match the other camera.

I then imported these two equirectangular videos into Adobe Premiere and did tiny bit of grading to bring the white/black components down/up slightly, but it really wasn't needed. Then I added the VR Projection filters, again not sure if I needed it since I adjusted the cameras to match via the Horizontal/Vertical adjust. Than scaled the horizontal out to clip the 220-180 portion of the video on both sides, then rendered it out again using HVENC, but at slightly smaller scale. Since Premiere adds the proper meta tag for 180 O/U, I just test in my Quest 2 using Skybox VR player before uploading it here. I will make a link for my next set of videos to download the versions I created before uploading in the next series as well as upload them to my Oculus Creator page as well.

Final goal is to create a script for FFMPEG to do the full encoding, skipping Premiere entirely. I have started to do this for my Unreal VR360 & VR160 videos as well to improve workflow.

show more

Share/Embed