Up, Up, Up Ingenuity (In "3D" & 5.1 Audio)
YouTube Viewers YouTube Viewers
9.62K subscribers
479 views
0

 Published On Premiered May 15, 2021

In this "Ingenuity in VR" teaser video we have decided to test the limits of Ingenuity's (in VR) thrust vector algorithm at 1 meter per second velocity by going straight up. There are no other external forces (wind) affecting it at this time so, but even than the "synthetic" IMU would minimize any external forces to allow Ingenuity to fly smoothly. The key is to synthesize the feedback loop to work at 500 cycles per second like the "real" Ingenuity, which is not possible using the standard timing 'tick' based timer system in Unreal, which at the most can reach 90 frames per second, so we will have to do some predictive averaging to provide a similar experience. Or... just fake it and create a function table based on the data, which sadly has not been made available other than a one page article with some simple charts. I had hope through some request to the team at Ingenuity more information would be available. Sadly, I have received no response from the team's engineering leader Bob Balaram, or Nasa's public relations representatives.

Mar's Tech Updates:
As mentioned above, the main goal is to emulate the flight characteristics of Ingenuity so that a VR user can fly it without having to be concerned about pitching or rolling the helicopter too much that flight becomes unstable, as well as provide external forces so that craft doesn't feel like a "point and direct" game style craft. As in "Ascent: Eagle Has Left the Moon," part of the "Apollo 11: 'One Small Step For...' VR Experiences" series, I created a complex flight model to emulate the forces at play in lift off and returning to orbit around the moon. This allowed me to synthesize the vectored thrust necessary to create the actual flight path Eagle took and described by Neil Armstrong and Buzz Aldrin as it reach escape velocity and engine shutdown. Of course over 50 years later there have been volumes of data and books available describing the navigation computer algorithms and the exact thrust (Newtons) and fuel amounts needed to for Eagle's ascent module and RTS thrusters to create simplified model for Unreal Engine's Blueprint visual scripting system. Sadly, there has been only one paper published in 2018 by the Ingenuity team, but it more of summary and description of the engineering goals, with limited specifications, which I found in trying to emulate Ingenuity's RTE camera, as seen in this video's perspective, to be wrong in that paper. The other article does provide some updates and a few illustrations, but the charts are not very detail. Again, NASA please release the data.

Unreal Engine Tech Updates:
One of the areas that was a problem with "Ascent: Eagle Has Left the Moon" was smoothly transitioning from detailed terrain to actually DEM based mesh and orthographic images at different levels of detail. The LROS cam system has provided detailed images of the moon of from 25cm to 50m of the entire moon. Blending these together was a problem & frankly one of the reasons it has not been released yet. However taking that development experience and working from scratch, I was able to to come up with a much better system of blending UE terrain and HiRise DEM & Orthographic images so that they are seamless. (I offer anyone a free copy of "Ingenuity in VR" to find the seam, but it will already be free). This method also allows me to add more detail terrains dynamically as more information comes in from Perseverance' cameras. What is still needed before release is the recreation of a number of unique rock formations and more precise geo data. Eyeballing locations based upon geographic features is not really the most exact way to go about positioning the rover and helicopter or many of the rocks, especially if the images presented by NASA also change over time. Since "Ingenuity in VR" is supposed to be drone simulator as well as discovery tool, again I wish NASA would provide more data to the public.

YouTube Tech Update:
NASA JPL just released their own "3D" video of Ingenuity's 3rd flight from the Mascam-Z camera, but it should be noted it was only at 480p and an red/cyan paired anaglyph. In creating this video, I took the time to figure out how use YouTube's "3D" feature, which sadly has been downgraded over the past two years in favor of their 180 panoramic video method. This broke most of the 3D (stereo paired) videos on YouTube and on top of that is not even a "colored" anaglyph method. Even more frustrating is the method of YouTube recognizing that your video is in 3D, which is to either encode using frame stacking (mp4) or meta tag (mkv) since they did away with a simple toggle in. Luckily this FFMPEG command line statement: "ffmpeg -i vidfilein.mp4 -vcodec libx264 -crf18 -x264opts frame-packing=3 vidfileout.mp4" does work, just replace vidfilein.mp4 with your file & vidfileout.mp4 with your output file name. -crf18 (compression quality) is optional, which is a little better than the default of '20'.

show more

Share/Embed