Photorealism in space mission design
Space Travel Blog / UT Tartu Observatory (Slavinskis et al.)
2021. g. 20. sept.
The original technological ideas used by SIA Nanocraft developed in the scope of Space Imaging Simulator for Proximity Operations, or SISPO, by Mihkel Pajusalu et al. The Space Travel Blog post published by our co-founder Andris Slavinskis introduces the capabilities of SISPO in terms of photorealistic scene modelling.
While SISPO itself is outdated, the blog post and our science article explains what is necessary in space photorealism and how it is achieved in Blender, later on reused in FlyByGen and Asteroid Image Simulator.
Seeing is believing
Planetary missions advance the science of planets, asteroids, comets and other Small Solar System Bodies (SSSBs). Each scientific spacecraft is designed to explore unseen worlds, while requiring its instruments to perform well in the new environment. Every space instrument developer is bridging the gap between the known performance of a device in a lab and the projected performance in simulations. While the goal of scientific missions is to go beyond our current knowledge and to collect completely new data about mission targets, prior data, models and assumptions are still crucial for designing the mission and for optimizing the instruments. For example, ground-based remote observations of asteroids inform us about the average brightness of an object, providing camera designers with a general estimate of the required sensitivity a camera would need. Later on, when the spacecraft reaches the asteroid, the camera would be able to map the asteroid in detail. This means that it can assess how the brightness varies locally depending on the surface material, roughness and terrain, among other parameters.
Knowing average figures allows an instrument designer to imagine what processes might be ongoing on the target object. For example, in the case of an asteroid, we might assume that it is covered with regolith (dust and other material found on terrestrial planets and SSSBs) and, likely, with boulders and craters. While detailed properties of the target remain unknown, we can explore a range of options by making a model of the object and tuning its properties within a computer simulation. For instance, we can choose the shape and size of the target object, the number and sizes of its rocks, as well as create cliffs and slopes. This article is about tools and methods used for transferring the ‘imagined’ scene to images rendering the scene.
Our goal is to create virtual scenes that are visually indistinguishable from the real world. In doing so, we can simulate missions and their operations, characterize an instrument’s performance, develop and test on-board algorithms and, in essence, see the imagined mission ourselves.
The header image showing different shading techniques: image textures on Jupiter and its moon Io, light emission in the fireball and reflections on the spacecraft, in addition to procedural bumps on the surface of Io. 3D graphics done by our team member Mario F. Palos (full video on YouTube Intergalactic trip to the Solar System).
Read the full article written by our co-founder Andris Slavinskis on Space Travel Blog.