real-time quality exploration of all-scales all-wavelengths galaxies
Encadrant:
Fabrice.Neyret@imag.fr
équipe:
Maverick / INRIA-LJK
One of the Graals of Computer Graphics is the realistic rendering of ultra large and detailed scenes, possibly with real-time walk-through. Virtual exploration of galaxies is a challenging example of it : ultra complex, large and detailed, with features at all scales and incomplete corresponding 3D data and physical knowledge, while the large public is already accustomed to detailed Hubble photos at various scales from whole galaxy to nebulae. The purpose of our long term project is to achieve this quality, anywhere, in real-time walk-through.
Of course we cannot store explicitly the whole galaxy in ultra high-resolution volume of voxels, nor can we simulate the all-frequencies whole-scene light transport in real-time. Worse: we often only have partial information, at varying resolution or purely as statistical knowledge. Conversely, our target is way more specific than for general CG ( large fields of stars, black dust, illuminated dust, in massive void ), thus allowing strong hypothesis and a lot of a-priori knowledge.
Procedural noise such as Perlin noise is a classical CG equation/algorithm to produce cheaply very large and detailed natural looking stochastic patterns. But many real-world patterns don't directly fit their look, in particular the dark fractal clouds of galactic dust showing in spirals. Worse: nebulae burst holes in it, deforming the fabric and adding their own local dynamics resulting in the typical gorgeous nebulae images. Conversely, the cloudy areas as well as the orientation of their feature is organized along macroscopic shapes or rules, along several scales ( which the recently released Gaia catalogs allow to better understand ). So there is some order in this chaos to organize the on-the-fly data generation as well as to guide the efficient rendering to populated areas and skip voids, and to develop new stochastic pattern methods ( that may find use for wider CG application ).
Light-wise, the playground covers the full spectrum from IR to UV (and more). Stars illuminate dust, but ionized H|| regions have self-emission in known frequency peaks causing the beauty of (false) colorful Hubble and James Webb images of nebulae. Worse: the opacity of dust is highly dependent of the wavelength we look at, and light interact with dust shape since strong blue stars photo-dissociate and push the ISM matter away, which then react in ionized bands as well as through thermal IR. Here, the hope lays in the fact that many specific physical laws and data describe these phenomena, plus stars are points at known location and limited range, so that many computations can be analytical or baked. Also, astrophysical images are composed from a fix and small set of filters at a time, so that we can precondition the data relatively to these.
We
already explored early steps of this long term project ( including
various students projects, and a collaboration in astrophysics: cf
https://www.inria.fr/en/vertige-virtual-tour-through-our-galaxy
).
More generally, our team developed various models about huge
volumetric real-time rendering ( http://gigavoxels.inria.fr/
) , dedicated models for all-scales planet rendering (
http://proland.inria.fr/ ) ,
procedural textures, quality minimalist rendering (engulfing small
scale appearance as emerging phenomena), etc.
More related
publications at
http://evasion.imag.fr/Membres/Fabrice.Neyret/publis/publisHal.fr.html
.
Prerequisite
-
Some experience + general culture in Computer Graphics ( realistic
rendering, real-time rendering, proceduralism... ) / Maths / Physics
of light,
-
C/C++
, GLSL shading language or equivalent ( programming involved ).