What is work-flow for super hi res scans?

Comments

6 comments

  • Avatar
    ssh4
    Hello!

    It hard to tell without see what kind of object you have.

    But in most situations 200-500Mln poly mesh from High reconstruction can be decimated (simplified) 5-20 times to something more usable (10-50mln poly) without lost details.
    0
    Comment actions Permalink
  • Avatar
    kir238
    Are you sure? I actually think the crispness of of small surface details: pores, cracks. paint flakes etc, is getting degraded after simplification more than 20% . not times. They rather dissolve among random surface noise. Am I wrong?

    Much the same as in Zbrush where decimation works good only on hard surface man made objects with a lot of flat simple surfaces .
    0
    Comment actions Permalink
  • Avatar
    herveMaxwell
    we are supposed to get Displacement baking directly in RC... 8-)

    at least I recall it was a plan... no idea if it still is.. nor when to expect it..
    0
    Comment actions Permalink
  • Avatar
    kir238
    That's would be a revolutionary , greatest thing ever :) I spend more time baking textures than for photogrametry itself.
    0
    Comment actions Permalink
  • Avatar
    ssh4
    kir238 wrote:
    Are you sure? I actually think the crispness of of small surface details: pores, cracks. paint flakes etc, is getting degraded after simplification more than 20% . not times. They rather dissolve among random surface noise. Am I wrong?

    Much the same as in Zbrush where decimation works good only on hard surface man made objects with a lot of flat simple surfaces .


    16K texture have 268Mln pixels. This mean if you can store tangent space normals for 268Mln poly for 16K. But this will be same as a noise if all pixels will have different values. So in reality good normal map must have data from less poly on 16K for looks nice on 3D renders. Probalby 3-9 times less. So for 16K normal maps - you can store information only for 30Mln poly.

    Also... raw billions poly mesh have too much noise, that will redused first, details will be remain after decimation.
    0
    Comment actions Permalink
  • Avatar
    kir238
    Did you do a comparative tests with RC? I did years ago with Zbrush decimation and had an impression that decimation tends to mess those tiny details unless you find a very precise limit.

    I bet it all depends on exact settings and how decimation filters those small imperfections but I am not sure I understand your pixel math logic. Imo for nice looking normal map each pixel normal should be rather interpolated from 4 polygons for each pixel.
    At least in my experience more mesh resolution allows to distinguish so subtle depth differences as paint flakes on a wooden boards. Otherwise I see just random noise. And this noise is much easier to deal with in Photoshop on displacement texture since you can adjust it interactively.

    Maybe I just deceive myself I am not 100% sure
    0
    Comment actions Permalink

Please sign in to leave a comment.