Home About Media Download Pre-Order Chat Blog Forum

RSS Subscribe   

Blog

20180821-074841 (2018/08/21)

Finished the GPU rendering part, now it renders the exact same output as the software renderer.

The general idea is that most of the effects (lights, ambient occlusion, some material properties) will be baked into static textures unique to the surfaces. The map file will contain all the texture layers used (including both generated projected textures and direct per-surface textures where needed) and the engine will be able to regenerate them as needed.

Same with the geometry, where the input will be various surface patches and the result will be nicely blended single mesh divided into a grid that will allow to process and update parts of the map as needed. Each cell in a grid will be generated in multiple levels of detail to aid rendering of more distant cells.

I've been inspired a bit by the various voxel engines, utilizing both the coarse approach (the grid that the geometry will be splitted into) and also on a micro level with the geometry processing to allow blending between shapes and average out too small geometric details as they would produce aliasing otherwise (putting the detail into a normalmap instead). This is similar to mipmapping for textures but for geometry. This will basically tie the level of detail of geometry and textures. The algorithm will not use actual voxels, just processing in a way that will produce similar result.

Now that I have all the needed pieces done I can work on the actual modeller based on Bézier surface patches.

Comments

No comments.

Add comment

Name:
Content:
Confirmation code:
 
  (type the letters shown in the picture)


Copyright © 2007-2012 ADVEL s.r.o. | Privacy Policy