FORUMS NOTCH General Adding custom shaders.

  • Author
    Posts
  • Richard
    Participant
    Post count: 2

    Hi,

    Just wondering what shader language to write in for adding custom shaders, GLSL/HLSL? Any notes on how to format additional frag/vert shaders? (Does the document “See writing custom shaders” exist yet?) I would quite like to for instance be able to import a simple 2d shader I’ve made to use as texture/material, same sort of idea as the generator section, just adds more flexibility if I can write/add my own.

    The other part to my question is about raymarching and shapes/geometry, I’d really like to be able to write/add my own distance functions for raymarching, the shapes are fine but slightly limiting. Is there currently a way to add custom distance functions such as using “Editable code”?

    Cheers

  • Matt Swoboda
    Keymaster
    Post count: 52

    Hi Richard,
    Shaders are written in HLSL .fx format. Much of Notch’s functionality is based on shaders, and in many cases the shader is tightly bound to the engine code driving it – in general the “Shader” property is there for developers to be able to hot-edit shaders during development. So in many cases, without having the engine code the shader is running with, it wouldn’t be of much use. Our main 3D object shader is an ubershader with generated permutations and is tightly bound to the deferred renderer.

    There are a few areas where it would be pretty easy to drop in a shader though – 2D generators and post fx for example, where you’re just generating or modifying a 2D image. We are planning to add a couple of nodes (Post Effect and Generator) that allow you to drop in a shader and have a few parameters available to work with, so you wouldn’t need to repurpose another existing node for it, and some example shaders to go with it as templates.

    That said, particularly in the case of post fx there’s a lot in the box already so it’s worth going over the nodes first before going to shaders. If you start writing your own code, performance and stability is going to be out of our hands.. 🙂

    The distance field shaders are different though. Distance field / procedural nodes in Notch work by generating a shader from fragments of code made from the nodes in the subtree, which gives a distance function that can be used by the procedural root node to generate geometry / voxels / be raymarched. You will have seen there is a procedural node called “Editable Code”.. you can type in shader code directly in there.
    The format is as follows:
    – the result distance value is a float called “sdfValue”. This contains the incoming value from the previous nodes in the tree – you can overwrite it or blend with it.
    – pos is the world position being evaluated
    – localPos is the local-space position to the node
    – param0, param1 and param2 are float values that map to the 3 params on the node

    so an example piece of code in that text box could be :

    sdfValue = min(sdfValue, localPos.y + sin(localPos.z * param0 + param1) * param2);

    I’ve attached a small example too. Obviously it’s not always easy. There’s no feedback about errors etc, you’re on your own there.. so you may want to build shader code in another environment to be given errors, and copy it in. You can stack up multiple small chunks into separate editable code nodes, it may be easier working like that.

    -Matt

    procedural_editablecode

    Attachments:
  • Richard
    Participant
    Post count: 2

    Hi Richard,
    Shaders are written in HLSL .fx format. Much of Notch’s functionality is based on shaders, and in many cases the shader is tightly bound to the engine code driving it – in general the “Shader” property is there for developers to be able to hot-edit shaders during development. So in many cases, without having the engine code the shader is running with, it wouldn’t be of much use. Our main 3D object shader is an ubershader with generated permutations and is tightly bound to the deferred renderer.

    There are a few areas where it would be pretty easy to drop in a shader though – 2D generators and post fx for example, where you’re just generating or modifying a 2D image. We are planning to add a couple of nodes (Post Effect and Generator) that allow you to drop in a shader and have a few parameters available to work with, so you wouldn’t need to repurpose another existing node for it, and some example shaders to go with it as templates.

    That said, particularly in the case of post fx there’s a lot in the box already so it’s worth going over the nodes first before going to shaders. If you start writing your own code, performance and stability is going to be out of our hands.. 🙂

    Hi Matt,

    Thanks for the swift and clear response, and that’s good news as I’m quite familiar with HLSL, as it’s the same shader language as Unity uses.

    Making simple 2d pixel shaders is exactly what I’m after really, to be able to extend a blank post fx or generator node would be the perfect solution. You’re right there’s a bunch of cool stuff in the post section, like the film grading, would be great if you could really smash the chromatic abb in it, or radial blur with rbg distortion whatever it is, smash factor needs to go to up 11!

    The distance field shaders are different though. Distance field / procedural nodes in Notch work by generating a shader from fragments of code made from the nodes in the subtree, which gives a distance function that can be used by the procedural root node to generate geometry / voxels / be raymarched. You will have seen there is a procedural node called “Editable Code”.. you can type in shader code directly in there.
    The format is as follows:
    – the result distance value is a float called “sdfValue”. This contains the incoming value from the previous nodes in the tree – you can overwrite it or blend with it.
    – pos is the world position being evaluated
    – localPos is the local-space position to the node
    – param0, param1 and param2 are float values that map to the 3 params on the node

    so an example piece of code in that text box could be :

    sdfValue = min(sdfValue, localPos.y + sin(localPos.z * param0 + param1) * param2);

    I’ve attached a small example too. Obviously it’s not always easy. There’s no feedback about errors etc, you’re on your own there.. so you may want to build shader code in another environment to be given errors, and copy it in. You can stack up multiple small chunks into separate editable code nodes, it may be easier working like that.

    -Matt

    procedural_editablecode

    Excellent, that’s perfect, is there anything you’d say about staying lipschitz continuous? Something about making sure you return a sdf value lower than 1?

    The other little detail is about using custom code in the CSG mode, if you double click 3D primitive I don’t see an input for csg mode, is it possible to overwrite the blend mode with the “Editable Code” node?

    Main reason for all this is I’ve got a set of ideas about moving from 2D to 3D using 2D tiling functions. I was thinking make animated textures/materials with pixel shaders and then morph into sdf’s with the same patterns etched into or protruding out of the surface, kinda like this I made in Unity a few months ago: https://youtu.be/RKhpKRibxro

    Very last throw away thing, something about emissive textures and being able to use an emissive texture to light surroundings, is this possible at all? I’ve been trying to blag it with the use of the glow node and a light in the same position like here: https://youtu.be/9BSLCmjuxQ8

  • Patrik
    Participant
    Post count: 12

    A live code 2d generator would be awesome : ) something like kodelife. https://hexler.net/software/kodelife

  • Matt Swoboda
    Keymaster
    Post count: 52

    Hi Richard,

    We have a Custom Shader Post Effect node coming in the 0.9.18 release. You can attach postfx to generators anyway.. so you can use it to generate too.

    > Excellent, that’s perfect, is there anything you’d say about staying lipschitz continuous? Something about making sure you return a sdf value lower than 1?

    To make things lipschitz continuous is an endless problem for distance field shaders.. 🙂 It means the distances you produce are real distances – i.e. if I get a distance, then step to the left and get a distance, it is correct vs the original distance plus the amount I stepped. Its very easy to achieve for primitives and harder for anything interesting. We don’t enforce that you do it, not all our generators totally are anyway, and the nodes we use to render (meshing, volumes) can be quite forgiving as they mainly look for the 0 crossing point rather than raymarching and needing accurate results. That said, we have a lot of optimisations that rely on the field being sampled anywhere and producing an accurate result – so you could get some interesting looking errors if you get it wrong.
    The CSG mode is only a parameter, not an input, so can’t be overridden like that.

  • KatiKoss
    Member
    Post count: 1

You must be logged in to reply to this topic.