Crosshatching with Eevee

Blender 2.8 isn’t even Beta and it already has me hooked! Eevee will (IMHO) open the flood­gates for NPR and all kinds of cool styles. This time I worked on a cross hatch­ing material. 

The tech­nique is based on this paper from Princeton/Microsoft Research. The gist of it is mak­ing sev­er­al tex­tures to rep­re­sent the dif­fer­ent lev­els of shad­ing and map them to the light­ing in the object. Well, guess who has a shad­er-to-rgb node now? We can grab the illu­mi­na­tion data from a dif­fuse or glossy BSDF, plug into sev­er­al col­or­ramp nodes and get masks for each step. I changed one thing from the paper though. In the paper they use tex­tures that have the cross hatch­ing already com­bined and then mix them. I think keep­ing them sep­a­rat­ed and mul­ti­ply­ing them in a mix node is more efficient/flexible for Eevee.

I hope you like UV unwrap­ping :). You will need it to make the lines flow in a uni­form way. Other tex­ture coor­di­nates can cause strange or unre­al­is­tic results. There is one excep­tion though. Large flat areas that are on graz­ing angles to the cam­era look weird, like a blurred mid-90s repeat­ing tex­ture. I haven’t found a good solu­tion yet. One way is to use the Window tex­ture coor­di­nates. But these bring a dif­fer­ent prob­lem: they look awful the moment you move the cam­era and make the objects flatter.

Don’t for­get to scale UVs prop­er­ly to make the lines have rough­ly the same size. Also, since this effect depends on col­or­ramp-ing the dif­fuse shad­er it’s a lit­tle illu­mi­na­tion spe­cif­ic. Particularly in high-con­trast places. You can see for your­self by mov­ing the sun in the blend file and watch­ing the effect on the shad­ows in the walls.

Subtle paper tex­tures real­ly help sell the effect. You can grab some at Pixabay or Lost&Taken. Since the tex­tures are black and white you can take the ren­der and mul­ti­ply it on top of the tex­ture. Another impor­tant part of the effect is using masks and leav­ing emp­ty spaces to sug­gest details. Even more so if you are repeat­ing the tex­tures a lot, since the rep­e­ti­tion will look artificial.

Speaking of tex­tures. They are very easy to make using Krita’s wrap mode (press w). Just grab a good brush and start mak­ing hor­i­zon­tal lines, then make anoth­er lay­er and make some more lines in oth­er places. Repeat to get three hor­i­zon­tal lay­ers. Then do the same for the three ver­ti­cals and export each lay­er as a sep­a­rate PNG file. For these tex­tures I used the ink-7-Brush Rough brush and drew the lines freehand.I also have anoth­er set of tex­tures I made with the Basic‑5 brush and the line tool. These look more clean and orga­nized, but I kin­da like the rough look better. 

When mak­ing your own tex­tures keep in mind it’s impor­tant to keep the thick­ness and spac­ing of the strokes even. Otherwise you can cre­ate local details that will show up when repeat­ing the tex­ture and ruin the effect. You can zoom out while in wrap mode to check this.

You can down­load the blend, tex­tures and the source kri­ta file. The whole thing is CC‑0 so feel free to use it for any­thing with­out attri­bu­tion.
(I would­n’t mind a link back though!)

All the posts you can read
FreebiesBlender, Eevee, Materials28.09.2018



12 Comments

  1. Iraito(5 years ago)

    Really inter­est­ing, i will mess with it in the future, would you say that this tech­nique is good to get results sim­i­lar to this ? https://www.youtube.com/watch?v=-cw2gXq83n8

    1. januz(5 years ago)

      Hi! Yeah, you could mix those cross­hatch tex­tures into oth­ers to cre­ate a sim­i­lar effect to what they show on the 4:38 mark. For the out­lines you would have to use freestyle.

  2. Gareth(5 years ago)

    That sounds very inter­est­ing. Will look at the blend file. I won­der if we can replace the man­u­al tex­tures with algo­rith­mic tex­tures? This might make things a lot sim­pler, espe­cial­ly when it comes to UV unwrap­ping and scaling.

    1. januz(5 years ago)

      Hi, I think the only way to do some­thing like with pro­ce­du­rals would be using OSL or some cus­tom shad­er to gen­er­ate the whole tex­ture. You could pick ran­dom points and paint lines from them. The main prob­lem would how to dis­trib­ute those points in a ran­dom-but-still-reg­u­lar sort of way. Otherwise you would have lines crum­pling together

  3. Iraito(4 years ago)

    I had time to test the shad­er and see if i could mix it with the prin­ci­pled one and it works BUT only in eevee, can you tell me what i should do to use the effect in cycles ?

    https://imgur.com/a/DOTpcWC

    I was also inter­est­ed to achieve the same result with shad­ows, any idea how to achieve it in eevee\cycles ?

    1. Diego Gangl(4 years ago)

      Hi, it should most­ly work. However since it’s using shad­er-to-rgb, the results of the dif­fuse shad­er from Cycles might be dif­fer­ent from Eevee. You might have to tweak the shad­er, the lights or the result of shad­er-to-rgb. For shad­ows, the process is sim­i­lar. Check out the “Floor” mate­r­i­al in the blend file, play with the col­or­ramp com­ing out of the shad­er-to-rgb node. You can use it as a mask for shad­ows (using a thick crosshatch).

  4. Iraito(4 years ago)

    Thank you a lot, i’m not an expert in shad­ing so i need as much help as i can get, i’m try­ing to devel­op a per­son­al visu­al style and your approach seems per­fect for me.

    I will try when i have time, sor­ry for being annoy­ing, i real­ly want to get this work; thank you again.

    1. Diego Gangl(4 years ago)

      Hey, no prob­lem! Lookin for­ward to check out your style in the future

  5. Iraito(4 years ago)

    Damn i asked on the blender artist forum if the shad­er to RGB node was in any way usable in cycles but it seems to be eevee only, at this point i either use eevee for my projects with a NPR style or the devel­op­ment team make the node avail­able for cycles.

    1. Diego Gangl(4 years ago)

      Ah sor­ry, I gave it a quick look and thought it was some­how work­ing. Turns out it’s out­putting pure grey :/ It’s prob­a­bly not pos­si­ble to cap­ture the out­put of BSDFs in Cycles, since it’s a ray­trac­er. You could still use both engines and com­pos­ite the result, but it’s prob­a­bly not worth the extra work.

  6. me(4 years ago)

    maybe use some noise to blend it all randomly ?

    1. Diego Gangl(4 years ago)

      Do you mean hav­ing noise added to the fac­tor of the mix nodes?

Leave a Reply

Your email address will not be published.