How To Demoscene
How To Demoscene
Tracie, by TBC, 2007 Muon Baryon, by YUP+UD+Outracks, 2009 Receptor, by TBC, 2008
• Only 2/24 intros had textures... (a design decision? in all of them? really?)
Parsec, 2005
motivation :: technical improvement (2)
• Because everybody was saying “very nice this Ixaleno, yes, but for when
realtime?”
• But I knew it could be done realtime
• even thou I couldn’t openly tell
• as I often say that “exe or it didn’t happen” myself
• So, I had to try the “2+1.000.000” thing to prove it
• et voila, it just worked!
motivation :: realtime ixaleno (3)
• Changed to camera aligned regular grid (no popping, still inifite terrain support)
Screenshot taken with the final technique, during the experimentation week. Basically realtime Ixaleno in my mobility 8600 GS.
motivation :: realtime ixaleno (3)
Screenshot taken with the final technique, during the experimentation week
motivation :: realtime ixaleno (3)
• and lakes
Screenshot taken with the final technique, during the experimentation week
motivation :: realtime ixaleno (3)
• So I knew the 2+1.000.000 was working, that Ixaleno was doable in realtime
• But it was resting in my disk
• “too big for 4k, not good enough for a 64k”
• Until
• [Mentor] what´s up? something new?
• [iq] no, not really. ah, well, yes, i have been trying something
• [iq] but it´s 3k5 already without mzk or script
• iq sends ´realtimeIxalenoScreenshot05.jpg´
• [Mentor] hm, looks nice
• [Mentor] how about making a 4k together
• [iq] don´t think it´s possible, it´s 3k5, unoptimized, but still 3k5
• [Mentor] we make it 4k
• [Mentor] i´m telling puryx
• [iq] ... ok. wow!
behind elevated
• motivation
• the approach
• techniques
the approach
Early experiments on film look postprocessing shaders – completely discarded after a short discussion about it
the approach :: the elements
• In the end we went for a more 70s camera style (result of the postprocess shader)
the approach :: the elements
• Before postprocessing, for comparison
the approach :: the elements
• X and Z are simply two octaves of cosinus functions. Frequencies and phases
define different cameras. These were chosen randomly from a “random” texture.
This texture was in fact the same one used for noise() ;)
• Therefore, only 256x256 = 65536 cameras possible
• We only explored two rows (512 cameras)
f1 = randomtexture[ texel+=k ]
f2 = randomtexture[ texel+=k ]
f3 = randomtexture[ texel+=k ]
f4 = randomtexture[ texel+=k ]
x(t) = 16*cos(f1*t+f2) + 8*cos(2*f3*t+f4)
y(t) = 16*cos(f4*t+f3) + 8*cos(2*f2*t+f1)
• N is the normal
• N’ is the “smooth normal”
• Simple to combine with regular lighting:
• Regular diffuse is
• Modified is
• h controls the softness of the shadows
the techniques :: texturing