Glitch Art and Shaders

posted by on 2017.02.11, under Processing
11:

It’s been a while since the last post. I have been busy with (finally!) starting to set up a website to collect some of my works, and I’ve been more or less finishing a couple of interactive installations. For this reason, interactivity and real-time processing have captured my attention recently. It turns out that when you want to interact with a piece of code which produces graphics, and as soon as what you are doing involves more than just a couple of pairs of colored circles, you run quickly into performance issues. So, unless you are one of those digital artists drawing a blinking white circle in the middle of the screen and call it art (it’s fine, don’t worry, go on with it), you need to find your way around these types of issues. In practice, this amounts to get comfortable with words like Vertex Buffer Object, C++, and shaders, to which this post is dedicated.
The story goes like this: modern graphic cards (GPU) have a language they use, called GLSL . For instance, when in Processing you draw a line or a circle, what is actually happening behind the curtains is a communication between the CPU and the graphic card: Processing informs the GPU about the vertices of the line, the fact that it has to be line, the color of the vertices, etc. There are several stages from when the vertices are comunicated to the final result that you see on your screen. Some of these stages are user programmable, and the little programs that take care of each of these stages are called “shaders”. Shaders are notoriously difficult to work with: you have to program them in C, basically, and they are quite unforgiving with respect to errors in the code. On the other hand, they are really really fast. If you want to know why it is so, and how a (fragment) shader operates, give a look here.
So, why the hell would you want to learn such a tool? Well, if you, like me, are fond of glitch art, you must have realized that interactive real-time glitch art is almost impossible if you try to work pixel by pixel: even at a resolution of 800×600, the amount of computations for the CPU to get a framerate of 30fps is impractical. Enter fragment shaders! If you delegate the work to the GPU, it becomes more than doable.
I can’t go into the detail of the code I present in the following, but there are very good tutorials on the web that slowly teach you how to tame shaders. In particular, give a look here. Rest assured: you really need to be programming friendly, and have a lot of patience to work with shaders!

PImage img;
PShader glitch;


void setup(){
  size(800, 600, P2D);
  background(0);
  img = loadImage(insert_link_to_image);
  img.resize(800, 600);
 
 
  glitch = loadShader("glitchFrag.glsl");
  glitch.set("iResolution", new PVector(800., 600., 0.0) );
 
}

 
}

void draw(){
 
  glitch.set("iGlobalTime", random(0, 60.0));
 
   if (random(0.0, 1.0) < 0.4){
  shader(glitch);
   }
 
  image(img, 0, 0);
 
  resetShader();
 
}

---------------

// glitchFrag.glsl

#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif


#define PROCESSING_TEXTURE_SHADER

varying vec4 vertTexCoord;
uniform sampler2D texture;
uniform vec3      iResolution;          
uniform float     iGlobalTime;          



float rand(vec2 co){
    return fract(cos(dot(co.xy ,vec2(12.9898,78.233))) * 43758.5453);
}


void main(){
   vec3 uv = vec3(0.0);
   vec2 uv2 = vec2(0.0);
   vec2 nuv = gl_FragCoord.xy / iResolution.xy;
   vec3 texColor = vec3(0.0);

   if (rand(vec2(iGlobalTime)) < 0.7){
    texColor = texture2D(texture, vertTexCoord.st).rgb;
}
 else{
   texColor = texture2D(texture, nuv * vec2(rand(vec2(iGlobalTime)), rand(vec2(iGlobalTime * 0.99)))).rgb;
}
       
    float r = rand(vec2(iGlobalTime * 0.001));
    float r2 = rand(vec2(iGlobalTime * 0.1));
    if (nuv.y > rand(vec2(r2)) && nuv.y < r2 + rand(vec2(0.05 * iGlobalTime))){
    if (r < rand(vec2(iGlobalTime * 0.01))){
       
   if ((texColor.b + texColor.g + texColor.b)/3.0 < r * rand(vec2(0.4, 0.5)) * 2.0){
       
        uv.r -= sin(nuv.x * r * 0.1 * iGlobalTime ) * r * 7000.;
        uv.g += sin(vertTexCoord.y * vertTexCoord.x/2 * 0.006 * iGlobalTime) * r * 10 *rand(vec2(iGlobalTime * 0.1)) ;
        uv.b -= sin(nuv.y * nuv.x * 0.5 * iGlobalTime) * sin(nuv.y * nuv.x * 0.1) * r *  20. ;
        uv2 += vec2(sin(nuv.x * r * 0.1 * iGlobalTime ) * r );
   
}
       
    }
}

  texColor = texture2D(texture, vertTexCoord.st + uv2).rgb;
  texColor += uv;
   gl_FragColor = vec4(texColor, 1.0);  
   
}

In the following, you can see the result applied to a famous painting by Caravaggio (yes, I love Caravaggio): it matches real time framerate.
If you want to apply the shader to the webcam, you just need to set up a Capture object, called, say, cam, and substitute img with cam in the Processing code. Enjoy some glitching! :)

Glitch Shader from kimri on Vimeo.

comment

Hello,

I love what you did with this effect. I’m looking to do the following thing:

I would like the image to be the camera and the glitch to be linked to the sound.

That is to say that when there is no noise, the image is clear and the more noise there is, the more the image is distorted.

Could you help me ? I can possibly pay you if it is necessary :)

I thank you in advance

A student in panic

Jack ( 23/03/2018 at 9:48 am )

Please Leave a Reply

Current ye@r *

TrackBack URL :

pagetop