We are about to switch to a new forum software. Until then we have removed the registration on this forum.
I want to pass a shader a texture of encoded locations, and have it draw points at the decoded locations.
I have a particle system stored in a texture that's 3000x1, with the x and y locations encoded into RGBA.
Currently I'm having to use the CPU to loop through particles and use point() to draw it to a new texture. I know this is being done properly in a shader in the PixelFlow library, but I can't figure it out, studying particleRender.glsl.
How can I get a shader to replicate whats going on in draw()? It feels like it should be easy but from what I've read on the PShader page and book of shaders I can't piece it together.
Edit: I've updated with my attempt at hacking away at PixelFlow's particleRender.glsl, it isn't throwing errors, but it isn't drawing anything either. I'm not used to the way version 150 works so maybe it's something simple. I've tried lots of trouble shooting and I can't get that shader to draw anything at all.
edit2: I've worked at it for hours, only a tiny bit of progress, a little sign of life from the shader, but I have no idea what it's doing. It's drawing what seems to be a random quad.
PGraphics pgLocs;
PShader psRender;
void setup() {
size(800, 800, P3D);
int totalPoints = 3000;
pgLocs = createGraphics(totalPoints, 1, P2D);
psRender = loadShader("pointFrag.frag", "pointVert.glsl");
psRender.set("tex_position", pgLocs);
psRender.set("wh_viewport", width, height);
psRender.set("wh_position", totalPoints, 1);
psRender.set("point_size", 4);
psRender.set("totalPoints", totalPoints);
randomFill();
}
void randomFill() {
pgLocs.beginDraw(); // fill pgLocs with random locations
for (int i = 0; i < pgLocs.width; i++) {
PVector loc = new PVector(random(width)/width, random(height)/height);
pgLocs.set(i, 0, xyToRGBA(loc));
}
pgLocs.endDraw();
psRender.set("tex_position",pgLocs);
}
void keyPressed() {
randomFill();
}
void draw() {
stroke(0);
strokeWeight(5);
background(55);
// // What I wish would work
shader(psRender);
//ill(255,100);
//rect(0, 0, width, height);
pgLocs.loadPixels();
for (int i = 0; i < pgLocs.width; i++) { // What I'd like to do in a shader instead
color c = pgLocs.pixels[i];// //get pixel color
PVector loc = RGBAtoXY(c); // decode location
stroke(c); // set color just for fun
point(loc.x*width, loc.y*height); // show location was stored in the texture properly
}
filter(psRender);
}
color xyToRGBA(PVector loc) { // pack x into r g, y into b a
PVector l = loc.copy().mult(255);
int xi = floor(l.x);
int yi = floor(l.y);
int xr = floor((l.x-xi)*255);
int yr = floor((l.y-yi)*255);
return (yr << 24) | (xi << 16) | (xr << 8) | yi;
}
PVector RGBAtoXY(color c) {
int a = ((c >> 24) & 0xff) ;
int r = ((c >> 16) & 0xff) ;
int g = ((c >> 8) & 0xff) ;
int b = (c & 0xff) ;
return new PVector((r+g/255.0)/255.0, (b+a/255.0)/255.0);
}
pointFrag.frag
#version 150
uniform vec2 wh_viewport;
uniform sampler2D tex_position;
uniform sampler2D tex_sprite;
uniform vec4 col_A = vec4(1, 1, 1, 1.0);
uniform vec4 col_B = vec4(0, 0, 0, 0.0);
in vec2 location;
in float my_PointSize;
out vec4 out_frag;
void main(){
vec2 my_PointCoord = ((location * wh_viewport) - gl_FragCoord.xy) / my_PointSize + 0.5;
out_frag = vec4(1.0,0,0,0);
}
pointVert.glsl
#version 150
uniform float point_size;
uniform sampler2D tex_position;
uniform vec4 col_A = vec4(1, 1, 1, 1.0);
uniform vec4 col_B = vec4(0, 0, 0, 0.0);
uniform int totalPoints;
out vec2 location;
out float my_PointSize;
vec2 posDecode(vec4 c) {
float r = c.r;
float g = c.g/255.0;
float b = c.b;
float a = c.a/255.0;
return vec2((r+g), (b+a));
}
void main(){
float x = gl_VertexID/ float(totalPoints);
vec4 color = vec4(texture2D(tex_position, vec2(x,1.0)));
location = posDecode(color);
location.y = 1-location.y;
gl_Position = vec4(location * 2.0 - 1.0, 0, 1); //vec4(location.x,location.y,0,1);//
gl_PointSize = point_size;
my_PointSize = point_size;
}
edit3: Still no luck figuring it out. I found more of the puzzle in pixelflow. These calls seem important gl.glEnable(GL3.GL_PROGRAM_POINT_SIZE); gl.glDrawArrays(GL2.GL_POINTS, 0, num_points);
This is what I'm trying to implement this into. I think I could do a million path finders instead of tens of thousands.
Shader in question: https://shaderfrog.com/app/view/1078?view=shader
I'm a complete GLSL noob and was wondering how to go about applying this shader to an imported .obj in Processing.
My sketch is currently as follows:
PShape obj;
PShader shader;
void setup() {
size(360, 720, P3D);
obj = loadShape("mesh.obj");
shader = loadShader("shaderFrogFrag.glsl", "shaderFrogVert.glsl");
}
void draw() {
shader.set("color", 0.3, 0.8, 0.8);
shader.set("secondaryColor", 0.2, 0.4, 0.7);
shader.set("lightPosition", 0.6, 0.0, 2.0);
shader(shader);
translate(width/2, height/2);
shape(obj);
}
The shader code is directly from the site.
Vert:
/**
* Example Vertex Shader
* Sets the position of the vertex by setting gl_Position
*/
// Set the precision for data types used in this shader
precision highp float;
precision highp int;
// Default THREE.js uniforms available to both fragment and vertex shader
uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat3 normalMatrix;
// Default uniforms provided by ShaderFrog.
uniform vec3 cameraPosition;
uniform float time;
// Default attributes provided by THREE.js. Attributes are only available in the
// vertex shader. You can pass them to the fragment shader using varyings
attribute vec3 position;
attribute vec3 normal;
attribute vec2 uv;
attribute vec2 uv2;
// Examples of variables passed from vertex to fragment shader
varying vec3 vPosition;
varying vec3 vNormal;
varying vec2 vUv;
varying vec2 vUv2;
void main() {
// To pass variables to the fragment shader, you assign them here in the
// main function. Traditionally you name the varying with vAttributeName
vNormal = normal;
vUv = uv;
vUv2 = uv2;
vPosition = position;
// This sets the position of the vertex in 3d space. The correct math is
// provided below to take into account camera and object data.
gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
Frag:
/**
* Example Fragment Shader
* Sets the color and alpha of the pixel by setting gl_FragColor
*/
// Set the precision for data types used in this shader
precision highp float;
precision highp int;
// Default THREE.js uniforms available to both fragment and vertex shader
uniform mat4 modelMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat3 normalMatrix;
// Default uniforms provided by ShaderFrog.
uniform vec3 cameraPosition;
uniform float time;
// A uniform unique to this shader. You can modify it to the using the form
// below the shader preview. Any uniform you add is automatically given a form
uniform vec3 color;
uniform vec3 secondaryColor;
uniform vec3 lightPosition;
// Example varyings passed from the vertex shader
varying vec3 vPosition;
varying vec3 vNormal;
varying vec2 vUv;
varying vec2 vUv2;
void main() {
// Calculate the real position of this pixel in 3d space, taking into account
// the rotation and scale of the model. It's a useful formula for some effects.
// This could also be done in the vertex shader
vec3 worldPosition = ( modelMatrix * vec4( vPosition, 1.0 )).xyz;
// Calculate the normal including the model rotation and scale
vec3 worldNormal = normalize( vec3( modelMatrix * vec4( vNormal, 0.0 ) ) );
vec3 lightVector = normalize( lightPosition - worldPosition );
// An example simple lighting effect, taking the dot product of the normal
// (which way this pixel is pointing) and a user generated light position
float brightness = dot( worldNormal, lightVector );
// Fragment shaders set the gl_FragColor, which is a vector4 of
// ( red, green, blue, alpha ).
gl_FragColor = vec4( mix(secondaryColor,color,brightness), 1.0 );
}
The code compiles but the screen is blank. I'm not sure why. Does the implementation of OpenGL used in Processing have a different syntax? For example, I noticed that in the PShader guide (https://processing.org/tutorials/pshader/) there is a different capitalization scheme than on the site: like 'modelviewMatrix' vs. 'modelViewMatrix'.
So what exactly would I need to change to get the code working? Alternatively, if anybody has some links that could point me in the right direction it'd be greatly appreciated.
@hamoid Thx for the shorter version. Im not the java guy here. the output on my system looks like this:
[0] "name=DroidCam Source 3,size=640x480,fps=0"
[1] "name=DroidCam Source 3,size=640x480,fps=30"
[2] "name=EasyCamera,size=1280x720,fps=30"
[3] "name=EasyCamera,size=160x120,fps=30"
So the DroidCam, is an virtual App i need, forgot about it.
I can run you scetch with
Capture.list()[2] // EasyCamera,size=1280x720,fps=30
Yes, it succesfully shows a static image from the webcam.
I don't know e.g
you check if (cam.available() && frame < 30) {
frame++;
means the cam with only 22fps would take longer then 1sec @ 30 frames per secound to start up shader. The Shader is running with 60 fps, means a singel shader cycle is two time and "a fraction" shorter then your frame++ loop.
Ergo: i could be that you contanly feeding the stream or you just don't recognize the effect, becourse many of thouse 22fps look the same, if you break them down eg via ffmpeg, i don't want to go deeply on this but motion starts about 16pfs.
Ergo: their could be an inconsistent behaivor in general.
http://nuigroup.com/forums/viewthread/12405/
Sorry man, i can't help you any further, i only here only on a jump.
EDIT: so i also put a weird nummers in the if statement like 23. and it works.
hm. maybe a shaderbug. try to upgrade your shader version.
processing
{PJOGL.version=3}
and in the shader
#version 150
EDIT 2: Also im on windows means every loop gets unrolled tex[0], tex[1] etc. befor transpiled from GLSL->ANGEL->HLSL. If you are on linux, - look up via "your favorite search engine" something like texture array loop linux glsl bug
Did some webcam test, my webcam is working correcly : ) Very happy about it.
// fragment shader see above
// processing
import processing.video.*;
PShader fx;
Capture cam;
void setup() {
size(640, 480, P2D);
fx = loadShader("shaders/frag.glsl");
cam = new Capture(this, 640, 480, Capture.list()[2], 30);
cam.start();
frameRate(30);
}
int frame = 0;
void draw() {
if (cam.available()) {
if(frame%30==0) frame=0;
cam.read();
fx.set("tex[" + frame + "]", cam.copy());
frame++;
}
//
fx.set("show",1 );
shader(fx);
rect(0, 0, width, height);
}
// gif at 33fps you see the millis are jumping around 60millisecs.
I leave here a full minimal program you can try:
// fragment shader
uniform sampler2D tex[30];
uniform int show;
varying vec4 vertTexCoord;
void main() {
gl_FragColor = texture2D(tex[show], vertTexCoord.st);
}
// processing
import processing.video.*;
PShader fx;
Capture cam;
//PImage tex[] = new PImage[30];
int frame = 0;
void setup() {
size(640, 480, P2D);
//for(int i=0; i<tex.length; i++) {
// tex[i] = createImage(ARGB, 640, 480);
//}
fx = loadShader("shaders/frag.glsl");
cam = new Capture(this, 640, 480, Capture.list()[0], 30);
cam.start();
}
void draw() {
if (cam.available() && frame < 30) {
cam.read();
//A
//tex[frame] = cam.copy();
//fx.set("tex[" + frame + "]", tex[frame]);
//B
fx.set("tex[" + frame + "]", cam.copy());
frame++;
}
fx.set("show", 1); // it should be stuck on frame 1
shader(fx);
rect(0, 0, width, height);
}
The program sends the first 30 frames to the shader.
With the uniform show
we tell the shader it should show only frame 1. Instead, it somehow shows live webcam, ignoring that value.
I tried keeping a copy of the images on the Processing side (commented out) but the result was the same.
/**
* Write Shaders From Inline Strings
*
* ^;.;^
* nabr (2018/Feb/12)
*
* forum.processing.org/two/discussion/comment/117009/#Comment_117009
*/
// Note: best when the Scetch is saved to your local HD, otherwise look in the tmp folder
String[] vertSource={"#version 150"
, "in vec4 position;"
, "uniform mat4 transform ;"
, "void main() {"
, "gl_Position =transform*position;"
, "}"
};
String[] fragSource={"#version 150"
, "out vec4 fragColor;"
, "void main() {"
, "fragColor =vec4(1.,0.,0.,1.);"
, "}"
};
PShader shdr;
void setup() {
size(800, 600, P3D);
noStroke();
// run once at lunch
/**/
PrintWriter O = createWriter("data/vertSource.txt");
for (String s : vertSource) O.println(s);
O.flush();
O.close();
O = createWriter("data/fragSource.txt");
for (String s : fragSource) O.println(s);
O.flush();
O.close();
exit();
/**/
shdr=loadShader("fragSource.txt","vertSource.txt");
shader(shdr);
}
void draw() {
background(0);
translate(width/2, height/2);
rotateY(frameCount*.1f);
box(100);
}
(Did some research this topic on processing, their is no answer todate. So i post my some of my discoveries for the benefit of the community)
@jeffthompson Hello
glShadeModel(GL_FLAT) is deprecated since then 10 years, this is how things work befor shaders were introduced.
You can also compute the hole thing on the CPU.
It should (untested) work with Processing Version <= 2.2.1
Pseudocode https://github.com/processing/processing/wiki/Advanced-OpenGL-in-Processing-2.x
PGraphicsOpenGL pg = (PGraphicsOpenGL)g;
PGL pgl = beginPGL();
gl2 = ((PJOGL)pgl).gl.getGL2();
gl2.glShadeModel(GL.GL_FLAT); // enable flat shading
Second:
"in a really complex project."
Pseudocode
//
void flatShading(float red, float blue, float green ) {
if (enableFlatShading==true) {
PShader flatshader = loadShader("flat.vert","flat.frag");
flatshader.set("color", red, blue, green);
}
else {
resetShader(); // reset to default processing shaders
}
}
hahah too complex. :)
And as i already wrote.Its a modern shader approach transpose inverse of Camera/Object View Matrix (MatrixMath is done on the CPU previously). For backward compatibility, use an older version shader, you can find tones of them on the web.
see here: (works on browsers, should work everywhere)
https://github.com/glslify/glsl-face-normal
If you think about the graphicpipeline. Cube, or what ever shape, will create you some Buffers and then everything is send to shaders, in order to change the color, you have to tweek the shader. https://processing.org/tutorials/pshader/
Probably also there is a way to implement flat shading direct into Shapes3D library.
@cansik nice!
but i wont run on my machine :/
// won't run on windows 10 processing 3.3.6
buffer = createGraphics(mov.width, mov.height, P2D);
// possible solution
mov = new Movie(this, "city.mov");
// github.com/processing/processing-video/blob/master/examples/Movie/Frames/Frames.pde
mov.play();
mov.jump(0);
mov.loop();
print(mov.width); // 960
sobelShader = loadShader("sobelFrag.glsl");
buffer = createGraphics(mov.width, mov.height, P2D);
// etc.
mov.width is zero at init time i guess.
(sobelFrag.glsl has to be in the data folder, just saying)
@cansik NP.
update:
mov = new Movie(this, "city.mov");
mov.read(); // grab the first frame
mov.loop();
If you create a P3D
scene (defined in the size()
method), then all of your PGraphics objects created by createGraphics()
are of that kind. But if you are working with video files, you have to use a P2D
PGraphics object, because video is usually 2-dimensional.
I think you have to read more about 2D and 3D offscreen canvas and when to use which one.
Here is an example on how to use an incoming movie image, render it to a offscreen canvas with shading and then using the result as texture for a shape in a 3D onscreen canvas. You can also download the whole sketch (with example files here).
import processing.video.*;
import peasy.*;
PShader sobelShader;
PGraphics buffer;
Movie mov;
PeasyCam cam;
float canvasWidth = 480;
float canvasHeight = 360;
void setup()
{
size(500, 500, P3D);
cam = new PeasyCam(this, 500);
// load movie
mov = new Movie(this, "city.mov");
mov.loop();
sobelShader = loadShader("sobelFrag.glsl");
buffer = createGraphics(mov.width, mov.height, P2D);
buffer.shader(sobelShader);
}
void draw()
{
// shade the incoming movie image
buffer.beginDraw();
buffer.background(0, 0);
buffer.image(mov, 0, 0);
buffer.endDraw();
// create 3d scene
background(0);
pushMatrix();
// rotate for more interesting 3d magic
rotateX(radians(frameCount % 360));
rotateZ(radians(frameCount % 360));
// use shaded video as texture in the 3d scene
rectMode(CENTER);
beginShape();
textureMode(IMAGE);
texture(buffer);
vertex(canvasWidth / -2f, canvasHeight / -2f, 0, 0);
vertex(canvasWidth / 2f, canvasHeight / -2f, buffer.width, 0);
vertex(canvasWidth / 2f, canvasHeight / 2f, buffer.width, buffer.height);
vertex(canvasWidth / -2f, canvasHeight / 2f, 0, buffer.height);
endShape();
popMatrix();
}
void movieEvent(Movie m) {
m.read();
}
hello! here i have the same problem but i can't resolve with cam.getState().apply()
PShader sh;
PShader sha;
PGraphics buffer;
PGraphics buffer2;
import com.hamoid.*;
import themidibus.*;
import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;
PeasyCam cam;
float count;
void settings () {
fullScreen(P3D);
}
void setup(){
background(0);
cam = new PeasyCam(this, 500);
sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
sha = loadShader("frag2.glsl", "basicShader.glsl");
buffer = createGraphics(width, height, P3D);
buffer2 = createGraphics(width, height, P3D);
}
void draw(){
background(0);
// println(frameRate);
count +=0.005;
sh.set("u_time", count);
buffer.beginDraw();
render(buffer);
buffer.endDraw();
buffer2.beginDraw();
buffer2.shader(sh);
buffer2.image(buffer,0, 0);
buffer2.endDraw();
cam.beginHUD();
image(buffer2, 0,0);
cam.endHUD();
}
void render(PGraphics a){
cam.getState().apply(a); // change here.
a.background(0, 50);
a.noStroke();
s.run();
a.sphere(200);
}
edit 1: new results but not resolve yet:
PShader sh;
PShader sha;
PGraphics buffer;
PGraphics buffer2;
import com.hamoid.*;
import themidibus.*;
import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;
PeasyCam cam;
float count;
void settings () {
fullScreen(P3D);
}
void setup(){
background(0);
cam = new PeasyCam(this, 500);
sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
sha = loadShader("frag2.glsl", "basicShader.glsl");
buffer = createGraphics(width, height, P3D);
buffer2 = createGraphics(width, height, P3D);
}
void draw(){
background(0);
// println(frameRate);
count +=0.005;
sh.set("u_time", count);
buffer.beginDraw();
render(buffer);
buffer.endDraw();
buffer2.beginDraw();
buffer2.shader(sh);
buffer2.image(buffer,0, 0);
buffer2.endDraw();
cam.beginHUD();
image(buffer2, 0,0);
cam.endHUD();
}
void render(PGraphics a){
cam.getState().apply(a); // change here.
a.background(0, 50);
a.noStroke();
s.run();
a.sphere(200);
}
and in both shaders: #define PROCESSING_TEXTURE_SHADER. the transformations do in the buffer2 but don't draw geometry (it apply the shader to the full screen). if i define #define PROCESSING_COLOR_SHADER the geometry is draw but is only white.
hi! im trying to do a multipass shader effect. the problem is that peasyCam dont apply the transformations in the buffer.
PShader sh;
PShader sha;
PGraphics buffer;
PGraphics buffer2;
import com.hamoid.*;
import themidibus.*;
import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;
PeasyCam cam;
float count;
void settings () {
fullScreen(P3D);
}
void setup(){
background(0);
cam = new PeasyCam(this, 500);
sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
sha = loadShader("frag2.glsl", "basicShader.glsl");
buffer = createGraphics(width, height, P3D);
buffer2 = createGraphics(width, height, P3D);
}
void draw(){
background(0);
// println(frameRate);
count +=0.005;
sh.set("u_time", count);
buffer.beginDraw();
render(buffer);
buffer.endDraw();
buffer2.beginDraw();
buffer2.background(0);
buffer2.shader(sh);
buffer2.image(buffer,0, 0);
buffer2.endDraw();
cam.getState().apply(buffer2); // here the question. the image show the shader but like it was a 2D screen.
cam.beginHUD();
image(buffer2, 0,0);
cam.endHUD();
}
void render(PGraphics a){
a.background(0, 50);
a.noStroke();
s.run();
a.sphere(200);
}
if i skip the buffer2
and apply the shader into buffer
it works fine.
i was follow this post https://github.com/jdf/peasycam/issues/25 but it doest work for me.
what im doing wrong?
This was tricky indeed!
I just gave a workshop on shaders and left scratching my head about why creating shader patterns that only depend on vertex positions are not anchored to the object, but rotate in strange ways if the object rotates.
It is solved with hint(DISABLE_OPTIMIZED_STROKE);
Here you can see a working example online.
Here the program that fails to work properly without the hint:
import peasy.PeasyCam;
PeasyCam cam;
PShader fx;
void setup() {
size(600, 600, P3D);
cam = new PeasyCam(this, 400);
fx = loadShader("frag1.glsl", "vert1.glsl");
hint(DISABLE_OPTIMIZED_STROKE);
}
void draw() {
background(0);
shader(fx);
box(300);
}
Vertex
uniform mat4 transformMatrix;
attribute vec4 position;
varying vec3 pos;
void main() {
gl_Position = transformMatrix * position;
pos = vec3(position) * 0.01;
}
Fragment
#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif
varying vec3 pos;
void main() {
float c = .5 + .5*sin(17.*(pos.x + pos.y + pos.z));
c = step(.5, c);
gl_FragColor = vec4(c, c, c, 1.);
}
Hi. I trying to draw texture on shader, but its not work. What I do wrong? bg drawn correct.
PShader mShader;
PImage bg;
PImage tex;
void setup() {
size(640, 360, P2D);
noStroke();
textureMode(NORMAL);
bg = loadImage("bg.jpg");
tex = loadImage("smoke.png");
mShader = loadShader("texfrag.glsl", "texvert.glsl");
mShader.set("texture", tex);
background(255);
}
void draw() {
image(bg,0,0);
shader(mShader);
}
texfrag.glsl:
#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif
uniform sampler2D texture;
varying vec4 vertColor;
varying vec4 vertTexCoord;
void main() {
gl_FragColor = texture2D(texture, vertTexCoord.st) * vertColor;
}
texvert.glsl:
uniform mat4 transform;
uniform mat4 texMatrix;
attribute vec4 position;
attribute vec4 color;
attribute vec2 texCoord;
varying vec4 vertColor;
varying vec4 vertTexCoord;
uniform sampler2D texture;
void main() {
gl_Position = transform * position;
vertColor = color;
vertTexCoord = texMatrix * vec4(texCoord, 1.0, 1.0);
}
Or need sample code for texture draw on shader.
hello everyone. Im trying to pass a video as a texture into a fragment shader. but the sketch crash when i run it.
here is the code:
import processing.video.*;
import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;
PeasyCam cam;
PShader sh;
float count;
Movie mov;
PGraphics p;
void setup (){
size(1440, 900, P3D );
mov = new Movie(this, "osc_noc.mov");
mov.play();
p = createGraphics(width,height);
cam = new PeasyCam(this, 500);
sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
}
void movieEvent(Movie m) {
m.read();
}
void draw(){
background(0);
shader(sh);
count +=0.09;
sh.set("u_time", count);
sphere(100);
p.beginDraw();
p.background(0);
p.image(mov, 0, 0, 200, 200);
p.endDraw();
sh.set("tex",p);
// image(p, 5, 260 ,200, 200);
}
#version 150
uniform mat4 transform;
uniform sampler2D tex;
in vec4 position;
in vec2 texCoord;
in vec3 normal;
out vec2 TexCoord;
void main(){
TexCoord = texCoord;
gl_Position = transform * position;
}
#ifdef GL_ES
precision mediump float;
#endif
#define PI 3.14
in vec2 TexCoord;
uniform float u_time;
uniform sampler2D tex;
void main(){
vec2 uv = TexCoord;
gl_FragColor = vec4(texture(tex, TexCoord));
}
a white screen appears, and next it close. the console just say: "Finished". it may be a bug? i could pass a PImage as a texture. but when i link the fragment and shader program into the sketch folder then crash. ..
update a progress, maybe it help anybody:
(thanks @nabr, im gonna to drive you insane);
Ok, so after hours and hours (think im a little bit stupid) i made a little progress with shaders and processing. I finally could use the texCoord in lightHouse tutorial:
processing:
PShader sh;
PShape t;
import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;
float c [] = {0.0, 0.1,1.0, 1.0};
float b [] = {1.0, 0.1,1.0, 1.0};
PeasyCam cam;
float count;
void setup(){
cam = new PeasyCam(this, 500);
// size(500,500, P3D);
fullScreen(P3D);
t = trian();
sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
}
void draw(){
background(0);
// cam.rotateY(0.05);
shader(sh);
count +=0.09;
sh.set("u_time", count);
// sh.set("RadiusOuter", 10.0 );
// sh.set("RadiusInner", 20.0 );
// sh.set("InnerColor", c);
// sh.set("OuterColor", b);
pushMatrix();
scale(100);
shape(t);
noStroke();
popMatrix();
pushMatrix();
translate(-width/4, 0,0);
sphere(200);
popMatrix();
}
PShape trian (){
textureMode(NORMAL);
PShape t = createShape();
t.beginShape(QUAD_STRIP);
t.vertex(-1,-1,0, 0, 1);
t.vertex(-1,1,0, 1, 1);
t.vertex(1,-1,0, 1, 0);
t.vertex(1,1,0, 0, 0);
t.endShape();
return t;
}
vertexShader:
uniform mat4 transform;
in vec4 position;
in vec2 texCoord;
out vec2 TexCoord;
void main(){
TexCoord = texCoord;
vertColor = position.xyz;
gl_Position = transform * position;
}
fragmentShader:
#ifdef GL_ES
precision mediump float;
#endif
in vec2 TexCoord;
void main() {
gl_FragColor = vec4(TexCoord,0.0, 1.0);
}
hello! its me again. im in a hardest process of learning GLSL. As i can see, exist many types of sintaxis and versions for writing a shader program and its a little bit confuse writing a shader for Processing. In this time, i`ve two questions.
1) im reading OpenGl 4.0 Shading Lenguaje Cookbook. One of the first examples, use it a block of uniforms. How do i set a uniform block in Processing? cos if i set one by one it doesnt work.
in fragment shader;
uniform BlobSettings {
vec4 InnerColor;
vec4 OuterColor;
float RadiusInner;
float RadiusOuter;
};
In processing:
sh.set("RadiusOuter", 10.0 );
sh.set("RadiusInner", 20.0 );
sh.set("InnerColor", c);
sh.set("OuterColor", b);
2) Following this light house tutorial http://www.lighthouse3d.com/tutorials/glsl-tutorial/texture-coordinates/ im trying to do the example of textCoord, and it doesnt work... here the code:
PShader sh;
PShape t;
import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.*;
PeasyCam cam;
void setup(){
cam = new PeasyCam(this, 800);
size(500,500, P3D);
t = trian();
sh = loadShader("basicShaderFrag.glsl", "basicShader.glsl");
}
void draw(){
background(0);
shader(sh);
shape(t);
}
PShape trian (){
PShape t = createShape();
t.beginShape(QUAD_STRIP);
// t.fill(random(255),0,0);
t.vertex(-200, -200);
// t.fill(0,255,0);
t.vertex(-200, 200);
// t.fill(0,0,255);
t.vertex(200,-200);
t.vertex(200, 200);
t.endShape();
return t;
}
vertexShader:
#version 330
uniform mat4 transform;
in vec4 position;
in vec3 color;
in vec2 texCoord;
out vec3 vertColor;
out vec2 TexCoord;
void main(){
TexCoord = texCoord;
vertColor = color;
gl_Position = transform * position;
}
fragmentShader:
#ifdef GL_ES
precision mediump float;
#endif
in vec3 vertColor;
in vec2 TexCoord;
void main() {
gl_FragColor = vec4(TexCoord,1.0,1.0);
}
was suppossed to be this result:
http://www.lighthouse3d.com/wp-content/uploads/2013/02/texturecoordinates.jpg
but its only a blue rect. and its becouse TexCoord value is 0 and im lost why... how fragCoord works?
thanks.
@nabr That example in the Processign website... ehhh I don't think it is the best. Calling the filter(blur) suppose to be a demo of shaders? I was puzzled first about where to get the glsl files from. Are you familiar with how filter relates to shaders by any chance?
@mnoble From the following link: https://processing.org/tutorials/pshader/
A PShader object is created with the loadShader() function which takes the filenames of the vertex and fragment files as the arguments. If only one filename is specified, then Processing will assume that the filename corresponds to the fragment shader, and will use a default vertex shader.
Later in the same document, it says:
You will notice that this time the loadShader() function only receives the filename of the fragment shader. How does Processing complete the entire shader program? The answer is that it uses the default vertex stage for texture shaders. As a consequence of this, and since the varying variables are first declared in the vertex stage, the fragment shader has to follow the varying names adopted in the default shader. In this case, the varying variables for the fragment color and texture coordinate must be named vertColor and vertTexCoord, respectively.
This last is important. If you get a shader from another source, you need to adapt it to Processing guidelines. Processing allows to manipulate and run shaders without going through all the exercise of setting up all the bits and pieces to do so. More info next:
If we work with a low-level toolkit in C or C++ with direct access to the OpenGL API, we are free to name the uniforms and attributes of a shader in any way we like, since we have absolute control on the way the geometry is stored in our application, and how and when it is passed over to the GPU pipeline using the OpenGL functions. This is different when working in Processing, since the shaders are handled automatically by the renderers and should be able to handle the geometry that is described with the drawing API of Processing. This doesn't imply that custom shaders must render things in the same way as Processing does by default, quite in the contrary, the use of custom shaders opens up the possibility of greatly altering the rendering pipeline in Processing. However, custom shaders meant to be used in conjunction with the standard drawing API have to follow certain naming conventions, and are bound by some limitations.
I suggest you have a look at the 6.1 and 6.2 examples presented in the link above. I am not familiar with shaders, I can only help whenever I can (and I know)...
Kf
Hm. I can not reproduce it.
- Processing 3.x
Exampels --> Shaders --> BlurFilter
shadersource: https://pastebin.com/raw/ny5JNCJH
//
PShader blur;
void setup() {
size(640, 360, P2D);
// note i have to separe the url "http"+string only in this forum
blur = loadShader("https://"+"pastebin.com/raw/ny5JNCJH");
stroke(255, 0, 0);
rectMode(CENTER);
}
void draw() {
filter(blur);
rect(mouseX, mouseY, 150, 150);
ellipse(mouseX, mouseY, 100, 100);
}
Some examples in https://processing.org/tutorials/pshader/
or
https://forum.processing.org/two/search?Search=loadshader
I dunno much about this example or about shaders. If you decide to try any of those in the above links, make sure you have the fragment (and vertex if any) shader in the sketch directory level (which is accessed via ctrl+k in your sketch).
Kf
Hello Team,
I feel foolish that I can't figure this out. I am literally copying the code example from the processing reference for blur.
https://processing.org/reference/PShader.html
And, I get this error in Processing 3 and 2).
"Cannot compile fragment shader: ERROR 0:9 '<' : syntax error"
I must be missing something simple yes? :)
PShader blur;
void setup() {
size(640, 360, P2D);
// Shaders files must be in the "data" folder to load correctly
blur = loadShader("blur.glsl");
stroke(0, 102, 153);
rectMode(CENTER);
}
void draw() {
filter(blur);
rect(mouseX-75, mouseY, 150, 150);
ellipse(mouseX+75, mouseY, 150, 150);
}