With the release of the Processing 2.0 the integration of GLSL shaders through the PShader has sort of forced me into properly learning GLSL (not a bad thing - just a steep learning curve) but has thrown up a few questions which if anybody has a pointer would be much appreciated. Ok brief outline of what I'm trying to achieve, I took one of the the examples from the
http://thndl.com GLSL tutorials and started hacking away to try and make an audio visualization rendered entirely in a fragment shader.
To do this I pass the left and right audio buffer data as float arrays into a GLSL shader and then use this to visualization of the waveforms. The first question is that when trying to index into the array containing the audio data I keep getting runtime errors - I think this is to do with dynamic access and that I need to create a dynamic uniform expression to index the array. I tried to get around this by creating a new array of floats but kept getting a redefinition error saying that I cannot convert a uniform float[] to a float[].
I've posted the code below if anybody could point me in the right direction the last 5 hours of slowly removing my hair will have not been in vain
.
Rich
GLSL Fragment shader
#ifdef GL_ES
precision highp float;
precision highp int;
#endif
#define PROCESSING_COLOR_SHADER
uniform float time;
uniform vec2 resolution;
uniform vec2 mouse;
uniform float audioLeft[1024];
uniform float audioRight[1024];
uniform float bufferSize;
// Required by Processing's default vertex shader
varying vec4 vertColor;
// f will be used to store the color of the current fragment
vec4 f = vec4(1.,1.,1.,1.);
void main(void)
{
// c will contain the position information for the current fragment (from -1,-1 to 1,1)
Hi I'm building a little app using tweetstream, that connects and follows a particular user, but the connection keeps timing out as there is not enough data going down the line, how do I set the socket not to time out so quickly?