Continuing previous experiments on random patterns composition, i made a web app with Processing.js.
It lets you choose one/several patterns among 65 through a simple GUI. Patterns are then randomly rotated and/or mirrored to create a new composition.
You can also
export your visual as png or add your own patterns or images to play with: you'll just need to drag 'n drop them on the canvas, adding them in the GUI.
I have a little math problem with glsl scaled translation:
I have a GSMovie playing in a GLTexture tex, i apply a GLTextureFilter ClipFilter that translate my movie following mouse coords, and apply Opacity and Scale.
The problem is: i want to apply Scale
from the center of the translation.
So, here is my program:
GLSLClip.pde:
import processing.opengl.*;
import codeanticode.glgraphics.*;
import codeanticode.gsvideo.*;
GSMovie movie;
GLTexture tex, texFiltered;
GLTextureFilter ClipFilter;
float Opacity=.5, Scale=1.0;
void setup() {
size(640, 480, GLConstants.GLGRAPHICS);
tex = new GLTexture(this);
texFiltered = new GLTexture(this);
movie = new GSMovie(this, "station.mov");
movie.setPixelDest(tex);
movie.loop();
ClipFilter = new GLTextureFilter(this, "ClipFilter.xml");
}
void movieEvent(GSMovie movie) {
movie.read();
}
void draw() {
if (movie.ready() && movie.width>1) {
if (tex.putPixelsIntoTexture()) {
background(255, 0, 0);
float x = map(mouseX, 0, width, -.5, .5);
float y = map(mouseY, 0, height, -.5, .5);
ClipFilter.setParameterValue("posXY", new float[]{x, y});
ClipFilter.setParameterValue("Scale", Scale);
ClipFilter.setParameterValue("Opacity", Opacity);
tex.filter(ClipFilter, texFiltered);
image(texFiltered, 0, 0, width, height);
}
}
}
void keyPressed() {
if(key == CODED){
if (keyCode == UP && Opacity<1){
Opacity += .1;
println("Opacity: "+Opacity);
}
else if (keyCode == DOWN && Opacity>0){
Opacity -= .1;
println("Opacity: "+Opacity);
}
else if (keyCode == LEFT && Scale>.1){
Scale -= .1;
println("Scale: "+Scale);
}
else if (keyCode == RIGHT && Scale<3){
Scale += .1;
println("Scale: "+Scale);
}
}
}
ClipFilter.xml:
<filter name="Clip Filter">
<description>Clip Filter with posXY,Scale,Opacity</description>
I've uploaded a growing series of ProcessingJS sketches including HTML5 canvas drop shadows, using camera (getUserMedia API), transparent canvas, and web audio API, more will follow like using javascript libraries for gif and pdf export from canvas...
I'd like to use the
Json4Processing library for a project, but am stuck with an "Unhandled exception type JSON Exception" when using JSONObject.put() method.
import org.json.*;
JSONObject obj;
void setup() {
obj = new JSONObject();
obj.put("key", "value");Unhandled exception type JSON Exception
println(obj);
}
void draw(){};
Json4Processing v0.1.3 lib is correctly installed (i can import it from the Import Menu and view the examples in the Examples Menu...) and the error appears in the examples too.
The lib seems to work well with JSONArray but the problem is with JSONObject: I can create instances but can't use the JSONObject.put() method.
I also tried to add: "import org.json.JSONException;" with the same result.
Don't know why it doesn't work for me when it seems no one had this problem before, so any hint is welcomed.
I'm currently working on an installation by
night where people interact with a video projection via a camera.
Here is a quick plan of the installation:
I'll need crowd movement/blob detection (no need of Kinect type detection). The position of the camera cannot be changed and It has to work with Processing1.5.1 on MacMini.
Then, do you have any suggestion on the type (or model) of camera you would use for this, considering low light?
I've read about IR projector + IR camera but don't know how it integrates with processing?
I'd like to use a PGraphics as a mask on a PImage via PImage.mask(PGraphics).
It works fine in Processing (except for PImage with png with alpha, bug...) but there seems to be a problem as you can see HERE: there is no mask at all on the PImage
Hi all, Well, i've developped a world battleship using twitter as a game platform. https://twitter.com/#!/WorldBattleShip Since everybody can shoot via a tweet, the map is bigger 26*26. you just have to send a tweet to shoot and the app will respond to you ;) Tweet to shoot: @WorldBattleShip#shoot#LetterFromAToZ-NumberFrom1To26 (ex: #D-14)
It is based on twitter4j & twitpic4p. It's still in early dev but you can try it ;) (if the app does not answer it means my computer is not online but the game will answer you as soon as i’ll turn on my computer)
The Game itself isn't a an end, but more a creative way of playing with Processing and Twitter different from the common twitter data viz which can be really great (but it's not the subject here).
Hello,
i'm having some troubles with
Psurf library which seems to be exactly what i need (fast image recognition), since the exemples coming along with the library are only about finding Interest Point in an image, but not about comparison with an other nor about finding the image itself in another or a camera stream.
So, if someone could give me an hint, or anyone who succeeded with Psurf could give a short exemple, that would be great.
Hi,
i've been working on a 3d spinning top generator, using OpenGL, Shapes3D and Unlekker.
The thing works great when running from processing ( except some mesh problems i have to fix in rhino3d before printing the .stl, any advice welcome ).
The problem is with the exported applet:
http://makio.free.fr/wp-content/uploads/ProcessingApplet/SpinningTopGen/ (sources available, exported with processing 1.0.9, signed jars) working from some machines and not from others... and i can't figure out why.
Could you tell me if it works for you and/or what could be the problem.
Thanks.
nb: commands:
space: new random
up/down: new random with one more/less control point
left/right: changing mode
s: exporting .stl when not online
d: straight lines mode