Hi! I want to convert this output to UNIX Time Stamp
but unable to find out how... It´s the current system date with zero hours/minutes/seconds.
Can someone help me with an example? Would be very appreciated! Best regards!
Hi! I´m a student working on my thesis project, and using openpaths. I'm having a problem when trying to use my data that is returned from the API. On the "blprintOpenPathsExample.pde" example there is an attempt to convert the response.getBody() in to a JSONArray. Wich gives me an error "JSONArray is ambiguos". After looking at the javadocs, I found out that response.getBody() returns a String. So, as a work around, I'm saving dynamically that string as a local "data.json" file and then load it back as a JSONArray. But I would like to convert it directly without having to save a local file. I´m sure its possible... Am I doing something wrong? Can someone help me? Would be very appreciated!
Best regards!
Here is my code:
final String ACCESS = "";
final String SECRET = "";
final String URL = "https://openpaths.cc/api/1";
void openPaths()
{
OAuthService service = new ServiceBuilder()
.provider(OpenPathsApi.class)
.apiKey(ACCESS)
.apiSecret(SECRET)
.build();
OAuthRequest request = new OAuthRequest(Verb.GET, URL);
My problem is that I want to use a PS3 Eye, as I did in the past, in a mac osx 10.6.8 with Processing 2.0b8.
Before, with Processing 1.5, I used Macam (macam.componente). Now as I understood, Processing as 2.0+ version stopped using Quiktime, but if I am not wrong, macam.componente uses Quiktime... Só now I don´t know how I am going to use my PS3 Eye in my latest project with Processing 2.0b8
Does anybody know how I can use PS3 Eye Camera with Processing 2.0b8?
Which is better for Android development? Processing or OF?
Which is lighter weight for the system, which would be less battery consumer...
It is going to be an app that has to be "on" 24/7, in system background...
Between these two frameworks, what are the pros and cons, for android development in
particularly
I´m working on a sketch where I´m using an Arduino for an interface and want to trigger some sounds with it.
My problem relies on when I trigger the sound before I start the arduino port, in the 'void setup' it works fine, but when I try to trigger the sound after I start the arduino port, like in 'void draw', it won´t make any sound.
Is there any incompatibility when using these two libraries together?
When I flip videocapture from my webcam doing the mirror effect, the overlayed objects don´t "flip" they go opposite way, where they original went before flipping. I know that this happen because I am feeding the library function the instance from the webcam and not the fliped frame. The problem is when try to feed the function the
fliped frame he doesn´t recognize the marker.
Please help, it´s for a school project.
Thanks for your time.
Here is my code:
import processing.video.*;
import jp.nyatla.nyar4psg.*;
import processing.opengl.*;
Capture webcam;
SingleARTKMarker realidade_aumentada;
Cubo cubo1;
void setup() {
try {
quicktime.QTSession.open();
}
catch (quicktime.QTException qte) {
qte.printStackTrace();
}
size(640, 480, OPENGL);
String[] devices = Capture.list();
//println(devices);
webcam = new Capture(this, width, height, devices[4], 30);
//webcam.settings();
realidade_aumentada = new SingleARTKMarker (this, width, height, "camera_para.dat", SingleARTKMarker.CS_LEFT);