How can I load these twitter URLs that load normally in my browser, but not when I try to access them with processing?
in
Core Library Questions
•
24 days ago
Hi!
I'm writing a little program whose purpose is to fetch images from twitter, and which should be very simple, I think. Yet, I'm not very experienced with programming, so its giving me a headache, and I'm getting tired now.
I have a few problems here and there, but the first thing I want to ask is about loading URLs from twitter images, like this one: t.co/mAtQ0Q8cot. It works if you open it from the browser, but if I try to fetch its code using loadStrings or Client.write I get "The file "t.co/mAtQ0Q8cot" is missing or inaccessible, make sure the URL is valid or that the file has been added to your sketch and is readable." or "java.net.UnknownHostException: t.co/mAtQ0Q8cot ...."
I want to read the content of the URL as text, and also be able to handle the exception if the content disappears or if it get a badly formatted web adress.
Here is my code for this part (three different versions I have tried to extract the HTML text):
import processing.net.*;
void setup() {
loadstrings3(); //<- Here I change the number accordingly.
}
void loadstrings1() {
String[] html = new String[0];
html = loadStrings("t.co/mAtQ0Q8cot");
if (html == null) {
println("Valor Nulo");
html = new String[2];
html[0] = "an";
html[1] = "error";
}
else {
println("html: " + html);
}
}
void loadstrings2() {
try {
String[] html = loadStrings("t.co/mAtQ0Q8cot");
println("html: " + html);
}
catch (Exception e) { //<- This doesn't work at all...
println("Exception e");
String[] html = new String[0];
html[0] = "an";
html[1] = "error";
}
}
void loadstrings3() {
String html;
Client c = new Client(this, "t.co/mAtQ0Q8cot", 80);
c.write("GET / HTTP/1.0\r\n"); // Use the HTTP "GET" command to ask for a Web page
c.write("\r\n");
if (c.available() > 0) { // If there's incoming data from the client...
html = c.readString(); // ...then grab it and print it
println(html);
}
}
Also, how long should I wait before asking for another URL? Since if I hit the server too often (I intend to download like a 1000 of these URLs and pics XD) it won't work right?
Thanks!
---
For if reason has once undoubted right on its side, it will not allow itself to be confined to set limits, by vague recommendations of moderation. -Immanuel Kant
1