I'm trying to write a simple program which draws a line from the centre of the screen to a randomised point within the window. Then, I want to (for now) draw an ellipse somewhere along that line, using the y-intercept formula y=mx+b to track the line's slope. So far, so good.

However, now I want to ensure that this ellipse isn't drawn too close to the centre. I used a while loop to check the distance between the ellipse's centre coordinates and the centre of the screen, but it's quite regularly, though not always, getting stuck in an infinite loop. It seems x and y are fairly often being assigned / calculated to 100.0 (i.e. the centre of the screen). Code below (beware of infinite looping if you decide to run it):

I thought I'd allowed for the infinite loop by randomising x within the while loop, re-calculating y, then reassigning nodeProx to the new distance, however, it spits out 0.0 for nodeProx and can't change despite the reassignment within the while loop.

I'm relatively new to Processing and my maths is extremely rusty. I suspect it's a maths problem over a programming problem, but after managing to get to this point, I'm a little stuck.

Can anyone see the error of my ways? Is there a simpler solution for drawing a random point along a randomly-generated line?

Many thanks in advance!

However, now I want to ensure that this ellipse isn't drawn too close to the centre. I used a while loop to check the distance between the ellipse's centre coordinates and the centre of the screen, but it's quite regularly, though not always, getting stuck in an infinite loop. It seems x and y are fairly often being assigned / calculated to 100.0 (i.e. the centre of the screen). Code below (beware of infinite looping if you decide to run it):

- size(200, 200);
- background(#13212C);
- smooth();
- stroke(255);
- float x1 = width / 2;
- float y1 = height / 2;
- float x2 = random(0, width);
- float y2 = random(0, height);
- while(x2 > (width / 4) && x2 < ((width / 4) * 3) && x2 > (height / 4) && x2 <((height / 4) * 3))
- {
- x2 = random(0, width);
- y2 = random(0, height);
- }
- line(x1, y1, x2, y2);
- float m = (y2 - y1) / (x2 - x1);
- float b = y1 - (m * x1);
- float x = random(x1, x2);
- float y = (m * x) + b;
- float nodeProx = dist(x, y, x1, y1);
- println("nodeProx" + nodeProx);
- println("x" + x + ", y" + y);
- while(nodeProx <= 20.0)
- {
- x = random(x1, x2);
- y = (m * x) + b;
- nodeProx = dist(x, y, x1, y1);
- println("x" + x + ", y" + y);
- println(nodeProx);
- }
- println("m = " + m);
- println("b = " + b);
- noStroke();
- fill(255);
- ellipse((width / 2), (height / 2), 20, 20); // central ellipse
- fill(255, 0, 0);
- ellipse(x, y, 5, 5); // 'node' ellipse

I thought I'd allowed for the infinite loop by randomising x within the while loop, re-calculating y, then reassigning nodeProx to the new distance, however, it spits out 0.0 for nodeProx and can't change despite the reassignment within the while loop.

I'm relatively new to Processing and my maths is extremely rusty. I suspect it's a maths problem over a programming problem, but after managing to get to this point, I'm a little stuck.

Can anyone see the error of my ways? Is there a simpler solution for drawing a random point along a randomly-generated line?

Many thanks in advance!

1