|
Author |
Topic: system constraint (Read 1061 times) |
|
benelek
|
system constraint
« on: Jan 20th, 2003, 10:51am » |
|
a lot of programs that collect complexity along the trail of calculation, tend to increase their processor (proce55or?) usage until suddenly the whole of your operating system has slowed to a crawl, and you struggle just to shut the program down. Fry's Valence, and Reas's Tissue get to this point after a few minutes on most computers, although by this point they've become every bit as beautiful as slow, and i struggle to reason with myself enough to want to close them down. however, it would be even better if we could design in such a way as to keep running while complexity increases. perhaps things like the framerate could be tailored to suit a predetermined point in processor usage. here's my question, cast out to the visually and systematically literate people out there: what are some things you wouldn't mind compromising in an increasingly complex program, in order for it to keep running in the long term? i've suggested framerate, but there've got to be other alternatives out there, in the research houses of cyber culture. would you, for instance, mind programming in a more modular structure, in able to allow quantisized processor usage?
|
« Last Edit: Jan 20th, 2003, 10:54am by benelek » |
|
|
|
|
fry
|
Re: system constraint
« Reply #1 on: Jan 20th, 2003, 2:48pm » |
|
oh my, that slowdown would be a bug.
|
|
|
|
|