Dear friends, I'll be in Taiwan next week, talking about my Vj software works (done in Processing and openFrameworks) on art and educational contexts. You're all invited!
Audiovisual Art Movement - Quase-Cinema Art Education Workshop
Lecture: November 1, 2012, 10:00 (free seating)
Workshop: November 1, 2012, from 13:30 to November 3, 2012, 18:00
Location: Taipei National University of the Arts, the Arts Biodome K501
Workshop Agenda
First day: 2012/11/01 (IV), 13:30 ~ 18:00
VJing and videoart practices overview
Software as a work of art / art tool
Experimental film, live cinema
Quase-Cinema VJ software usage
The next day: 2012/11/02 (V), 09:30 to 18:00
Audio and video improvisation as a tool for arts education
Nonlinear narrative structure
Movie library preparation
Performance planning
Equipment showcase
Day Three: 2012/11/03 (Sat), 09:30 to 18:00
Quase-Cinema VJ software development
In-depth look at Quase-Cinema / Java source code
Supervised by: National Science Council, the Ministry of Education Teaching Excellence Project
Organizer: World Culture Portal Asia Pacific Secretariat, Taipei National University of the Arts, Art and Technology Center, Taipei Contemporary Art Center
Co-organizer: Digital Archives and Digital Learning National Science and Technology Program, Digital Archives and Learning and Promotion and International Cooperation Program
I'm making a dictionary like applet for ancient greek poetry.
I need to check what word the user clicked (or is mousing over) per line. Each line has aorund five words, with a font that is not monospaced.
I'm thinking about a solution that has the pixel size and position of each word hard coded. Any better ideas?
I've released the code for my new live video performance application "Quase-Cinema Feijoada Remix".
The project is on it's very beginning, but the source is open and it's already running ok. The software is supposed to replace my previous performance app, made in openFrameworks, Quase-Cinema 2 (also open source).
It allow the mixing of 4 video layers, pixel effects, video mapping (quads and berzier), BPM rythm creation and Midi control.
It's built on Processing 2.0 (alpha 4), and the performance is smoking - thanks to the new 2.0 architecture, GSvideo and openGL. I'm realy enjoying to be making this in Processing, as the whole environment and community and very mature.
Great thanks to all the people that have made this possible:
- Ben Fry, Casey Reas and the Processing development team
- Andreas Schlegel (controlP5 and sDrop)
- Andres Colubri (GSvideo)
- Damien Di Fede (Minim)
- Marcin Ignac (Projected Quads)
- Patrick Saint-Denis (mappingtools)
- Severin Smith (the MidiBus)