This paper announces an ongoing research project on instant composition for computer-controlled acoustic instruments performed by humans. Most specifically, this concerns the composition of original works for two computer-controlled instruments,
a Yamaha Disklavier and a MIDI-controlled organ, employing LiveLily -- a system for live scoring/sequencing through live coding in a Lilypond-like language -- as well as a Recursive Neural Network generating patterns on-the-fly and during performance.
Here, we review the state-of-the-art underscoring the urgency for research in this subarea, outline a method and present preliminary findings as a proof of concept. Eventually, a plausible research plan that is to be pursued over the following months is sketched out.