[flocking] Decoupling Signal Generation from Audio Domain

Colin Clark colin at colinclark.org
Fri May 5 18:07:52 UTC 2017


Hi Markus,

I'm glad to hear it was so helpful. I'm looking forward to seeing what you're developing!

Can you elaborate a bit more about the problem you're currently facing? The scope unit generator is only intended for visualizing the realtime signal flow. Do you want draw an arbitrary buffer into a canvas element, or something like that? If so, you can use the flock.view.scope component or the flock.view.drawBuffer() utility function. The latter takes a buffer and will create a new canvas element, draw the buffer into it, and return the Canvas. The former requires you to create your own canvas element, and a fixed "values" array, and whenever you call the refreshView() method on it, it will redraw the output. This is used internally by the scope unit generator, and might be closer to what you want.

Unfortunately there's no documentation for these parts of Flocking, so you'll have to (I'm sorry to say) work your way through the source to understand how to use them:

https://github.com/colinbdclark/Flocking/blob/master/src/gfx.js

Let me know if you have any questions and I'll try to help!

Colin

> On May 1, 2017, at 11:07 AM, Markus Tretzmüller <a1528519 at unet.univie.ac.at> wrote:
> 
> Hi Colin,
> 
> Wow, this was quite helpful. We merged your code (slightly adapted) into a live playground. We are now able to control our light setup with the synthesized signals. This application is intended to be a tool for light/ visual jockeys. I’m currently stuck in visualizing the signal on a canvas each time the synthDef changes/ is loaded. The ugen.scope (as intended) only renders the currently processed value, but we need to prerender the signal for a given timespan. Can this be accomplished? Thanks for the feedback and the great support. 
> 
> Best regards! 
> 
>> Am 16.04.2017 um 18:18 schrieb Colin Clark <colin at colinclark.org <mailto:colin at colinclark.org>>:
>> 
>> Hi Markus,
>> 
>> I'm glad to hear you're interested in Flocking. It's not quite clear to me exactly how you will translate audio signals into something suitable for lights and stroboscopes, so I can't really make any recommendations for you. But I think there are a few different ways you could do this with Flocking. Are you planning to use a web browser or Node.js?
>> 
>> One option, if you need to extract signals at audio rate, would be to write a "sink" unit generator that takes incoming audio signals and then, rather than outputting them to the speakers, operates on them in whatever way you need to for activating your lights. This will be synchronous, so you'll want to make sure whatever operations you're performing will return quickly enough to avoid dropouts.
>> 
>> But perhaps more appropriately, I can imagine that you really don't need to control your lights at audio rate at all, but at some much slower frequency—perhaps 60 Hz or so? In that case, you could create a flock.synth.frameRate synth, which is intended to be manually evaluated at non-audio rates (you can configure the frequency with the fps option). In this case, you won't typically add a frame rate synth to the Flocking environment, you'll just manually invoke its value() method at whatever clock rate you need. You may well want to use a separate scheduler of some kind, such as Bergson (https://github.com/colinbdclark/bergson <https://github.com/colinbdclark/bergson>). This is how I use Flocking to produce signals for my videos (e.g. https://vimeo.com/166157781 <https://vimeo.com/166157781>).
>> 
>> Here's a basic sketch of roughly what you might do:
>> 
>> var frameRate = 60;
>> 
>> // Create a scheduler powered by the browser's requestAnimationFrame callback.
>> var scheduler = berg.scheduler({
>>     components: {
>>         clock: {
>>             type: "berg.clock.requestAnimationFrame",
>>             options: {
>>                 freq: frameRate
>>             }
>>         }
>>     }
>> });
>> 
>> scheduler.start();
>> 
>> // Create a frame rate synthesizer. This one produces a period of a sine wave every two seconds.
>> var synth = flock.synth.frameRate({
>>     fps: frameRate,
>>     addToEnvironment: false,
>>     synthDef: {
>>         ugen: "flock.ugen.sinOsc",
>>         freq: 1/120
>>     }
>> });
>> 
>> // Schedule an action to extract the value of your synth and do something with it every frame.
>> scheduler.schedule({
>>     type: "repeat",
>>     freq: frameRate,
>>     callback: function (now) {
>>         var currentValue = synth.value();
>>     	// Do something with your current value, like send it to a light.
>>     }
>> });
>> Good luck with your project, and let me know how it goes!
>> 
>> Colin
>> 
>>> On Apr 14, 2017, at 4:25 AM, Markus Tretzmüller <a1528519 at unet.univie.ac.at <mailto:a1528519 at unet.univie.ac.at>> wrote:
>>> 
>>> Hello,
>>> 
>>> I’m fascinated by flockings easy and flexible style of signal generation. Me and some colleagues at the university are thinking about adopting it for light signal generation.
>>> Can you recommend flocking for such a use case? We basically replace the audio speakers with lights and stroboscopes. How can flocking be adjusted to serve this purpose? We need to hook in, and gather the signals right before forwarding them to the audio APIs. Any help would be appreciated.
>>> 
>>> Best regards
>>> Markus Tretzmüller
>>> _______________________________________________
>>> flocking mailing list
>>> flocking at lists.idrc.ocad.ca <mailto:flocking at lists.idrc.ocad.ca>
>>> http://lists.idrc.ocad.ca/mailman/listinfo/flocking <http://lists.idrc.ocad.ca/mailman/listinfo/flocking>
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.idrc.ocad.ca/pipermail/flocking/attachments/20170505/4be207a5/attachment.html>


More information about the flocking mailing list