[flocking] Questions

Steven Dale lifeinchords at gmail.com
Fri May 1 17:55:34 EDT 2015


Thank you for super detailed response. Very helpful - we have a great
amount of info to move forward.

- I'm quite sure the worker thing isn't a bug, as I was calling init on
every SFX play. Gonna update our instance and have another go

- The popping noise happens on first call, but I noticed it happens on the
examples page too, so I think it's just a property of the Impulse sound
demo, that it continues after the ramp? That was a stupid mistake on my

- The info regarding events, reverb and leads for books are great - I'll
get in contact after I dig in some more and have a chance to explore.

Also will send over a link to our sandbox instance as soon as we get
something in there.. Talk soon

=: s

On Thu, Apr 30, 2015 at 11:41 AM, Colin Clark <colin at colinclark.org> wrote:

> Hi Steven,
> Thanks for all your great questions. Just some background information
> before I answer your specific questions below:
> You should have one Flocking Environment per application. So only invoke
> flock.init() once, and retain an instance of the environment somewhere we
> you can access it if needed. Start it playing at the beginning, rather than
> every time you want to trigger a sound.
> You'll typically want one Synth instance per "voice" or thing that needs
> to trigger sound. You won't typically reinstantiate a synth every time you
> want to trigger a sound. Instead, create the synth up front, start it
> playing, and then open and close the gate on an envelope unit generator to
> trigger sounds in response to user actions. If you know you won't be using
> a synth for a while, you can call pause() on it to minimize resources, and
> then start it play()ing again later.

Re: context: That helps a lot. Does resource drain build up over time by
just being idle?
 - didn't realize it, and this sounds like the culprit - I was calling
init() on every SFX play. The strategy you mention makes sense, kinda like
turning down the volume knob I guess, the radio is still on, but still
serves the purpose of keeping sound out.

> On Apr 29, 2015, at 10:10 PM, Steven Dale <lifeinchords at gmail.com> wrote:
> - When sound is triggered/played in our app, afterwords there's an endless
> repeating click/pop sound. Both Chrome + Firefox. Listen here:
> https://drive.google.com/file/d/0B4zhzWwgaF63Z3h4WDN2VldzbE0/view?usp=sharing
> Can you provide more details or a running instance of your application? I
> don't know exactly what you're doing, nor what your audio file sounds like.
> More detail makes it easier to answer these kinds of questions.
> If I had to randomly guess, it may have to do with the fact that you're
> repeatedly calling this._enviro.play() every time you're triggering a
> sound, but I don't know.
> Is there a stop audio function? Do we need to run it every time we're done
> playing something to stop audio output, then re-init it right before the
> next trigger of sound? Or is this audio connection supposed to stay on
> throughout the life of the person's experience in a given session?
> Synths are intended to be relatively long-lived. They can be triggered
> repeatedly and have their parameters changed on the fly. Again, more detail
> would be helpful. But in general you'll probably want to keep your synth
> instance around persistently, and then use some kind of an envelope,
> opening and closing its gate in response to user actions.
> - We're triggering sounds on drag and dropping divs. They spawn web
> workers seen in the  console, one for each sound.. I racked up 20-30 in
> seconds.. and they persist. Is this normal + expected? Feels like at some
> point the browser is gonna choke
> That sounds like a bug. Can you:
> * Make sure you're running the latest release of Flocking (version 0.1.1)
> https://github.com/colinbdclark/Flocking/releases/tag/0.1.1
> * Send me a link to a running instance of your app or instructions on how
> to build/run so I can take a look?
> Web Workers are spawned in Flocking for two reasons:
> 1) To run the clocks on an asynchronous Scheduler instance (one of which
> is created when you instantiate the Flocking Environment by calling
> flock.init())
> 2) If you're using the legacy pure JavaScript audio file decoders, which
> aren't shipped by default with recent versions of Flocking (so very
> unlikely)
> Is it possible that you're initializing Flocking over and over again
> somehow? Or creating a large number of Scheduler instances? If not, I'll
> take a look and see if there's a bug I need to fix.
> - Does the lib play mp3 files? I saw a reference to a WAV file in the
> examples - is there a preferred file format to use to trigger samples?
> uncompressed feels quite large for transferring back n forth over the wire
> Yes. Flocking supports all of the codecs that can be decoded by the Web
> Audio API. Here's MDN's compatibility table:
> https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats#Browser_compatibility
> In short, most browsers will support MP3 out of the box with Flocking.
> - From here:
> https://github.com/colinbdclark/Flocking/blob/master/docs/responding-to-user-input.md:
> Can we bind the events to a class that returns a set of DIV's, rather than
> a single DIV, tied with an ID? Is it a linear thing, where if we bind to 5
> divs say, it will be 5x more drain on browser resources bc 5 voices are
> playing at once?
> Currently, the flock.ugen.mouse.click unit generator only supports being
> bound to one element at a time. That's something that can be fixed, and
> I've filed a bug about it:
> https://github.com/colinbdclark/Flocking/issues/107
> However, it's trivial to create your own custom handlers and bind them to
> your synth using the <code>set()</code> method. Something like this should
> work just fine:
> var divs = $(".lotsOfElements");
> divs.mousedown(function () {
>     mySynth.set("myEnv.gate", 1.0);
> });
> divs.mouseup(function () {
>     mySynth.set("myEnv.gate", 0.0);
> });
> The browser ugens are just there to provide quick solutions for testing a
> synth. If you're building more complex UIs, you'll probably want to roll
> your own event logic.
> - Is there a way to apply a reverb effect onto the end of the signal
> chain? I saw something about several channel audio, and the delay
> definition. Is this possible with the lib and needs to be modeled? or is
> just not possible with this kind of synthesis? I'm new to doing this stuff
> with code.
> There's the Freeverb unit generator. It takes four inputs:
> * source: the signal you want to apply the reverb to
> * mix: the wet/dry mix for the reverb, between 0-1
> * room: the room size, between 0-1
> * damp: the reverb's HF damp, between 0-1
> If you want to add reverb to a whole collection of different synths,
> you'll want to write your synths' output to an interconnect bus (using
> flock.ugen.out) and then create a dedicated "effects synth" that reads from
> the interconnect bus and applies the reverb. I can whip you up an example
> if you end up going that way.
> - Any ETA on the docs for flock.synth?
> https://github.com/colinbdclark/Flocking/blob/master/docs/synths/overview.md
> Between the documentation in the main README and this page, is there any
> other documentation you're specifically interested in? What information do
> you think we're missing?
> https://github.com/colinbdclark/Flocking/blob/master/docs/synths/creating-synths.md
> The API for Synths, fortunately, is fairly simple. You can create them,
> add and remove them to the environment's list of evaluated nodes in
> specified locations, and get/set values on them. That's pretty much the
> extent of its functionality.
> - Any existing docs on how I might take some sound synthesis examples from
> classic textbooks, papers, etc, and apply them to create synth definitions
> [in Flocking]?  For example, we want to synthesize/model the elastic,
> stretchy sound of a slingshot, think Angry birds SFX-- is this possible
> with Flocking?
> There are some pretty good books about audio synthesis. My favourite is
> this one, which is unfortunately out of print:
> http://books.google.ca/books/about/Computer_Music.html?id=eY_BQgAACAAJ
> Curtis Roads' Computer Music Tutorial goes into detail about many
> synthesis techniques. Also Nick Collins' Introduction to Computer Music is
> quite good. There are also great languages-specific books for
> SuperCollider, ChucK, and CSound that provide primers on signal processing
> and the source code could likely be ported to Flocking.
> I hope this helps,
> Colin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.idrc.ocad.ca/pipermail/flocking/attachments/20150501/50935c60/attachment.html>

More information about the flocking mailing list