Experiments with audio, part III

I’m working on a project to try and expose audio spectrum data from Firefox’s audio element.  Today I add a new event to the audio element.

Last time I wrote that "I’m going to have to get this wrong before I get it right."  So here goes some "wrong that feels right."  Having located the proper audio data, I'm up against the problem of how to get it exposed to content (i.e., script running in a web page).  We've been starting to talk more about how to structure the API, but I'm a ways off from needing those decisions made.  Since my last experiment I've been realizing that we're going to need a notification so as to let scripts know that audio data is available.

Today's experiment was to add a new DOM event to the audio element, such that web developers could be notified whenever raw audio data was available.  Since it comes in a "frame" at a time out of the decoder, we get the song or "sound" in chunks as it is played.  Here's how you use what I've done today:

<audio src="somesong.ogg" onaudiowritten="doSomething();">

At the moment, there isn't much you can do in doSomething(), since I haven't exposed the data with the event.  Instead what I'm planning to do is add a method named mozGetAudioData(), which you'll call whenever onaudiowritten occurs.

Adding a new DOM event is accomplished by carefully studying how other events are implemented  The audio element has a bunch of events, for example onloadedmetadata.  I had to make a dozen or so changes to the following files (look for the string loadedmetadata as an example in these files):

In part IV I hope to get the decoded frame data exposed via a new method on the audio element.

Show Comments