Experiments with audio, part IV

I’m working on a project to try and expose audio spectrum data from Firefox’s audio element.  Today I get raw data into the browser.

This blog post was going to be very different than the one you see now.  That's because our DOM code kicked my ass all day yesterday and into the night.  I stopped coding when my eyes stopped working.  And the result was very much, "...so close, what is causing this error!!!!!????"  But then this morning, through the power of irc debugging, the answer came!  Let me show you a picture before I go on:

First successful test of raw audio in browser

Previously I wrote about my idea to add a new method to the audio element, which you'd call whenever the onaudiowritten event occurred.  I thought a lot about this and couldn't figure out how I was going to deal with the synchronization issues: the decoding (and raw data) is produced on a separate thread, and I need to expose it without blocking on the main thread.  I decided to get some advice and went to speak with Chris Jones, who is one of the people working on Electrolysis, Mozilla's project to rewrite Firefox with a multi-process architecture.  Chris knows a lot about threading and synchronization, and he confirmed what I suspected--I should make this data available as a parameter of the event vs. adding another function call and trying to deal with the locking semantics.

I was hoping to avoid this.  The very first thing I ever did with Firefox, back in 2005, was to work with some students to add a new event with data to the DOM.  And it was hard.  Very hard.  But dealing with thread issues is hard too, so I dug in and went to work.  Luckily for me all these years later, there are a lot more examples in the code I can follow.  The MozAfterPaint event is pretty close to what I need to do: an event that contains a list of data.  So I spent much of the day studying the code that implements it, and it's a wild ride.

Eventually I was able to add nsIDOMNotifyAudioWrittenEvent and nsIDOMAudioData, with implementations.  The way this code works now is this:

<script>  
function audioWritten(event) {  
  for (var i = 0; i < event.mozAudioData.length; i++)  
    console.log(event.mozAudioData.item(i));  
}  
</script>  
<audio src="song.ogg" onaudiowritten="audioWritten(event);"></audio>

As the audio is decoded, this event fires for every "frame" and you get an object with the event named mozAudioData. This contains all the raw audio data, basically a list of numbers like I discussed here. You can iterate through them using .item(n) and .length. Simple, right? Well, I can tell you it wasn't simple to implement!

Last night I had all the code in place, but when I quit I was staring at an assertion being caused by my code:

!!! ASSERTION: hmm? CanCallNow failed in XPCWrappedNative::CallMethod. We are finding out about this late!: 'rv

== NS_ERROR_XPC_SECURITY_MANAGER_VETO', file /Users/dave/moz/mozilla-central/src/js/src/xpconnect/src/xpcwrappednative.cpp, line 2206
I knew I must be doing something wrong with my nsIDOMAudioData class, since the event was working, and is very similar in terms of the plumbing it needs to get dispatched to the DOM.  This morning, feeling thoroughly stuck, I went in search of Olli Petty (smaug on irc), who knows this code really well:
09:28 < smaug> AudioData isn't an event
09:28  * smaug is surprised that even compiles
Me too!  In my sleep deprived state last night I'd copy-pasted my nsDOMClassInfo code for the event and reused it for nsIDOMAudioData without removing the line that makes it an event.  After deleting that line it works!

My patch is getting a lot bigger, but it needs more yet--I don't even want to think about what they'll ask me to change as I move to getting it reviewed.  But that's trouble for another day.  Today I'm thrilled to have gotten this data exposed to script and lived to tell the tale.  In part V I'll try to get this into a state others can use, and then set my audiophile partners loose on doing something cool with the data.

Show Comments