Turn Your Browser up to 11! The Digital Surgeons Web Audio Meetup

Written by in Technology on

The #WebAudio Mad Scientists at Digital Surgeons were proud to host a newhaven.io meetup this past week that featured three of our own showing why a web browser can be as powerful as a roaring Marshall stack.

A Wednesday night at our headquarters on State Street, complete with pizza from BAR and Stony Creek Beer on tap - there isn’t a better time to talk about why the future of the web is loud..and we like it.

newhaven.io

First, let’s talk about the talented and passionate community of tech junkies we were privileged to welcome into our offices. newhaven.io is a non-profit developer group here in New Haven that welcomes all interested, as they put it, “in programming, the intertubes, and whatnot.”

Hell bent on fostering a culture of experimentation and building the local tech community, their vision aligns well with what we try to accomplish every day at Digital Surgeons.  

The Meetup

Tech Lead Adam Chambers

@Chambaz

After stuffing our faces with mashed potato bacon pie, it was time to get to business. First to present was our Technology Lead Adam Chambers. A musician with serious front end development skills, it shouldn’t surprise anyone that Chambers is web audio obsessed.

Chambers created and runs AudioCrawl, a site dedicated to showcasing the best in web audio via a community aggregated and voted feed of the latest and greatest interactive, audio-visual experiences, compositions and visualizations created with technologies like HTML5, the WebAudio API, WebGL, and WebRTC.  

Chambers’ presentation focused on interactive music videos. Video may have killed the radio star, but reality shows have put a bullet in the heart of the MTV music video programming we all grew up on.

Interactive music videos will become more and more common as artists look to technology to revolutionize how their audiences not only consume, but co-create the experience they have listening to an artist’s song. No longer does listening to your favorite song have to be a passive experience, technology is enabling today’s content consumer to push back and participate.  

Chambers shared a couple of his favorite examples of interactive music videos that inspire him, including the video for Arcade Fire’s “We Used to Wait”, built by the Google Team, that incorporates fly bys and street view mashups of your own address. Another from George and Jonathon, showcased an entire album that was released via an interactive website. The site’s 3D experience lets you navigate around every note in the song. Last he showed us Jazz Computer from industry legend Yotam Mann. Jazz computer allows users to manipulate the structure, tempo, and graphics of a song merely by scrolling forward and backward.  

Chambers is one half of hip hop duo Cam and Sound (he’s sound). Cam Nichols is a West Haven rapper, Sound Chambers creates beats that he reluctantly labels as a combination of hip hop, trip hop, and ambient electronica.

For the release of their song “Infomercials”, Chambers built an accompanying interactive music video. The music video allows the viewer to use their mouse to “peek inside Cam’s mind.”

To put the technology behind it in simple terms, the site scans a video of Cam and creates tiles across the page, copying those tiles to an HTML5 canvas. It then uses the user’s mouse position to explode and contract those tiles in time with the music using the web audio API to analyze the audio and trigger the explosions. It also hits the Giphy and Instagram APIs to populate images from either Cam’s instagram or random infomercials behind the exploding tiles. The viewer moves the their mouse to explode the tiles in time with the beat to reveal the images, peering into Cam’s mind as he lays in his bed rapping about sleepless nights in the blue din of a sea of infomercials.  

Besides being an impressive technical implication, the video allows the viewer to feel the dreamy ambience that Cam is describing and it conveys the story of the song that much better.

Follow @camandsound and be on the lookout for the public release of the interactive video!

@Aaron_M_Shea

Next up, the boy wonder Aaron Shea. Shea first came to Digital Surgeons as a fresh faced intern. He remains fresh faced but after making us all look bad, he’s now a part time developer that works for us when he isn’t busy being a college sophomore.    

Developer Aaron Shea

Shea’s true passion lies in coding games. As a musician, when he heard about the Web Audio Meetup he immediately wanted to create a browser based music game for the occasion.   

Shea finds himself frustrated that most music games follow what he calls “the Dance Dance Revolution model.” Someone playing Guitar Hero isn’t really interacting with the music, simply following on screen prompts that if completed on time trigger the playback of a song. Even without music, someone could play along by simply watching the screen and pressing the right colors. Shea wanted to raise the bar and create a game that forced the user to pay close attention to both the music and the visuals, a game that finds someone playing along with the music instead of simply over it.

His creation, “Binary Beat Down”, challenges a player to release a rocket in time with the beat of a song. If timed correctly, the rocket launches, incorrectly it explodes. Rockets correspond to different rhythm divisions, some require a whole beat, others a half. The different beat length rockets are not only different colors and shapes, but queued and launched with different accompanying sounds. The player is forced to listen closely to anticipate what the next rocket will be because visually they can be distracted by flying asteroids, spinning objects, and a background that constantly moves at a different pace. However, only the most rhythmic player can play the game from the auditory inputs alone, paying close attention to visual cues is a must.   

When the game is played well, the sounds that correspond with both the queuing and launching of the rockets begin to actually create a sort of instrumental track of their own that complements the textures of the song you’re playing along with.  

Shea used a massive JSON file, generated by a custom beat mapping tool, to map the entire song down to each millisecond to ensure inputs matched as accurately as possible to the song. If a player is able to click within .06 seconds of the beat starting, the game considers you to be right on the beat. This leeway allows non-musicians just enough slack to enjoy the game. Shea played with different degrees of variance, but .06 seconds felt just right.

Shea hopes future iterations of the game will take musical immersion to the next level by enabling the game to alter the song based on how well the player is finding the beat. The better you perform, the faster it goes.  

So how long did it take for Shea to pull this together?

A weekend.

Mr. Shea, we salute you.  

@codecommando

Last, but certainly not least, was our Director of Interactive Aaron Sherrill. Sherril is the go-to- guy on the team for off-the-wall builds, when you wonder if a crazy animation is possible, he is the one that makes it happen.

Any front-end developer can tell you that CSS animations are a great way to take a site to the next level, but they lack a standardized timeline animation IDE, like Flash’s.

For those unversed in the ways of code, this means that the way the programming language CSS looks at an animation is as a series of states where at each state the properties change. These states are defined as a percentage of the total animation length.

What this means in execution is, if you want something to happen halfway through an animation that is 2 seconds, the state is defined at 50%, which will trigger it to happen 1 second in. It sounds easy enough in this simple example, but it becomes extremely difficult to think about time as a percentage of overall animation length when the build becomes more complicated. Once you begin taking into account things like how long you need to leave the user to read text before the next animation happens, it can be a developer's nightmare to animate in CSS.  

Sherrill presented a technique that he wrote for taking the capabilities an animator has when he's using Flash to sequence animations, and from that sequence to export the data necessary to run it as a CSS3 keyframe rule animation. You get all the timeline benefits of animating in Flash, but it's in CSS.  

To put his technique to the test, Sherrill did several real-time demos and was able to quickly code complex CSS animations on the fly in front of a packed room.    

The technique has been invaluable in animations for both our clients Lego and Demandware, and for a just launched internal sports marketing page that is chock full of awesome user interactions.

Now what?

The development team’s newest hire Cory Zibell may have summed it up best, “Now we have to figure out how to combine Aaron Shea’s beatmap tool, plug that JSON code into Aaron Sherrill’s animation keyframe finder, and then use those timings with Adam Chambers’ Interactive Music video so that it can be automated for any audio source! Boom, automated interactive music videos."

Welcome aboard Cory, now make that happen. 

Lessons Learned

More than anything else, Chambers wants people to understand the value behind creative technology. Developers aren’t cave dwellers banished to code silently in basements with their hoods up while they execute the vision of creative designers.

“There's a big creative element in what we do. It's not just x equals y, it's tweaking values to make them feel right. It's all based on feeling and experience, there's a lot of UX in code,” said Chambers.

Missed the Meetup? See it through the eyes of our Creative Technologist Craig Keller.

 

New Haven IO WebAudio from Digital Surgeons on Vimeo.

Discuss on Twitter