Last holiday season I built a little playlist site rocking the Audio API to create a gooey equalizer animation. This is me finally getting around to writing something about it. 😜
Anywho, like many an agency, GreenRubino ( a Seattle-based full service) sends gifts off to their clients for the holidays. For 2018, they wanted to go beyond the standard bottle of wine & box o’ chocolates.
Our concept was to spread that holiday cheer with some holiday jams. 🤗
Recipients were mailed a package of branded jam jars, with a card inviting them to visit HolidayJammin.com – a custom playlist experience of holiday jams curated by employee ‘taste experts’. This jammy theme tied into the site UI/UX with a variety of jam-inspired Gooey animation effects.
(And dontcha worry – this being Seattle and all, the Jam was organic and local, made by a mother/daughter/grandmother team of jam makers on Whidbey Island.
The whole thing came together rather nicely and snagged a little feature on Adweek.
You can go check a version of the site here.
The Site Experience
The site begins with an intro scene of animated snowflake-ish icons and an invitation to “Click and spread cheer.” Doing so triggers a gooey-flavored transition revealing the playlist scene, which features a series of circular employee pics of when they were pups.
When clicked, each pic converts to a spinning record and begins playing that employee’s selected track. The actual player ui morphs into a Gooey-style equalizer when active.
Gooey Player
To accomplish the Gooey Equalizer, I leveraged the Audio API and SVG filters.
Essentially, the Audio API allows you to extract frequency data from an audio source and transfer it to an array of data points. Each point can be correlated to the y-axis animation of a div element, that’s then modified with an SVG filter.
Here’ a simple breakdown of how that works:
AudioContext
In order to extract data from an audio source, you first create an AnalyserNode with the audio API’s AudioContext.createAnalyser() method:
Create audioContext
// Audio element
const audio = document.querySelector('#js-audio')
let audioContext = new(window.AudioContext || window.webkitAudioContext)
let analyser = audioContext.createAnalyser()
You can then connect that node to an audio source via createMediaElementSource. This serves as a middle point before the final destination, where we reroute our signal for visualization.
Connect audio source to Analyser node
let source = audioContext.createMediaElementSource(audio)
source.connect(analyser)
analyser.connect(audioContext.destination)
Now, we need to capture our audio/frequency data and copy it to an array that can be leveraged for animating its data points. We do that by first defining the buffer size using the frequencyBinCount property of our Analyser, which determines the number of data points we’re getting from the frequency data.
By default, that’s 1024, which is half the size of the AnaylserNode’s Fast Fourier Transform (fft). The Fast Fourier algorithm is beyond the scope of this write up, but you can read more about it here.
Just know that you can alter its value with the analyser.fftSize property.
We can then use our frequencyBinCount to create a Uint8Array() that can be passed to the analyser’s getByteFrequencyData method.
Create frequency data array
let bufferLength = analyser.frequencyBinCount
let frequencyData = new Uint8Array(bufferLength)
analyser.getByteFrequencyData(frequencyData)
Gotcha
Right before launch, Google pushed an update to the audio api, enforcing user interaction to play audio files, or even connect an analyser to its source. So, to comply with that, I had to move the source.connect(analyser)
stuff into a separate method that’s called when a user clicks to play a track:
connectAnalyser() {
source.connect(analyser)
analyser.connect(audioContext.destination)
}
I also found that simply resuming the audioContext on play, like audioContext.resume()
did the trick as well.
See full script below for final working structure.
Animate It
Now, to actually animate our audio’s frequency real-time data we update the frequencyData in a RequestAnimationFrame callback (a recursive loop). The script contains a method to kick off the Y axis animation of a series of divs that comprise the equalizer. An additional method stops/clears the raf loop.
/**
* Setup Ref to equalizer bars/divs
*/
function setupEqualizer() {
for (let i = 1; i <= 100; i++) {
bars.push(document.getElementById('bar-' + i))
}
}
/**
* Start Equalizer Method
*/
function startEqualizer() {
analyser.getByteFrequencyData(frequencyData)
let barsCount = 0
let numberOfBars = 100
for (let i = 1; i < numberOfBars * 2; i += 1) {
let y = frequencyData[i]
barsCount++
if (barsCount > numberOfBars) {
barsCount = 0
}
let bar = bars[barsCount]
if (bar) {
bar.style.transform = 'translateY(-' + y + 'px)'
}
}
// Recursive raf loop call
let raf = requestAnimationFrame(GooEqualizer.startEqualizer)
}
/**
* Stop Equalizer
*/
function stopEqualizer() {
cancelAnimationFrame(raf)
}
The Gooey Effect
The actual Gooey Animation is achieved via SVG filters.
If you’re new to SVG filters, they allow you to modify a source element with an operation. For example, you can modify an html element with an SVG filter applied using css.
Graphical operations with SVG filters are handled via a set of filter primitives. There’s a whole host of available filter primitives, but our gooey effect is achieved with feGaussianBlur
and feComposite
.
Note the #goo
reference in the following html/svg/css for our Equalizer:
Equalizer Markup and SVG Filter
<section class="equalizer">
<div class="bar" id="bar-1"></div>
<div class="bar" id="bar-2"></div>
...
</section>
<!-- #goo reference -->
<svg xmlns="http://www.w3.org/2000/svg" version="1.1" width="800">
<defs>
<filter id="goo">
<feGaussianBlur in="SourceGraphic" stdDeviation="10" result="blur"/>
<feColorMatrix in="blur" mode="matrix" values="1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 19 -9" result="goo"/>
<feComposite in="SourceGraphic" in2="goo" operator="atop"/>
</filter>
</defs>
</svg>
CSS applying filter #goo
.equalizer {
z-index: 4;
position: fixed;
left: 0;
bottom: -16em; // el height, - bar height + height of player bar
height: 30em;
width: 100%;
flex: 1;
display: flex;
flex-direction: row;
align-items: flex-end;
filter: url("#goo"); // SVG filter ref
transform: translateZ(0);
will-change: transform;
.bar {
background-color: $color-alpha; // blue
flex: 1 1 auto;
height: 20em;
min-width: 1%;
transform: translateZ(0);
will-change: transform;
}
}
And there you have it. We’ve run our audio source through the Audio API to animate it’s frequency data as an equalizer, using SVG filters to create a gooey effect.
All Together Now
Of course, that was just one part of the player experience. I also needed to create the actual playlist JavaScript and the audio player UI. A quick rundown of that:
- For the Player UI, I leveraged Plyr.io as it’s a super solid lib that I use for custom video player.
- The data for employees info and audio files was stored as JSON file.
- Handlebars templates handled the markup components and data interpolation.
- CSS Keyframes handled the spinning record animation when an
is-playing
class was added to them, with the record grooves created with just linear-gradient. - The playlist and equalizer JS was housed in separate modules, with the Playlist module calling public methods from the equalizer module as part of it’s play/pause logic.
GooPlayer
Here’s the meat and potatoes of the playlist and equalizer:
GooEqualizer.js
/**
* GooEqualizer
* Connects audio source to analyzer node, so we can create
* output of frequency data as fftSize array which we use to
* create Equalizer animation of y Axis transforms.
* @author stephenscaff
*/
const GooEqualizer = (() => {
// bail if browser lacks audioContext
if (!window.AudioContext) return;
const audio = document.querySelector('#js-audio')
let audioContext = new(window.AudioContext || window.webkitAudioContext)
let analyser = audioContext.createAnalyser()
let source = audioContext.createMediaElementSource(audio)
let bars = []
let raf
return {
setup() {
this.connectAnalyser()
this.setupEqualizer()
},
/**
* Connect Audio source to analyser
* Connects analyser node to audioContext
* @see https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API/Visualizations_with_Web_Audio_API
*/
connectAnalyser() {
source.connect(analyser)
analyser.connect(audioContext.destination)
analyser.smoothingTimeConstant = 0.8
},
/**
* Get Frequency Array Data
*/
getFreqArray() {
let bufferLength = analyser.frequencyBinCount
let frequencyData = new Uint8Array(bufferLength)
analyser.getByteFrequencyData(frequencyData)
return frequencyData
},
/**
* Setup Equalizer Bars
* Equalizer is a series of divs used to visulize audio source
*/
setupEqualizer() {
for (let i = 1; i <= 100; i++) {
bars.push(document.getElementById('bar-' + i))
}
},
/**
* Start Equalizer
* Animation loop(raf) converting feequencyData
* to y axis transforms of our bars.
*/
startEqualizer() {
let frequencyData = GooEqualizer.getFreqArray()
let barsCount = 0
let numberOfBars = 100
for (let i = 1; i < numberOfBars * 2; i += 1) {
let y = frequencyData[i]
barsCount++
if (barsCount > numberOfBars) {
barsCount = 0
}
let bar = bars[barsCount]
if (bar) {
bar.style.transform = 'translateY(-' + y + 'px)'
}
}
// Recursive raf loop call
let raf = requestAnimationFrame(GooEqualizer.startEqualizer)
},
/**
* Stop Equalizer
*/
stopEqualizer() {
cancelAnimationFrame(raf)
}
}
})()
export default GooEqualizer
GooPlayer.js
import GooEqualizer from './GooEqualizer.js'
/**
* GooPlayer
* Handles the audio player and playlist logic
* Calls GooEqualizer for audio vis animation
* @author Stephen Scaff
*/
function GooPlayer () {
this.audio = document.querySelector('#js-audio')
this.playBtn = document.querySelector('.js-play')
this.peeps = document.querySelectorAll('.js-peep')
this.peepImg = document.querySelectorAll('.playlist-peep__img')
this.songTitle = document.querySelector('.js-song-title')
this.songArtist = document.querySelector('.js-song-artist')
this.currentSong = 0
this.isPlaying = false
}
GooPlayer.prototype = {
constructor : GooPlayer,
/**
* Init
*/
init(){
self = this
this.loadFirstSong()
this.audioListeners()
this.handleClick()
return true
},
/**
* Handle Peeps/Tracks Click Events
* Primary click handler for audio user interaction req.
*/
handleClick() {
this.peeps.forEach((peep, i) => {
peep.addEventListener('click', (e) => {
e.preventDefault()
let song = self.getSong(i)
self.audio.src = song.src
self.setSongInfo(song.title, song.artist)
self.playPause(e.currentTarget, this.peeps)
GooEqualizer.setup()
})
})
},
/**
* Load First Song
* Kick off playlist by loading up first track in peeps array.
*/
loadFirstSong() {
let song = this.getSong(0)
this.audio.src = song.src
this.audio.load()
this.setSongInfo(song.title, song.artist)
},
/**
* Get Song
* Helper to set current song index and return song info
* from the peep's data attributes. Sets currentSong Index and returns
* an object with song url, title, artist
* $param {number} index of peeps array.
* @return {object}
*/
getSong(i) {
let song = this.peeps[i].dataset
this.currentSong = i
return {
src: song.songSrc,
title: song.songTitle,
artist: song.songArtist
}
},
/**
* Set Song Info
* Sets song title/artist info in player, via object values returned with getSong() helper.
* @param {string} Title
* @param {string} Artist
*/
setSongInfo(title, artist) {
this.songTitle.innerHTML = title
this.songArtist.innerHTML = artist
},
/**
* Play Pause
* Play clicked track if not already playing.
* If already playing, pause. Play next track on click.
*/
playPause(el, group) {
this.audio.load()
if (el.classList.contains('is-playing')){
this.isPlaying = true
} else {
this.isPlaying = false
}
if (this.isPlaying){
this.pause()
} else {
this.play()
}
},
/**
* Audio Event Listeners
* Using audio event listeners to trigger our
* play/pause/next methods. Make sure to watch these
* in relation to new audio api updates in chrome 72
*/
audioListeners() {
this.audio.onplay = () => {
this.play()
}
this.audio.onpause = () => {
this.pause()
}
this.audio.onended = () => {
this.next()
}
},
/**
* Play
* Adds 'is-playing' class, starts MusicEqualizer
*/
play() {
let active = document.querySelector('.is-playing')
if (active) {
active.classList.remove('is-playing')
}
this.peeps[this.currentSong].classList.add('is-playing')
this.audio.play()
this.isPlaying = true
GooEqualizer.startEqualizer()
},
/**
* Pause
* Removes 'is-playing' class, pauses audio.
*/
pause() {
this.peeps[this.currentSong].classList.remove('is-playing')
this.audio.pause()
this.isPlaying = false
// Give vis a sec to drop off
setTimeout(function(){
GooEqualizer.stopEqualizer()
}, 400)
},
/**
* Next Song
*/
next() {
let nextSong = this.currentSong + 1
let song = this.getSong(nextSong)
let nextPeep = this.peeps[nextSong]
this.audio.src = song.src
this.audio.load()
this.setSongInfo(song.title, song.artist)
this.playPause(nextPeep, this.peeps)
},
/**
* Previous song
*/
prev() {
let nextSong = this.currentSong - 1
let song = this.getSong(nextSong)
let nextPeep = this.peeps[nextSong]
this.audio.src = song.src
this.audio.load()
this.setSongInfo(song.title, song.artist)
this.playPause(nextPeep, this.peeps)
},
};
export default GooPlayer
HBS Partial for Each Peep/Employee
{{#this}}
<article class="playlist-peep">
<a class="playlist-peep__link js-peep" data-song-src="../assets/audio/{{song_url}}" data-song-title="{{song_title}}" data-song-artist="{{song_artist}}">
<figure class="playlist-peep__figure">
<div class="playlist-peep__scaler">
<img id="{{peepNameID first_name last_name}}" class="playlist-peep__img" src="assets/images/peeps/{{avatar}}" alt="{{first_name}} {{last_name}}"/>
<div class="playlist-peep__spindle"></div>
</div>
</figure>
<span class="playlist-peep__icon"><i class="icon-play"></i></span>
<header class="playlist-peep__header">
<h3 class="playlist-peep__name">{{first_name}}</h3>
<span class="playlist-peep__role">{{role}}</span>
</header>
</a>
</article>
{{/this}}
Playlist Data Json
let peeps_data = [
{
"first_name": "Abdul ",
"last_name": "Sharif",
"role": "Campaign Manager",
"song_title": "What Christmas Means to Me",
"song_artist": "CeeLo Green",
"song_url": "Abdul-S-What-Christmas-Means-To-Me.mp3",
"avatar": "abdul-sharif.jpg"
},
{
"first_name": "Althea ",
"last_name": "Conyers Achem",
"role": "PR Account Manager",
"song_title": "Dance of the Sugar Plum Fairy",
"song_artist": "Pyotr Ilyich Tchaikovsky",
"song_url": "Althea-C-A-Dance-Of-The-Sugar-Plum-Fairy.mp3",
"avatar": "althea-conyers-achem.jpg"
},
...
Snag the Code
I put the project up on Github, so you can go snag the full code there.
And, you can check the live project here.
Credz
The print and packaging stuff was handled by my homie Chaun Osburn.
Words and idears were handled by the super talented Hank Zakroff
All the digital/web/interaction stuff was handled by me.
Thanks for reading.