Now Playing with Spotify and iTunes

When I started to finalize the features for the WhoNote agent app I knew I wanted to give users the ability to choose multiple music sources. Out of the gate the first two choices I wanted to support were iTunes and Spotify. Arguably the most common sources.

Coming from iOS development I had the naive assumption that there would an Cocoa equivelant of the MediaPlayer Framework. There isn’t. I became one sad panda. I saw quite a number of people “get around the problem” by writing AppleScripts and running them from the command line or including the AppleScript(s) in their apps and calling them directly. That would have worked, but there had a be a native way of interacting with iTunes. My use case is quite simple: Listen for track change notifications and upload meta data.

RTFM

When in doubt one should read the manual. Apple has provided an excellent example of how to natively interact with iTunes. Though the subject of ScriptBridging is much broader than just iTunes, the example couldn’t be better.

Needless to say incorporating iTunes support was a breeze.

Playtime with Spotify

Now that I had a native methodolgy to go by adding in Spotify support was just as easy. First I had to generate the header file:

sdef /Applications/Spotify.app | sdp -fh --basename SpotifyClient

Note

I did get the following error message when I ran the above command:

sdp: warning: class “SpotifyClientApplication” redeclared; use instead.

I’m not sure yet if this is bug in the sdp or if I didn’t pass in a correct param, but basically in your generated header file you’ll have a class forwarding delcaration twice.

@class SpotifyClientApplication, SpotifyClientTrack, SpotifyClientApplication;

Just open the file, remove one and you are good to go.

Being a Good Listener

Though I could interact with both services, the functionality that I needed was to listen for state and song changes. Both services “broadcast” this information through NSDistributedNotificationCenter.

iTunes Notification

com.apple.iTunes.playerInfo

2013-04-08 11:12:13.652 WhoNoteMac[7939:303] notification payload: {
Album = "Collide With the Sky";
"Album Artist" = "Pierce the Veil";
"Album Rating" = 0;
"Album Rating Computed" = 1;
Artist = "Pierce the Veil";
"Artwork Count" = 1;
"Disc Count" = 1;
"Disc Number" = 1;
Genre = Alternative;
"Library PersistentID" = 162261630986489880;
Location = "file://localhost/Users/cwiles/Music/iTunes/iTunes%20Media/Music/Pierce%20the%20Veil/Collide%20With%20the%20Sky/07%20Tangled%20In%20the%20Great%20Escape%20(feat.%20Jason%20Butler).m4a";
Name = "Tangled In the Great Escape (feat. Jason Butler)";
PersistentID = "-7678479651277138133";
"Play Count" = 1;
"Play Date" = "2013-04-08 14:24:50 +0000";
"Player State" = Playing;
"Playlist PersistentID" = 1335587615931680054;
"Rating Computed" = 1;
"Skip Count" = 0;
"Store URL" = "itms://itunes.com/album?p=536038964&i=536039373";
"Total Time" = 356893;
"Track Count" = 13;
"Track Number" = 7;
Year = 2012;
}

Spotify Notification

com.spotify.client.PlaybackStateChanged

2013-04-08 11:18:52.679 WhoNoteMac[8053:303] notification payload: {
Album = "In Search Of Solid Ground";
"Album Artist" = Saosin;
Artist = Saosin;
"Disc Number" = 1;
Duration = 215;
"Has Artwork" = 1;
Location = "/Users/cwiles/Music/iTunes/iTunes Media/Music/Saosin/In Search Of Solid Ground/08 The Worst Of Me.m4a";
Name = "The Worst Of Me";
"Play Count" = 0;
"Playback Position" = 0;
"Player State" = Playing;
Popularity = 38;
Starred = 0;
"Track ID" = "spotify:track:1wiIhmcIU1hJWqEKhcaXcp";
"Track Number" = 8;

}

Finally with all the pieces in places I was able to get all the meta information I need to send to the WhoNote backend. While I do wish there was an official Cocoa framework to interact with iTunes; utilizing ScriptBridging does provide a blessed mechanism for interacting with other installed applications.

Example Implementation

Gist of how it looks

Let's Talk iPhone Event

I always love the last few months and weeks that build up to an iPhone/iPad announcement.  Though some of the rumors have turned into fact, most are just random, fabricated factoids that many MSM and bloggers push out there for the masses.  At the end of the day I can only presume that Apple doesn't pay those any attention except in the case of lost hardware.

However, I am not immune to reading a lot of these misleading articles. Granted I don't treat them as gospel there are a few things that I am very excited about.

First, is the official roll out of iCloud.  Having used the beta version the past few months I have been pleasantly surprised as it's convince and ease of use.  It is a huge win for the average consumer who has, up to this point, been reliant on iTunes and various other backup strategies to manage their content and media.

Second, is iTunes Match.  It has always been somewhat of a let down that users couldn't access all of their music from any of their devices.  The process is extremely easy and now you don't have to carry around a separate iDevice to hold your music catalog.

Third, wireless syncing.  I have a strong feeling that most IT administrators and children/grandchildren who have to provide tech support to their parents/grandparents will jump up and down over this.

Fourth, the API features with the new iOS have once again raised the bar for app developers.  Apple has listen to feedback from developers to make it easier for us to provide better functionality in our apps as well as more detailed customization options.  

*Bye-bye drawRect: category*

Fifth (possibly), Apple Assistant.  Rumor Alert! Could the headline "Let's Talk iPhone" really have a hidden meaning.  
Apple gave the masses the first decent implementation of multi-touch.  The communication mechanism that is inherit to all since birth.  Pointing, dragging, panning, zooming, swiping.  With the acquisition of Siri, there is no doubt that Apple has been putting a lot of effort in creating the near-perfect implementation of voice recognition for the masses.  On device voice processing via natural speech that is a native part of iOS.  If this is the case, then you will see a whole new generation of apps and a paradigm shift in how users, impaired or not, interface with these applications.

Apple...let's talk.