Moving to Medium

Over the next few months I'll be migrating the blog over to Medium. I was an early Posterous adopter and really enjoyed the service. I was happy to move over to Posthaven as simple easy to use alternative.

However, my needs have grown while the features and updates to the platform. This is not a dig on the owners and developers of Posthaven. I think what they have committed to is admirable and we need more founders like them.

If you arrive here and are looking for my latest contributions please head over to Medium.

Objective-C Developer in a Swift World/ Map and Filter

Old Dog Swift Tricks

Over the past year and half I’ve spent as much time as possible learning Swift. I don’t care how senior of an iOS developer you are, Swift is a shift in language and paradigms. I hit the reset button in my development career and wanted to get up to speed with the new language changes in a short amount of time.

I’m Loving It

I’ve been reading, coding, cursing and embracing Swift since 1.0 (including Swift 3) and I love the language. Before I taught myself Objective-C I programmed in Java, PHP and occassionally Python. All of the later languages where extremely fun, but didn’t hold a candle to Objective-C or Swift.

Below are some examples of the new language features.

New Paradigms

  • Map
  • Reduce
  • Filter
  • Generics
  • ErrorType
  • Functional
  • Protocols
  • Value Types

Adjustments

Adjusting to these paradigms hasn’t been easy, but all been worth it. Swift is the future and the latest XCode is my Delorian.

Full Steam Ahead

Apple and the community have spoken…Swift is the future. With that said I’m not letting my Objective-C (fu) go to waste. I, like many others, have a TON of code written (and needs to be maintained) in Objective-C. The language and features are amazing and if you are starting out in iOS development then I would strongly suggest boning up on you Objective-C (fu).

Lessons Learned

Embrace the Swift change swiftly…and by change I mean high order functions mapand filter.

Map

Definition

The map method solves the problem of transforming the elements of an array using a function. Let’s look at some examples.

Example

Let’s say you want to display a list of names from a Person object in a UITableView. In the “old” way I would have a list of Person objects and then in my UITableViewCell I would have grabbed the instance of the Person object from the list and set the textLabel.text to the name property value.

In the Swift “way” map would simplify this.

struct Person {
    let name: String
    let age: Int
}

let names: [String] = people.map { return $0.name }

You can also apply the same principles to dictionaries.

Uppercase the values

let payload: Dictionary <String, String> = ["hello": "world", "steve": "jobs"]
let result = payload.map { (key, value) in (key, value.uppercased()) }

/// Output

[("hello", "WORLD"), ("steve", "JOBS")]

Filter

Definition

The filter method solves the problem of selecting the elements of an array that pass a certain condition

Example (name filter)

Filter names from a pre-defined list

Extending our map example from above let’s say you have a list of Person objects that you want to filter from another list. Somewhat like a search filter.

struct Person {
    let name: String
    let age: Int
}

let people = [
    Person(name: "Katie",  age: 23),
    Person(name: "Bob",    age: 21),
    Person(name: "Rachel", age: 33),
    Person(name: "John",   age: 27),
    Person(name: "Megan",  age: 15)
]

let matchingNames = ["Katie", "Rachel", "Megan"]
let names: [String] = people.map { return $0.name }
let filteredNames = names.filter({matchingNames.contains($0)})

/// Output

["Katie", "Rachel", "Megan"]

Example (character filter)

let badCharacters: [Character] = [".", "$", "#", "[", "]"]
let badChannelName: String = "thisis my string[.$#"

let cleanName = String(badChannelName.characters.filter{!badCharacters.contains($0)})

/// Output

"thisis my string"

Example (filter dictionary)

let data: [String: String] = ["hello": "world", "steve": "jobs"]
let filtered = data.filter { $0.1 == "world" }
// let filtered = data.filter { $1 == "jobs" }

Real World

I have a list of objects with three properties: name, imageName and selected. You can toggle the selection from the list display. If user selected one then I would toggle the “selected” flag then I would add it to a local Array of toggled items. Didn’t care there were dupes. When the user was finished with workflow then I need to pass the selected options to another view controller with only names.

Obvious question is “How do you filter an array of duplicates?”. With a a combination of map, filter and a computed property the answer is out there.

Let’s assume I have two lists of sports that can be selected or unselected and those datasource for those lists. What is the optimal solution for returning a final list of objects who has a selected value of true?

Let’s start with the lists

let SportPickerTypesRight: Array<SportPickerType> = [SportPickerType.create("NFL", imageName: "nfl_background"),
                                                     SportPickerType.create("NCAA Football", imageName: "football_background"),
                                                     SportPickerType.create("NCAA Basketball", imageName: "basketball_background"),
                                                     SportPickerType.create("NBA", imageName: "nfl_background")]

let SportPickerTypesLeft: Array<SportPickerType> = [SportPickerType.create("NHL", imageName: "nhl_background"),
                                                    SportPickerType.create("Soccer", imageName: "soccer_background"),
                                                    SportPickerType.create("MLB", imageName: "mlb_background"),
                                                    SportPickerType.create("Other", imageName: "other_background")]
var SportPickerTypes: Array<SportPickerType> {

    let combined = [SportPickerTypesRight, SportPickerTypesLeft]
    
    return combined.flatMap { $0 }
}

struct SportPickerType {

    let name: String
    
    let imageName: String
    
    var selected: Bool
    
    static func create(name: String, imageName: String) -> SportPickerType {
        return SportPickerType(name: name, imageName: imageName, selected: false)
    }
}

extension SportPickerType: Equatable {}

extension SportPickerType: Hashable {
    
    var hashValue: Int {
        return (name + imageName).hashValue
    }
}

func ==(lhs: SportPickerType, rhs: SportPickerType) -> Bool {
    return lhs.name == rhs.name
}

/// When you want to return a list of unique selected sports
///
/// What the property does is return a set (unique values)
/// and the filters that set for only values with selected == true
/// and returns that array of names

   var selectedSportTypes: Array<String> {
        
        let filteredSet: Set<SportPickerType> = Set(self.unfilteredSportTypes)
        let mappedNames: Array<String> = filteredSet.filter({$0.selected == true}).map{return $0.name}
 
        return mappedNames
    }

Understanding these higher level function provide great functionality and clarity to your coding. Swift is the future and learning / understanding these higher level functions will provide much more elegance to your code.

Handling Min Character Requirements in iOS with Swift

Bytes Not Length

While working on a little side project I was testing out some for validation on a textview. The business rule was that the user has to enter at least three characters before they can save the form. However, using an emoji should pass as well and wasn’t because I was checking character length only.

One string character == 1 byte, but one emjoi == 4 bytes.

I just had to change my validator it look for string bytes greater or equal to 3. Not a sexy solution, but it reminded me to a) not make assumptions b) to really think through usecases when dealing with form inputs.

Check out the playground.

2.0.1 More Than Meets the Eye

More Than Meets the Eye

2.0 was released back in mid December. The release was a 6 week effort to completely overall the UI. Today’s release was focused more on UI tweaks, bug fixes and universal app linking.

This Feedback is BAD

What seemed like over night, analytics showed that 2/3rds of iOS users are now on iPhone 6(s) phones. Two bug reports that came in on the same day made me panic. Users who were trying to do mobile broadcasts on a 6 or 6s were getting horrible feedback from the phone. I’m not sure if this hardware specific or iOS update that introduced regression.

In previous versions of the app, and lesser hardware, when a user wanted to started a broadcast I used the following configuration for the audio session:

[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:nil];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
[[AVAudioSession sharedInstance] setMode:AVAudioSessionModeDefault error:nil];

In order to fix the broadcasting feedback issue I had to override the audio output port like so:

    NSError *error;
    
    if (![[AVAudioSession sharedInstance] overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&error]) {
        RTVLog(@"AVAudioSession error overrideOutputAudioPort to Reciever:%@",error);
    }

Realtime Play Counts

Stats are not really very useful in a “snapshot” context. This release added in functionality that will update a broadcast’s play count in real time.

Up and Down

It seemed awkward for many users in the early days of iOS to “swipe” for actions, but over the past 3+ years Apple has been including these mechanisms in interacting with the OS and native apps (pull to refresh and mail controls). I added in swipe gestures on the broadcast header for up / down vote. The gestures free up real estate and feels more natural.

Custom Filter Control

I’m a huge advocate for using 3rd party libraries as little as possible in my apps, unless it absolutely necessary. There are tremendous number of truly amazing controls and libraries written by people smarter than myself, but I learn more my doing than copying. We wanted to move the “Live” and “Upcoming” filter from the drop down to above the lists and the user would toggle that way.

There are a million ways to skin that cat. I’m not the biggest fan the UIButton class due to all the setup, but using an interactive UIView didn’t seem the right approach either. I settled on just creating my own custom UIControl which was a fun little task that only took a few hours. Though it is simple to look at and interact, which was the goal, IMHO, the subtle animations and control states are what make it shine.

There is always room for improvment. If you have an comments or questions about features or bugs please email me at cory {@} rabble (.) tv

Rabble 1.3 Released - It's a BFD

In the Beginning

The two main functions of RabbleTV are listening and broadcasting. Since I first met with RabbleTV to talk about what they wanted one of the most talked about features has always been broadcasting from the app. We decided it was best to leave it out of the first few versions of the app because of the huge development effort that is involved with this one feature.

Once version 1.2 was safe and sound in the App Store I dove into the deep end of live broadcasting from the app. Ben (CTO) and I spent a few days just going through what was on the web and what needed to be re-thought to provide the best experience on the phone.

A BFD

On the desktop version you can create a broadast for Sports, TV / Movies or a RabbleCast. The first 2 have quite a bit of workflow that is doable on the phone, but would require an enormous effort on the client side to accomplish the ease of use that is found on the web. We wireframed the solution, but decided for this release we would stick with RabbleCasts. With workflow set out it was time to create the 1.3 feature branch and get to work.

Build, Buy or Borrow

AVPlayer provides a wonderful API for streaming audio and video. Unfortunately, there isn’t such for sending out audio and video. You really need to drop down to lower level API’s. When I set down to start writing the broadcasting functionality I had 3 choices:

  1. Write my own
  2. Use an open source solution
  3. Purchase and off the shelf solution

I’m a huge proponent of writing my own libraries (if possible)…especially if it centers around a core piece of functionality in the app. In this case, due to the feature timeline, it made sense to go with something stable that someone else had written. I just didn’t have the time to write the library. YET!

There aren’t many open source solutions, but the one I decided to use is MediaLib. After a few configuration settings made I was up and running fairly quickly. I felt a little uneasy about using the library because most of the source code is compiled into static libraries and since this was such an important piece to the app going forward I didn’t want to be caught in a situation where an iOS upgrade broke the library.

Due to the fact I still wasn’t in a place to write my own library we decided it would be worth the money to purchase a license (compiled version) from RealTimeLibs. Being the cautioned person that I am I got in contact with their product support to ask specifically about their iOS 9 support and development, upgrades, licensing and some custom development that we needed to add in. They were very quick in responding and gave me all the information that I need and a license was purchased.

I was assured that the custom development was only going to take 4 hours and would cost X amount extra. No problem. Weeks went by and no word on when the custom development was going to be done. This was very annoying, but still not a deal breaker since I had a stop gap put in place. What was a deal breaker the lack of background support.

When a user was broadcasting and puts the app in the background the stream was disconnect.

RTMP sends / receives audio and video. Even though I was doing audio broadcasts only their was still a video capture session initialized and you can’t broadcast video with the app in the background. I contact their support about this and was told…well not what I wanted to hear and knew better.

It is a good thing I didn’t delete my old manager. It was easy to revert the project back to using my open source based solution.

Having been reaffirmed in my belief of being in control of my app’s libraries I’ll be writing my own over the next few months when time allows.

Testing is Hard

Unlike Bill O’Rilley, I don’t just assume that something is going to work live. Any sort of live media publishing on a mobile device is going to get interrupted by FaceTime, phone calls, phone restarts, etc. A lot of time was spent testing various application states and system interruptions to make sure that listeners were notified, as well as, giving the broadcaster the ability to restart a broadcast if necessary.

I even found a bug with iOS 9 regarding FaceTime.

Additional Features

In addition to broadcasting these additions features and enhancements were added: * iOS 9 support * Spotlight search * Background fetch * Better list caching * New default broadasting image * Comment links now have better touch handling

Don’t wait download or update now!!!!

Broken FaceTime Audio Interruptions in iOS 9

Constant Interruptions

I’ve been working on a new feature for RabbleTV the past 3 months. Now that the functionality is pretty close to shipping I’m going through all the various testing scenarios of making sure that if/when the user gets an audio interruption (phone call, FaceTime, etc) that I handle the app state appropriately.

Apple has pretty straight forward guidelines of how to handle these types of interruptions. When one occurs the AVAudioSessionInterruptionNotification is sent out and the developer can inspect whether the AVAudioSessionInterruptionTypeKey is equal to AVAudioSessionInterruptionTypeBegan or AVAudioSessionInterruptionTypeEnded and handle their app state and UI appropriately.

HOWEVER, looking closer at the documentation there is one very important sentence:

There is no guarantee that a begin interruption will have an end interruption. Your app needs to be aware of switching to a foreground running state or the user pressing a play button. In either case, determine whether your app should reactivate its audio session.

Cause for Panic?

Receiving FaceTime calls on iOS 8 works “as expected”. The appropriate notifications are fired off by iOS and I’m able to pause and continue my use of the microphone. The development was moving right along until I started testing under iOS 9.

Going through the same scenario using iOS 9 the AVAudioSessionInterruptionTypeEnded type is never available. The AVAudioSessionInterruptionNotification is called when the interruption begins and ends, but the only type that is set is AVAudioSessionInterruptionTypeBegan.

iOS 9 Log

2015–09–17 09:40:32.098 RabbleTV[6541:2258301] _66-[RTVBroadcastingManager addAudioInterruptionNotificationListener]block_invoke(271): notification for interruption NSConcreteNotification 0x14e8a6ea0 {name = AVAudioSessionInterruptionNotification; object = <AVAudioSession: 0x14e813220>; userInfo = { AVAudioSessionInterruptionTypeKey = 1;}}

iOS 8 Log

2015–09–17 09:34:35.405 RabbleTV[471:106341] _66-[RTVBroadcastingManager addAudioInterruptionNotificationListener]block_invoke(271): notification for interruption NSConcreteNotification 0x17f31ab0 {name = AVAudioSessionInterruptionNotification; object = <AVAudioSession: 0x17ee4860>; userInfo = { AVAudioSessionInterruptionTypeKey = 1;}} 2015–09–17 09:34:52.715 RabbleTV[471:106341] _66-[RTVBroadcastingManager addAudioInterruptionNotificationListener]block_invoke(271): notification for interruption NSConcreteNotification 0x17dae300 {name = AVAudioSessionInterruptionNotification; object = <AVAudioSession: 0x17ee4860>; userInfo = { AVAudioSessionInterruptionOptionKey = 1; AVAudioSessionInterruptionTypeKey = 0; }}

I have some business rules that add a bit more complexity to work around this issue, but if your only concern is to know when it the interruption began and ended then you can set a property to toggle between the states.

I’ve filed a radar with Apple.

Detecting When UIGravityBehavior is “Off Screen"

A Simple Implementation

Yesterday I was working on some UI enhancements to a new feature that is coming for RabbleTV. One of the pieces to the UI envolved using UIDynamics…specifically UIGravityBehavior. This was going to be a pretty straightforward implementation considering I didn’t need to use the UIGravityBehavior with any other types as I had done with previous apps.

Assumptions Are Bad

During some testing I noticed that the CPU would spike during the animation, but never go back down after I assumed the animation was complete…in my case the “fall” passed the referced view’s bounds. I didn’t think too much of it at the time because I still needed to add in my completionHandler. I kicked the can down the road for a few hours until I could profile it. I assumed it must have been a coincidence since I’m also making network calls during this animation as well.

Upon the first run of my now completed UI animation the completionHandler wasn’t called. I checked and doubled checked my code and all the appropriate delegates and properties were set. The next part of my debugging strategy was to see when exactly the behavior was stopped. Perhaps I was trying to perform an action before everything had been completed. This is where my assumption bit me.

I had assumed that UIGravityBehavior was completing, but in reality it wasn’t. I was able to verify this by logging the current point in the reference view the item was at using linearVelocityForItem.

The fall was infinite. After I stopped and thought about it it made sense. If the UIGravityBehavior is supposed to represent gravity on an object and space is infinite then why would it ever stop. I had never run into this before because in all my other experiences of using UIDynamics I used UIGravityBehavior inconjunction with other behaviors.

Choose Your Solution

As I saw it I had two possible soultions to implement to fix my issue.

First

Use UICollisionBehavior. There really isn’t much more to say there. You can setTranslatesReferenceBoundsIntoBoundaryWithInsets to setup the area where you want the items to “stop”.

Second

Add a UIDynamicBehavior that checks for the Y coordinate as the items are falling (specifically the last item). Once it is past the height of the reference view then remove the behaviors.

And the winner is…

I opted for the second approach because it gave me more control over when to stop the animation. Once I updated my animation controller all of the delegate and completionHandlers were properly called.

Code Snippet

// MARK: Public

- (void)animateItemsWithCompletionBlock:(RTVStandardCompletionBlock)block {

    if (block) {
        self.animationCompletionBlock = [block copy];
    }

    self.animator.delegate = self;

    NSArray *referenceItems = self.itemReferences.allObjects;
    
    /**
     * Gravity Behavior
     */

    UIGravityBehavior *gravityBehavior = [[UIGravityBehavior alloc] initWithItems:referenceItems];

    [self.animator addBehavior:gravityBehavior];
    
    /**
     * Dynamic Behavior
     *
     * @note
     * I'm adding the dynamic behavior so that I can tell when the last item
     * has fallen past the bottom of the screen. Once it has then I remove all
     * the behaviors. This will trigger the animator delegate method, which will
     * call the completionBlock.
     *
     * Without doing this the view continues to "fall" and eats up the CPU.
     * Another possible solution is to setup a collision barrier which should
     * trigger it as well.
     */
    
    UIDynamicItemBehavior *dynamicItemBehavior = [[UIDynamicItemBehavior alloc] initWithItems:referenceItems];
    
    __weak UIDynamicItemBehavior *weakBehavior = dynamicItemBehavior;
    __weak typeof(self) weakSelf = self;

    dynamicItemBehavior.action = ^{

        /**
         * @note
         * You only need to wait for the last object to finish (drop below) as 
         * opposed to iterating over all the item.
         */

        CGFloat currentY = [weakBehavior linearVelocityForItem:referenceItems.lastObject].y;
        
        if (currentY > CGRectGetMaxY(self.animator.referenceView.frame)) {
            [weakSelf.animator removeAllBehaviors];
        }
    };

    [self.animator addBehavior:dynamicItemBehavior];
    
    /**
     * Implict Animation of Alpha
     */
    
    [referenceItems enumerateObjectsUsingBlock:^(UIView *item, NSUInteger idx, BOOL *stop){

        [UIView animateWithDuration:0.65
                              delay:0
                            options:kNilOptions
                         animations:^{
                             item.alpha = 0.0f;
                         }
                         completion:nil];
    }];
}

Lessons (Re)Learned

  1. Space is inifinite
  2. Never assume anything

Rabble 1.2 Released

Today version 1.2 launched. Launching 1.1 was pretty hectic and I missed detailing some of the more technical changes from 1.0. Each iteration brings important lessions learned.

Version 1.1

Your version 1.0 is always the hardest. This is because you want to get to market as quickly as you can, but not at the expense of quality and user experience. Before 1.0 was started 1.1 was already roadmapped and I began basic development a few weeks before 1.0 was released.

Features, Fixesa and Improvements

  • Added ability to open broadcast detail link from the web directly into the app. I wrote my own router classes for this which I am able to use for more deep linking functionality and simplified the code for the OAuth dance that must be performed for social media authentication
  • Fixed bug where an archived broadcast would start playing if it was paused, the app was put into the background and then you brought the app back in the foreground. This was caused by the state of the AVPlayer and KVO calls automatically calling play on the instance
  • Fixed display issues with Rabblecasts.
  • Fixed crash that would occur when trying to load more “featured” broadcasts.
  • Fixed issue where sometimes broadcasts for PLU would be co-mingled.
  • Added loading feature for play/pause. AVPlayer has to do some setup work before the HLS stream will start playing and depending on the network connection this could take a bit of time so I overlay a UIActivityIndicatorView.
  • Wrote my own custom label with link tap support so comments are much better. This was one my favorite improvements. I ended up migrating most of the other labels to this new subclass.
  • Caching lists. I consider to caching to be a shield and sword. On one hand it allows for a much better use experience and the conversvation of resources, it also introduces lots of potential bugs and in a multithreaded environment race conditions. Further down on the roadmap I will be integrating CoreData, but as “step 1” I’m using NSCache. I created a fairly nice cache manager and protocol that sits on top of it so that I can swap out NSCache or bolt CoreData on top of it when the time comes.
  • Reminders. This was the biggest feature to the app. When a user schedules a broadcast there isn’t a 100% guarantee that it will start or start on time. This is not a fault of Rabble’s service, but rather the broadcaster has failed to start it and/or is running late. On each upcoming broadcast detail screen you can set a reminder. If/when the broadcaster starts you’ll be notified via push notification. This is big improvement for user engagement and retention.
  • Fixed issue with certain sports broadcasts were showing (null).

Version 1.2

The goal of 1.2 was geared to more polish, speed enhancements and putting the app’s UI/UX in the direction for some big features coming over the next two to three months.

  • Simplified the app navigation which will allow for adding in additional features (hint hint).
  • Custom transitions.
  • Fixed nasty keychain bug. Fortunately this didn’t effect, but one user. The issue came about during some UAT. The bug occured if you deleted the app without logging out first, reinstalling the app and trying to login. The old keychain value wasn’t being deleted so you would find yourself in “authentication” loop.
  • Custom font.
  • Moved to swipe actions for comments and role action. The comments’ cell looked pretty crowed and the touch areas for up and down voting weren’t optimum. I moved everything over to a swipe to reveal control similar to the native mail application. This also allowed me to add in more role based actions for deleting and muting comments.
  • Fixed overlay bug that occured on the main broadcast list screens.
  • Tweaked caching.
  • Migrated my RabbleKit static library over to a framework

Intro to iOS Class Teaching Experience

File -> New -> Teaching Class

It is no secret that there has been a growing, and necesary, push to bring more females into tech. I was asked a few months ago if I would be interested in teaching an intro to iOS class for woman at Lamp Post Group. It would be a six week course that met two hours every Wednesday. I jumped at this opprotunity. I’ve given lectures and pitches before, but never taught a class so I figured this would be an amazing opprotunity to:

a) contribute something positivie to the local femal tech community

b) help myself grow in an area that I’m interested in

c) promote iOS development.

When I talked with the program director at LPG I learned of my most exciting challenge yet…the diverse background of my students. This included not only age, but the level of prior experience in any development. I couldn’t wait to get started.

Develop and Teach with a Purpose

The most memorable classes I had in college were always the one’s where my professors had passion for the subject(s) they were teaching. It shouldn’t be a surprise to anyone that those were the classes that I excelled in. My computer science and M.I.S. classes I took I was average in because of the lack of enthusiasm from my teachers.

One of my primary goals was to make sure that my students knew that I was passionate about iOS development and that they had my full attention. I was there for them.

The next step was to outline a curriculum that I could present to the class. This was tougher than I had anticipated. Most of the intro classes that I have seen or been in always position themselves as “by the end of the class or workshop you will have built an app”. I personally think this does a disservice to the attendees. My reasoning for feeling this way is that if you break down the various components it takes to actually create an app, the coding portion is but a fraction. What about the actual submission process, Xcode, design patterns, language features, etc? These are all very important concepts that are too often skipped.

This would not be the case in my class. At the end of the class I wanted the women to have the necessary foundation to carry on with their future careers.

Pencil vs Pen

At the beginning of the first class I told everyone that our curriculum wasn’t going to be written in stone. Granted, I had curriculum outlined, but it was important that we move at the pace of the class, thus the topics covered could / would be adapted.

In retrospect this was the best decision I made.

Classroom Supplies

PowerPoint was always the bane of my existence in college and early in my career. Keynote was a much welcome alternative, but I probably use less than 1% of its power. I don’t use animated transitions or fancy slide layouts. I keep each slide to a topic or sub topic and engage my students. There is nothing worse than having a presenter read verbatim from their presentation.

Deckset was the perfection solution. Creating presentations using markup was right up my alley. I can’t endorse this product enough.

if let humbled

I’ve been writing software for the past 18 years and iOS apps since the initial SDK. Teaching this class has been without a doubt my greatest challenge and reward. Truth be told, I was always nervous each passing week if people would return or was scaring away future developers, but each week the classroom was full and the questions were more intriguing.

Over the six weeks I think I learned just as much as my students did.

During our last class I asked if anyone had any final thoughts or questions before we signed off. The question shared amongst the majority of the class was, “Where do we go if we want to learn more?”. A huge smile came over my face and I felt that I was successful in teaching the class. I knew that I wasn’t going to be able to cover every topic that I wanted to nor would everyone “get it” at the end, but my goal was to inspire them to keep learning.

Code of Wisdom

The advice I gave them to becoming successful developers…

Presentations

All of my desks and samples are on Github.

Mute.Mic.Rabble 1.0

Receipe for Success

I’ve never quite understood why television broadcasting, especially sports, has continued to decline in quality over the past 20 years. I speculate that the networks would rather spend money on “personalities” that actually distract from the event than enhance it. In addition, there are so many rules and regulations imposed by the FCC, networks and sponsors that I always felt watered down the experience.

Other mediums, such as, podcasts, forums, neighborhood bars or a friend’s basement; you have individuals, fans and super fans who provide much better commentatry, stats and experiences than you would ever find on the networks. These individuals aren’t shackeled by duldrums plaguing corporate broadcasting and are free to really “call it like they see it”. Unfortunately, there hasn’t been a platform for these loquacious verablizers to sound off…that allowed for not just one way communication, but also given the ability to interact with their listeners in real time.

Introducing Rabble.tv

Serendipty

I knew my contract was ending last December with Life360, so around November I started to look around for the “next thing”. I got a call from Ben (CTO of Rabble and best friend) asking if I would come to Nashville and chat with the other two founders about what it would take, or even possible, to build their iOS app. I don’t need much of an excuse to go hang out with Ben and other cool people in Nashville. After spending about half a day hearing what they needed and when they needed it I headed back to Chattanooga with the green light to build the first RabbleTV app.

File -> New Workspace/Project/Design/EcoSystem, etc.

~35 days to develop…that’s what I had.

When I sat down to start cranking on the app it really began with a todo list of tasks that needed to be completed before I could actually start writing code. This included setting up the developer account, certs, github repos, workflows for testing and general design/wireframe elements. Since there was no time to waste I jumped right on my list and starting knocking things out.

Because fate loves to provide us with challenges, a few where thrown my way. These were in the form of getting the flu for almost 5 days, as well as, my wife and kids. Nothing brings development to a hault like a house full of sick people. Despite this set back I was able to make up the time. Ironcially, what caught me up was going on vacation with my wife to Tulum. We had our trip planned for almost a year and right before we left I was defintiely feeling the pressure. I had many late nights so completely disconnecting for 4 days was exactly what I needed. When I got back the saw was sharpened and was cutting faster and better than ever.

Development

Writing Swiftly with Objective-C

The entire app was written in Objective-C. Yep, that’s right…not the first line of Swift. My reasoning for this is due to the very limited time for development and right now I’m much faster with Objective-C. Future releases will incorporate Swift. I did setup the project to support Swift from the beginning knowing that I wanted to use it in the near future.

Hard Feature Decisions

The features on the web far out number the features on version 1 of the app. This was both a neccessity and a choice. The end goal of my development time was to go from version none to version one. Trying to fit all that Ben had developed for the web would have easily been 6 to 8 months of development. In reality the usage of on the phone is going to be much different than that on the web. Shocker I know!!!!

The Rabble guys were great to work with in regards to these decisions. As such, I believe and stand by the features that are supported by the iOS app. Additional features will be coming shortly.

TestFlight and Crashes

For beta testing I opted for TestFlight for beta distrubutions and Flurry for reporting and crash reporting. This was both a blessing and a curse. By curse I mean all the cursing I did when iTunes Connect had its meltdowns and a blessing when it worked. My experience was 50/50.

What’s Next

Discussions for what was going to be included also included what was going to be in 1.0.1. These are secondary, but still important features, as well as, bug fixes. I’ve also laid out the roadmap for 1.0.2, 1.1 and 1.2.

Zen and the Art of iOS Development

This has been the BEST app I’ve ever worked on. The app is far from perfect and will continue to improve and evolve (like software is supposed to), but the experiences couldn’t have been better. The work/life balance has been phenonmueal and working 1:1 with Rabble couldn’t be better.

My good friend, Jason Jardim, tweeted an article called Looking back at where it all began, the basement startup. As I look back on building the Rabble iOS app, I completely understand what the author is describing. Though I actually work in my home office, I have lots of fresh air, open light, etc…it is pretty humble and I wouldn’t trade it for anything.

Mute.Mic.Rabble

Promo Video

Download Now

Sign up now