Handling Min Character Requirements in iOS with Swift

Bytes Not Length

While working on a little side project I was testing out some for validation on a textview. The business rule was that the user has to enter at least three characters before they can save the form. However, using an emoji should pass as well and wasn’t because I was checking character length only.

One string character == 1 byte, but one emjoi == 4 bytes.

I just had to change my validator it look for string bytes greater or equal to 3. Not a sexy solution, but it reminded me to a) not make assumptions b) to really think through usecases when dealing with form inputs.

Check out the playground.

2.0.1 More Than Meets the Eye

More Than Meets the Eye

2.0 was released back in mid December. The release was a 6 week effort to completely overall the UI. Today’s release was focused more on UI tweaks, bug fixes and universal app linking.

This Feedback is BAD

What seemed like over night, analytics showed that 2/3rds of iOS users are now on iPhone 6(s) phones. Two bug reports that came in on the same day made me panic. Users who were trying to do mobile broadcasts on a 6 or 6s were getting horrible feedback from the phone. I’m not sure if this hardware specific or iOS update that introduced regression.

In previous versions of the app, and lesser hardware, when a user wanted to started a broadcast I used the following configuration for the audio session:

[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:nil];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
[[AVAudioSession sharedInstance] setMode:AVAudioSessionModeDefault error:nil];

In order to fix the broadcasting feedback issue I had to override the audio output port like so:

    NSError *error;
    
    if (![[AVAudioSession sharedInstance] overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&error]) {
        RTVLog(@"AVAudioSession error overrideOutputAudioPort to Reciever:%@",error);
    }

Realtime Play Counts

Stats are not really very useful in a “snapshot” context. This release added in functionality that will update a broadcast’s play count in real time.

Up and Down

It seemed awkward for many users in the early days of iOS to “swipe” for actions, but over the past 3+ years Apple has been including these mechanisms in interacting with the OS and native apps (pull to refresh and mail controls). I added in swipe gestures on the broadcast header for up / down vote. The gestures free up real estate and feels more natural.

Custom Filter Control

I’m a huge advocate for using 3rd party libraries as little as possible in my apps, unless it absolutely necessary. There are tremendous number of truly amazing controls and libraries written by people smarter than myself, but I learn more my doing than copying. We wanted to move the “Live” and “Upcoming” filter from the drop down to above the lists and the user would toggle that way.

There are a million ways to skin that cat. I’m not the biggest fan the UIButton class due to all the setup, but using an interactive UIView didn’t seem the right approach either. I settled on just creating my own custom UIControl which was a fun little task that only took a few hours. Though it is simple to look at and interact, which was the goal, IMHO, the subtle animations and control states are what make it shine.

There is always room for improvment. If you have an comments or questions about features or bugs please email me at cory {@} rabble (.) tv

Rabble 1.3 Released - It's a BFD

In the Beginning

The two main functions of RabbleTV are listening and broadcasting. Since I first met with RabbleTV to talk about what they wanted one of the most talked about features has always been broadcasting from the app. We decided it was best to leave it out of the first few versions of the app because of the huge development effort that is involved with this one feature.

Once version 1.2 was safe and sound in the App Store I dove into the deep end of live broadcasting from the app. Ben (CTO) and I spent a few days just going through what was on the web and what needed to be re-thought to provide the best experience on the phone.

A BFD

On the desktop version you can create a broadast for Sports, TV / Movies or a RabbleCast. The first 2 have quite a bit of workflow that is doable on the phone, but would require an enormous effort on the client side to accomplish the ease of use that is found on the web. We wireframed the solution, but decided for this release we would stick with RabbleCasts. With workflow set out it was time to create the 1.3 feature branch and get to work.

Build, Buy or Borrow

AVPlayer provides a wonderful API for streaming audio and video. Unfortunately, there isn’t such for sending out audio and video. You really need to drop down to lower level API’s. When I set down to start writing the broadcasting functionality I had 3 choices:

  1. Write my own
  2. Use an open source solution
  3. Purchase and off the shelf solution

I’m a huge proponent of writing my own libraries (if possible)…especially if it centers around a core piece of functionality in the app. In this case, due to the feature timeline, it made sense to go with something stable that someone else had written. I just didn’t have the time to write the library. YET!

There aren’t many open source solutions, but the one I decided to use is MediaLib. After a few configuration settings made I was up and running fairly quickly. I felt a little uneasy about using the library because most of the source code is compiled into static libraries and since this was such an important piece to the app going forward I didn’t want to be caught in a situation where an iOS upgrade broke the library.

Due to the fact I still wasn’t in a place to write my own library we decided it would be worth the money to purchase a license (compiled version) from RealTimeLibs. Being the cautioned person that I am I got in contact with their product support to ask specifically about their iOS 9 support and development, upgrades, licensing and some custom development that we needed to add in. They were very quick in responding and gave me all the information that I need and a license was purchased.

I was assured that the custom development was only going to take 4 hours and would cost X amount extra. No problem. Weeks went by and no word on when the custom development was going to be done. This was very annoying, but still not a deal breaker since I had a stop gap put in place. What was a deal breaker the lack of background support.

When a user was broadcasting and puts the app in the background the stream was disconnect.

RTMP sends / receives audio and video. Even though I was doing audio broadcasts only their was still a video capture session initialized and you can’t broadcast video with the app in the background. I contact their support about this and was told…well not what I wanted to hear and knew better.

It is a good thing I didn’t delete my old manager. It was easy to revert the project back to using my open source based solution.

Having been reaffirmed in my belief of being in control of my app’s libraries I’ll be writing my own over the next few months when time allows.

Testing is Hard

Unlike Bill O’Rilley, I don’t just assume that something is going to work live. Any sort of live media publishing on a mobile device is going to get interrupted by FaceTime, phone calls, phone restarts, etc. A lot of time was spent testing various application states and system interruptions to make sure that listeners were notified, as well as, giving the broadcaster the ability to restart a broadcast if necessary.

I even found a bug with iOS 9 regarding FaceTime.

Additional Features

In addition to broadcasting these additions features and enhancements were added: * iOS 9 support * Spotlight search * Background fetch * Better list caching * New default broadasting image * Comment links now have better touch handling

Don’t wait download or update now!!!!

Broken FaceTime Audio Interruptions in iOS 9

Constant Interruptions

I’ve been working on a new feature for RabbleTV the past 3 months. Now that the functionality is pretty close to shipping I’m going through all the various testing scenarios of making sure that if/when the user gets an audio interruption (phone call, FaceTime, etc) that I handle the app state appropriately.

Apple has pretty straight forward guidelines of how to handle these types of interruptions. When one occurs the AVAudioSessionInterruptionNotification is sent out and the developer can inspect whether the AVAudioSessionInterruptionTypeKey is equal to AVAudioSessionInterruptionTypeBegan or AVAudioSessionInterruptionTypeEnded and handle their app state and UI appropriately.

HOWEVER, looking closer at the documentation there is one very important sentence:

There is no guarantee that a begin interruption will have an end interruption. Your app needs to be aware of switching to a foreground running state or the user pressing a play button. In either case, determine whether your app should reactivate its audio session.

Cause for Panic?

Receiving FaceTime calls on iOS 8 works “as expected”. The appropriate notifications are fired off by iOS and I’m able to pause and continue my use of the microphone. The development was moving right along until I started testing under iOS 9.

Going through the same scenario using iOS 9 the AVAudioSessionInterruptionTypeEnded type is never available. The AVAudioSessionInterruptionNotification is called when the interruption begins and ends, but the only type that is set is AVAudioSessionInterruptionTypeBegan.

iOS 9 Log

2015–09–17 09:40:32.098 RabbleTV[6541:2258301] _66-[RTVBroadcastingManager addAudioInterruptionNotificationListener]block_invoke(271): notification for interruption NSConcreteNotification 0x14e8a6ea0 {name = AVAudioSessionInterruptionNotification; object = <AVAudioSession: 0x14e813220>; userInfo = { AVAudioSessionInterruptionTypeKey = 1;}}

iOS 8 Log

2015–09–17 09:34:35.405 RabbleTV[471:106341] _66-[RTVBroadcastingManager addAudioInterruptionNotificationListener]block_invoke(271): notification for interruption NSConcreteNotification 0x17f31ab0 {name = AVAudioSessionInterruptionNotification; object = <AVAudioSession: 0x17ee4860>; userInfo = { AVAudioSessionInterruptionTypeKey = 1;}} 2015–09–17 09:34:52.715 RabbleTV[471:106341] _66-[RTVBroadcastingManager addAudioInterruptionNotificationListener]block_invoke(271): notification for interruption NSConcreteNotification 0x17dae300 {name = AVAudioSessionInterruptionNotification; object = <AVAudioSession: 0x17ee4860>; userInfo = { AVAudioSessionInterruptionOptionKey = 1; AVAudioSessionInterruptionTypeKey = 0; }}

I have some business rules that add a bit more complexity to work around this issue, but if your only concern is to know when it the interruption began and ended then you can set a property to toggle between the states.

I’ve filed a radar with Apple.

Detecting When UIGravityBehavior is “Off Screen"

A Simple Implementation

Yesterday I was working on some UI enhancements to a new feature that is coming for RabbleTV. One of the pieces to the UI envolved using UIDynamics…specifically UIGravityBehavior. This was going to be a pretty straightforward implementation considering I didn’t need to use the UIGravityBehavior with any other types as I had done with previous apps.

Assumptions Are Bad

During some testing I noticed that the CPU would spike during the animation, but never go back down after I assumed the animation was complete…in my case the “fall” passed the referced view’s bounds. I didn’t think too much of it at the time because I still needed to add in my completionHandler. I kicked the can down the road for a few hours until I could profile it. I assumed it must have been a coincidence since I’m also making network calls during this animation as well.

Upon the first run of my now completed UI animation the completionHandler wasn’t called. I checked and doubled checked my code and all the appropriate delegates and properties were set. The next part of my debugging strategy was to see when exactly the behavior was stopped. Perhaps I was trying to perform an action before everything had been completed. This is where my assumption bit me.

I had assumed that UIGravityBehavior was completing, but in reality it wasn’t. I was able to verify this by logging the current point in the reference view the item was at using linearVelocityForItem.

The fall was infinite. After I stopped and thought about it it made sense. If the UIGravityBehavior is supposed to represent gravity on an object and space is infinite then why would it ever stop. I had never run into this before because in all my other experiences of using UIDynamics I used UIGravityBehavior inconjunction with other behaviors.

Choose Your Solution

As I saw it I had two possible soultions to implement to fix my issue.

First

Use UICollisionBehavior. There really isn’t much more to say there. You can setTranslatesReferenceBoundsIntoBoundaryWithInsets to setup the area where you want the items to “stop”.

Second

Add a UIDynamicBehavior that checks for the Y coordinate as the items are falling (specifically the last item). Once it is past the height of the reference view then remove the behaviors.

And the winner is…

I opted for the second approach because it gave me more control over when to stop the animation. Once I updated my animation controller all of the delegate and completionHandlers were properly called.

Code Snippet

// MARK: Public

- (void)animateItemsWithCompletionBlock:(RTVStandardCompletionBlock)block {

    if (block) {
        self.animationCompletionBlock = [block copy];
    }

    self.animator.delegate = self;

    NSArray *referenceItems = self.itemReferences.allObjects;
    
    /**
     * Gravity Behavior
     */

    UIGravityBehavior *gravityBehavior = [[UIGravityBehavior alloc] initWithItems:referenceItems];

    [self.animator addBehavior:gravityBehavior];
    
    /**
     * Dynamic Behavior
     *
     * @note
     * I'm adding the dynamic behavior so that I can tell when the last item
     * has fallen past the bottom of the screen. Once it has then I remove all
     * the behaviors. This will trigger the animator delegate method, which will
     * call the completionBlock.
     *
     * Without doing this the view continues to "fall" and eats up the CPU.
     * Another possible solution is to setup a collision barrier which should
     * trigger it as well.
     */
    
    UIDynamicItemBehavior *dynamicItemBehavior = [[UIDynamicItemBehavior alloc] initWithItems:referenceItems];
    
    __weak UIDynamicItemBehavior *weakBehavior = dynamicItemBehavior;
    __weak typeof(self) weakSelf = self;

    dynamicItemBehavior.action = ^{

        /**
         * @note
         * You only need to wait for the last object to finish (drop below) as 
         * opposed to iterating over all the item.
         */

        CGFloat currentY = [weakBehavior linearVelocityForItem:referenceItems.lastObject].y;
        
        if (currentY > CGRectGetMaxY(self.animator.referenceView.frame)) {
            [weakSelf.animator removeAllBehaviors];
        }
    };

    [self.animator addBehavior:dynamicItemBehavior];
    
    /**
     * Implict Animation of Alpha
     */
    
    [referenceItems enumerateObjectsUsingBlock:^(UIView *item, NSUInteger idx, BOOL *stop){

        [UIView animateWithDuration:0.65
                              delay:0
                            options:kNilOptions
                         animations:^{
                             item.alpha = 0.0f;
                         }
                         completion:nil];
    }];
}

Lessons (Re)Learned

  1. Space is inifinite
  2. Never assume anything

Rabble 1.2 Released

Today version 1.2 launched. Launching 1.1 was pretty hectic and I missed detailing some of the more technical changes from 1.0. Each iteration brings important lessions learned.

Version 1.1

Your version 1.0 is always the hardest. This is because you want to get to market as quickly as you can, but not at the expense of quality and user experience. Before 1.0 was started 1.1 was already roadmapped and I began basic development a few weeks before 1.0 was released.

Features, Fixesa and Improvements

  • Added ability to open broadcast detail link from the web directly into the app. I wrote my own router classes for this which I am able to use for more deep linking functionality and simplified the code for the OAuth dance that must be performed for social media authentication
  • Fixed bug where an archived broadcast would start playing if it was paused, the app was put into the background and then you brought the app back in the foreground. This was caused by the state of the AVPlayer and KVO calls automatically calling play on the instance
  • Fixed display issues with Rabblecasts.
  • Fixed crash that would occur when trying to load more “featured” broadcasts.
  • Fixed issue where sometimes broadcasts for PLU would be co-mingled.
  • Added loading feature for play/pause. AVPlayer has to do some setup work before the HLS stream will start playing and depending on the network connection this could take a bit of time so I overlay a UIActivityIndicatorView.
  • Wrote my own custom label with link tap support so comments are much better. This was one my favorite improvements. I ended up migrating most of the other labels to this new subclass.
  • Caching lists. I consider to caching to be a shield and sword. On one hand it allows for a much better use experience and the conversvation of resources, it also introduces lots of potential bugs and in a multithreaded environment race conditions. Further down on the roadmap I will be integrating CoreData, but as “step 1” I’m using NSCache. I created a fairly nice cache manager and protocol that sits on top of it so that I can swap out NSCache or bolt CoreData on top of it when the time comes.
  • Reminders. This was the biggest feature to the app. When a user schedules a broadcast there isn’t a 100% guarantee that it will start or start on time. This is not a fault of Rabble’s service, but rather the broadcaster has failed to start it and/or is running late. On each upcoming broadcast detail screen you can set a reminder. If/when the broadcaster starts you’ll be notified via push notification. This is big improvement for user engagement and retention.
  • Fixed issue with certain sports broadcasts were showing (null).

Version 1.2

The goal of 1.2 was geared to more polish, speed enhancements and putting the app’s UI/UX in the direction for some big features coming over the next two to three months.

  • Simplified the app navigation which will allow for adding in additional features (hint hint).
  • Custom transitions.
  • Fixed nasty keychain bug. Fortunately this didn’t effect, but one user. The issue came about during some UAT. The bug occured if you deleted the app without logging out first, reinstalling the app and trying to login. The old keychain value wasn’t being deleted so you would find yourself in “authentication” loop.
  • Custom font.
  • Moved to swipe actions for comments and role action. The comments’ cell looked pretty crowed and the touch areas for up and down voting weren’t optimum. I moved everything over to a swipe to reveal control similar to the native mail application. This also allowed me to add in more role based actions for deleting and muting comments.
  • Fixed overlay bug that occured on the main broadcast list screens.
  • Tweaked caching.
  • Migrated my RabbleKit static library over to a framework

Intro to iOS Class Teaching Experience

File -> New -> Teaching Class

It is no secret that there has been a growing, and necesary, push to bring more females into tech. I was asked a few months ago if I would be interested in teaching an intro to iOS class for woman at Lamp Post Group. It would be a six week course that met two hours every Wednesday. I jumped at this opprotunity. I’ve given lectures and pitches before, but never taught a class so I figured this would be an amazing opprotunity to:

a) contribute something positivie to the local femal tech community

b) help myself grow in an area that I’m interested in

c) promote iOS development.

When I talked with the program director at LPG I learned of my most exciting challenge yet…the diverse background of my students. This included not only age, but the level of prior experience in any development. I couldn’t wait to get started.

Develop and Teach with a Purpose

The most memorable classes I had in college were always the one’s where my professors had passion for the subject(s) they were teaching. It shouldn’t be a surprise to anyone that those were the classes that I excelled in. My computer science and M.I.S. classes I took I was average in because of the lack of enthusiasm from my teachers.

One of my primary goals was to make sure that my students knew that I was passionate about iOS development and that they had my full attention. I was there for them.

The next step was to outline a curriculum that I could present to the class. This was tougher than I had anticipated. Most of the intro classes that I have seen or been in always position themselves as “by the end of the class or workshop you will have built an app”. I personally think this does a disservice to the attendees. My reasoning for feeling this way is that if you break down the various components it takes to actually create an app, the coding portion is but a fraction. What about the actual submission process, Xcode, design patterns, language features, etc? These are all very important concepts that are too often skipped.

This would not be the case in my class. At the end of the class I wanted the women to have the necessary foundation to carry on with their future careers.

Pencil vs Pen

At the beginning of the first class I told everyone that our curriculum wasn’t going to be written in stone. Granted, I had curriculum outlined, but it was important that we move at the pace of the class, thus the topics covered could / would be adapted.

In retrospect this was the best decision I made.

Classroom Supplies

PowerPoint was always the bane of my existence in college and early in my career. Keynote was a much welcome alternative, but I probably use less than 1% of its power. I don’t use animated transitions or fancy slide layouts. I keep each slide to a topic or sub topic and engage my students. There is nothing worse than having a presenter read verbatim from their presentation.

Deckset was the perfection solution. Creating presentations using markup was right up my alley. I can’t endorse this product enough.

if let humbled

I’ve been writing software for the past 18 years and iOS apps since the initial SDK. Teaching this class has been without a doubt my greatest challenge and reward. Truth be told, I was always nervous each passing week if people would return or was scaring away future developers, but each week the classroom was full and the questions were more intriguing.

Over the six weeks I think I learned just as much as my students did.

During our last class I asked if anyone had any final thoughts or questions before we signed off. The question shared amongst the majority of the class was, “Where do we go if we want to learn more?”. A huge smile came over my face and I felt that I was successful in teaching the class. I knew that I wasn’t going to be able to cover every topic that I wanted to nor would everyone “get it” at the end, but my goal was to inspire them to keep learning.

Code of Wisdom

The advice I gave them to becoming successful developers…

Presentations

All of my desks and samples are on Github.

Mute.Mic.Rabble 1.0

Receipe for Success

I’ve never quite understood why television broadcasting, especially sports, has continued to decline in quality over the past 20 years. I speculate that the networks would rather spend money on “personalities” that actually distract from the event than enhance it. In addition, there are so many rules and regulations imposed by the FCC, networks and sponsors that I always felt watered down the experience.

Other mediums, such as, podcasts, forums, neighborhood bars or a friend’s basement; you have individuals, fans and super fans who provide much better commentatry, stats and experiences than you would ever find on the networks. These individuals aren’t shackeled by duldrums plaguing corporate broadcasting and are free to really “call it like they see it”. Unfortunately, there hasn’t been a platform for these loquacious verablizers to sound off…that allowed for not just one way communication, but also given the ability to interact with their listeners in real time.

Introducing Rabble.tv

Serendipty

I knew my contract was ending last December with Life360, so around November I started to look around for the “next thing”. I got a call from Ben (CTO of Rabble and best friend) asking if I would come to Nashville and chat with the other two founders about what it would take, or even possible, to build their iOS app. I don’t need much of an excuse to go hang out with Ben and other cool people in Nashville. After spending about half a day hearing what they needed and when they needed it I headed back to Chattanooga with the green light to build the first RabbleTV app.

File -> New Workspace/Project/Design/EcoSystem, etc.

~35 days to develop…that’s what I had.

When I sat down to start cranking on the app it really began with a todo list of tasks that needed to be completed before I could actually start writing code. This included setting up the developer account, certs, github repos, workflows for testing and general design/wireframe elements. Since there was no time to waste I jumped right on my list and starting knocking things out.

Because fate loves to provide us with challenges, a few where thrown my way. These were in the form of getting the flu for almost 5 days, as well as, my wife and kids. Nothing brings development to a hault like a house full of sick people. Despite this set back I was able to make up the time. Ironcially, what caught me up was going on vacation with my wife to Tulum. We had our trip planned for almost a year and right before we left I was defintiely feeling the pressure. I had many late nights so completely disconnecting for 4 days was exactly what I needed. When I got back the saw was sharpened and was cutting faster and better than ever.

Development

Writing Swiftly with Objective-C

The entire app was written in Objective-C. Yep, that’s right…not the first line of Swift. My reasoning for this is due to the very limited time for development and right now I’m much faster with Objective-C. Future releases will incorporate Swift. I did setup the project to support Swift from the beginning knowing that I wanted to use it in the near future.

Hard Feature Decisions

The features on the web far out number the features on version 1 of the app. This was both a neccessity and a choice. The end goal of my development time was to go from version none to version one. Trying to fit all that Ben had developed for the web would have easily been 6 to 8 months of development. In reality the usage of on the phone is going to be much different than that on the web. Shocker I know!!!!

The Rabble guys were great to work with in regards to these decisions. As such, I believe and stand by the features that are supported by the iOS app. Additional features will be coming shortly.

TestFlight and Crashes

For beta testing I opted for TestFlight for beta distrubutions and Flurry for reporting and crash reporting. This was both a blessing and a curse. By curse I mean all the cursing I did when iTunes Connect had its meltdowns and a blessing when it worked. My experience was 50/50.

What’s Next

Discussions for what was going to be included also included what was going to be in 1.0.1. These are secondary, but still important features, as well as, bug fixes. I’ve also laid out the roadmap for 1.0.2, 1.1 and 1.2.

Zen and the Art of iOS Development

This has been the BEST app I’ve ever worked on. The app is far from perfect and will continue to improve and evolve (like software is supposed to), but the experiences couldn’t have been better. The work/life balance has been phenonmueal and working 1:1 with Rabble couldn’t be better.

My good friend, Jason Jardim, tweeted an article called Looking back at where it all began, the basement startup. As I look back on building the Rabble iOS app, I completely understand what the author is describing. Though I actually work in my home office, I have lots of fresh air, open light, etc…it is pretty humble and I wouldn’t trade it for anything.

Mute.Mic.Rabble

Promo Video

Download Now

Sign up now

Objective-C Runtime with Swift

One thing that I really miss when using Swift is Objective-C’s runtime. I don’t go crazy with it, but its power, when used wisely, is highly effective.

I had an idea a few weeks ago for a fun little app and thought this would be a great time to build an app from the ground up using Swift. I have a lot of convenience classes and logic written in Objective-C that needed to be ported over. One the first, and most important categories, is on UIView. I extend the class to include the ability to add a tap gesture.

Over the years I’ve moved over to use Oliver Drobnik’s technique. It is cleaner than my original approach.

When Swift was announced, Apple assured developers that you could mix and match Objective-C and Swift. Most things you can. One of the skeptisms I originally had concerned the runtime. Mattt Thompson posted a great piece on NSHipster that gives some great examples on how to do it. I used that article as the main reference when starting to port my category over. When I went to compile the project I got the following error.

error: type ‘() -> Void’ does not conform to protocol ‘AnyObject’

var tapAction: (() -> Void)? {

    get {
        objc_getAssociatedObject(self, &AssociatedKeys.SNGLSActionHandlerTapBlockKey)
    }

    set {

        objc_setAssociatedObject(
            self,
            &AssociatedKeys.SNGLSActionHandlerTapBlockKey,
            newValue,
            UInt(OBJC_ASSOCIATION_COPY_NONATOMIC)
        )
    }
}

I wrestled with this for quite sometime. I understood what it was saying, but didn’t have an elegant solution for it or hell an unelegant solution. Part of my problem was that I was thinking in terms of Objective-C and not Swift. After posting to StackOverflow Greg Heo pointed me in the right direction. How do make pass in a Void type into something that must conform to AnyObject…you have to wrap it in a class.

Well that is simple enough:

class ClosureWrapper {
  var closure: (() -> Void)?

  init(closure: (() -> Void)?) {
    self.closure = closure
  }
}

var tapAction: (() -> Void)? {
  get {
    if let cl = objc_getAssociatedObject(self, "key") as ClosureWrapper {
      return cl.closure
    }
    return nil
  }

  set {
    objc_setAssociatedObject(
      self,
      "key",
      ClosureWrapper(newValue),
      UInt(OBJC_ASSOCIATION_COPY_NONATOMIC)
    )
  }
}

Not so fast. That is the right concept, but there is some additional work that you have to do. The ClosureWrapper needs to inherit from NSObject and conform NSCopying.

Now I’m able to add a tap gesture to any UIView subclass.

Personal Project Preferences for IOS

It’s Personal

How developers setup their projects is a very personal subject and can invoke extreme emotion if ever critized. While I’m pretty set in my ways as to how I like to setup my iOS projects, I know that I’ve gotten to my preferred setup because of others so I would like to share mine.

Workspaces and Projects

When starting a new iOS project I always create a new workspace. Most of my projects are not just a singular project and workspaces provide the best relationship management, whether it is a static library, framework or related.

Filesystem

Since I edit the classes in the Xcode and do not access them via the file system I don’t create any additional folders or hierarchy inside the project(s). It doesn’t make seems to me. Any class organization I handle with groups.

With the addition of asset folders in Xcode, image assets are now kept within the project as well. The file system project folders are split between the various “source” projects, configurations, external libraries and misc dependencies.

Configurations

It is a huge pet peeve when I see code like this:

NSString *serverURL = nil;
#if DEBUG
serverURL = @"https://path.to.dev/";
#else
serverURL = @"https://path.to.prod/";
#endif

These type of environment key/values shouldn’t be determined by preprocessor macros inside of your app. Even logging can/should be handled by a dedicated class. Since my projects usually have three build configurations, Debug, Adhoc and Release, I’ll create three plists: ConfigDevelopment, ConfigAdhoc and ConfigRelease.

You can create as many or few as you want depending on your setup

I don’t need all three of these plists in the app’s bundle, only the appropriate one which corresponds to my current build configuration. This is handled by a build script

#!/bin/sh

PATH_TO_CONFIGURATIONS="$SRCROOT/../../Configurations"
ConfigDevelopment="$PATH_TO_CONFIGURATIONS/ConfigDevelopment.plist"
ConfigRelease="$PATH_TO_CONFIGURATIONS/ConfigRelease.plist"
ConfigAdhoc="$PATH_TO_CONFIGURATIONS/ConfigAdhoc.plist"

echo "Checking for file $ConfigDevelopment, $ConfigRelease, $ConfigAdhoc..."

if [ $CONFIGURATION == "Debug" ]; then
echo "Processing $CONFIGURATION"
if [ -f "$ConfigDevelopment" ]; then
cp "$ConfigDevelopment" "$PATH_TO_CONFIGURATIONS/Config.plist"
else
echo "$ConfigDevelopment not found"
exit 1
fi
fi

if [ $CONFIGURATION == "Adhoc" ]; then
echo "Processing $CONFIGURATION"
if [ -f "$ConfigAdhoc" ]; then
cp "$ConfigAdhoc" "$PATH_TO_CONFIGURATIONS/Config.plist"
else
echo "$ConfigAdhoc not found"
exit 1
fi
fi

if [ $CONFIGURATION == "Release" ]; then
echo "Processing $CONFIGURATION"
if [ -f "$ConfigRelease" ]; then
cp "$ConfigRelease" "$PATH_TO_CONFIGURATIONS/Config.plist"
else
echo "$ConfigRelease not found"
exit 1
fi
fi

if [ -f "$PATH_TO_CONFIGURATIONS/Config.plist" ]; then
echo "Processing $CONFIGURATION"
echo "Found Config.plist, and copying to $TARGET_BUILD_DIR/$EXECUTABLE_FOLDER_PATH"
cp "$PATH_TO_CONFIGURATIONS/Config.plist" "$TARGET_BUILD_DIR/$EXECUTABLE_FOLDER_PATH/"
else
echo "Did not find Config.plist"
exit 1
fi

In my projects I create a simple NSObject subclass, AppConfig which loads the Config.plist from the bundle and sets the values from the plist’s dictionary. Though I haven’t done it yet, you could “automate” this process further by using some Objective-C runtime to dynamically create and set the properties.

WIth this setup whatever setting, URL, path, etc. I can grab by calling [AppConfig defaultConfig] without having to maintain or deal with unnecessary runtime settings.

Subclassing

I spend a lot of time and energy evaluating DRY in my code. This isn’t only applied to abstraction, but inheritenance. View controllers and views (standard UIView subclasses, UILabels, UIButtons, etc) will share some basic default attributes that I only want to set once, and by design, modify once to be applied to all. You can accomplish a lot on the UI side of things by using UIAppearance, which I do utilize, but anything outside of that supported by the protocol I have a base class.

For example, this is what my base UIViewController would look like.

- (instancetype)init {
    
    self = [super init];
    
    if (self) {

        self.edgesForExtendedLayout           = UIRectEdgeNone;
        self.extendedLayoutIncludesOpaqueBars = YES;
    }
    
    return self;
}

// MARK: View Life Cycle

- (void)loadView {
    
    self.view = [[UIView alloc] initWithFrame:[UIScreen mainScreen].bounds];
    self.view.backgroundColor = [UIColor whiteColor];
}

There isn’t much to look at, but in a fairly large application having type/copy/paste that becomes tedious and unnecessary. Now I just have to subclass this base controller.

In the case of UIView and UIControl subclasses, I almost exclusively use AutoLayout and I don’t use Storyboards or XIBs. I create a base AutoLayout view and button class and set the AutoLayout “flag”. Labels are given some extra love by adding an additional property so that I can handle multiline and edgeInsets.

// MARK: Initializers

- (instancetype)initWithFrame:(CGRect)frame {
    
    self = [super initWithFrame:frame];
    
    if (self) {
        
        _edgeInsets = UIEdgeInsetsZero;
        
        self.translatesAutoresizingMaskIntoConstraints = NO;
        self.textColor       = [UIColor whiteColor];
        self.backgroundColor = [UIColor clearColor];
        self.font            = [UIFont preferredFontForTextStyle:UIFontTextStyleBody];
    }
    
    return self;
}

- (instancetype)init {
    return [self initWithFrame:CGRectZero];
}

/**
 * @discussion
 * In order for autolayout (or sizeThatFits) to correctly calculate the size of the
 * label when using edgeinsets then we must adjust the rect accordingly.
 *
 * If not then you can run into the situation where multiline text is cut off.
 * This happens when calculating dynamic size height cells in UICollectionView's.
 */

- (CGRect)textRectForBounds:(CGRect)bounds limitedToNumberOfLines:(NSInteger)numberOfLines {

    UIEdgeInsets insets = self.edgeInsets;
    CGRect rect = [super textRectForBounds:UIEdgeInsetsInsetRect(bounds, insets)
                    limitedToNumberOfLines:numberOfLines];
    
    rect.origin.x    -= insets.left;
    rect.origin.y    -= insets.top;
    rect.size.width  += (insets.left + insets.right);
    rect.size.height += (insets.top + insets.bottom);
    
    return rect;
}

- (void)drawTextInRect:(CGRect)rect
{
    [super drawTextInRect:UIEdgeInsetsInsetRect(rect, self.edgeInsets)];
}

MVCUC,e (Models, Views, Controllers, Utilies, Categories, etc)

I like to take the abstraction of responsibility a bit further using my own Foundation framework project added to the workspace that holds all the models, UIView parent classes, utilies and categories. The app project itself is mainly view controllers, static assets and config files. This type of setup wasn’t as necessary early on in iOS development, but as devices and functionality (Apple Watch, separate iPad and/or Mac apps and extensions) are added it has been extremely beneficial to keep these separate.

Daniel Kennet’s article talks about the benefits of taking a similar approach.

Basically, if it doesn’t involve UI or application state, it goes in a separate framework. My iPod-scanning app Music Rescue contained a complete framework for reading an iPod’s database. My pet record-keeping app Clarus contained one to work with the application’s document format.

Dependencies

This is probably the most debatible preference I have. I do not like CocoaPods. I prefer to use git submodules. I admire and respect the work that has gone into CocoaPods and there are certainly benefits to using it, but it has caused more headaches and is one more external dependency that I have to work with. Ruby (I know it is installed by default), messing with pod files, the XCode project modifications to name a few. Git submodules aren’t without their issues, but you have complete visbility into what is going on. I don’t feel that way about CocoaPods. I’ve used it on a few projects at different points over the past year and it just annoys me.

Networking

My network stack is homegrown solution, which consists of a Client, Request and Session class. The client has paginator property.


------------------------------------------------- =====================
Client                                              paginator
------------------------------------------------- =====================
------------------------------------------------
Request
------------------------------------------------
------------------------------------------------
Session
------------------------------------------------

The only class and methods that are accessed directly by the app is the Client instance which calls down the Request class (handles get, put, post and delete verbs) which in turns calls down the Session (wrapper around NSURLSessions).

The client has a property called paginator which allows the Client instance to call nextPage or previousPage for request pagination.

@property (nonatomic, weak) RTVPaginator *paginator;

XIB’s, Storyboards and Views

Unless it is a requirement I DO NOT use XIB’s or Storyboards. Over the course my iOS career, when I did use them, it became yet another asset to manage and I found myself, more times than not, doing all of my customizations programmatically and never really touched the XIB’s after their initial creation so I just quite using them all together. With the addition of AutoLayout in iOS6 all of the initial tutorials and examples used XIB’s so I gave it another try. To me it was MORE frustrating dealing with AutoLayout using IB so I started doing all of the setup programmatically and haven’t had any issues.

XCode Build Settings

Before I add/modify in any project I’ll create target specific xcconfig files. One shared and specific ones for the various build settings. Tweaking specific settings becomes much more manageable that way, in addition, to see change sets via diff tools. Viewing the project files in any diff viewer is a horrible experience.

The Way

The Right Way, The Wrong Way and Your Way

There is no really wrong to setup or maintain your project. If you are developing and shipping apps then your setup is correct. Over the past 7 years certain trends have emerged as common expectations, but I have yet to see an official recommendations or guidelines from Apple so I’ll (and you) should keep doing it your way.

I would love to hear more of how you manage your project setup.