Fast String Concatenation

Besides collection/list iterations, string concatenation is probably the most common developer task. Doesn’t matter the language.  Unfortunately, this is done incorrectly, especially with large strings.  I myself am guilty of taking for granted what goes on under the hood when performing this operation. After reading this article I found myself wondering about the mutability of strings in Ruby. Strings are natively mutable so you don’t have as much performance hit when doing basic concatenation, unlike Python which treats strings as immutable, thus “+=” performs a copy-on-write. No efficient at all. For small string concats doing the standard: “+=” or “+” is still the shortest distance between two points, however manipulating a string of any significant size, what seems to be the most efficient across Python and Ruby is to add each separate string into an array and then join.

Examples:
Ruby: ** Thanks to Travis Dunn for codev

Python

Do to the fact that I am doing more and more large scale projects performance is more important than ever and paying close attention to these little details allow for performance boots very quickly.

UITabBarController Subview Gotcha

By far the most view blog article that I have, and subsequent Github project, is my Universal iOS App Template.  Unfortunately, it has been a tad bit neglected. There were some memory leaks, still supported 3.2 and not iOS5 compatible.  Now that I am working from home I have dedicated part of my day to personal projects and the first order of business is to get the app template up-to-date.  Most of the work was just cleanup and minor tweaks.  One of the branches however required a little bit more work. The template that comes with TabBarController support. A few months ago I added in the feature of having a “Tweetie” style indicator slider.  In 3.2 - 4.x iOS everything worked fine.  In iOS5 the indicator was not initially centered in relation to the first TabBarItem, but the entire TabBar.  When you selected another TabBarItem the indicator would move, but not in the correct position either.

The number of subviews are different between the iOS4 and iOS5.

--- iOS5
2011-11-17 09:14:54.290 UniversalExample[8989:f803] view frame: {{0, 0}, {768, 1024}}
2011-11-17 09:14:54.293 UniversalExample[8989:f803] view frame: {{181, 1}, {76, 48}}
2011-11-17 09:14:54.294 UniversalExample[8989:f803] view frame: {{291, 1}, {76, 48}}
2011-11-17 09:14:54.296 UniversalExample[8989:f803] view frame: {{401, 1}, {76, 48}}
2011-11-17 09:14:54.297 UniversalExample[8989:f803] view frame: {{511, 1}, {76, 48}}

--- iOS4
2011-11-17 09:16:06.751 UniversalExample[9033:b303] view frame: {{181, 1}, {76, 48}}
2011-11-17 09:16:06.754 UniversalExample[9033:b303] view frame: {{291, 1}, {76, 48}}
2011-11-17 09:16:06.754 UniversalExample[9033:b303] view frame: {{401, 1}, {76, 48}}
2011-11-17 09:16:06.755 UniversalExample[9033:b303] view frame: {{511, 1}, {76, 48}}

I now confirmed that there indeed was a difference in how the center position was being calculated. In iOS5 the first subview is the entire frame of the view.  However, I didn’t know which view(sub or super) that it might be referring to.

The best uiview debugging tool for situations like this is the recursiveDescription method.  If you haven’t heard or used it before I suggest/encourage you read Oliver Drobnik’s blog on it. In summary it is a private method to iterate over a views hierarchy.

After calling this method on the subviews in both iOS versions I saw the culprit. The first subview of a UITabBarController is an UITabBarBackgroundView, which obviously is going to extended the entire length of the tab bar.

iOS4


iOS5


Why did Apple add this subview you might ask? They added it because of welcome changes to UIKit.  Since iOS 2.0 developers have needed the ability to customize many of the UI elements, but unfortunately there wasn’t a straight forward way to accomplish this. The categories and hacks that people came up with are pretty cool, but not maintainable. With iOS5, and updates to UIKit, you can customize background images, tint color, selected images, etc. to your heart’s desire.


In order to fix the iOS5 bug in my template I just had to add a quick runtime check on the OS version and increment the index by 1.

All is right with the world.

UIWebViews Are Horrible

After spending the last few days trying to improve performance with a local html document I am convinced that UIWebView is a horrible component that should be limited to only showing remote web page documents.  What brought upon this rant, and proposed solution for it, had to do with the half a second flicker between clicking on a TabBarItem in a TabBarController and the rendering of the local html file itself. If a user navigates away from the TabBarItem and then back again it will load instantly since the contents are now cached in memory.  However, I just couldn’t get past that initial flicker.

There are three different methods of loading an html document into.
  1. - (void)loadRequest:(NSURLRequest *)request
  2. - (void)loadData:(NSData *)data MIMEType:(NSString *)MIMEType textEncodingName:(NSString *)encodingName baseURL:(NSURL *)baseURL
  3. - (void)loadHTMLString:(NSString *)string baseURL:(NSURL *)baseURL
Source: http://cwil.es/uJznzA

The first thing I tried was timing each method to see which was faster.



The difference is negligible.  The second route that I attempted was to use Grand Central Dispatch (GCD).  Using GCD to manage process intensive logic that would normally block the main thread makes EVERYTHING so much better.  

Unfortunately, this is not the case with loading the local html file.  In fact, the speeds were essentially the same as using the single threaded paradigm.  I am assuming this to be the case because of the slight overhead of managing the threads. At the end of the day I am not gaining any performance.

Due to this frustration I am going to look into using @Cocoanetics NSAttributedString-Additions-for-HTML and save UIWebViews for remote display only.

Test Results (iOS5 Simulator. Ran code 4 separate times in - (void)viewDidLoad)

Main Thread Only

loadRequest
  1. 0.024776 seconds
  2. 0.025888 seconds
  3. 0.023575 seconds
  4. 0.026948 seconds
loadHTMLString
  1. 0.020844 seconds
  2. 0.063485 seconds
  3. 0.026386 seconds
  4. 0.057899 seconds
loadHTMLData
  1. 0.027755 seconds
  2. 0.036826 seconds
  3. 0.026125 seconds
  4. 0.025203 seconds
Grand Central Dispatch

loadRequest
  1. 0.029463 seconds
  2. 0.025321 seconds
  3. 0.026808 seconds
  4. 0.033857 seconds
loadHTMLString
  1. 0.025549 seconds
  2. 0.030053 seconds
  3. 0.027117 seconds
  4. 0.022959 seconds
loadHTMLData
  1. 0.035207 seconds
  2. 0.034666 seconds
  3. 0.029324 seconds
  4. 0.023069 seconds

Dynamic Model Update and Using Google App Engine

A few weeks ago I presented at the Mobile Technology for Teaching & Learning.  My talk was how to leverage service-oriented-architecture and objective-c runtime functionality in your enterprise applications.  In my demo app I had two json files to use as my datasource so that I wouldn't have to rely on an internet connection.

*I was glad that I did because I had lots of problems with it.

After that talk was over I pushed all the code to github with the goal of modifying the project to use an actual web service to show a real world example of how it works. In order to do that I needed to first create the web service.  Up to this point in my development career I have been using PHP for the service layer with CouchDB as the datastore.  With this particular project I wanted to take things in a different direction.  A few years ago I started tinkering with Python and have really grown to love the language. Unfortunately, I haven't been in a situation to were I could use it in any project beyond a few helper scripts (mainly because of me being a novice and my tight time constraints). I wasn't bound by these restrictions for this project.  What I wasn't looking forward to was doing the server side, "infrastructure" setup to get things going.  I wanted to focus on the development.

Enter Google App Engine to the rescue!  Setting up an account and application with GAE is out of the scope of this post, but Google makes it very easy to do…especially if you are focusing on Python.  I knew that scaling wouldn't be an issue nor would the data integrity. Google is somewhat good at both of those.

I was able to hack out a basic web service within a day.

My biggest roadblock came with handling class instances into JSON.  The awesome community of stackoverflow.com had it covered.

The demo app is modified to use these endpoints for data via GCD.  I also added a class extension for NSDictionary that simplifies the JSON parsing logic, thus removing a lot of the boilerplate code that I had in there originally.

I will be adding in some more enhancements and abstracting the code over the next few weeks.  As always, if you have any comments or suggestions please don't hesitate to contact me

The Sign Language of the Times

Apple has been leading the way for mobile devices since the first iPhone was announced.  Except it! However, this doesn’t mean that other competitors can’t be innovative when in this area.  Unfortunately, up to this point Google, Microsoft, Blackberry, etc. haven’t been thinking outside of the box.  In essence they are just copying, and doing a poor job at that, with what is on iOS devices.

Over the past few years, and most recently with iOS5, there is a lot of discussion about cloud technologies and having seamless access to one’s content. In addition, there is always talk about form factor, physical hard drive capacity, the number of mega-pixels that this camera has vs that one, and the potential of voice interaction with these devices.

All of these features are great, but aside from maybe voice interaction, they are all nice to haves. They are a convienence.  What is missing is the answer to: “Where is the next evolution in device interaction?”.  Even voice at this juncture is in it’s infancy.  Siri is great and I use it all the time, but there are still too many dependencies that any company has to deal with.  The computer processing has to be done in the cloud, all the different languages, accents, dialects, connotations, etc.

If I were Apple, Google, Microsoft, Blackberry, Nokia, etc. I would be focusing on multi-touch gestures.  Even after 3 short years engineers have started to rest on their laurels with pushing this technology forward and the possibilities. The first place I would start in my research and work on this is by becoming an expert in sign-language. I would hire a team of the best historians and contemporaries in the world on the subject and have them work side-by-side with my UX and engineering teams.

When you look at sign language as a communication tool it beholds such beauty and elegance.  It is the calligraphy of gesture communication.  The potential for this, as an example, is develop a new gesture “language”. Much like we have spoken and computer languages, but with consistency. The power of this would be unlimited and easy to use and interpret.  With the current usage of multi-gesture it is always a singular expression, but when examining sign language a single gesture has to the ability to express an emotion or entire concept.  It is multi-dimensional. Unlike, the computational and processing dependencies listed above that come with using voice interaction, all legacy and future multi-touch hardware have the ability to implement a new era of gestures.

Gestures. They are a sign of the times to come.

The Case of Lost Text Messages and AT&T

Oct 14th, like millions of other people, got my iPhone 4S. Because I develop iOS apps I had been using iOS5 for a few months so to have both the new hardware and software together was nothing less than spectacular.  Many unfortunate users that day experience failed activations because of Apple's server overload. Since I didn't get mine until later that night I was able to shut off my iPhone4, activate the new phone and restore from iCloud without incident.  My new phone was BLAZING fast.  Siri, lived up to the height.  Cutting the cord with iTunes has made life so much more enjoyable.  Finally, I was able to communicate with the other iDevice people with iMessage in mass. It was a good day.

The next night at dinner a friend commented that she had texted me but I never responded. I didn't think too much of it because these things happen especially on iPhone launch/activation days due to network congestion.  Over the next few days I noticed that I wasn't receiving text messages from anyone other than iMessages and people weren't receiving mine.  Didn't matter the carrier.

After consulting the Google, I learned that many other people were experiencing similar issues, regardless of carrier or device version (4 and 4s).  It turns that with iOS5 there is a cap in the number of text messages that the messages app will support. After many frustrating hours with both Apple and AT&T, their "fix" was to wait for update from Apple, start deleting messages from your phone to free up space or wipe your phone and set it up as a new phone. By doing the last suggestion all your text messages would be gone.  I wasn't about to start all over with wiping my phone.  I deleted any of the old text messages that I could, but still no luck.  I waited a few days to see if either one of the carriers or Apple would break their radio silence on this issue and offer some support. Well….NOTHING.  While reading on the AT&T community support forum I notice that a moderator had reminded anyone having issues to check their account to make sure that they hadn't gone over their allotted number of texts.

When I checked my account I found something that will make me LOATH AT&T for quite sometime to come.

During the checkout process when I bought my phone I had to choose my data plan.  I still don't know why this was part of the process.  Maybe because I am one of the remaining few who have unlimited data and AT&T hates me. Regardless, I had to verify that I wanted to keep this plan. Apparently, AT&T found it fairly funny to remove my text messaging that I originally had. I mean why would someone assume that their current service features would still be there just because they are ordering a new phone.

Needless to say as soon as I added the text messaging back everything started working.

Carriers wonder why consumers and companies like Apple (iMessage), Google (GoogleVoice) and even Blackberry are doing whatever they can to bypass their unreliable networks, constantly changing terms/conditions and overall uncoolness towards consumers (hidden fees, data throttling, "upgrade fees", etc).

If you are having text messages check your account before you waste days and hours with wiping and restoring your phone.

Fun Times with Objective-C Runtime

Good reusable solutions to problems usually come out of necessity rather than luck or leisure.  Over the past 18 months I was always faced with problems of how to simplify the data integrity, security, and aggregation of multiple datasources, usually in big organizations that would then be fed in an iOS app.

The first solution to this issue is to create the "man in the middle".  Essentially a proxy/web services layer that acts as the gateway to all the necessary model objects.  Utilizing tools such as Gearman, CouchDB, MySQL, PHP, Python, Memcache and a little bit of Apache to make everything complete I was able to deploy a lightweight standard response to the iOS app.  As requests from various business groups would come in I found myself taking more and more out of the app and putting that logic into the services layer. Instead of having to rely on the app to handle security, data integrity, computational analysis, image manipulation, etc. I pushed that onto a light weight stack server side that could be scaled easily and mainainted without having to revision the app for basic model object changes.  The added benefits that I saw later on were that I could adapt the response based upon the request…(Android, iOS, mobile web, desktop browser, etc.)

Now that I was concentrating more on feature sets, bug fixes and UI enhancements I was still stuck with having to manage various model classes within the app.  This was a serious pain point to me. I wanted to be able to consolidate the various list and detail controllers that I had down to one instance of each.

Objective-C runtime to the rescue.

Let me be very clear that objective-c runtime is VERY powerful and not for the faint of heart, but when used effectively can be quite useful.  The approach that I took was as follow. 

Use the existing "standard" json format from a web services layer and restructure the "responseObject" so that it gave meta information about the model name, properties for class, which properties should be displayed on the list page and detail page.  By structuring the json  (xml, plist, etc) play load this way I now can push changes to model object and display anytime I want.

In the github project I use an "Employee" object as the example. The app should be able to show a list of employees (title and subtitle) and then a detail page for each employee that would display up to 3 properties.

I have included another json file for "Events" just so that I can show the flexibility. The same list/detail view controllers are now reused for two different objects.

The (current) limitations.
  1. There is an assumption that your list view will use the subtitle uitableviewcell enum type.
  2. Parsing and creation of classes doesn't take into account types other than nsstrings. I have an idea to remedy this, just haven't implemented yet.
  3. What about images?  Most list views have an image associated with it.  What I am thinking about it having a flag set in the properties specifically for images so that they can be loaded asynchronously.
I will be adding in these features.  I am still getting my head wrapped around some of the advanced features of runtime. Below are the links to resources that I used in my research.

Posted on slideshare is the presentation that I gave. It details the philosophical and practical importance of having an SOA that is similar to what I described.  It was heavily inspired by the presentation written on the subject of NPR's SOA.

Presentation References:

What Should I Do Next Steve?

The past two weeks have been quite interesting. Three of the most notable events for me where:

1) My contract with VW is ending so I will be without a job in about a month
2) iPhone 4s was announced (and ordered @ 3:30am)

In regards to the first big event, working with VW has been a blast and I have really enjoyed my time here. The project I worked on was very challenging and the end result was thus rewarding.  The people I got to work with are truly exceptional.  I would hang out with them even if I wasn't being paid.

I am not one to wait until the last minute for anything so I have been thinking about what I wanted to do next in regards to my career.  I could go and look for another corporate full-time job doing iOS. I could try and find another contract gig or I could really be adventurous and work for myself.  To weigh equally all of my options I have spent many hours these past few weeks talking to other developers and close friends about their experiences.

However, at the end of the day I have to wake up living with my decision.

For those of you who don't know the east Tennessee area very well it is absolutely beautiful. Especially in the fall.  The part of town that I live is in it's own little pocket with steep winding roads and TONS of trees. There is lots of solidarity. I took a very long walk through these roads and up the little mountain to think about "what's next".  While doing so I listened to Steve Jobs 2005 Stanford Commencement address. Paying very close attention to the parts of listening to my inner voice, pushing obstacles aside to achieve one's goals and connecting the dots. I kept thinking about the three different Evernote books I have with various hair brained ideas.  Two github projects that have been neglected that I really want to put more work into.  How excited I am to speak at the Memphis Mobile Conference next week. How I admire the following iOS developers and their impact on the iOS community and how I want to be included in that list.
On the last half mile home the decision became clear.  Not exactly how I was going to get there, but what I wanted to do. As I walked up my stairs I noticed I had clenched my fist.  This was not done in anger and I wasn't mad. In fact I was quite happy and smiling.  I went inside and took my daughter to the park.

I was ready to put my ding in the universe.


Let's Talk iPhone Event

I always love the last few months and weeks that build up to an iPhone/iPad announcement.  Though some of the rumors have turned into fact, most are just random, fabricated factoids that many MSM and bloggers push out there for the masses.  At the end of the day I can only presume that Apple doesn't pay those any attention except in the case of lost hardware.

However, I am not immune to reading a lot of these misleading articles. Granted I don't treat them as gospel there are a few things that I am very excited about.

First, is the official roll out of iCloud.  Having used the beta version the past few months I have been pleasantly surprised as it's convince and ease of use.  It is a huge win for the average consumer who has, up to this point, been reliant on iTunes and various other backup strategies to manage their content and media.

Second, is iTunes Match.  It has always been somewhat of a let down that users couldn't access all of their music from any of their devices.  The process is extremely easy and now you don't have to carry around a separate iDevice to hold your music catalog.

Third, wireless syncing.  I have a strong feeling that most IT administrators and children/grandchildren who have to provide tech support to their parents/grandparents will jump up and down over this.

Fourth, the API features with the new iOS have once again raised the bar for app developers.  Apple has listen to feedback from developers to make it easier for us to provide better functionality in our apps as well as more detailed customization options.  

*Bye-bye drawRect: category*

Fifth (possibly), Apple Assistant.  Rumor Alert! Could the headline "Let's Talk iPhone" really have a hidden meaning.  
Apple gave the masses the first decent implementation of multi-touch.  The communication mechanism that is inherit to all since birth.  Pointing, dragging, panning, zooming, swiping.  With the acquisition of Siri, there is no doubt that Apple has been putting a lot of effort in creating the near-perfect implementation of voice recognition for the masses.  On device voice processing via natural speech that is a native part of iOS.  If this is the case, then you will see a whole new generation of apps and a paradigm shift in how users, impaired or not, interface with these applications.

Apple...let's talk.