Well Tempered Chronicles - Implementation

Written by: | Posted on:

Implementation

This is the fourth installment of the behind-the-app series, showing the making of Well Tempered version 2.0.

In Well Tempered 1.0 I explained how Well Tempered got to be an iPhone app. For version 2.0 I wanted to make sure I integrate with all relevant parts of the iOS ecosystem. That meant implementing it so it would work well on all iPhone sizes and on the iPads, and if it made sense, also integrating the Apple Watch.

Version 1.0 had been compatible with the original iPhone, but keeping compability is expensive. So for version 2.0 I gave myself a clean slate, and set the current platform, iOS 8, as a base line.

I chose to make the app universal, meaning that it would support any iOS formfactor. While working with AutoLayout, I have not used size classes, preferring to have the UI shrink and grow and only change a little bit to accommodate the different sizes. I think this worked well for the different iPhone sizes, while there is perhaps still a little bit too much free room on the iPad.

An important part when designing an application is finding its logical units, and how they will compose. Up front it was important to me that defining your temperament should all be done on the main screen. So the topmost part of the screen was designed for that, leaving the bottom part to either the pitch pipe or the tuner. To make this modular, I implemented this as two view controllers within one main view controller. I must admit I am surprised on how non-trivial this was, or how hard AutoLayout would make this for me - especially if I wanted to recompose during rotation. I trust, though, that stack views in iOS 9 will do much to relieve the pain experienced there.

I chose to write my entire application in Swift. This was at a point in time when Swift 1.2 had just come into beta, and CocoaPods support for Swift was not entirely complete. My first patches for AudioKit were thus support for CocoaPods, and bug reports to Apple about the upcoming Xcode 6.2.

Using Swift is of course not only about changing the syntax of one language to write Cocoa in to another. With Swift came new concepts - or rather, concepts that I got to use more than what I was used to. Three to mention in particular are value types, protocol oriented programming and functional programming.

My interest in value types came out of Andy Matuschaks presentation at Realm. In short, it is about using structs instead of classes to store model data, thus letting data be just data. I thought this would be especially beneficial when implementing state restoration, something I view as an important part of being a good iOS citizen, yet something that many app developers find too complex in their apps and choose to neglect. I discovered, though, that I had to serialize them myself, having no good JSON or plist serializer at hand. However, I think my work paid off well when it came time to pass the application state to my Apple Watch companion app, and I learned a lot. I expect I'll be encapsulating all my model data in structs rather than classes for the next few years.

I also found it really nice to encapsulate my model functions together with my structs, as well as having my structs conform to different protocols. Apple described this later, at WWDC 2015, as protocol oriented programming. That, btw, was a really good video, and you should see it. Thereafter, of course, you should read Crustys reply.

Functional programming, while not new to me, is much more natural to embed into my code when writing Swift than writing Objective C.

Finally, I thought I should write an Apple Watch extention to be able to remote control the application. I did this while reviewing Pearson publishing's Learning WatchKit Programming book my Wei-Meng Lee. His book about WatchKit 1.0 is a brilliant starters guide. When I finally got an Apple Watch to try it out on, I was surprised by the speed of it - or rather the lack thereof. It had been intentionally slow on the simulator, and I was expecting Apple to have overdone the slowness. I did not expect that the real-life experience of the app would be even slower. Or rather, that the communication between the app UI and the app extension would be as slow as it was. I am definitely looking forward to reworking the Apple Watch companion as a watchOS 2.0 app.

In the next installment, I'll talk more about implementation, and in particular the implementation of Localization and Accessibility.

Full index:

Image Image

Father, husband, software developer, musician and photographer.

Get to know what I'm up to


© 2020 Niklas Saers Contact Me