Parsing Hex Strings in Swift

August 12th, 2017

I want to turn the string "0x02 0x03 0x04" into an array of bytes ([UInt8]). Seems simple enough right?

let hex = "0x02 0x03 0x04"
let components = hex.components(separatedBy: " ")
let array = { UInt8($0) }

But the UInt8 constructor that takes a string doesn't understand the "0x" prefix to indicate hex strings. There's a handy UInt8("3F" radix:16) constructor that does understand hex but we have to remove the prefix first.

Swift doesn't have an obvious way to remove the first 2 characters from a string.

This works:

let hexStr = "0x3F"
let hexValue = UInt8(hexStr.replacingOccurrences(of: "0x", with: ""), radix: 16)

But this will scan the entire string (admittedly only two more bytes in this case). But is there an easy way to just skip the first two bytes?

String has substring(from:String.Index) which seems like what we want, but "0x3F".substring(from:2) doesn't compile. The String.Index is a type that's managed by the String, to account for Unicode glyphs that are longer than one byte.

hexStr.substring(from: hexStr.index(hexStr.startIndex, offsetBy: 2)) will give us the correct substring but this is a mouthful - it's hard to tell from scanning that line that all we want to do is remove the first two characters.

So we have these two options to do this with Swift string methods:

let hex = "0x3F"
let hex1 = hex.replacingOccurrences(of: "0x", with: "")
let hex2 = hex.substring(from: hex.index(hex.startIndex, offsetBy: 2))

As expected, a quick benchmark shows the second version is about 4 times faster than the first version.

There's also Scanner, which can take the hex string with the 0x prefix and return a hex value, but it works with UInt32 and UInt64 types so there's some extra gymnastics to get the result into a UInt8. Scanner benchmarks to be almost as fast as using substring, but using a separate class for this operation feels like more overhead than I want.

One of the great things about Swift is we can drop down to C code, and for this particular operation, there's a C function that fits the bill exactly.

let hex3 = UInt8(strtoul(hex, nil, 16))

The C strtoul function converts a string given a radix, and for radix 16, ignores the prefix. And it's 3x faster than the Swift substring version above.

For readability let's wrap this up into a UInt8 extension:

extension UInt8 {
 static func from(hexString: String) -> UInt8 {
   return UInt8(strtoul(hex, nil, 16))

With that in place we finally have:

let hex4 = UInt8.from(hexString: "0x3F")

Since making it easy to apply to other int sizes veers off into generics, I'll leave that as an exercise for the reader.

Always-On iOS

June 23rd, 2017

The iPad is a perfectly capable computer, but one thing you can't use the iPad for are jobs that take a long time.

If you're on a desktop or laptop PC and you need to kick off a job that's going to take a half hour, such as rendering a 3D video, you just do it. The machine sits there and does the job until it's done. The computer may go to sleep, which will pause the job, but you can control if and when this happens. The laptop may run out of power, but plug it in and the job continues.

On iOS, you can kick off a long-running job, but you have to keep that app in the foreground until it completes. You can't hit the Home button and go surf the web until it's done, because iOS will kill the backgrounded app after 10 minutes, whether it's done or not. Need to upload a large file to an FTP server? Better hope you can get it done in 10 minutes, or make sure you switch back to the FTP tool before the 10 minutes is up, or the app is killed.

This also means you can't use an iOS device as a server. The hardware is more than capable of acting in a server role, listening for connections and responding to them, but because hitting the Home button would wind up eventually killing your server, you can't reliably use an iOS device this way.

iOS has the ability to run background jobs. Photos, for example, downloads and processes your photos in the background. iCloud does this for documents. But no third party applications are allowed to do this.

Suppose I wanted to build an app that would monitor photos from a webcam over wifi, do image detection with the vision API, and take some action depending on what objects are detected. Totally possible and totally reasonable app to build, but the app would have to be launched interactively by the user and kept in the foreground. If the device were to lose power, a user would have to tap on the app to re-launch it.

iOS does seem like an odd platform for this sort of server application, but it's the platform where all the innovation is happening. Some Apple technologies, like CoreML and HomeKit, are only supported on iOS.

I don't mind that iOS is so aggressive with background processes when the device is running on battery power. That seems like the right tradeoff. But when plugged in, apps should be able to request, and be granted, the ability to run in the background as long as they need to.

OperationQueue and dynamic priorities

May 7th, 2017

I had a question about OperationQueue that I couldn't find an easy answer to: Can you modify the priority of an operation once it's been added to the queue?

My situation is that I have some background operations that are executing, but the user may choose to view one of these items interactively, and at that point, I'd like to bump that item's priority so the user doesn't have to wait for it to get its turn in the queue. I want that item to jump to the head of the line.

Many of the StackOverflow responses I looked at said that the queuePriority had to be set before the operation was added to the queue, but I couldn't find a definitive answer in Apple's documentation, so I put together a short playground to test it. Here's the code:

import UIKit

let queue = OperationQueue()

// Serial queue, start suspended
queue.isSuspended = true
queue.maxConcurrentOperationCount = 1

// Create three operations
let operations = [
BlockOperation() {
BlockOperation() {
BlockOperation() {

// Add all the operations
operations.forEach { queue.addOperation($0) }

// Resume (one of the items will start executing immediately)
queue.isSuspended = false

// Now dynamically change the priority of one of the items
operations[2].queuePriority = .high

// Must wait for competion or you won't see the result

Play with which operation you're adjusting the priority on and you can see the that the priority changes are respected. You can adjust the priorities of operations on a queue.

BufferBloat, or Gaming on DSL while Uploading

February 27th, 2017

If you find that when some device on your network is uploading data, the entire network becomes unusable due to excess latency, this post is for you.

There's a term for this, BufferBloat, and it refers to the a phenomenon where your ping times increase due to excessive buffering at a choke point on your network.

In my case, I have a DSL network connection with 800kbps upload. When the network is idle, I have pretty good ping times to the outside world, with typical values being around 20ms.

But if some device on my network is uploading data, this can increase to 2000ms or more.

There's an easy test for this now. Go to DSLReports Speed Test and watch the Buffer Bloat score. This test measures ping times during the download and upload portion of the bandwidth metering tests, and if it goes up significantly while testing vs idle, that's buffer bloat.

In my case, quite often some device on the network would be uploading something, and this was killing my ability to play Overwatch on the PS4. It's hard to track down what's uploading. It could be that my wife has recorded a video clip on her phone and it's syncing to iCloud. Maybe I dropped a big file into a Dropbox folder. All I'd know is I couldn't play right now because "something" was killing the network.

I tested my network connected directly to the SpeedStream 6250 DSL modem, and connected through an AirPort Extreme Base Station. Neither made a big difference. But after doing some reading, I discovered there's a QoS algorithm for prioritizing packets that helps with this problem, available for integration into networking equipment firmware. The "codel" algorithm is implemented into newer versions of the DD-WRT custom router firmware project.

My Solution

I'm not using the WiFi on the router, so the wireless stats didn't matter to me. The cheapest DD-WRT compatible router that supports code is the ASUS RT-AC56U. The ASUS RT-N66U also runs DD-WRT, but the codel algorithm isn't enabled in firmware for that router, so it won't help.

I set up that router, installed a beta DD-WRT (the one I used is here), and configured it as my PPPoE gateway. The SpeedStream DSL modem is still there, but it's just acting as a dumb modem, and the RT-AC56U is taking care of PPPoE, and local network services like DHCP.

The last step was enabling QoS on the router. This is in the DD-WRT web interface. Choose the fq_codel queueing discipline, and enter a value in the uplink field that's slightly lower than your connection's actual bandwidth speed. For my 800kbps-rated upload, which typically sees about 80k/second upload speeds, I entered 650 here. You can experiment by changing the numbers and repeatedly running the speed tests.

As you can see, once all this was done, the buffer bloat problem was solved. It worked better than I expected.

Ping times are great in theory, so here's some concrete evidence: I've had uploading some large video files for the last few days. Meanwhile, I can go play Overwatch and not see the Ping or Latency indicators at all. It really works!

Thanks to the bufferbloat project for relentlessly pursuing this and developing the algorithms to combat bufferbloat. As of February 2017 none of the mainstream routers have these algorithms built in, but hopefully in the future this will become a standard feature.

Occasionally Connected

February 16th, 2017

I think it's easy for people who have a reliable always-on internet connection to forget that much of the world does not. Even those of us who do may prefer not to use cellular data as much as possible.

What I'd like is for the phone to sync as much data as it can while it has a Wi-Fi connection, on the assumption that I'm going to need that data when I'm offline.

iOS is pretty good at this, but as I recently discovered on a trip where the hotel Wi-Fi only worked in the common areas and I had no cellular connection at all, both iOS itself and the apps I use had some big problems working offline.

Ideally, Background App Refresh would launch every app that wants background refreshes whenever I'd encounter a WiFi network after having been offline for a while, but that's not how it seems to work. I couldn't predict which apps would have fresh data and which would not.

Here's a quick rundown of the apps I was using and my experiences during an occasionally-connected week.

Mail is the gold standard for apps that work either online or offline. When I was within WiFi range, mail would download, and I could then read it and reply whether I was online or offline. Any mail I wrote while offline would send the next time I was online. This is exactly how it should work.

TweetBot would pull down the timeline if I launched it while in WiFi range, but wouldn't go get it on its own, I had to manually launch it.

Safari Reading List seems like an ideal offline feature, but in practice, worked terribly.

When you add a page to the Reading List, if you're online when you do it, Safari will go and cache the page so you can read it offline. But if you're adding pages while you're offline, Safari doesn't proactively go and download the content of those pages when you do come back online. It also doesn't download pages that you added from a different device. This renders Reading List useless as you just can't predict what pages will be available.

Music lets you sync playlists for offline playback, and this works well.

The Washington Post app would download stories if I launched it while online, and keep them cached so I could read them when offline.

Amazon Prime let me download shows for offline playback, but when I tried to play them in the hotel room, it had actually downloaded a French language version of The Grand Tour even though none of my preferences were set that way. Fail.

Plex is awesome. I downloaded some shows to play in the hotel room when offline, and they played perfectly.

Ulysses (and iCloud Drive sync in general) would sync if I was online, but didn't sync unless I was running the app. This is different from iCloud Documents and Data, which I believe would sync in the background even when you weren't running the app.

The Apple Watch Weather complication would just show stale data while offline, with no indication the data was stale. Not good.

I mostly place the blame on Apple here, for not aggressively triggering Background App Refresh for all enabled apps when the device found itself online.

Emotions by Hodelpa

February 6th, 2017

We went to the Dominican Republic for a week, and stayed at Emotions by Hodelpa. ([TripAdvisor]). Here's some information about the hotel that you might find useful if you're considering this resort for your trip.

The place is always referred to as "Emotions by Hodelpa". Hodelpa is a chain, and they recently bought this resort and have put a lot of resources into renovating it.

"Essentia by Emotions" is part of the hotel, but shows up as a separate hotel in the booking sites. It's actually just one of the Emotions buildings, the closest one to the beach. Essentia guests get some nice perks, including being steps from the beach, the 24 hour beach bar and one of the buffet restaurants.

The beach is beautiful, and there's a 24-hour all-inclusive bar right on the beach. Most of the guest rooms are across the street from the beach, but it's a short walk and there are guards at the crossing 24 hours a day so it's never a problem getting to the beach.


The primary language spoken at the resort is Spanish. They will try to accommodate English-language speakers, but it would be worth brushing up on some basic Spanish phrases. All the signage is in English, menus and printed materials are available in English, but the staff all speak Spanish.

The common areas, restaurants and some of the buildings with the guest rooms are new (in January 2017). The renovations are ongoing, and not everything is perfect yet but the design is nice (if stark) and I expect the rough edges will be smoothed out. The rooms are large, but could use more storage.

A lot of reviews mention a lack of hot water. The buildings have solar water heaters on the roof, and I suspect they just aren't up to the needs of everyone showering at the same time. Try again later, I guess.

Emotions is a small resort. It has one main pool, the one you see in the photos on their website. There's an adults-only pool, a small Essentia-only pool, and of course the beach.

There's a nice pond with two large pink flamingos that are always there. I don't know why they don't fly away, but they seem perfectly happy there and it was cool seeing them every time we walked past.

It's in a small town (Juan Dolio) with nothing to do. If you're looking for adventures or excursions, expect a long drive to one of the larger nearby towns.

WiFi is not available in the rooms, at least, not in our room. There are WiFi repeaters at each building, but the signal doesn't make it through the concrete walls into the rooms. You can get online on the balcony, but it's frustratingly slow.

The coffee shop is great. Hand-made espresso drinks including alcoholic coffee drinks, and pastry snacks, no extra charge. The coffee shop doesn't open until 9am, but there are in-room coffee makers.

There are some specialty restaurants that are included in the all-inclusive package, but you have to make a reservation one day in advance. There's a lineup in the lobby right at 9am. The restaurants were a nice change from the buffet, but didn't have a kids menu.

This isn't a full review, just some notes from our trip. We enjoyed it, but did find it a bit lacking in entertainment and things to do. If you're looking for a quiet week, this is your place.

Controllers vs Mouse and Keyboard

November 20th, 2016

This was (and is, until tomorrow, November 21st, 2016) a free weekend in Overwatch, so you can download and play the game for free on PC, PS4 and Xbox One. I play a lot of Overwatch on the PS4, so I thought I'd take the opportunity to try it on the PC.

I'm not very good at aiming using a game controller. I find it constantly frustrating trying to line up a shot and overshooting the target. I've been trying to get better, by using techniques like roughly aiming with the aiming controls and then moving my player for the fine aiming. This works, but it never feels precise.

I installed Overwatch on the PC and spent a few hours playing it, and first impression is, wow. I can aim! What a difference.

Second impression is: Uh oh, so can everyone else. Snipers seem a lot more dangerous on the PC than on PS4.

When you get killed in Overwatch, the game shows you a replay from the perspective of the person who got you, and it makes me feel better to see, on the PS4, other players having the same trouble. Lining up a shot, overshooting, slowly repositioning ... sometimes people make a lucky shot but often it shows other people struggling with the same imprecision I do. 

That levels the playing field, and probably explains why they'll never give us cross-platform play (PC and console gamers in the same game). The PC gamers would run away with it.

I'd switch back to the PC, but main reasons for going console in the first place still apply: That's where my friends are, cheaper long term hardware cost (no buying new hardware as minimum requirements rise), and RSI.

I use a trackpad on my Mac, because when I used to use a mouse, I'd start to experience carpal tunnel syndrome symptoms. Wrist pain. Switched to the trackpad and the symptoms vanished. Game controllers are fine, trackpads are fine, but for some reason, I just can't spend a lot of time gaming with a keyboard and mouse anymore.

But it was nice to jump back into that world for a few hours. Yes, console gamers are at a disadvantage, but we're all at the same disadvantage, which evens it out. Makes me feel a little less bad about my crummy aim.

Xcode Errors in Source Editor

November 19th, 2016

Here's a problem I was having recently. My project would build and run fine, but the source editor was showing errors.

Sometimes the errors wouldn't be there, sometimes they wouldn't. Sometimes they'd interfere with things like autocomplete, and it made working with the affected source files rather frustrating.

The problem was that the source file was used in more than one target, and one of the targets in a scheme other than the one I was using to develop the app had build settings that were causing the file to not build successfully.

In my case, I'd added some Swift source to the project, and configured the bridging header in my main target, but not in the UI Tests target. This is hard to discover, because there's no way that I can see to build that target directly from Xcode.

But there is an indirect way to get it to build, and it's probably a good idea to enable this anyway:


Pick "Manage Schemes..." in the dropdown that appears when you click on your project name in the picker on the Xcode toolbar, 

Napkin 16 11 19 6 42 38 AM

Now when you Analyze your project, you're analyzing not just the main target, but any other targets you select as well. This will show the build errors in Xcode and make it easy to go fix them.

More RAM Please

November 1st, 2016

Here's my two cents on the new MacBook Pro's, not that anyone is asking.

This tweet sums it up:

"I’m struck by the cultural divide between dismissive Apple defenders and people who buy an expensive mac for real work every four-five years" - @pinboard

The main reason I'm uncomfortable dropping that much money on a new laptop right now is RAM. My current Mac Pro, late 2010, has 20GB RAM, and my current laptop, a 2012, has 16GB.  I use a lot of RAM, because I use big, RAM-hungry tools, and I'm frequently near using all of it.  I just checked and right now I'm using 15.59GB.

When I bought my laptop, four years ago, 16gb was way more than I needed. But usage increases over time. I'm uncomfortable buying a new computer, expecting it to last another 4 years, with the same amount of memory in my current hardware.

I know it's probably Intel's fault, and that's fine.  What matters to me is that these new laptops aren't the computer I want to upgrade to, so I will wait.



“Free” Photo Storage

October 25th, 2016

Google is giving free, unlimited, full-resolution photo storage to all customers of their Google Pixel phone, and giving unlimited storage of "optimized" photos to everyone. This is an attractive deal, and Google using the storage warnings on iOS in their advertising will resonate with a lot of people. I think this will be a real, long-term threat.

How much storage is a Pixel user going to use, over the life of the phone? It really depends on the user, of course, but since the Pixel is a high-end, expensive phone, I'd expect the people who buy them are going to be heavy photo and video users. I could easily see using a few hundred gigabytes of photos and video over a couple of years. Video is large.

Apple added 4K video and Live Photos to the iPhone, features which more than double the per-item storage requirement. In a world where Apple makes a profit from either users buying higher-capacity devices, or paying for iCloud storage, that's great for Apple. But it makes it more difficult to compete with Google's free storage option, since it would simply cost Apple more to store the data generated from the same amount of customer usage.

But why is Google doing this? I don't believe it's simply to sell more phones.

Google is making hardware as a way to protect their ad business. Everything Google does can be viewed through this lens. Chrome was a way to keep the browser vendors from using an alternate default search engine. Android was a way to keep Apple from cutting Google out of mobile. Google needs you to be using Google services, and is systematically removing anything that gets in the way of that. You can have Google fiber to your Google Wi-Fi to your Google phones, and Chromebooks. It's Google all the way down.

But, specifically, why photos? I don't have any inside info here, but from looking at a few obvious trends, I have a hunch.

Recognizing things in images is becoming easier. The search capability that Google has introduced for Google Photos lets you search for photos that contain whatever terms you want to type in. Your phone knows where you are when you take these photos, so Google can tell a lot about the places you take pictures.

For example, it's obvious from my photos that I have a dog. Why wouldn't Google use that signal in their ad-selection algorithm? It makes sense, and it's feasible, so they will.

There's so much Google could learn about your home, your style, colour preferences, clothing, furniture, and so much more just by analyzing your photos.

That's the price for free photo storage.