UITableView, Multi Select and Swipe To Delete

November 29th, 2011

I was having a problem recently with a UITableView, where the swipe-to-delete gesture wasn't working.

The documentation is pretty clear that your table will get the swipe to delete behaviour if you implement the [cci]tableView:commitEditingStyle:forRowAtIndexPath:[/cci] method in your [cci]UITableViewDataSource[/cci] delegate.

So, in other words, implementing this one method in your data source should be enough to enable swiping to delete a row:

[cc lang="objc"]
- (void)tableView:(UITableView *)tableView commitEditingStyle:(UITableViewCellEditingStyle)editingStyle forRowAtIndexPath:(NSIndexPath *)indexPath
{
if (editingStyle == UITableViewCellEditingStyleDelete)
{
// Delete the row from the data source.
[self.model deleteItemAtIndex:indexPath.row];

// And from the table itself
[tableView deleteRowsAtIndexPaths:[NSArray arrayWithObject:indexPath] withRowAnimation:UITableViewRowAnimationFade];
}
}
[/cc]

But this wasn't working for me.

After a bit of digging, I found the problem. My table supports an edit mode that the user flips into by pressing an Edit button in the toolbar. When in edit mode, the user can tap on multiple cells and then tap a trashcan at the bottom to delete all the selected cells. This works nicely, but interferes with the swipe-to-delete gesture, in that swipe-to-delete isn't supported when multiple selection is enabled in edit mode.

What I wanted was single selection when not in edit mode, and multiple selection when in edit mode. This was easy enough to achieve:

[cc lang="objc"]
-(void)setEditing:(BOOL)editing animated:(BOOL)animated
{
self.tableView.allowsMultipleSelectionDuringEditing = editing;
[super setEditing:editing animated:animated];
}
[/cc]

Make sure the table is set for Single Selection During Editing in Interface Builder, and that's it.

Objective C Categories

November 27th, 2011

Categories take a little getting used to.

Objective-C categories let you add your own functionality to existing classes. This can take classes in directions that seem decidedly strange.

For example, if you want to draw a string in UIKit, you ask the string to draw itself. Coming from a Windows / Java background this seems weird. Why would a string know how to draw itself into a graphics context? But, on the other hand, why wouldn't it?

With Objective-C, any object can be taught how to draw itself. Or how to serialise itself. Or any other bit of functionality that a particular library wants to add.

It's a bit of a disconnect at first, but once you know it's happening it makes perfect sense, and the idea of extending basic types with your own categories seems perfectly reasonable. Need to add a way to hex-encode the binary data in an NSData object? Add it right to the NSData object using a category. This makes it easier to find than the equivalent in other languages, which is typically to add some utility functions somewhere.

One tip: If you're adding categories in a static library, you will need to specify the "-all_load" flag to the linker. If you don't do this, the categories won't be visible to the calling application and things won't go well for you. Normally the linker will discard things that it thinks aren't being used, and it seems to get this wrong about categories, but -all_load forces the linker to keep the category methods.

Siri on iPad and iPhone 4

November 23rd, 2011

There's a question that's been asked: Why isn't Siri available on other iOS 5 devices?

Answers have varied, but a common one seems to be that it's an attempt to engage in a scaled rollout. Turning Siri on for anyone with an iOS 5 compatible device would have resulted in much greater load on the servers, and so, the theory goes, Apple used the hardware as a gatekeeper. Not a bad theory.

Another theory is simply that they wanted to give people a reason to upgrade. I don't give this theory as much credit, since I don't think that's how Apple works. Apple has been providing new features to existing phones since the first iPhone updates. Apple wants to keep you happy with your iPhone, so you keep buying apps, and when the time comes, make your next phone an iPhone as well.

My theory is a bit different.

I think Apple may not be making Siri available to customers of older phones because they didn't pay for it.

Revenue recognition rules are a funny thing. They prevent companies from adding features to existing products, because the users of those existing products didn't pay for them. Remember when Apple had to charge a small fee for a WiFi driver update that added 802.11N to MacBook Pro's that had the hardware but didn't come with drivers? They didn't do that because they wanted to, they did that because the accounting rules forced them to.

Siri is "in beta" now, and I think the reason it's labelled as such is because it sets the expectation that the feature isn't done; that there will be updates to it that are included in the price you're paying for the hardware, even though those features aren't done yet. It makes it easier for Apple to add features without getting into accounting trouble.

So, could Apple make Siri available for existing devices, the way it did with that network driver? Yes, it could. And I'm guessing that once the initial rush on the servers is done, and once the software is a little farther along, they may do just that.

Would you pay $4.99 to add Siri to your existing phone?

Combining Videos in QuickTime on the Mac

November 22nd, 2011

A quick QuickTime tip I just discovered.

Blogging from Siri

November 19th, 2011

I'm sitting in the car at the Canada/US border, blogging into my phone, using Siri.

There's a 30 to 60 minute delay in crossing the border, so I have some time to play.

So far I've dictated this entire post to Siri, without having to correct anything.

Cool.

MealPlan for the PlayBook

November 9th, 2011

I like writing software to scratch a personal itch, and answering the "what's for dinner" question has always been something I figured computers could help with.

I had a PlayBook, development tools, and a desire for an app that would make it easy to quickly put together a plan for the meals for the week. Not the sort of in-depth grocery list tracking app that most of these turn into, but something that makes it easy to say "we're having burgers on Tuesday".

What I created is an app called MealPlan:

Screenshot
The UI is simple and easy to understand. The ratings have generally been pretty good, with most of the negative commenters asking for more features (which IMHO is a pretty good place to start). I'm going to work on a bug fix release now that I have time available to work on it. I just recently set a price on the app, $1.99; it's been free since I didn't want any conflict with my work at Adobe, but now that I'm no longer with Adobe, I can charge for it. The free ride is over. :)

If you've got a PlayBook, and ever have a tough time with the question "What's for dinner?" have a look at MealPlan.

Plantronics Voyager Pro+ vs Pro HD

November 4th, 2011

I had a Plantronics Voyager Pro+ headset that I rather liked. But I lost it. :(

So when I went shopping for a replacement, I decided to see what else was out there. I tried out two other headsets, the Jawbone ERA
and Jawbone ICON, but in the end, returned them.

I mostly use a bluetooth headset for listening to podcasts and audiobooks.

The Jawbone products were nice enough - sound quality was okay, phone calls were good, but while all of these products support Bluetooth A2DP audio streaming, the Jawbone products don't support the AVRCP profile needed to play and pause media. This means that if you're listening to streaming audio and someone comes up and wants to talk to you, you need to fumble your phone out of your pocket and pause it. That's no good.

Newer Jawbone products, like the Jawbone ICON HD, do support AVRCP, but I didn't have one of those available to test, and I knew I'd liked my Voyager Pro+.

So I went with the Plantronics Voyager Pro HD.

It's a bigger headset, with a weighty ear hook.

201111041106

There are lots of reviews of the Voyager PRO+ out there, and the Pro+ and Pro HD are identical in many respects, so I'll just talk about the differences here.

The main difference is the sensor that tells the headset when it's touching your face. This is used to prevent it from making calls while it's in your pocket, and to let you switch calls from the phone to the headset by putting the headset on or taking it off. It detects this, so you don't need to press any buttons.

That's a nice feature, but even more important for me is that they made the volume buttons easier to press! The previous design of the phone put the volume buttons on a bit of an angle, and there was no surface directly across from them that you could put your thumb on to get leverage to push the buttons. The way you pause and resume media is by holding down both volume buttons for a second, so this is something I was doing fairly often, and it was just awkward. With the Pro HD, it's easy.

One thing I've noticed, both with the Pro+ and Pro HD, is that the signal sometimes cuts out while the phone is in my opposite pocket while I'm outside with the phone. This doesn't seem to happen in the house, and I think I've ruled it down to interference with the iPhone's WiFi signal. Turn off WiFi and the Bluetooth seems to be better. This may be an iPhone specific issue, I'm not sure.

Speaking of iPhone, how do the Pro HD and Siri get along? Reasonably well. I find voice recognition works a bit better when I hold the phone itself up near my mouth while I'm speaking, compared to speaking into the headset, but in a quiet area, both work well enough. You hold down the button on the side of the headset for a second to initiate voice dialling, which beeps once. Then it connects to the phone, which beeps again, a different kind of beep, and then you have to wait for the Siri beep. Fortunately, Siri's beep is recognisable, but you do need to learn to wait for the third beep before speaking.

Thoughts on Steve Jobs

November 4th, 2011

I finished reading Jobs’s bio this morning. It’s a fascinating read, and I recommend it to anyone who knows who he is.

This post will be mostly spoiler-free; I’m not going to talk about specific things from the bio, but rather his management, and Apple. But it will probably make more sense if you’ve read the book.

I’m not sure it’s a great biography. I don’t feel like I really got to know Steve Jobs the way you typically do in a good biography. Some of that is, I’m sure, because the book is big enough already and there’s not much in there that you could leave out. But it’s also probably because it was hard to really get to know him.

Jobs was strong-willed, had a serious perfectionist streak, and could be very charismatic and influential. He had a strong desire for things to be well-designed, and as simple as they could possibly be. This showed in every product he was involved in.

But, and here we go into my thoughts, he was only able to achieve these things because of the opportunity that Steve Wozniak gave him.

Jobs was, as the book (and even Jobs himself) repeatedly asserts, an asshole. I doubt he’d do well in a job interview, or with people who didn’t already respect him for his past accomplishments. But he proved, with the Apple I and II, that he could build ship products.

Shipping great products is different from designing great products. Woz came up with the hardware for the early Apple computers, but he wanted to give the designs away. It was Jobs who valued the work they’d done and had the vision for building a company around it. It was Jobs who obsessed over the marketing and presentation but it was also Jobs who set requirements on what the hardware had to do, what it should look like, and other facets that fed back into engineering. He wasn’t just a sales guy.

Jobs needed a good team behind him to ship the products that he did.

He had to build the team himself, and to do that, he basically needed to already be rich and buy a great team (Pixar), or needed someone like Woz, to build the product and then partner with him. Their early meeting and collaboration is what facilitated the creation of Apple and gave Jobs the team he needed to build the products he did.

It comes up frequently in the book how Jobs had to wrestle with teams to get them to do what they wanted. He needed something to be a certain way, for reasons that mattered deeply to him but that others viewed as unimportant. Through sheer force of will he got things done the way they needed to get done.

Tim Cook is a brilliant operations guy. He knows how to make things that work, ship them on time, and ship millions of them. But, as even Jobs admits, he’s not a product guy.

Apple is a machine that can build almost anything. Ideas either come from the top, or bubble up to the top, where the the vision is defined. Tim Cook and the rest of the team can execute that vision, but now they don’t (seem to) have anyone to champion and nurture and refine that vision.

Jony Ive would probably jump up and down here and rightly so. But I don’t think you want an artist driving vision either. That’s when you get products like the iPod Hi-Fi that are beautiful products that don’t fit with what people want.

Jobs and Wozniak was the perfect starting combination. Jobs with Tim Cook running the show is a scaled up version of that, an unstoppable innovation machine.

I wanted to speculate here about the prospects for Apple without Jobs are, but that's hard to do. I'm sure there are a few years worth of grand ideas in the pipeline, but as Jobs showed, it's about the details. Jobs built a company that can make great products, but for as long as he could he was deeply involved in the tiniest details.

But here's an encouraging thought. Siri, the voice assistant in the iPhone 4S, was done, from the sounds of it, mostly without Jobs's involvement. Siri represents some of the most interesting UI innovation on the iPhone in the last few years. Completely new stuff, and in my opinion, beautifully designed, with an excellent attention to the details.

The fact that this was done without Jobs gives me hope that Jobs really has created a company that can build the products he might have built, without him. Let's hope so.

Clearing Out AIR’s Encrypted Local Store

November 2nd, 2011

When developing an AIR app, it's common to want to start it clean, as if there were a new user launching the app for the first time. Flash Builder even supports this, with a checkbox in the Debug Configuration dialog for "Clear application data on each launch".

This clears out the local storage for the application, but doesn't clear out the encrypted local storage, or ELS. ELS is stored separately, and there's no way to clear this from within Flash Builder.

To clear this out manually, you'll need to remove the folder that contains the local storage. On the Mac, this is:

[cc]
~/Library/Application Support/Adobe/AIR/ELS/my.application.id
[/cc]

And on Windows, it's:

[cc]
C:\Users\username\AppData\Roaming\Adobe\AIR\ELS\my.application.id
[/cc]

Remove this folder, check the "Clear application data on each launch" checkbox, and the only data remaining will be data that your application explicitly writes elsewhere on the file system.

Insteon Power Monitoring Source Release

November 1st, 2011

This is a fairly simple program that was intended to be the start of something bigger.

In it's current form, it's an Insteon PLM (PowerLineModem) protocol client that uses a TCP connection to an Insteon SmartLinc (2412N) device to send commands over the power line / wireless mesh to an Insteon iMeter Solo (2423A1), to retrieve the current power consumption in watts of whatever device is connected to the iMeter. It retrieves the watt usage and displays it in a huge number on the display of whatever is running the app.

It's a Flex Mobile app, and the intention was to build this for Android and iOS as an app that would let you monitor power consumption from anywhere in your house (or beyond, depending on your network configuration).

But, as I discovered once I acquired a few other Insteon devices, my house is not Insteon compatible. Insteon requires specific wiring, with line, load, neutral and ground all available at the switch, and my house isn't wired that way. (Our last house, which was built new, was wired for Insteon because I had that written into the plans).

Since I'm not building out an Insteon network at home, I'm not going to pursue the Insteon software I was building. So I'm releasing the code, as-is, for others to hopefully use as a starting point.

So I've posted the app on GitHub. You can get it here:

https://github.com/eyepaq/insteon-as3

Here's a screenshot of what it would look like if you managed to download, build and run it, and you had the required hardware on your network:

201111012145

The device I have plugged in (my Roomba charger) is currently using 7 watts.