Armen Chakmakjian

Archive for September, 2015|Monthly archive page

iOS 9 beta (well by this point 9.1)

In Random on September 12, 2015 at 3:25 am

So I’ve been playing with iOS 9 beta on my iPhone 5s and my iPad Air (gen 1). Today I got the update for the 9.1 beta and I am relatively impressed. I’ve also updated my MacBook Air (gen 2) and my Mac Mini (last year) to El Capitan (first the beta and then the GM version). Initially I’m going to stick with the iOS updates and only mention the OS X updates where they intersect. The enhancements listed on the Apple site are pretty accurate and I’d like to go through some of the major ones that I’ve experimented and my opinions.

I’d like to start with some of the apps. First the Notes App, I think Apple nailed it. Apple did what Evernote failed to do with Penultimate and integrated sketching and drawing, pictures and typed notes all in one. It is funny that on almost the same day I got a note from Paper by 53 telling me about all the cool things I can do with my Pencil by 53 or my fingers. Notes allows me to create somewhat simple and yet multi-media notes (sketches, text written notes and pictures) all at once. It even has a handy-dandy ruler on the screen to use. Now it doesn’t yet have all the features that Paper by 53 has for auto-shaping your sketches into squares and circles and flow charts, but it’s better integrated with typed notes than Penultimate and Evernote are with each other.

I’ve also been a Livescribe fan until recently and for a short period of time the clunky integration of Livescribe sky into Evernote was marginally more useful than going to the echo desktop since it allowed me to reach my handwritten notes on Evernote on my iPad. But in that Rube Goldberg-esque way, none of these cool things— Livescribe, Penultimate, Paper, Pencil, —none were integrated together except to be stored as records in Evernote. I think with a few more features, notes will be a category killer for those who live in the Apple universe, especially considering the El Capitan integration for notes. The stopper should have been that if you have a non-Apple machine in front of you, you’re SOL to get to notes…but guess what has a beta Notes app on it showing you all your notes, so you are never far from your content.

The new News app is quite good and is another category killer. On my iPad I can get to a lot of interesting news, free for the moment, and somewhat curated so that I can scan several viewpoints on the same story. The only missing item is Wall Street Journal articles. For that I still have my couple hundred dollars a year subscription (I’ve been subscribing from the Palm Pilot III days…so it’s been a while). In fact I will say that for a while I thought that having the WSJ rendered like a magazine in its own app was pretty nice…but lately I find that website is just better for overall accessibility to items. News is what the WSJ app should be. The WSJ will be assimilated…resistance is futile.

And now having the iCloud Drive on my iPad and finally having Preview for rendering PDFs on the iPad is a complete solution. Ok, now we kill Dropbox killer. Anyway the Dropbox app on the iPad was always weird and clunky. Unfortunately, once you embed things in Dropbox, Evernote, and the like, you don’t move everything all at once into the Apple cloud because not everyone can get to it in a clean way which is what Dropbox provides. Evernote is somewhere in between Dropbox and iCloud in its share-ability.

Anyway, now let’s get to some of the iOS capabilities itself. Since I have an iPad air, I can’t do split screen (and I’m not sure it would be very useful anyway at that form factor). However, the slide feature to bring up a second app on the right margin is very cool. Even more cool is the picture-in-picture capability, that is, to shrink videos (the video app or iTunes at the moment) and work in another app while the video continues to render in the lower right corner of the screen. Both of these are very cool for a multi-tasker like myself.
Lately, I’ve noticed Siri giving me more in the moment hints about things without my prompting, sort of like google now. This is very cool and creepy of course. Yesterday I went out to lunch to a local Chipotle. When I got back in my car, my iPhone had an alert on it that said it would take me 6 minutes to get back to work. Yikes!

Until I replace my iPhone 5s next month on the contract, next month, I rely on my Apple Watch for Apple Pay. I probably will pull my new phone out more often than try to use the watch because it’s proving to be awkward (double tapping your watch to initiate the payment instead of it doing what it does with my wife’s iPhone 6 which is wake up payments when your near and NFC). I hope that that is fixed with WatchOS2. Also if it is true that Dunkin Donuts and MyPanera rewards will be accessible in Wallet, then they will have fixed a glaring hole in the payments process.

Having the ability to drag the cursor to any place on the screen like a trackpad instead of hunting and pecking to select something is a feature that has been a long time coming.  This is especially true when combined with slide and select instead of first selecting then trying to drag the line and ball anchors around the phrase to be copy/cut.

So there it is. It’s a pretty major update in functionality at the app, interaction and content accessibility levels. I will be interesting to see what it looks like on the big phones and iPad pro.