Devices, devices, devices

Before I start on my own thing, let me first point you to what Brad Frost and Stephanie Rieger have already written on the subject of building a device library:

Without wanting to get into the debate over “sites” and “apps”, I think the distinction has some bearing on how you choose to spend your money in building a device library. Here’s how:

  • Site-like, or document-oriented web projects generally align with a 100% compatibility strategy: you want your content to render acceptably and be at least minimally usable on any web browser.
  • App-like web projects generally align with a partial compatibility strategy: your focus is on providing a set of features to browsers that meet a set of minimum requirements. (Although if your requirement is simply “webkit”, I may have to come and hurt you.)

Other markets are different, but StatCounter’s mobile figures for Europe in February 2012 show that 73% of mobile browsing comes from iOS and Android devices. (Some Android and iOS users may be using Opera Mobile or Mini, but that percentage is small.) Throw in a smattering of BlackBerry 6 and 7 devices, some high-end Nokias, and a few Windows Phones, and you could take this to mean that about 80% of mobile web users are working with a touch-screen device running a relatively modern browser.

If you go with the partial compatibility strategy, your device library should cover this 80%. Testing a feature-heavy site on BlackBerry 5 or Nokia S60 browser is going to drive you mad, and isn’t going to serve your core customer base. Because you’re concentrating on the medium-to-high end of the market you’ll spend more money per device, but you can get away with relatively few of them. I’d suggest a handful of Android and iOS devices, a BlackBerry 7, and a Windows Phone Mango. Make sure that the devices you get cover a variety of screen resolutions (small smartphone, large smartphone, tablet).

If you’re aiming for 100% compatibility, you need to focus your attention on the other 20%, because that’s where you’re going to spending the vast majority of your time debugging and fixing crazy bugs. Buy as many cheap, old, low-spec, oddball devices as you can lay your hands on. Any time someone says “WTF – you’re browsing the web on that thing?” award yourself special bonus points. Low-spec is good because if your site works well on a slow device, it’s only going to do better on a faster one. Because the 20% represents the long tail, you will need more devices, but they will be cheaper ones, so you’ll probably end up spending just the same amount of money.

Building a device library is not a one-off thing. It’s not like a developer workstation that you can get away with upgrading every 2-3 years. New devices and new OS upgrades are being delivered at a rapid pace. If you can afford it, you should be looking to add new devices to your library at least every six months. Again, if you can afford it, don’t upgrade the OS on old devices. Over time, they will become your legacy devices.

Finally, I consider it important to actually use the devices. Having a spectrum of devices sitting on a shelf in the office while you walk around with an iPhone 4S in your pocket will not get you exposed to the strengths and weaknesses of the different platforms and form factors. For example, if all you use is a touchscreen, you won’t understand what it’s like to browse with just a keyboard. (Try unplugging the mouse from your desktop machine, and see how that works out.)

For reference, here’s the stack of devices I have on hand for testing:

  • Nokia N95. 240×320 screen, S60 browser. The first phone I used for doing mobile web work. Released in 2007; an impressive piece of kit at the time. Unfortunately it has lost the ability to connect to my wifi, so testing with it is a bit of a pain now.
  • Nokia C3-00. 320×240, keyboard, no touch screen. Uses the Nokia Ovi browser by default, but also has Opera Mini built in. Cute little phone.
  • Nokia C5-03. 360×640 resistive touch screen with haptic feedback – it buzzes when you press the screen hard enough for it to register a touch/tap. The screen is small, but high resolution, and looks great. I used to think that it was a bit crap for web use because it only had a numeric keypad, but that was before I figured out that the qwerty keypad only shows up when you hold the phone in landscape mode on its left side, not on its right. Way to go, Nokia. Also, the “Create WLAN connection in offline mode” prompt should have died 5 years ago.
  • Motorola FlipOut. Android 2.1, 320×240 touch screen, and a keyboard. The keyboard doesn’t slide out, it rotates out from behind a corner. When it’s closed, this looks more like a pager than a phone. It’s cute and small, and I really like it.
  • Samsung Galaxy Ace. Android 2.2, 320×480 touch screen. Relatively cheap and versatile, because it can be upgraded to run 2.3 as well. (Buy two while they’re still selling them with 2.2!).
  • Samsung Galaxy Y. Android 2.3, 240×320 touch screen that pretends it’s 320×480. Cheap. Nice-looking little phone, but the touch screen is dreadfully inaccurate.
  • Acer Iconia Tab A100. Android 3.2, 1024×600. Piece. Of. Crap. Poor screen, slow, a capacitative “home” button that makes an accidental touch target no matter which way you hold it, and the battery won’t hold a charge in standby mode for more than 48 hours. The only thing it has going for it is its price. (That, and the fact that it fills the Android 3.x hole in my line-up.)
  • iPhone 3G. iOS 4.2, 320×480. The first iPhone in our household; I inherited it from Abi when she upgraded to a 4 in 2010. Useful as a legacy iOS4 device.
  • iPod touch 4th generation. iOS 4.3, 320×480 (retina). Gorgeous little device. This is my stand-in for an iPhone 4, because it has the same A4 chip and retina display resolution (although the screen is slightly lower-quality than the actual iPhone 4). I’m currently trying to figure out if I want to upgrade it to iOS5 to match the distribution of hardware in the wild, or keep it on iOS4 as a legacy device. It’s a cost consideration.
  • iPad (original). iOS 5, 1024×768. Still great.
  • BlackBerry Curve 8900. BlackBerry OS 5, 480×360 screen, keyboard. BlackBerry 5 is the OS That Will Not Die. BB5 devices are still being sold in volume because they’re cheap, have great keyboards, and people like using them for texting and messaging.
  • LG Optimus 7. Windows Phone 7 Mango, 480×800 touch screen. This was my first Windows Phone, but I don’t think it’ll be my last. I like WP7 as an OS. I like the sleek design, and the live tiles on the home screen. As a phone, the Optimus 7 is just okay. It’s big, heavy, the screen brightness is harsh, and the camera is only so-so. But I still found myself carrying it around a lot as my day-to-day phone.
  • Nintendo 3DS. This is what I mean by “oddball”. It has two screens, one on top of the other. The top screen has an effective resolution of 400×240 (landscape) and the bottom one is 320×240 (landscape). When you load up a page, the page loads into the bottom screen, but you can scroll it upwards. Only half the page is touch-capable (the bottom screen). The browser is a custom webkit build, made by Netfront. It scores 125 on html5test.com. It’s cheaper than most of the other devices in my library, and it plays Zelda? Come on, how can you not love this? The big disappointment is that you can’t embed 3d images in a web page. The top screen has to switch into a separate 3d-mode to show 3d images and video.

And here’s what’s on my shopping list for the next round of upgrades:

  • An Android 2.3 device with a larger screen. Android 2.3 currently holds the largest installed base, and a lone 320×240 device just isn’t sufficiently representative 13 March 2012: target acquired. Motorola Defy+
  • The new iPad with retina display. These are going to sell by the ton, and people will be using them to browse the web all the time. Apart from wanting one for myself, I can easily see the new iPad on its own accounting for more than 10% of all mobile browsing by the end of the year. 23 March 2012: target acquired. This thing is awesome.
  • A BlackBerry 7. A touchscreen-only or keyboard-only device would be cheaper; a touch+keyboard device would cover both categories, but it much more expensive. Also, not a huge market share. Not sure about this.
  • An Android 4. There aren’t enough Android 4 devices on the market yet to give it a significant amount of market share, and therefore to worry about it as a target for testing with. I’m trusting that it’s going to be mostly like Android 3.x, but faster. I might check out the Samsung Galaxy Tab 2 7″ when it appears. Otherwise, I’ll just wait for cheaper devices.

Further reading

Website traffic patterns in the presence of native apps

A serious question for anyone running a website that has a matching native app to go with it: looking at the platform on which the app is available, is the share of website traffic from this platform going up or down (or staying static)?

My suspicion is that the share of mobile web traffic for most platforms is still going up, even in cases where a native app alternative is available.

Please comment, or contact me if you have any data to share!

A rich client tipping point

Francis Hwang talks about reaching a tipping point in web application development:

If you are writing a new web application, you should make it a rich-client application from the start. Your servers should not generate any HTML. You should do all that work in the browser with a Javascript framework such as Backbone.js or Ember.js, and the server should only talk to the browser via a REST API.

I’m very close to agreeing with this, but I have my own set of caveats.

First of all: mobile. If you’re using an iPhone 4(S), you might not realize that a lot of web browsers on mobile devices are abominably slow. In terms of getting the first page of your app/site up and running on a mobile browser, an HTML page rendered on the server is going to beat a client-side JS application hands down in at least 90% of cases.

After that first page has rendered, the comparison is different. A server-rendered site/app with full-page reloads is (probably) going to have to do more round-tripping, re-parsing and rendering, whereas a client-side app that has bootstrapped itself might only need to load up small chunks of data and refresh parts of the page. But remember that mobile devices are often tight on working memory, and even switching between tabs (let alone switching between apps) can be enough to kick your app out of memory, so that it has to bootstrap again when you switch back.

With IE6 and 7 effectively dead, the desktop doesn’t really have this problem any more. Desktop browsers are now sufficiently fast and capable that the bootstrapping cost is tiny, and from an architectural point of view separating the front-end from the back-end API is the service-oriented way of doing things, which is to say the right way.

(Then we still have to make our client-side apps internally resilient and service-oriented, so that one poorly performing component on the page doesn’t crash the whole app. But. Baby steps.)

It irks me that mobile web development, which in many ways was running ahead of classic desktop web development (advanced CSS3 capabilities, lack of IE6, et al.) is now going to fall behind in this architectural shift. Instead of incapable legacy browsers (IE6,7), we’re stuck with at least a couple of generations of poorly performing devices. The platform that is most suited to “app-like” web content is the one least capable of running the damn things.

Having said that, it’s important to know your audience. In Western Europe, at least, turnover of mobile devices is high, and the shift away from Blackberries and Nokias towards iPhones and Androids is massive. As usual, “it depends.”

Secondly, there’s the small matter of linkability and history management. If there is any part of your application that you want people to jump to directly, either as a bookmark for their own benefit, or as a link to hand out to others, it has to be have a URL. Using hash fragments for navigation may be a well-established pattern, but it’s still a hack. So long as you’re using hash fragments, that URL can only be run on the client. pushState() and replaceState() can fix this, but we’re still a little while away from these methods being universally available (IE10).

  • And then, if I’ve been clicking around in the application, what happens when I press the back button? Does it take me to a previous state within the application, or does it take me out of the application altogether? I don’t think that the field of front-end development has answered this question with a consistent set of patterns to meet (and reset) user expectations yet. Far too often I still hear, “er…don’t use the back button because that’ll break everything”.

    The back button doesn’t reign as supreme as it used to, though; multi-tab browsers have seen to that. Running an app in a dedicated browser tab, and closing it when you’re finished is a common option now. Adding a web app to your phone’s home screen so that it can run without browser chrome is less common, but on the rise. Still, the back button is likely to still be in play in the vast majority of cases. If you’re going to build an app, make sure that you have a plan to deal with it.

    Of course, none of this applies to document-oriented web content. But the distinction between apps and documents is an entirely different discussion.

  • Apple TV, Airplay, and the future of video

    I didn’t get the Apple TV for a long time; neither in the sense of “buy” nor “understand”. Certainly here in the Netherlands, where Apple didn’t start selling video content through the iTunes Store until last year, it seemed entirely pointless. Even now in 2012 the selection of films on offer in the Dutch iTunes store is still anemic at best, and the Apple TV is a dubious proposition — if you think of it as the equivalent of the DVD player that sits below your TV.

    But that’s not what it’s for.

    My first glimpse of understanding came in November last year, when the keynote at Minecon was being streamed live across the internets. Minecraft 1.0 was finally going to be released, and Alex and Fiona were super excited about it. We plugged my old laptop into our big-screen living room TV, and let the kids stay up late so we could all watch the big moment together. But while we were waiting, we watched a couple of videos on YouTube. And afterwards, we left the laptop plugged in for a few days, and the kids watched some more videos.

    Turns out there is a place in our life for watching YouTube on the living room TV instead of just on our computer screens. But also: it reminded me of a long-dormant desire to rip all of my DVDs to hard disk and have them available through some kind of media centre interface. I had played around with using a Mac Mini as a media centre before, but there’s something a bit naff about having a computer with a big whopping external hard disk plugged into your TV. Sure, it works, and it gives you all of the power! of a general purpose computer (web browser! games!); but it also gives you all of the downsides (software updates! manual configuration!). A home-built HTPC may be a source of massive geek cred, but in the end it’s just a hack.

    That’s where the Apple TV came in. It’s a perfectly silent, tiny device that plugs into an HDMI socket, and provides a lovely interface for streaming video. It has a built-in YouTube “app”; no web browser required. I can rent and watch movies directly from the iTunes store. And best of all, I can use it to stream videos from a computer acting as a media centre — which means that the bulky NAS can be conveniently tucked away out of sight in my office. So I bought an Apple TV in December, and it’s great. Mark Boulton gives a good overview of a similar setup (including his backup options) in his post Backups, Networks and a Digital Home. I haven’t installed aTV Flash on ours, but I might in the future.

    But although this setup is groovy, it’s still not what the Apple TV is for.

    All of these features are about pulling content to your TV. It assumes that the TV is the important thing, and everything else is a peripheral designed to serve it. It assumes that the living room is where you have your TV, and that’s where you settle in for the evening to watch your shows. This is the paradigm that other consumer-level devices operate in: DVD players, cable TV set-top boxes, game consoles. They are all static boxes designed to complement your TV, and to maintain its supremacy as the core of your home entertainment setup.

    The Apple TV looks like just another content box, but that’s camouflage. It’s actually is a secret weapon in disguise. Its purpose is to stand that home entertainment model on its head: to place content at the heart instead of the screen. This is where Airplay comes in. Airplay is all about pushing your content (music, video) to a device (speakers, screen) from wherever you are and whatever you are using, rather than assuming that you are tied to a delivery mechanism (your hifi speakers, your living room TV) and want to pull content towards you.

    The “aha!” moment for me came when my parents were visiting us a few weeks ago. We were talking about our trip to see the magician Hans Klok in Carré last summer, and about magic in general. My dad pointed out that Hans Klok’s wine bottles trick is a variation of Tommy Cooper’s classic “bottle, glass” routine. Of course, we fired up the Apple TV and went straight to YouTube to find a clip of it. Cool. My iPad was around as well, and while we were watching clips on the TV, we were also browsing YouTube on the iPad and looking up our other favourite acts. And then, rather than pulling up the next clip on the Apple TV, I used Airplay to push the video from my iPad up onto the TV.

    This was the lighbulb moment.

    The iPad is the device I’m using. It could just as easily be an iPhone, or my laptop. The TV happens to be nearby. The TV becomes a temporary extension of the device.

    Another example: I have a friend round to visit. They want to show me their holiday photos, or a funny video of their cat. They could show me on the screen of their iPhone, or they could just push it up onto the big screen.

    Or: I go round to a friend’s house. I have a bunch of music on my iPhone. They have an Airport Express plugged into their hifi, and — boom — I can push my favourite song through their speakers rather than having to plug my phone in to the stereo. If they have an Apple TV box attached to their TV, we can watch any movie I have with me, without futzing about with HDMI cables and adapters, or cursing myself for forgetting to bring the DVD.

    As more music and video gets stored in the cloud, this becomes an even more low-friction scenario. I won’t even have to worry about putting music or video onto my iPad to take with me wherever I go, because it’ll be accessible from anywhere with a wireless signal. As flexible screen technology develops, more and more ordinary surfaces will be transformed into displays. Maybe I’ll be able to walk into a café with my iPod, and play a movie onto my table. Maybe I’ll be able to play funny pictures onto your T-shirt.

    You know, in the future, when Captain Jean-Luc Picard says “On screen!“, and some random ensign pushes the video from their console up onto the main viewing screen? Airplay.

    This is also why I think Apple is not about to produce an actual consumer TV. (Although I may be proved wrong on this very soon.) With Airplay, they have relegated the TV to the status of a peripheral. A very expensive peripheral to be sure, but as Fred Wilson has pointed out, cheap things will be smart, while expensive things will be dumb. Apple likes smart things that you will upgrade every couple of years. A small, cheap Apple TV device that makes any big screen TV smart fits in that category; a 42-inch “smart” TV that will be obsolete in two years, but is too expensive to replace in that timeframe, doesn’t. And any kind of Apple-branded TV definitely would be a high-end, premium, and expensive device.

    Apple used to make speakers: remember the iPod Hi-Fi? But they got out of that business. TVs and speakers don’t matter any more. They’re just surfaces through which we push our media. Smart, highly personal devices that control the TVs and speakers — that’s where the real value lies.

    How to turn off keyboard sounds on Android 3.2

    It’s not under SettingsSound. Oh no. That would be far too easy.

    Go to SettingsLanguage & InputConfigure input methodsSettings, and uncheck the Sound on keypress checkbox.

    Someone was thinking: “You paid for these sounds. Why on earth would you want to turn them off?”

    Whoops!

    I read John Lanchester’s book Whoops! Why everyone owes everyone and no one can pay the other week. It’s another look at the financial crisis: what exactly the mess is all about, how we got there, and what is being done about it. It’s clear, insightful, and (in places) very funny.

    Right in the very first chapter, he makes a point that is going to stick with me: with the the fall of the Berlin Wall and the collapse of Soviet communism, we lost a global counterweight to (certain forms of) greed, corruption, and injustice. The current absurd levels of social inequality in the Anglo-American world can be seen as a result of this. The camp of “greed is good” had won by default, and took a sociopathic twenty-year victory lap:

    […] the population of the west benefited from the existence, the policies, and the example of the socialist bloc. For decades there was the equivalent of an ideological beauty contest between the capitalist west and the communist east, both of them vying to look as if they offered their citizens the better, fairer way of life. The result in the east was oppression; the result in the west was free schooling, universal healthcare, weeks of paid holiday and a consistent, across-the-board rise in opportunities and rights. […]

    And then the good guys won, the beauty contest came to an end and so did the decades of western progress in relation to equality and individual rights. In the USA, the median income — the number bang in the middle of the earnings curve — has for workers stayed effectively unchanged since the 1970s, while inequality of income between the top and the bottom has risen sharply. […]

    Here’s a way of thinking about the change since the fall of the Wall. One of the most vivid consequences was the abolition of the ban on torture which had previously been a central characteristic of the democratic world’s self-definition. Previously, when the west did bad things, it chose to deny having done them or to do them under the cover of darkness, or to have proxies do them on its behalf. Corrupt regimes linked to the west might commit crimes such as torture and imprisonment without due process, but when the crimes came to light, the relevant governments did everything they could to deny and cover up the charges — the crimes were considered to be shameful things. With the end of the ideological beauty contest, that changed. Consider the issue of waterboarding. At the Nuremberg tribunals it was an indictable offence: a Japanese officer, Yukio Osano, was sentenced to fifteen years’ hard labour for waterboarding a US civilian. During the Vietnam war, US forces would occasionally use waterboarding — but when they were found out, there was a scandal. In January 1968 the Washington Post ran a photograph of an American soldier waterboarding a North Vietnamese captive: there was uproar and he was court-martialled. With the end of the Cold War and the beginning of the ‘war on terror’, waterboarding became an explicitly endorsed tool of US security. (And of British security too, by extension.) At a time when the democratic world was preoccupied by demonstrating its moral superiority to the communist bloc, that would never have happened.

    The same goes for the way in which the financial sector was allowed to run out of control. […] [I]t was the first moment when capitalism was unthreatened as the world’s dominant political-economic system. Under those circumstances, it could have been predicted that the financial sector, which presides over the operation of capitalism, should be in a position to begin rewarding itself with a disproportionate piece of the economic pie. There was no global antagonist to point at and jeer at the rise in the number and size of the fat cats; there was no embarrassment about allowing the rich to get so much richer so very quickly. […]

    I hope that movements like Occupy can bring about change from within.