Showing posts tagged design


The textile industry is squandering an opportunity. Despite accounting for 8% of manufactured goods sales around the world, they’ve managed to stay on the sidelines of our mind share ever since ire over sweatshops boiled over in the 1990’s. Nowadays it’s software designers undertaking the bulk of the PR work for textiles, as skeuomorphism finally impresses upon an otherwise fabric oblivious generation the nuances of linen, felt, faux leather, and whatever other basic textiles make up your shirt’s blend. Blame my cynicism but I’m shocked Cotton or DuPont hasn’t sized the moment and begun demanding their logos mar every wallpaper or user interface element on which digitized versions of their products appear. Unfortunately for them, it looks as though the public’s honeymoon with skeuomorphism is already coming to a brandless end.

“The Trend Against Skeuomorphic Textures and Effects in User Interface Design”, the latest in a long list of attempts at explaining this particular eventide, stands out thanks to John Gruber’s uncanny ability to summon a history of events wholly disconnected from reality. His essay, like most magic, begins on a benign observation: there’s a trend forming among top tier1 iOS developers steering away from the skeuomorphic design language of the platform. Trying to figure out why, Gruber cites Letterpress, Instapaper, and Twitteriffic 5 as case studies (other good examples: Realmac Software’s Clear, Flipboard, and Simplebot’s Rise), endorsing Dave Wiscus’s false rationale that the examples supra cement iOS’s legacy as the birthplace of leading-edge, non-skeuomorphic design. Things immediately start to fall apart.

*Proper usage of the word skeuomorphism is contentious enough to warrant its own article, so I’ll address it here to avoid issues later on. Most of the ire is concentrated around its misappropriation to designs which aren’t by definition skeuomorphic at all. I prefer deferring to the experts: Christopher Downer provides a good introductory overview that delineates the apples and oranges. In contrast, Chris Baraniuk’s position is polemic, calling into question the entire use of the word in relation to UI design and—not content to stop there—wonders whether or not the Wikipedia definition is more or less entirely rubbish. Louie Mantia also provides some needed Mythbusting on the issue. While I tend to agree with each’s arguments, I still can’t get on board with their prescriptivist position. Doing so would be ignoring how the word has transcended the boundaries of its old meaning and become a catchall term to a larger body of people using and adapting a definition that’s more popular in everyday use. Much the same way minimalism is flung around with little regard for definitions, we can use skeuomorphism as a genre word that, though perhaps frequently misapplied, is apt enough in practice for everyone to distinguish between a skeuomorphic-ish design and one that isn’t. And it’ll be used as such here.

From the start, both men’s design myopia refuses to acknowledge that non-skeuomorphic design has existed elsewhere prior to 2012, whether as the preeminent aesthetic of Windows Phone 7, Microsoft’s mobile operating system2, or through the clean lines and sci-fi sterility of Android’s not-completely-flat-yet-not-stuffed-with-chrome UI. The sidestepping of any outside influence is meant as misdirection, a reshaping of events that encourages the idea that iOS designers live in a vacuum controlled by the whims of Apple. My guess is that Gruber thinks he can get away with this fallacy since Windows Phone sales have been tepid at best and that the stock Android UI is almost always redecorated by whoever’s supplying the hardware. Except popularity isn’t a necessary condition of influence. Any competent accounting of flat UI design shouldn’t, and wouldn’t, ignore the contributions of Microsoft, Google, or even Palm, no matter how disappointing their sales records.3 Having declared iOS as the epicenter of this new trend, an iota of sleight is all that’s needed for Gruber to switch Apple’s position from beneficiary to benefactor.

Gruber’s chosen Apple’s Retina display to be the hero of his story, declaring it a singular breakthrough absolving designers from employing the “the textures, the shadows, the subtle (and sometimes unsubtle) 3D effects” of skeuomorphs that were “accommodat[ing] for [the] crude pixels” of non Retina quality displays. His thought process involves comparing the influence of high resolution displays on UI design to the influence—in this case real and documented—they’ve had on digital type design. Quick recap: Retina caliber displays are behind the viability of print hinted fonts rendered digitally, which had hitherto looked insulting on the sub-par resolution of non Retina displays. They’ve also had the reverse effect on screen optimized fonts by suddenly making them appear vulgar, ridding them of their purpose. Gruber equates the trimmings of skeuomorphic design to stopgap fonts like Georgia and Verdana4: poor solutions used for a lack of better options, given that the “hallmarks of modern UI graphic design style are (almost) never used in good print graphic design”. Therefore, we ought to be thanking Apple for granting designers the opportunity to produce “graphic design that truly looks good.” on our devices.

There’s no evidence I can find—and suspect will ever find—to defend the claim that skeuomorphic textures and effects are scapegoats for the inefficacies of lower quality displays. Gruber so heavily leans on his comparison to screen fonts he starts to redefine the term, implicitly suggesting that skeuomorphism is equivalent to poor design taste. If you’ve made it this far then you know how spurious the whole idea is. Even Dave Wiscus’s 100-level explanation is enough for anyone to articulate the relationship between a skeuomorph’s purpose and a heavily textured material surface. Neither is there any reason to believe that skeuomorphic design is now defunct thanks to Retina displays, given that (a) we know a skeuomorph’s primary function isn’t too cover for crude pixels; (b) contrary to Gruber’s subjective analysis that all drop shadows and glassy surfaces look worse on them, Retina caliber displays allow for even more detailed and striking effects, making already beautiful apps using skeuomorphic elements all the more stunning; and (c) even if we cede the last two points, questions abound on why, since the release of the Retina bearing iPhone 4 in June 2010, Apple has all but ignored the apparent Retina-resolution design era and pushed towards heavier and heavier use of so-called parlor tricks on both iOS and Mac OS, or why so few third party developers have moved away from the skeuomorphic model. His entire essay is being driven in a car without a rear view mirror, aces rushing out of its driver’s sleeves.

Most of the sensible explanations put forth in “The Trend Against Skeuomorphic Textures and Effects in User Interface Design”—that skeuomorphic elements are overused, how Retina caliber displays can influence UI design—are perverted by the misconception that print design and UI design are one and the same.5 They’re not. Where print design is concerned with aesthetic cues and organization of information that’s conveyed subconsciously to the reader (e.g., the way the eye moves between two paragraphs and understands new ideas are being introduced, or how text size imparts hierarchy), UI design’s cues are dynamic and explicit. They must convey function, respond to input, morph, adapt, and tangibly interact with the user. The set of skills required for one doesn’t come close to the set needed for the other. When Gruber tells us that “[the] hallmarks of modern UI graphic design style are (almost) never used in good print graphic design”, he’s right for all the wrong reasons. The differences don’t even matter. What’s he’s trying to demonstrate is how UI design is undergoing the same crippling transitional phase print design—specifically as it concerned fonts—had to endure with the introduction of digital displays. His account of digital type’s hobbled history, right down to its rescue by high-resolution displays, is spot on. Yet the paths between the two arts don’t run parallel; software’s only ever been digital. Where’s the analog6 (or digital) counterpart we compare it to and say “We could do so much more if only we weren’t stuck designing this software on a screen”? As displays march on towards human-grade optics, of course designers’ options have improved, but there isn’t some past UI standard they’re trying to return to. Progress here is strictly forward. Nothing forced skeuomorphism on us.

The upshot to this mess is that Gruber’s initial question is actually worth considering. It never once occurs to him however, that the answer needn’t be as convoluted as he makes it.

In his own words: “There is a shift going on, fashion-wise”.

Designers. Users. No one is immune to the fatigue brought on from overexposure. The numbers themselves are staggering. 700,000 apps downloaded 35,000,000,000 times. Even accounting for the large number of games making up that total, the prominence to skeuomorphic design is inescapable. We’ve refined, developed, added to, twisted, and debased the style down to a chintzy polish.7 Why doesn’t Gruber wonder whether we’ve simply tired of seeing yet another faux-textile background mimic a pair of pants no one would dare buy in the real world?

The analogies to fashion are easy to latch onto because they help make the distinction between aesthetics and function, something Gruber understands and has leaned on previously when describing user interfaces as “clothing for the mind”8. The premise is simple: No matter the amount of “stylistic tweaking”, UIs—or clothes—remain true to their form. So long as it remains able to divide the bill at the end of lunch (form), your calculator app can resemble whatever model Braun calculator it wants (stylistic tweak). The couture comparisons might be heavy handed, but they’re a good starting point from which to find better reasons why we’re moving towards flat user interfaces. For example, it could be that designers are realizing there’s a whole new generation of people for whom the cues of skeuomorphic design aren’t referential, but merely aesthetic.9 What’s the point of mimicking a Braun TG 60 reel-to-reel deck to millions of kids and young adults who will never lay eyes on—never mind use—an actual physical tape recorder in their lives?10 Why stick by a design that’s losing its raison-d’être?(_ed notes: an update to the Podcasts app on 21-03-2013 got rid of the tape deck simulacrum_) We might also consider whether skeuomorphic design is even fit for the UIs of modern computing anymore. As we increasingly interface by way of gestures, voice commands, and inputs disconnected from physical analogs, are digital knobs and textures the most efficient or practical solution? Asking these sorts of questions—not wondering what’s changed since Apple released a new iPhone—is how we begin noticing the influence of an entire mobile industry on itself: We can trace the career of Matias Duarte from Palm to Google and see WebOS’s legacy of physicality continuing on Android. It’s why designers at Microsoft can find solace in the fact that designers are apparently taking inspiration from Windows Phone 8’s text-centric, chrome-less aesthetic and adapting it to their software. Point being, it’s pure fantasy to imagine third party iOS developers leading the charge against embossed text on the basis of a single and insularly engineered cataclysm.11

Skeuomorphism isn’t bad design. Nor is it a fad. A pragmatist might complain it’s no longer ideal in 2013. A pessimist would say we’ve made it kitsch. I suspect John Gruber knows and believes these things. Otherwise his essay is a change of opinion that throws away years of Daring Fireball posts. Then why go to such lengths to find a solution so stretched and un-obvious? My suspicion is that any scenario wherein we acknowledge that fashion-wise something has fallen out of favour inevitably leads to questions about exactly what’s causing the falling out. Fingers want to be pointed and the inconvenient truth here is that skeuomorphism has no bigger an evangelist than Apple.

What goes unmentioned in Gruber’s essay is that most of the gaudy elements he’s reproaching were introduced, if not heavily endorsed and popularized, by Apple.12 iOS’s contribution was to dial the exposure knob to 11 by attracting thousands of eager developers to its ready-made developer tools favouring conformity and uniformity across the entire platform. The formula’s proved so successful that the entire UI language of specific classes of apps has been codified, standardized, and left customizable only at the level of “Which texture or drop shadow angle should we use here?”. Hence the excess.

There’s little satisfaction in getting this far only to have me pin this on one writer blindly marching his party line. While there’s no doubt Gruber’s over thought the situation so Apple can walk away unscathed, what I want to try and coax into sight are the actual consequences at play in this debate. Blaming Apple for abrading our tolerance of skeuomorphism isn’t as worrisome as the idea that it might have no intention of stopping. Hardware aside, there’s enough evidence to suggest that Apple’s institutionalized its taste for the playful, safe, non-threatening, and innocent genre of software espoused by iOS. You’ll notice small doses of it in places like the App Store, where categories and catalogs are given their own tacky icons filled with garish fonts and unimaginative emblems: a golden plaque background for its hall of fame category, an assortment of balls to decorate its sports section. Where it’s most apparent is in their now celebrity-laden, heartstring-tugging commercials, the charms of which have less to do with Apple’s clever wit and genuine passion than applying its fastidious work ethic to clichés we’ve seen elsewhere in advertising. There’s a shift occurring at Apple about who it considers its core audience to be, a shift that consequently reverberates across its product design, i.e., why it continues to be attached to skeuomorphism.

* Marketing is often the simplest way to see who a company cares about, how it perceive its audience, and how it cares to be portrayed. The best way to illustrate this particular shift—without rewinding too far—is by drawing a line somewhere around the launch of the iPhone 4 and comparing Apple’s advertising efforts before and after. The biggest visible change is the introduction of the decidedly cinematic and ostentatious suburban lifestyle vignettes exemplified by the Sam Mende’s directed FaceTime videos, as well as almost the entire run of Siri spots, and the short-lived_ Apple Genius series. They’re evidence of a company shedding its aura of pretentious coolness in favour of innocuous inclusiveness. Even going as far back as the Jeff Goldblum narrated iMac G3 commercials, Apple’s marketing pre-iPhone 4 was often about differentiating its values: Apple’s, and everyone else’s. The Manchurian-like effect on consumers meant—besides exemplifying TBWA\Chiat\Day’s own genius—that owning something California designed was a token of membership. If nothing prevented anyone from enjoying those iPod Shilouette dance videos, nor the charms of the Get a Mac _series, those ads nonetheless introduced dividing lines. If you didn’t own an iPod, didn’t recognize the catchy music (remember when Apple abandoned the opaque dancers and upcoming hipster bands in favour of unmistakable U2 and Coldplay mini-music videos?), owned a PC because you honestly couldn’t tell the difference, or weren’t savvy enough to make out all the references in the classic “Hello” iPhone Oscars spot, you couldn’t help but notice how different you were from those people who did own Apple products, a realization laced with all the consumerist impulses we like to pretend we’re immune to. Today, with so many iPhones and iPads in the hands of people who decidedly don’t care to fit that particular brand image, the old approach becomes alienating. Thus the current marketing—because Apple’s demographics run such a broad spectrum—goes out of its way to avoid any delineation, aiming to associate the brand with a wholesome, family values, American Dream lifestyle that almost anyone can relate or aspire to in some way.

Apple’s cutting edge innovations are both blessing and curse. As responsible as they are for the massive success and ubiquity of Apple within the pockets of a large portion of the developed world, they’re also responsible for populating its base with customers for whom cutting edge technologies have little appeal, traction, or even desirability. Today’s average Apple enthusiast is less likely to care about trends in UI design than they are about whether their current iPhone’s case will fit the next one. The kicker is that it’s proof of Apple shrewd business acumen: the skeuomorphic designs introduced in iOS back in 2007 were central to overthrowing the crude and unapproachable UIs powering devices preceding the iPhone and transforming the smartphone into something desirable to people outside office buildings. In hindsight it’s easy to explain why Apple had a hit on its hands. Today however, the huge heterogenous market Apple managed to attract to iOS is also the huge, heterogenous, and sensitive to change market which expects its median to be catered too. Dealing with expectations of this magnitude is a new world for the company, one which they may not comfortable operating in.13 Even assuming it remains a best of breed consumer electronics company well into the future, the attrition caused by the demands of ubiquitous user base means it’ll be increasingly harder for Apple to remain at the leading edge of the industry, at least UI-wise, without running the risk of estranging that base. While it won’t prevent them from innovating on hardware and technologies, it could force them into tempering their software breakthroughs in aspects they otherwise wouldn’t have if the target market still resembled what it was in 2007. Multi-touch gestures are a good example. Despite possessing the most advanced touch display technology in the industry, gestures remain woefully underplayed in the core iOS interface. Four and five fingered iOS navigation only became available to the public on iOS 5, and their—turned off by default—use limited to the iPad. There’s also no reason why some of those same gestures couldn’t work on smaller iPhone sized devices with one or two fingered substitutes. Yet their absence is conspicuous. Six years in, the gist of working one’s way through iOS remains by tapping buttons over and over again. Even prominent 3rd party innovations like “pull to refresh”, which thanks to their popularity on third party apps could routinely be mistaken as a core element of iOS’s interface, have only been timidly adopted by Apple, if at all. This underlines why the charge away from skeuomorphism is being led by third party developers, and not Apple as Gruber suggests. Third party developers aren’t beholden to the median of iOS users. They can find success in narrow audiences. They can take more risks UI-wise, acting as outliers with aspirations of becoming the trendsetters for next year’s UI fashion trends. It’s a can’t-lose scenario for Apple: at a minimum there’s enough apps to please anyone’s tastes, and if any of these Flat UI projects happen to take off at scale, e.g., Google Maps, certain elements of the native Facebook app, or pull to refresh, Apple benefits by osmosis.

There’s a hitch of course. Nothing explained, debated, or corrected supra applies to any industrial design related activity Apple’s been involved with over the last 13 years. No one would contest that every desktop, notebook, or mobile device bearing its logo hasn’t at one time represented the absolute bleeding edge of its field, achievements superseded only by their successors. There’s no denying how relentless Jony Ive14 and his team have been at pushing the boundaries of what a computer device ought to be, how it ought to look like, and what it ought be made of. Theirs is a unique focus that, mixed with a healthy disregard for whatever customers might want or expect (floppy disks, DVD drives, removable batteries, whatever I/O ports the iPad doesn’t have, and bigger or smaller iPhones depending on the rumours circulating the day you’re reading this), is almost enough to vindicate Apple’s overabundant affection of superlatives when describing its products. But hardware designers enjoy some privileges the software guys don’t. The big one concerns how being at the leading edge of electronic industrial design—as it seems only Apple has realized—actually aligns itself with the goals of the art. However striking its design, hardware’s ultimate goal is to disappear into the user’s unconscious: Lighter so as to not fatigue the hand, smaller so it can fit into any bag. Faster, longer lasting, higher resolution-ed. Whatever means necessary to prevent it from impeding the user’s experience.15 So long as the result doesn’t wildly diverge from the norm (say, twenty-seven inch convertible desktop tablets or buttonless iPods), there’s otherwise little consumer attrition constraining the imaginations of industrial designers. Once in use, most of the physical aspects of our computers fade into the unconscious, out-shined by the attention its software commands. The burden for the software guys lives in that differing proportion of attention. Our relationship with software is so immediate that any atomic change to our literacy of a given UI elicits a larger and longer sustained reaction than any material changes made to our favourite products.16 We’re prone to blame, justly or not, the successes and failures of our computers on software. The feel of brushed aluminum matters more on our screens than in our hands.

Whether tangible or pixelated, fashion remains capitalism’s favourite child. Being able to tap into—or manufacture—the desires of an enormous aggregation of people is SOP for any company hoping to reach the rarefied company of Apples, Coca Colas, and McDonalds(s), even if the usefulness of their brand images don’t make significant contributions past enlarging the guts of the many and the wallets of the few. Yet for UI design, fashion is more than an agent for consumerism: it can solve crucial problems that define how meaningful technologies can be. It’s especially important in mobile computing, where rejection of a long history of desktop UI paradigms has renewed exploration of the ways in which we use computers and what we can accomplish with them.17 What worries me is the possibility that stagnation is penetrating a field that’s still trying to define itself. Even scarier is the possibility that this stagnation germinates from iOS, for the simple reason (personal allegiances aside) that Apple has up to now been the only major tech company with any proven track record of saving us from stagnant trends ,e.g., command line UIs, floppy drives, physical music, and desktop computing. The dilemma with skeuomorphism is that as major driving force for iOS’s success, it’s a design strategy that’s hard to argue against, let alone abandon. Therefore whatever new possibilities leading edge UI design is pointing towards, Apple’s role risks becoming reactive instead of proactive. My question then is whether or not—no matter how best of breed their products remain—having Apple so consummately dominate the mobile computing space is what’s best for the industry. I know the question seems rhetorical given the idiom that competition breeds innovation, but try and name any leading edge mobile platforms that have enjoyed success in any way similar to Apple’s: WebOS not just ruined but killed Palm. Windows Phone 8 is eroding what’s left of Nokia. Windows 8 in general has Microsoft and its OEM partners in a frenzy that proves why not all ideas aren’t created equal (again, like twenty-seven inch convertible tablet desktops marketed to moms and kids). Android as a commodity OS for hardware manufacturers has been a bestseller, but it has left the platform disjointed and lacking cohesiveness from one device to another. Android the stock, presented-by-Google, operating system is almost a misnomer given its relative obscurity to the public. The only thing standing between us and the troves of innovations the aforementioned have created is the painful truth that only Apple has a proven track record of being able to popularize them.

If John Gruber can be fooled into thinking Apple remains at the leading edges of UI design, it’s thanks to its 3rd party developers who’ve inadvertently earned the majority stake in maintaining iOS’s innovative and dazzling pedigree, inadvertently making them iOS’s greatest asset in the process. While Apple is happy to oblige with statistics about the ever enlarging successes of the App Store, little is mentioned about how the ever enlarging clout of the store is shifting the power dynamics of the developer/platform provider relationship. You might describe equilibrium like this: Apple provides a product and platform customers want to buy into, e.g., the iPhone, thereby attracting developers with the promise of an untapped audience. In return developers provide the platform with (sometimes) exclusive software that distinguishes Apple’s platform from others, keeping current customers in the fold and also attracting outsiders that want a seat at the table, e.g., anyone who wanted to use Instagram prior to April 2012.18 This feedback loop is self renewing as long as each player maintains their stride: a new desirable iPhone every year, followed by new apps that take advantage of its new features. Things challenging this balance: On one front, the other platforms are rapidly catching up to, and in some cases surpassing, iOS both software and hardware wise, strengthening their own feedback loops. On another, there’s the aforementioned trend away from skeuomorphism that, at least UI wise, is dulling the appeal of a sticking-to-its-guns iOS and denying developers19 the guidance needed to meet the needs of this new vogue. The latter puts in play a few consequences. If Apple isn’t at least mildly proactive about updating its UI interfaces and campaigning them through its Human Interface Guidelines, then developers are left to act upon their own whims. This lack of uniformity and convention means that a Retina-Resolution era of UI becomes defined as one thing by The Icon Factory and as another by Path, by Simple Bots, Marco Arment, Realmac Software, Flipboard, and every other designer attempting to navigate iOS’s future without Apple’s guidance. I’m already frustrated by the number of Twitter clients disagreeing on what actions left-to-right and right-to-left swipes are supposed to invoke. But here’s the bigger worry: Apple’s hardware edge notwithstanding, what if the only incentive to develop for iOS—or to own an iOS device—is the promise of an ecosystem controlled, determined, and made enticing primarily by developers outside Cupertino? How does Apple prevent a mass migration if (when) another platform comes around proving they can foster developers the same way iOS did back in 2008?20 It’s no small feat for the challengers, but we’re fast approaching this reality.21 Developers aren’t just Apple’s biggest asset then, they’re also its biggest liability. For almost 6 years to pass without Apple demonstrating little interest in updating its UI beyond restrained refinement, beyond what’s necessary to show up with at a yearly keynote event, is either brazen confidence bordering on negligence or a lack of tactical manoeuvrability.

This for me is the real intrigue—the delicate balance between reassuring users and guiding developers—that’s simmering beneath the Skeuo v. Flat debate. Because in 2013 it’s winning the software battles that matters. The challenge for Apple then is whether they can settle on a UI design that’s simple and familiar enough to assuage the large swath of its users who seek nothing else, yet also avant-garde enough to secure its role as the pace-setter of an industry fuelled by innovation. Such a balancing act requires a flummoxing understanding of the power of design and UI’s undisputed role as the nexus of computing today. A particular design decision can not only solve a particular user experience problem, it can also make or break entire corporations while spontaneously introducing new user experience problems we’re not even sure exist yet, begetting new decision solutions, which themselves may or may not solve other unknown user experience problems, introducing who knows what kinds of make or break challenges that will be the death of some companies and the birth of others. On most minds—to say nothing of mine—the entanglement of implications is like boiling water to oatmeal. Imagine if it we were talking about anything more than a trend.

1: I’m tempted to substitute “top-tier” for a one time non-pejorative use of high brow. The distinction is important because we aren’t dealing with a “this is what all the cool kids are doing” type of trend but a “we’re the kids that were doing this before all the cool kids were” kind of trend, one that isn’t responsible for making something mainstream but rather for influencing other designers who’s apps will eventually take it into the mainstream. See: The Devil Wears Prada

2: That Gruber relegates any mention of the Metro aesthetic to 10pt footnotes is pre-emptive of reader riposte at best and negligent at worst.

3: And I’d argue that outside influences of flat design on iOS are too obvious to ignore. Not only thanks to the prevalence of Google’s own apps on iOS, but through the growing popularity of horizontally swiped views that owe a lot to Android and WebOS.

4: No word yet on when Daring Fireball plans to join the retina-resolution era.

5:A mistake on the scale of “print magazines are just as easy on tablets!”

6: Although in a primitive sense we can work our way backwards from our digital user interfaces to the very analog control panels, knobs, levers, keypads, and switches we use to interface with a variety of tools and appliances, which we’ve ⌘c ⌘v into our software. Ergo.

7: Explaining why Gruber’s complaints are often directed at the misapplication—whether by design or laziness—of skeuomorphic elements to UI designs which aren’t skeuomorphic at all, e.g., Find My Friends.

8: Quoted from this Webstock’11 talk. Given Gruber’s apparent knowledge of the subject, it’s all the more suspect that as basic an argument as “style changes” doesn’t warrant the briefest mention in his essay.

9: See: The broach, overalls, fanny packs. The monocle.

10: Nostalgia perhaps, the kind that lets me defend my love of You’ve Got Mail on its historical merits and memories of my own childhood ePenpals. But lets be honest about the Apple Podcast app, and about You’ve Got Mail.

11: And the emergence of flat UI design on iOS proper is still so negligent that it’s hard to go along with a premise that casts Retina displays as the catalyst for designer agency in all this. When Gruber—unblinkingly I imagine—informs us that the Windows 8 interface is “meant to look best on retina-caliber displays”, you have to ask yourself whether you believe in the sort of conspiracies that say either Microsoft is so forward thinking it’s willing to push out a suboptimal product for 2 years waiting for Apple to rescue them, or is just carving another notch in the bedpost of their own folly by being cravenly inept.

12: The representation of physical elements through digital form has been around since the release of the original Macintosh, but it’s really in the last 12 years, since the release of OS X, that Apple has pushed this design philosophy into every corner, background, and window pane of its operating systems. The greater the technology, the greater the amount of physical mimicry Apple has added to its software.

13: Apple’s motto until now has been It Isn’t The Consumers Job to Know What They Want. Even when the iPod was at its peak, Apple showed a surprising disregard for maintaining continuity in the line, often radically redesigning a product within a single generation, and sometimes backtracking the following year when those new designs failed to catch on. Underscored here is the relative insignificance of the iPod software in relation to the physicality of the device. This proportion is reversed with iOS.

14: Hence the excitement over Ive’s recent promotion to director of human interface at Apple, given the decidedly leading edge and un-skeuomorphic style of the industrial design team Ive leads (manifested in their distaste for the philistine, superficial, and heavy-handed traps the accoutrements skeuomorphic design often fall prey to). I liken the situation to MJs decision to try baseball. Here’s a guy who possessed unique natural talents that would make him gifted in any sport he decided to get up for in the morning, yet which weren’t sufficient to finding immediate success versus the experience of his competitors. At the highest levels, all else being equal, experience trumps skill.

15: A topic Ive has broached in the iPhone 5’s introductory video, demonstrating the power of familiarity in user experience.

16: The Microsoft Surface is a perfect case study: incredible, innovative industrial design buried and ignored in the face of the radical changes introduced by Windows 8.

17: A small list of things we either don’t have at all, would have on a smaller scale, or probably would have waited longer to see introduced were it not for smartphones: Siri & Google Now, social networking on a global scale, the explosive ubiquity of digital photography, a gaming industry divorced from its tenured oligopoly, wearable computing, ubiquitous connectivity, geo-location based services, and Angry Birds.

18: An exact description of the video game market from the mid 80s up until 2005/2006, when the economics of making a first rate video game on the current generation of consoles made it virtually impossible to succeed unless it’s sold on every available platform, putting the kibosh on decades of schoolyard turf wars over which console systems were best. But its only made exclusivity that much more valuable. Nintendo’s IP is the only reason the company has any relevance today, if you need just one example.

19: You need only make your own list of restricted, convoluted, clamoured for but denied, or impoverished APIs that could otherwise enable developers to create apps even greater than they already are.

20: Continuing with the video game theme from 18, we’re now describing what Steam could do with the Steam Box, its bid for our living rooms. Valve not only has a Nintendo-like following around its game titles, its also got the best disc-less distribution system out there in Steam. There’s likely no better candidate to endorse in the “most likely to replicate for gaming what the iPhone did for mobile computing” race.

21: Observable (a Google search will emphasize my point better than a link) from the variety of essays and switcher articles on Android finally reaching parity with iOS. From a developer/platform feedback loop perspective, where not quite there yet. While most of the major players (Facebook, Flipboard, Twitter, Instagram, and Angry Birds) have Android versions of their apps, what’s still lacking are desirable exclusives that attracts large swaths of users and makes those on competing platforms jealous. Yet this kind of slow leakage threatens to turn into a flood; the greater the number of major developers on the platform, the greater the level of confidence developers have in it, the greater the odds of Android getting those exclusives. Combined with its superior web services and ever improving hardware, Android is slowly changing the conversation from “Why wouldn’t I get an iPhone?” to “Why should I get an Android device over an iPhone?” to “Why should I get an iPhone over an Android Device?”.

A Logo Too Many

Fragmentation within Microsoft design

I’m exhausting myself trying to use the “Metro” version of Windows. There’s something fresh and exciting about its vision of UI interaction that’s more appealing to me than Apple’s adherence to skeuomorphic dogma. It’s a testament to Metro’s appeal that it has me bending over backward to find excuses leave my iPhone behind and buy a Windows Phone, even if I know in the back of my head I’ll probably regret it. And that’s exactly how I felt last night staying up until 3am attempting to install the Windows 8 Developer Preview on my Macbook.

Even if I littered my opinions with the big grain of salt that should accompany any criticism of incomplete software, I could still predict that Windows 8, on its current course of development, is a clusterfuck. I could waste a few thousand words describing every bewildering detail of this clusterfuck but I can talk about one thing that will sum it all up for you: The Windows logo.


On the image above, you should be able to identify 3 different depictions of the Windows logo. Visit the Windows Phone website and you’ll find another. Microsoft decided Windows 8 shall have its own logo as well. That’s 5 different logos, all depicting a single ecosystem of software. Variations of a logo aren’t necessarily bad. As Juan Perez demonstrates in his own Windows rebranding experiment, there are way of branding individual services in a way that let them stand out while reminding the user “they belong to a common umbrella”. But there’s no single unifying design umbrella to Windows branding as it stands today. Or rather, there is an umbrella (the window graphic), but it’s one made up from the parts of various other umbrellas lying around the offices in Redmond. How effective could such a patchwork logo ever be? Like parents with their children, Microsoft seems to have a difficult time saying no to its designers the way it allows any and all logo designs appear on its products. You could become furious thinking about all the issues inherent with such an approach to design, even for something as minimally interactive as a logo. Now imagine the frustration you might feel when the actual product is treated the same way Microsoft treats its own logo.

There’s no use exhausting myself for an umbrella with a golden handle when the rest of it is made of paper and barbed wire.

(Parts of) Women in Design

Katie Gillum:

There are, just scratching the surface, three main issues with this title: standardization of vaginal softness, the cutting out of the whole female body to focus only on their vaginas, and the awkward half-sexualization of the product.

Important confrontation of gender issues in design? Check.

Unbelievable yet real product you can’t believe anyone ever pitched straight faced? Check.

Scathing wit to tie the whole thing together into an engrossing read? Check.

Brooks Review Goes No Logo

Ben Brooks launched a redesign of his website today, with a noticeably unorthodox omission: his logo.

His explanation:

The truth is that I dropped the logo on accident when I was designing the new site (forgot to paste back in the relevant code) and I kept designing without it. Then I realized it was missing and added it back in, hated it, and I removed it again — this time on purpose.

Unlike the Verge design, I’ve got nothing but praise - save maybe for the strange indentation of his Fusion ad - for the new Brooks Review. Logos on text driven websites serve almost no purpose and the new Brooks Review is significantly improved without one. The focus is now squarely on Brook’s writing, where it should be. That is precisely why the recent redesign of Smarterbits did away with it’s entire header beginning last week.

Brook’s may have stumbled onto this idea by happenstance, but removing the header and logo from my site was entirely intentional. As someone fancying themselves a writer, the Smarterbits brand should be the voice and quality of my writing, not my fantastic vector imaging skills. Besides, as many other tech writers can probably attest to, a large portion of my audience reaches my content not by navigating to my site, but through an intermediary like Reeder or Instapaper. Having proper branding on those intermidiaries is essential because that’s the easiest way for users to identify my content from among the many other authors in their RSS feeds or reading lists. If you follow a link to a new article on my site from my Twitter feed, its own branding is already alerting you to precisely who’s site you are being taken to. What’s the use for a giant banner on the top of my website to remind you what you already know?

Paying attention to reading habits and web design reveals exactly how wasteful a header can actually be. After adapting Frédéric Filloux’s study on advertisement’s effect on desktop web design, I realized that my own header was taking up half the screen real-estate on my iPhone, plausibly making it unclear to visitors exactly what they had landed on. Never mind advertising, my own branding was negatively affecting the functionality and purpose of my site. Considering that mobile web browsing is - I’ve read - rather popular these days, it felt rather uncouth that my site should render so poorly in that particular area. Rather than reducing its size, I simply decided to be done with it entirely. In doing so, it became impossible to justify its presence on any other version of my site. Now, anyone visiting Smarterbits, from any device, is greeted first and foremost with legible content and clear links to additional information from an un-intrusive navigation bar.

Having people know that I’m the author behind the content is a secondary objective; it’s something that occurs organically over time. Someone who finds my site but doesn’t enjoy the content or must waste time finding it isn’t likely to bother caring who’s behind it , other than to complain perhaps, no matter how beautiful the branding may be. And they certainly won’t come back or follow a link back to my site. On the other hand, someone who does enjoy reading what I have to say and has a pleasurable experience doing so is more likely to come back regularly, read more content, and share it with others - regardless of whether there is a logo or not to greet them.

I rather spend my time ensuring the latter scenario.

Designing for Mobile

Dave Olsen:

This focus, by removing unnecessary fluff and cruft to fit in the constraints of both the device real estate as well as network limitations, helps craft a better and more useful user experience. I think it’s a really interesting way to approach design and maybe we need more of that in higher ed.

Need not apply solely to designing for mobile. You can always benefit from the removal of unnecessary fluff and cruft.

(via Luke Wroblewski’s Mobile First )

Great Expectations

John Gruber:

It may or may not have ideally launched a few months sooner, but the plan was always for an iPhone 4 successor that looked like the 4 but had improved internal components. I wouldn’t be surprised if the next iPhone doesn’t change, or doesn’t change much, either.

Apple isn’t going to make a new form factor just for the sake of newness itself — they make changes only if the changes make things decidedly better. Thinner, stronger, smaller, more efficient. If they don’t have a new design that brings about such adjectives, they’re going to stick with what they have.

This seems to be Apple’s overarching philosophy when it comes to the design of all their products: regular iteration of the components and major aesthetic redesigns only when necessary and/or beneficial.

People’s expectations are a whole n’other thing of course. Watching all the complaints this week about the iPhone 4S retaining the same case design as the iPhone 4 is somewhat fascinating. There’s an expectation of the iPhone that none of Apple’s other products must endure.

Certainly, people don’t want a new case because the current one is lacking. If the iPhone 4’s design debuted this year instead of last, people would undoubtably be praising its merits now as they were then. Arguably, it is still the most iconic smartphone design you can buy today.

So then why the fuss over being stuck with it another year?

My guess? People have grown to expect “new” phones every year because for the longest time, aesthetics was the only factor cellphone OEMs could - or cared to - significantly change and control. Different colored candy wrappers on top of identical mounds of empty calories.

Apple has never been about flash over substance. And despite all the protests to the contrary, we all know what wins out in the end.

Huffington Post Doesn’t Value Design, Itself.

If ever you’re looking for a sign that your company, institution or client doesn’t value good design and creative work, there’s no better way than to find out if they’re running a contest inviting anyone to do free work for them.

Not only is the Post’s stupid gimmick contemptuous towards whatever in-house creative professionals they might employ, but it shows how little they care for their own brand. The loser of this contest might submit a nice logo, but I can almost guarantee it won’t be the right one for the Huffington Post needs.

Mike Monteiro said it best. If you have training, experience, passion, expertise, self-respect and respect for your clients, no matter what your field, get paid. Refuse to undervalue your time and skill. Refuse to work with people who don’t appreciate the importance and value of creative work. Saying no is just saying yes to something else. Something better in every way, shape or form.

As a photographer, I’ve experienced how tough it can be saying no to cheap or free work. We rationalize it by saying it’s going to pay off in experience or open doors to other opportunities. You are wasting of time. The best opportunities I got from pro bono work is more offers to spend my time and energy not getting paid and being frustrating.

Don’t waste your talent giving it to a company who doesn’t even care about its own image.

The iMac G4's Beautiful Guts

Fantastic look from a knowledgeable source on the small attention to details that makes Apple such a great design and engineering company.

Wish I was brave enough to take apart my PowerBook to marvel at its insides.

Thirteen Thousand

Pat Dryburgh:

The biggest lesson I’ve learned designing these ads is that constraint is your friend. There is absolutely no extra room in 13,000 pixels for unnecessary copy or graphics, so you are forced to pair down your message to the essentials.

Impressive what can be accomplished on such a small canvas.

More Reasons the Web is so Useful.

So I’ve been working for the last few weeks on building a website for my photography business.  Having no web design experience, I’ve been looking for some reference material to help me get up to speed with HTML/XHTML/CSS.  A few searches later and I stumbled onto this Introduction to Web Standards course on the Opera Developer site. Essentially, it’s the perfect primer to creating your own website if your into DIY.

It’s been around since 2008, but I feel it’s worth circulating around anyway for anyone who might not have heard of it.  It’s a great initiative on the part of the Opera team.  It still astounds me how much easier it is to gain access to information and education in the digital age.