nickchaves.com

New Year's Resolution: Be Stupider

Thursday, January 2, 2014

""You miss 100% of the shots you don't take." - Wayne Gretzky" - Michael Scott

Before you read anything I have to say, go read "Can-Do vs. Can’t-Do Culture" (5 min read) by Ben Horowitz (founder, Andreessen Horowitz). I owe it complete credit for inspiring my one, and only, New Year's Resolution:

2014:
Be Stupider

The trouble with innovation is that truly innovative ideas often look like bad ideas at the time. That’s why they are innovative — until now, nobody ever figured out that they were good ideas.
...
They focused on what the technology could not do at the time rather than what it could do and might be able to do in the future.
...
Don’t hate, create.
- Ben Horowitz

Being smart sounds right. So why should I be more stupid? Because the world is full of smart reasons why stuff can't be done.

Stupid, on the other hand, can only try.

I could sit here literally forever and wait for the perfect pitch, the one that is the easiest to knock out of the park, or I could just start swinging.

This concept reminds me of several quotes from the fantastic "Win Like Stupid" article I read a few months ago (hat tip Jason Waters). Some of my favorites:

TOO STUPID FOR CAN’T

STUPID PEOPLE NOT THINK ABOUT CAN’T WIN AT ALL. THEM JUST DO IDEAS UNTIL ONE CAN.

STUPID PEOPLE TOO DUMB FOR ODDS. THEM JUST ASSUME NEXT THING WILL WORK.

CHANCE IF NOT TRY EXACTLY NOTHING.

EVERY SMART PERSON TERRIFIED EVERYONE THINK THEM IDIOT.
STUPID PERSON ALREADY IS ONE, NOT MIND IF PEOPLE KNOW.

WORLD SMART. IT HARD TO OUTSMART WORLD. BE IDIOT. OUTSTUPID WORLD INSTEAD.

Vinn Diagram of Real Life

Image from http://readwrite.com/2013/03/19/fake-grimlock-win-like-stupid

Overthinking is the enemy of doing. That perfect time to leave the gate will never come. Most of us have spent plenty of time learning how to not fall down. Just get on the track and run.

So, Happy New Year. The newer stupider me is going to have a great 2014!

Tuesday, October 23, 2012

Monday, September 24, 2012

Wednesday, September 19, 2012

"Texting and watching a video at the same time"

Thursday, September 6, 2012

Tonight I was interrupted with this ad, introducing the new Samsung Galaxy S III:

First guy: "Is he texting and watching a video at the same time?"
Second guy: "Hey, what are you doing?"
Third guy: "I'm texting and watching a video at the same time."
First guy: "You can't do that."
Second guy: "You can't do that."
Third guy: "And yet I'm doing it."
First and second guys: "Yeah - nice!"

That's right, this ad touts one of Samsung's newest features: texting and watching a video at the same time.

Screenshot of Samsung commercial

That should say "screen images simulated. Final version may actually line up correctly".

For an OS that seems to be marketed at robots (queue "Droid" voice), maybe this has been a long-time coming. But for humans like me, for whom input and output rarely mix, and who are already working on a screen the size of their palm, I have my doubts about the real-life usefulness of this feature.

My phone already plays audio at the same time as I can text or use other apps. In contrast to Samsung's latest, one feature I would love is for an easier way to pause the audio. When I'm listening to a podcast, I frequently take notes — but I can't do it at the same time, or I end up rewinding anyway.

But it's a feature, and features sell, right?

Is playing video on the same screen as a text composer difficult? Technically speaking, no. It might be a design challenge to make the experience seamless and handle the different modes. But even then, it's achievable.

So why not do it?

What better time to dig up my friend Steve Ballard's article "Designing Digital Solutions" (PDF).

It's a great read about avoiding featuritis. Ballard's premise:

Users are not really compelled by features; they care only about achieving their goals. The real cost of adding excess features to digital products is unhappy customers.

I understand that Samsung has to innovate (especially given recent events). But let's remember that innovation doesn't always have to be additive.

The Sad User Slope

Hitching your horse to the "features-sell" wagon is actually a very dangerous game.

This is aptly illustrated in Ballard's article with the Happy User Peak, originally published by respected author and creator of the Head First book series, Kathy Sierra.

Unfortunately, so many of these features actually fall on the Sad User Slope. Where do you think texting while watching a video falls?

The Sad User Slope

What puts a product onto the happy user peak, or the sad user slope? Ballard's definition is simple:

What turns a pleasurable solution into a cumbersome tool is the number of visible features that are irrelevant to the user’s goal.

Why this happens

These concepts aren't revolutionary. Sierra's post about featuritis is from 2005. Don't make me dig up a post about what phones were like then. (Hint: one term for them is feature phones.)

So, let's assume that Samsung isn't filled with idiots. Why is it so hard to resist adding features where the human value is so low — and then create commercials about them?

Ballard:

Knowing exactly what features will make an elegant solution depends on establishing a deep understanding of the users’ goals. When we fail to know what the users’ behaviors, motivations and goals are, we create complex and burdensome tools. This is because instead of first designing digital products with a real understanding of the user, we cast a wide net of features with the hope that we will capture a wide audience. We wrongly feel that the safe way to develop products is to provide as many features as possible. This is self-defeating to our goal of providing real solutions. The secret to success is to get to know your user and be specific with your solution. In short, focus on your customers’ goals and not on features. (emphasis added)

And why wouldn't we think that our users will want these features? After all, they tell us they do:

Users will almost always say yes to more features. “Would you like?” questions will generally be received with a positive response even if the feature will not help users be successful. The idea that “more is more” is so compelling, and users are unable to visualize the experience that will result from applying that approach.

Just say no.

Apple is famous for their focus. Recently, John Gruber linked to a fantastic article titled "The Disciplined Pursuit of Less", written by Greg McKeown for Harvard Business Review. In it, McKeown puts forward what he calls the Clarity Paradox:

Why don’t successful people and organizations automatically become very successful? One important explanation is due to what I call “the clarity paradox,” which can be summed up in four predictable phases:

  • Phase 1: When we really have clarity of purpose, it leads to success.
  • Phase 2: When we have success, it leads to more options and opportunities.
  • Phase 3: When we have increased options and opportunities, it leads to diffused efforts.
  • Phase 4: Diffused efforts undermine the very clarity that led to our success in the first place.

Curiously, and overstating the point in order to make it, success is a catalyst for failure.

In his own post, Gruber suggested that Apple's uncommon focus has allowed it to keep clarity and overcome this pattern.

With the resulting clarity, Apple keeps its products focused on the goal of providing exactly what users need.

Steve Jobs once said:

I'm as proud of the products we have not done as I am of the products we have done.

Replace products with features and you get a similar point. That sentiment is echoed by Jobs himself, as recounted by Ballard:

In a meeting between Steve Jobs and some record label representatives concerning Apple’s iTunes, Jobs kept fielding questions like "Does it do [x]?" or "Do you plan to add [y]?"

Finally Jobs said, "Wait, wait — put your hands down. Listen: I know you have a thousand ideas for all the cool features iTunes could have. So do we. But we don’t want a thousand features. That would be ugly. Innovation is not about saying yes to everything. It’s about saying NO to all but the most crucial features."

We don't yet know if this feature makes an ugly phone; Samsung's screens are simulated, but even that version doesn't give a lot of hope. Neither does the fact that users are having trouble finding how to use it.

Features don't need to be fancy. They don't need to be gimmicky or do new tricks. They just need to get done what the user wants or needs.

Just say no; and allow the features that are done right the spotlight they deserve.

Thursday, August 30, 2012

No, I don't want to download a new browser

Wednesday, July 25, 2012

Or, "Why you should not excuse a bad website with a list of 'approved' browsers"

This is what I got when I clicked an "Order Online" button on byutickets.com:

Screenshot

For reference, here are the current popular browsers and their latest version (since versions are mostly arbitrary in browsers, I've also included time since the versions in the image were surpassed), in rough popularity order:

  • Chrome (Version 20). Not Listed. Passed Firefox around version 15, 8 months ago.
  • Internet Explorer (Version 9). 17 months old.
  • Firefox (Version 14.0.1). Version 4 was released 17 months ago.
  • Safari (5.1.7/6.0 as of today). Safari 5 has been out for two years.

Do your users (and your reputation) a favor, and stop relying on "recommended browsers". Those should have died 10 years ago:

Free Microsoft Internet Explorer 3.0 Netscape Now! 3.0!

It's not hard. Just build your website:

  • A) to web standards that have staying power, and have already been stable for years
  • B) with progressive enhancement and graceful degradation in mind.

Thursday, July 5, 2012

Hackers gonna hate

Sunday, June 17, 2012

Kyle Wiens (of iFixit) doesn't like the new MacBook Pro:

The Retina MacBook is the least repairable laptop we’ve ever taken apart: unlike the previous model, the display is fused to the glass—meaning replacing the LCD requires buying an expensive display assembly. The RAM is now soldered to the logic board—making future memory upgrades impossible. And the battery is glued to the case—requiring customers to mail their laptop to Apple every so often for a $200 replacement. ... The design pattern has serious consequences not only for consumers and the environment, but also for the tech industry as a whole.

And he blames us:

We have consistently voted for hardware that’s thinner rather than upgradeable. But we have to draw a line in the sand somewhere. Our purchasing decisions are telling Apple that we’re happy to buy computers and watch them die on schedule. When we choose a short-lived laptop over a more robust model that’s a quarter of an inch thicker, what does that say about our values?

Actually, what does that say about our values?

First of all, "short-lived" is arguable, and I'd argue for "flat-out wrong". I don't take enough laptops through to the end of their life to be a representative sample, but I've purchased two PC laptops and two MacBook Pros. After two years of use, both PCs were essentially falling apart (hinges, power cords, and basically dead batteries) while the MacBooks were running strong.

My 2008 MacBook Pro did get a not-totally-necessary battery replacement after a year, but my 2010 has run strong for two years. I'd expect nothing less from the Airs or new MacBook Pro. So short-lived might be a relative characterization if anything, and only if you consider the need to pay Apple to replace your battery instead of doing it yourself a "death".

Second, and more important, this thought occurred to me: when we look at futuristic computing devices in movies such as Minority Report or Avatar, do we think "Neat, but those definitely don't look upgradeable. No thanks."

Do we imagine that such progress is achieved through the kind of Luddite thinking that leads people to value "hackability" over never-before-achieved levels of precision and portability?

The quote above is the summation of Wiens' argument that "consumer voting" has pushed Apple down this road, and that we need to draw a line in the name of repairability, upgradability and hackability.

I'd argue that Apple's push toward devices that are more about the human interface and less about the components is a form of a categorical imperative, a rule for acting that has no conditions or qualifications — that there is no line, there is only an endless drive towards progress: more portable devices that get the job done with less thinking about the hardware.

That is what drives descriptions like Apple uses in its product announcements: magical, revolutionary — not hacking and upgrading.

(Link via Daring Fireball)

Friday, June 8, 2012

The Android version problem

Tuesday, June 5, 2012

I'm writing this post from Chrome 19.

Yes, nineteen. Zero is another significant number: the number of times I've downloaded the new version of Chrome.

Jeff Atwood wrote about this phenomenon, and the eventual paradise of the infinite version a year ago, stating that the infinite version is a "not if — but when" eventuality.

Chrome is at one (awesome) end of the software upgrade continuum. I've rarely noticed an update to Chrome, and I've rarely checked the version. A couple of months ago I did, when a Facebook game I was working on suddenly started crashing consistently. The problem was tracked to a specific version in Chrome, and within hours, I heard that an update was released. I checked my version again, and Chrome had already been updated with the fix.

This nearly version-free agility has allowed Chrome to innovate a pace not matched by IE, Firefox, or even Safari. I swear the preferences pane has changed every time I open it (not necessarily a good thing).

At the other end you have the enterprise dinosaurs; the inventory, procurement, accounting systems that are just impossible to get rid of — I'm thinking of one local furniture retailer which still uses point-of-sale and inventory systems with black/green DOS-looking screens on the sales floor.

Most other software industries fall somewhere in between, trying to innovate, update, or even just survive while still paying the backward-compatibility price for technical decisions made in years past.

Check out this gem alive and well (erm, alive, at least) in Windows Vista:

Add Fonts in Windows Vista

Look familiar? That file/directory widget has seen better days; it made its debut in 1992 with Windows 3.1:

Open File in Windows 3.1

Supporting legacy versions is not just a technical problem, though. For some companies and products it's just in their nature to be backward compatible. And sometimes to great success; take Windows in the PC growth generation, or standards like USB, for example. For some opinionated few, backward incompatibility is downright opposed to success.

And then there's Apple.

Apple may not be a shining example of smooth upgrades, but they aren't shy about doing it.

Anyone who knows Apple knows they are just about the least sentimental bastards in the world. They'll throw away all their work on a platform that many were still excited about or discontinue an enormously popular product that was only 18 months old. They have switched underlying operating system kernels, chip architectures, their core operating system API, and notoriously and unceremoniously broken third-party applications on a wide scale with every new OS (both Mac and iOS).

OK, so they aren't entirely unsentimental — Steve Jobs did hold a funeral for Mac OS 9, about a year after OS X replaced it on desktops.

All of this allows Apple to put progress and innovation above all else.

Steve Jobs, in his famous "Thoughts on Flash", made clear his view that Apple doesn't let attachment to anything stand in the way of platform progress:

We know from painful experience that letting a third party layer of software come between the platform and the developer ultimately results in sub-standard apps and hinders the enhancement and progress of the platform. If developers grow dependent on third party development libraries and tools, they can only take advantage of platform enhancements if and when the third party chooses to adopt the new features. We cannot be at the mercy of a third party deciding if and when they will make our enhancements available to our developers.

Jobs also made it clear that this applies to Apple itself. "If you don't cannibalize yourself, someone else will," said Jobs in a quote pulled from the Walter Isaacon biography.

This attitude has paid off for iOS. Analysis of app downloads shows that uptake for new major versions is rapid. iOS 4.0 just barely got going before it was traded in for iOS 5.0.

The story is not the same for Android. MG Seigler reveals (via DF) that Android 4.0.x "Ice Cream Sandwich" has only seen 7.1% uptake in 7 months. As Seigler notes, Google is expected to announce the next version this month, which would likely occur before ICS even sees 10% adoption.

Two Googles

Two Googles

So at one end of the upgrade-nirvana spectrum, we have: Google. And at the other end: Google.

How could they be so different?

A lot of people want to draw tight comparisons between the mobile OS wars going on now and the desktop OS wars that Apple lost in the 80s and 90s.

If we do that, for a minute, we might actually see some similarities. Apple currently supports six iOS devices, with a total history of twelve. By some accounts, there may be as many as 3997 distinct Android devices.

Android is also "winning" in market share, with 51 percent of the US smartphone market vs. iOS's almost 31 percent, according to March 2012 Comscore data. Gartner numbers put Android ahead even further, with 56 and 23 percent, respectively (and this has been the story for some time).

Or is it?

Isn't all smartphone market share equal? What's interesting about these numbers is just how different they are from web usage numbers. Shockingly different.

This report by Forbes puts iOS web market share 62% with Android at 20%.

Similar data comes in when analyzing enterprise usage, as well.

Smartdumbphones

"Smartphone" market share numbers suggest Android is winning, yet certain usage statistics still put Apple in a comfortable lead. Meanwhile, Apple seems quite comfortable blowing away growth estimates again and again while nearly every other device maker is struggling to turn a profit. What does Apple know that the industry doesn't?

Has anyone else noticed a lack of "dumbphones" (non-smartphones) around them lately? Android phones have largely replaced these non-smartphones or feature phones in the "free" section of major cell phone providers' online stores and store shelves alike. This means customers who walk into a store looking for a new phone or a replacement but not knowing (or really caring) what they are shopping for are often ending up with an Android phone — just like they ended up with feature phones in years past.

That's great for market share. But how is it great for business? Generally speaking, these people don't buy apps. They don't promote or evangelize. They aren't repeat customers. Their primary business value is to the service provider; not to the device maker, not to Google.

And they are extremely unlikely to update their operating system.

The good news to Google is that these users who don't upgrade don't represent a large cost to abandon, should they decide to make innovative, backward-compatibility-breaking changes in future versions.

The bad news is that third party developers will be afraid to support new (and possibly risky) versions, or break compatibility with old versions in order to adopt new features. All reasonable version distribution statistics, such as the new Ice Cream Sandwich numbers, show basically no adoption at all. So why take the risk?

I get the feeling that Apple will be happy to continue to innovate and bring new and better features to its customers; and Android will continue to match feature-for-feature on paper, but not where it counts: in the App Store.

Wednesday, May 16, 2012

Wednesday, May 9, 2012

Thursday, April 26, 2012

Will LinkedIn outlive Facebook?

Monday, April 23, 2012

Looks like it's downfall-prediction time in the popularity lifecycle for Facebook.

Geoffrey James of Inc. Magazine published an editorial which argues that Facebook, unlike LinkedIn, is vulnerable to being ditched for "something 'cooler'":

LinkedIn is all about business and people's resumes. Because its scope is limited to fundamentally dull information, LinkedIn is simply not vulnerable to something "cooler."

Sure, somebody could launch a site similar to LinkedIn. (And I'm sure plenty of people have.) But why would the customer base bother to change? Nobody on LinkedIn cares about being cool. LinkedIn's beauty is that it's dull but functional–like email and the telephone.

Fair point. Being popular because you're cool does make you vulnerable to the next cool thing. But he doesn't make good arguments — or any at all, really — for his case that people have been using Facebook because it's cool.

Specifically, this argument falls flat for me:

Consumer-oriented social networking sites are like television networks: People will switch when there's something better on another channel.

Actually, consumer-oriented social networking sites are nothing like television networks. Exclusive content provides customers a reason to use more than one network. And, most importantly, there's no cost to switch to something better on another TV channel — in fact, it's common to switch back and forth.

Facebook, on the other hand, with years of posts, photos, and other social interactions (yes, many of them useless), as well as a large current audience, has a huge cost for a user that wants to "switch".

James does address that:

Frankly, I think it's just one online conversion program away from losing its customer base and becoming the next MySpace.

An "online conversion program" might provide a way to minimize the data loss, but Facebook has a much larger asset: market momentum.

ComScore analysis shows that 69% of North American internet users have Facebook accounts, according to a CNET article. People use Facebook because other people use Facebook — their friends, specifically. That's been one of the primary drivers of its virality, and it is now the reason for its ubiquity. In that sense, Facebook is more like email or the telephone than LinkedIn.

No online conversion program is going to move their friends over to the "next" Facebook.

Tuesday, April 10, 2012

Monday, April 9, 2012

Not so fast, 2012-deniers

Tuesday, March 6, 2012

Don't believe the don't-believe-the-hype hype.

According to a meme being passed around Facebook, Pinterest and other sites in various forms, the historians who calculated the end of the Mayan Long Count calendar as December 21, 2012 forgot to account for leap days. Oops?

This is even being picked up by some news outlets. The International Business Times reported it with the attention-grabbing headline, "Why the end of the world will not happen on December 21, 2012":

There have been about 514 Leap Years since Caesar created it in 45BC. Without the extra day every 4 years, today would be July 28, 2013. Also, the Mayan calendar did not account for leap year...so technically the world should have ended 7 months ago.

Unfortunately for those who want to disprove the doomsayers a few months early, this is completely wrong.

The Mayan calendar makes use of five different periods of time, starting with a single day (k'in). The next period, winal, is 20 k'in. Then, tun is 18 winal (360 days), k'atun is 20 tun, and finally, a b'ak'tun is 20 tun, which makes it 144,000 days.

As the meme suggests, this calendar does not include leap days. In fact, it doesn't have anything to do with solar years at all. At 360 days, a tun is a rough approximation of a solar year, but it's already 5 days off.

The "end of the world", according to doomsayers (and, in Mayan tradition, the end of the current age, or what they considered to be the fourth world), was after the current long-count calendar reached 13.0.0.0.0, or in other words, completed the thirteenth b'ak'tun (the digits in the calendar start with zero).

That means that the thirteenth b'ak'tun ends on the 1,872,000th day since the beginning of the calendar. Leap years or not, exactly 1,872,000 days.

So how did we get December 21, 2012?

Since we know that the 13.0.0.0.0 is equivalent to 1,872,000 days, we need to know exactly what date the Mayan calendar starts on.

According to astrologer John Major Jenkins, an archeologist named J. Eric. S. Thompson determined that 0.0.0.0.0 corresponded to the Julian date 584283.

Plug this into a handy calculator and you can see that December 21, 2012 is, in fact, 1,872,000 days from Julian date 584283.

Screenshot from Julian Date calculator

Leap days included.

While it would be convenient and quite funny if some anonymous Facebook skeptic realized that historians and chronologists had "forgotten" to include leap days in the conversion from the Mayan calendar to our modern calendar, we can't rule out the end of the world quite yet.

Friday, February 3, 2012

What hold music sounds like to Google Voice

Sunday, January 29, 2012

Google Voice made a valiant effort to transcribe a voicemail I received that consisted entirely of hold music:

Hey. Hmm, hey. Thank you. Look forward to talking with you. Fine, hey right back with you. Hey, ciao they Yeah, hey Yeah, hey. Hello. Thank you. We're hoping we sincerely appreciate your patience. Please stay on the line and we'll be back in December.

Hey, yeah. Hello, hey bye. Hey, they. Bye. Hey, We appreciate your time and patience you stay on the line will be back hey hey. Bye, bye bye bye I did. Thanks for holding We appreciate your time and pasted. Please stay on the line and we'll be back in just a moment, bye hello. Bye bye But thank you for a likes.

Don't know why back. I was put Hello, Thank you for alone We look forward to talking with you soon. Please hold the line. I will be right back with you.

Monday, January 9, 2012

Saturday, January 7, 2012

Thursday, January 5, 2012