The trouble with innovation is that truly innovative ideas often look like bad ideas at the time. That’s why they are innovative — until now, nobody ever figured out that they were good ideas.
They focused on what the technology could not do at the time rather than what it could do and might be able to do in the future.
Don’t hate, create.
- Ben Horowitz
Being smart sounds right. So why should I be more stupid? Because the world is full of smart reasons why stuff can't be done.
Stupid, on the other hand, can only try.
I could sit here literally forever and wait for the perfect pitch, the one that is the easiest to knock out of the park, or I could just start swinging.
This concept reminds me of several quotes from the fantastic "Win Like Stupid" article I read a few months ago (hat tip Jason Waters). Some of my favorites:
TOO STUPID FOR CAN’T
STUPID PEOPLE NOT THINK ABOUT CAN’T WIN AT ALL. THEM JUST DO IDEAS UNTIL ONE CAN.
STUPID PEOPLE TOO DUMB FOR ODDS. THEM JUST ASSUME NEXT THING WILL WORK.
CHANCE IF NOT TRY EXACTLY NOTHING.
EVERY SMART PERSON TERRIFIED EVERYONE THINK THEM IDIOT.
STUPID PERSON ALREADY IS ONE, NOT MIND IF PEOPLE KNOW.
WORLD SMART. IT HARD TO OUTSMART WORLD. BE IDIOT. OUTSTUPID WORLD INSTEAD.
First guy: "Is he texting and watching a video at the same time?"
Second guy: "Hey, what are you doing?"
Third guy: "I'm texting and watching a video at the same time."
First guy: "You can't do that."
Second guy: "You can't do that."
Third guy: "And yet I'm doing it."
First and second guys: "Yeah - nice!"
That's right, this ad touts one of Samsung's newest features: texting and watching a video at the same time.
That should say "screen images simulated. Final version may actually line up correctly".
For an OS that seems to be marketed at robots (queue "Droid" voice), maybe this has been a long-time coming. But for humans like me, for whom input and output rarely mix, and who are already working on a screen the size of their palm, I have my doubts about the real-life usefulness of this feature.
My phone already plays audio at the same time as I can text or use other apps. In contrast to Samsung's latest, one feature I would love is for an easier way to pause the audio. When I'm listening to a podcast, I frequently take notes — but I can't do it at the same time, or I end up rewinding anyway.
But it's a feature, and features sell, right?
Is playing video on the same screen as a text composer difficult? Technically speaking, no. It might be a design challenge to make the experience seamless and handle the different modes. But even then, it's achievable.
It's a great read about avoiding featuritis. Ballard's premise:
Users are not really compelled by features; they care only about achieving their goals. The real cost of adding excess features to digital products is unhappy customers.
I understand that Samsung has to innovate (especially given recent events). But let's remember that innovation doesn't always have to be additive.
The Sad User Slope
Hitching your horse to the "features-sell" wagon is actually a very dangerous game.
This is aptly illustrated in Ballard's article with the Happy User Peak, originally published by respected author and creator of the Head First book series, Kathy Sierra.
Unfortunately, so many of these features actually fall on the Sad User Slope. Where do you think texting while watching a video falls?
What puts a product onto the happy user peak, or the sad user slope? Ballard's definition is simple:
What turns a pleasurable solution into a cumbersome tool is the number of
visible features that are irrelevant to the user’s goal.
Why this happens
These concepts aren't revolutionary. Sierra's post about featuritis is from 2005. Don't make me dig up a post about what phones were like then. (Hint: one term for them is feature phones.)
So, let's assume that Samsung isn't filled with idiots. Why is it so hard to resist adding features where the human value is so low — and then create commercials about them?
Knowing exactly what features will make an elegant solution depends on establishing a deep understanding of the users’ goals. When we fail to know what the users’ behaviors, motivations and goals are, we create complex and burdensome tools. This is because instead of first designing digital products with a real understanding of the user, we cast a wide net of features with the hope that we will capture a wide audience. We wrongly feel that the safe way to develop products is to provide as many features as possible. This is self-defeating to our goal of providing real solutions. The secret to success is to get to know your user and be specific with your solution. In short, focus on your customers’ goals and not on features. (emphasis added)
And why wouldn't we think that our users will want these features? After all, they tell us they do:
Users will almost always say yes to more features. “Would you like?” questions will generally be received with a positive response even if the feature will not help users be successful. The idea that “more is more” is so compelling, and users are unable to visualize the experience that will result from applying
Just say no.
Apple is famous for their focus. Recently, John Gruber linked to a fantastic article titled "The Disciplined Pursuit of Less", written by Greg McKeown for Harvard Business Review. In it, McKeown puts forward what he calls the Clarity Paradox:
Why don’t successful people and organizations automatically become very successful? One important explanation is due to what I call “the clarity paradox,” which can be summed up in four predictable phases:
Phase 1: When we really have clarity of purpose, it leads to success.
Phase 2: When we have success, it leads to more options and opportunities.
Phase 3: When we have increased options and opportunities, it leads to diffused efforts.
Phase 4: Diffused efforts undermine the very clarity that led to our success in the first place.
Curiously, and overstating the point in order to make it, success is a catalyst for failure.
In his own post, Gruber suggested that Apple's uncommon focus has allowed it to keep clarity and overcome this pattern.
With the resulting clarity, Apple keeps its products focused on the goal of providing exactly what users need.
Steve Jobs once said:
I'm as proud of the products we have not done as I am of the products we have done.
Replace products with features and you get a similar point. That sentiment is echoed by Jobs himself, as recounted by Ballard:
In a meeting between Steve Jobs and some record label representatives concerning Apple’s iTunes, Jobs kept fielding questions like "Does it do [x]?" or "Do you plan to add [y]?"
Finally Jobs said, "Wait, wait — put your hands down. Listen: I know you have a thousand ideas for all the cool features iTunes could have. So do we. But we don’t want a thousand features. That would be ugly. Innovation is not about saying yes to everything. It’s about saying NO to all but the most crucial features."
We don't yet know if this feature makes an ugly phone; Samsung's screens are simulated, but even that version doesn't give a lot of hope. Neither does the fact that users are having trouble finding how to use it.
Features don't need to be fancy. They don't need to be gimmicky or do new tricks. They just need to get done what the user wants or needs.
Just say no; and allow the features that are done right the spotlight they deserve.
Stripe is a web payments company whose engineering team get web security. They launched a hacking contest. Joseph Tartaro of IOActive has kindly compiled this writeup of the solutions.
It is a must-read for anyone interested in web security. Wait, scratch that — for anyone who even touches web application code.
In February, the engineering team at Stripe (easy, secure web payments) created the first Stripe Capture the Flag, a "security wargame" intended to test your ability to find exploits in vulnerable code. This event was largely based on understanding of Unix systems, C exploits, with one PHP exploit thrown in.
The original event was a huge success, with attention from Hacker News as well as Reddit (of course).
A few days ago the team released Stripe CTF 2.0 which they are calling "Web Edition". They stepped up the support systems for this one, with logins, a leaderboard, and public code on GitHub. But what's even better is the type of exploits that are covered:
The Retina MacBook is the least repairable laptop we’ve ever taken apart: unlike the previous model, the display is fused to the glass—meaning replacing the LCD requires buying an expensive display assembly. The RAM is now soldered to the logic board—making future memory upgrades impossible. And the battery is glued to the case—requiring customers to mail their laptop to Apple every so often for a $200 replacement. ... The design pattern has serious consequences not only for consumers and the environment, but also for the tech industry as a whole.
And he blames us:
We have consistently voted for hardware that’s thinner rather than upgradeable. But we have to draw a line in the sand somewhere. Our purchasing decisions are telling Apple that we’re happy to buy computers and watch them die on schedule. When we choose a short-lived laptop over a more robust model that’s a quarter of an inch thicker, what does that say about our values?
Actually, what does that say about our values?
First of all, "short-lived" is arguable, and I'd argue for "flat-out wrong". I don't take enough laptops through to the end of their life to be a representative sample, but I've purchased two PC laptops and two MacBook Pros. After two years of use, both PCs were essentially falling apart (hinges, power cords, and basically dead batteries) while the MacBooks were running strong.
My 2008 MacBook Pro did get a not-totally-necessary battery replacement after a year, but my 2010 has run strong for two years. I'd expect nothing less from the Airs or new MacBook Pro. So short-lived might be a relative characterization if anything, and only if you consider the need to pay Apple to replace your battery instead of doing it yourself a "death".
Second, and more important, this thought occurred to me: when we look at futuristic computing devices in movies such as Minority Report or Avatar, do we think "Neat, but those definitely don't look upgradeable. No thanks."
Do we imagine that such progress is achieved through the kind of Luddite thinking that leads people to value "hackability" over never-before-achieved levels of precision and portability?
The quote above is the summation of Wiens' argument that "consumer voting" has pushed Apple down this road, and that we need to draw a line in the name of repairability, upgradability and hackability.
I'd argue that Apple's push toward devices that are more about the human interface and less about the components is a form of a categorical imperative, a rule for acting that has no conditions or qualifications — that there is no line, there is only an endless drive towards progress: more portable devices that get the job done with less thinking about the hardware.
That is what drives descriptions like Apple uses in its product announcements: magical, revolutionary — not hacking and upgrading.
Approved this morning, Mac App Store tool Folderwatch — an app that monitors, syncs and mirrors important files automatically — included a small detail in its update notes, stating “Retina graphics” were a new addition to the app.
Rumor is that more than just the Airs will be getting Retina displays:
We reported earlier in the week that Apple may have plans to announce updates to many of its Mac computers at WWDC. The models to be updated include the 15″ MacBook Pro, the 11″ and 13″ MacBook Air models and the iMac.
All of the various portable models being updated are rumored to be getting a Retina display boost.
I've been thinking for a while that the next round of Macbook Airs are perfectly positioned for retina displays, so I think that's pretty much a given at this point. A colleague of mine is thinking that touchscreen displays might also make an appearance -- which would justify the scrolling direction switch in Mac OS X Lion last year.
Yes, nineteen. Zero is another significant number: the number of times I've downloaded the new version of Chrome.
Jeff Atwood wrote about this phenomenon, and the eventual paradise of the infinite version a year ago, stating that the infinite version is a "not if — but when" eventuality.
Chrome is at one (awesome) end of the software upgrade continuum. I've rarely noticed an update to Chrome, and I've rarely checked the version. A couple of months ago I did, when a Facebook game I was working on suddenly started crashing consistently. The problem was tracked to a specific version in Chrome, and within hours, I heard that an update was released. I checked my version again, and Chrome had already been updated with the fix.
This nearly version-free agility has allowed Chrome to innovate a pace not matched by IE, Firefox, or even Safari. I swear the preferences pane has changed every time I open it (not necessarily a good thing).
At the other end you have the enterprise dinosaurs; the inventory, procurement, accounting systems that are just impossible to get rid of — I'm thinking of one local furniture retailer which still uses point-of-sale and inventory systems with black/green DOS-looking screens on the sales floor.
Most other software industries fall somewhere in between, trying to innovate, update, or even just survive while still paying the backward-compatibility price for technical decisions made in years past.
Check out this gem alive and well (erm, alive, at least) in Windows Vista:
Look familiar? That file/directory widget has seen better days; it made its debut in 1992 with Windows 3.1:
Supporting legacy versions is not just a technical problem, though. For some companies and products it's just in their nature to be backward compatible. And sometimes to great success; take Windows in the PC growth generation, or standards like USB, for example. For some opinionated few, backward incompatibility is downright opposed to success.
And then there's Apple.
Apple may not be a shining example of smooth upgrades, but they aren't shy about doing it.
OK, so they aren't entirely unsentimental — Steve Jobs did hold a funeral for Mac OS 9, about a year after OS X replaced it on desktops.
All of this allows Apple to put progress and innovation above all else.
Steve Jobs, in his famous "Thoughts on Flash", made clear his view that Apple doesn't let attachment to anything stand in the way of platform progress:
We know from painful experience that letting a third party layer of software come between the platform and the developer ultimately results in sub-standard apps and hinders the enhancement and progress of the platform. If developers grow dependent on third party development libraries and tools, they can only take advantage of platform enhancements if and when the third party chooses to adopt the new features. We cannot be at the mercy of a third party deciding if and when they will make our enhancements available to our developers.
This attitude has paid off for iOS. Analysis of app downloads shows that uptake for new major versions is rapid. iOS 4.0 just barely got going before it was traded in for iOS 5.0.
The story is not the same for Android. MG Seigler reveals (via DF) that Android 4.0.x "Ice Cream Sandwich" has only seen 7.1% uptake in 7 months. As Seigler notes, Google is expected to announce the next version this month, which would likely occur before ICS even sees 10% adoption.
So at one end of the upgrade-nirvana spectrum, we have: Google. And at the other end: Google.
How could they be so different?
A lot of people want to draw tight comparisons between the mobile OS wars going on now and the desktop OS wars that Apple lost in the 80s and 90s.
Android is also "winning" in market share, with 51 percent of the US smartphone market vs. iOS's almost 31 percent, according to March 2012 Comscore data. Gartner numbers put Android ahead even further, with 56 and 23 percent, respectively (and this has been the story for some time).
Or is it?
Isn't all smartphone market share equal? What's interesting about these numbers is just how different they are from web usage numbers. Shockingly different.
"Smartphone" market share numbers suggest Android is winning, yet certain usage statistics still put Apple in a comfortable lead. Meanwhile, Apple seems quite comfortable blowing away growth estimates again and again while nearly every other device maker is struggling to turn a profit. What does Apple know that the industry doesn't?
Has anyone else noticed a lack of "dumbphones" (non-smartphones) around them lately? Android phones have largely replaced these non-smartphones or feature phones in the "free" section of major cell phone providers' online stores and store shelves alike. This means customers who walk into a store looking for a new phone or a replacement but not knowing (or really caring) what they are shopping for are often ending up with an Android phone — just like they ended up with feature phones in years past.
That's great for market share. But how is it great for business? Generally speaking, these people don't buy apps. They don't promote or evangelize. They aren't repeat customers. Their primary business value is to the service provider; not to the device maker, not to Google.
And they are extremely unlikely to update their operating system.
The good news to Google is that these users who don't upgrade don't represent a large cost to abandon, should they decide to make innovative, backward-compatibility-breaking changes in future versions.
The bad news is that third party developers will be afraid to support new (and possibly risky) versions, or break compatibility with old versions in order to adopt new features. All reasonable version distribution statistics, such as the new Ice Cream Sandwich numbers, show basically no adoption at all. So why take the risk?
I get the feeling that Apple will be happy to continue to innovate and bring new and better features to its customers; and Android will continue to match feature-for-feature on paper, but not where it counts: in the App Store.
The articles on Rands keep getting longer and longer, and as I’m finishing a piece, I worry, “Is it too long?” I worry about this because we live in a lovely world of 140-character quips and status updates, and I fret about whether I’ll be able to hold your attention, which is precisely the wrong thing to worry about. What I should be worried about is, “Have I written something worthy of your attention?”
Houzz.com on "magic mirrors" — computerized touch surfaces on the mirrors and windows in your home:
Magic mirrors and magic windows — in fact, magic glass surfaces all over the house — will soon become commonplace, thanks to breathtaking advancements in computers, computer interfaces and, of all things, glass.
Count me as a skeptic on the word "soon". This technology barely exists, let alone having a good reason to (yet).
Devices should be getting more mobile, not less. To be successful, innovations should also solve a problem. I don't remember having the urge to check my email while leaning over the bathroom sink. It's also fairly counterproductive to OCD-types like me: "Now introducing smudgy screens all over your house, not just in your pocket."
... there are a lot of folks who think gamification means pulling the worst aspects out of games and shoving them into an application. It’s not. Don’t think of gamification as anything other than clever strategies to motivate someone to learn so they can have fun being productive.
The ever-insightful Lukas Mathis took the thoughtful middle ground. After I read his piece, I started a draft of my own. Now that draft will probably never see the light because, although the whole article is worth reading, Lopp has in three sentences summed up my feelings.
Of course, if your purpose in using gamification is anything other than helping the user enjoy learning and to be productive, then you'd do well to hear from the critics how you might be making your users feel about your software.
Creating or expanding business relationships is not about selling – it’s about establishing trust, rapport, and value creation without selling. ...
Engage me, communicate with me, add value to my business, solve my problems, create opportunity for me, educate me, inform me, but don’t try and sell me – it won’t work. An attempt to sell me insults my intelligence and wastes my time. Think about it; do you like to be sold? News flash – nobody does. Now ask yourself this question, do you like to be helped? Most reasonable people do. The difference between the two positions, while subtle, is very meaningful.
Great article, and a lot to think about. A corporate goal like "increase sales by 50%" can be taken two ways.
One way would be to imagine that more people need to be convinced to buy your product.
The other way is to consider how you can add value and find the customers who most need what you are offering.
Create goals that communicate your actual intended action and aren't open to interpretation. Measurements should be customer happiness levels, not dollars of income (that may or may not have been pried from your customers' unwilling fingers).
Focus on the reason that your product exists and help it help people.
Looks like it's downfall-prediction time in the popularity lifecycle for Facebook.
Geoffrey James of Inc. Magazine published an editorial which argues that Facebook, unlike LinkedIn, is vulnerable to being ditched for "something 'cooler'":
LinkedIn is all about business and people's resumes. Because its scope is limited to fundamentally dull information, LinkedIn is simply not vulnerable to something "cooler."
Sure, somebody could launch a site similar to LinkedIn. (And I'm sure plenty of people have.) But why would the customer base bother to change? Nobody on LinkedIn cares about being cool. LinkedIn's beauty is that it's dull but functional–like email and the telephone.
Fair point. Being popular because you're cool does make you vulnerable to the next cool thing. But he doesn't make good arguments — or any at all, really — for his case that people have been using Facebook because it's cool.
Specifically, this argument falls flat for me:
Consumer-oriented social networking sites are like television networks: People will switch when there's something better on another channel.
Actually, consumer-oriented social networking sites are nothing like television networks. Exclusive content provides customers a reason to use more than one network. And, most importantly, there's no cost to switch to something better on another TV channel — in fact, it's common to switch back and forth.
Facebook, on the other hand, with years of posts, photos, and other social interactions (yes, many of them useless), as well as a large current audience, has a huge cost for a user that wants to "switch".
James does address that:
Frankly, I think it's just one online conversion program away from losing its customer base and becoming the next MySpace.
An "online conversion program" might provide a way to minimize the data loss, but Facebook has a much larger asset: market momentum.
ComScore analysis shows that 69% of North American internet users have Facebook accounts, according to a CNET article. People use Facebook because other people use Facebook — their friends, specifically. That's been one of the primary drivers of its virality, and it is now the reason for its ubiquity. In that sense, Facebook is more like email or the telephone than LinkedIn.
No online conversion program is going to move their friends over to the "next" Facebook.
A dead-simple, free, JSON API for retrieving the city and state from a zip code. And, a sample/beta jQuery plugin to show how easy it is to auto-populate those fields when the zip code is typed. Could this be an end to the years of extra-typing madness?
Great article in Forbes by Warren Buffett for their "When I Was 25" series.
Warren Buffett, age 25:
Although I had no idea, age 25 was a turning point. I was changing my life, setting up something that would turn into a fairly good-size partnership called Berkshire Hathaway. I wasn’t scared. I was doing something I liked, and I’m still doing it.
According to a meme being passed around Facebook, Pinterest and other sites in various forms, the historians who calculated the end of the Mayan Long Count calendar as December 21, 2012 forgot to account for leap days. Oops?
There have been about 514 Leap Years since Caesar created it in 45BC. Without the extra day every 4 years, today would be July 28, 2013. Also, the Mayan calendar did not account for leap year...so technically the world should have ended 7 months ago.
Unfortunately for those who want to disprove the doomsayers a few months early, this is completely wrong.
The Mayan calendar makes use of five different periods of time, starting with a single day (k'in). The next period, winal, is 20 k'in. Then, tun is 18 winal (360 days), k'atun is 20 tun, and finally, a b'ak'tun is 20 tun, which makes it 144,000 days.
As the meme suggests, this calendar does not include leap days. In fact, it doesn't have anything to do with solar years at all. At 360 days, a tun is a rough approximation of a solar year, but it's already 5 days off.
The "end of the world", according to doomsayers (and, in Mayan tradition, the end of the current age, or what they considered to be the fourth world), was after the current long-count calendar reached 22.214.171.124.0, or in other words, completed the thirteenth b'ak'tun (the digits in the calendar start with zero).
That means that the thirteenth b'ak'tun ends on the 1,872,000th day since the beginning of the calendar. Leap years or not, exactly 1,872,000 days.
So how did we get December 21, 2012?
Since we know that the 126.96.36.199.0 is equivalent to 1,872,000 days, we need to know exactly what date the Mayan calendar starts on.
Plug this into a handy calculator and you can see that December 21, 2012 is, in fact, 1,872,000 days from Julian date 584283.
Leap days included.
While it would be convenient and quite funny if some anonymous Facebook skeptic realized that historians and chronologists had "forgotten" to include leap days in the conversion from the Mayan calendar to our modern calendar, we can't rule out the end of the world quite yet.
Macs were (and are) just better. Not just because they were better built or put together, but because Apple was a better company. A braver company. A company that stood for higher ideals. When compared to the empire of Microsoft and the Dells, Sonys of the time, it simply felt like they were more right.
When I looked at that, it seemed like an injustice that Macs and Apple were the odd ones out. Like quality was being held back and barred a chance to shine just because the dominant gorillas in the room had so much power and inertia going for them.
You may or may not agree with this. You may even think this statement is ironic; that Apple is now the evil empire.
But think about this: If Apple is the gorilla now, who are they keeping down? Samsung? Sony? Microsoft? RIM? Google?
Don't get me wrong; Apple has made its fair share of blunders. However, the list of players in the personal computing industry that are actively proving their general ineptitude, indifference and/or outright malice toward their customers in one way or another is long and growing.
While Apple has certainly shown that at times they’ve let their power corrupt, they’re still guided by the fundamental principle we fell in love with: Superior products through superior design.
If Apple isn't that, the company with the higher ideal, to actually create a superior product, we are in a sad state of affairs.
Google Voice made a valiant effort to transcribe a voicemail I received that consisted entirely of hold music:
Hey. Hmm, hey. Thank you. Look forward to talking with you. Fine, hey right back with you. Hey, ciao they Yeah, hey Yeah, hey. Hello. Thank you. We're hoping we sincerely appreciate your patience. Please stay on the line and we'll be back in December.
Hey, yeah. Hello, hey bye. Hey, they. Bye. Hey, We appreciate your time and patience you stay on the line will be back hey hey. Bye, bye bye bye I did. Thanks for holding We appreciate your time and pasted. Please stay on the line and we'll be back in just a moment, bye hello. Bye bye But thank you for a likes.
Don't know why back. I was put Hello, Thank you for alone We look forward to talking with you soon. Please hold the line. I will be right back with you.
You remember those Magic Eye books from the 1990s? The ones where you'd look at them, relax your eyes, and a 3D picture would pop out? Saying that 3D movies are the future of cinema is like saying that Magic Eye books were the future of literature.