“The way to have good ideas is to have lots of ideas.” That’s one of my favorite axioms and, in my experience, it is universally true. I have many ideas, every day, and some of them are very good. Mostly, though, they are bad.
A small fraction of my good ideas made it to market, but time spent on good ideas is never wasted. There’s always abundant insight to be gleaned from working on a promising thought, and sometimes working on an auspicious idea can lead to other, even better ones.
I’ve wasted plenty of time, though, pursuing my bad ideas. The time and attention I’ve invested in bad ideas in the quixotic hope that they will somehow morph into good ones has been, by far, my biggest waste. Not only did it cost me time and effort, but I could have been working on something much better instead.
Pursuing bad ideas instead of good ideas is a significant and largely hidden problem of innovation. Economists call the waste “opportunity cost.” It’s the cost of what you didn’t do while you were busy doing something else. That is, what good idea did you ignore while you were busy working on a bad idea. I would argue that opportunity cost is the most expensive in all of business.
The obvious solution is to only invest time on good ideas, but that isn’t a realistic solution because of the conundrum of innovation:
Bad ideas often look really good in the beginning, and that’s when good ideas almost always look bad.
For example, the people who worked hard on the Microsoft Zune really thought at the time it was the best music player ever, and many observers of Google a decade ago thought it was just another silly Web startup with an equally silly name.
Frankly, it’s really difficult to find good examples of this phenomenon because of some very powerful cognitive illusions. In hindsight, all good ideas look good and all bad ideas look bad, even though this is not at all the case in the heat of the moment.Generally, ideas are cheap and plentiful, therefore so are good ideas. Unfortunately, bad ideas are the most plentiful of all. While success depends on developing good ideas, consistently successful companies and individuals are skilled at both identifying and discarding bad ideas to devote resources to the good ones.
Unfortunately, the only way to determine if an idea is any good is to spend time and attention developing it. The conundrum comes from having the clarity to finally recognize an idea as bad and yet also having the strength to abandon it immediately. Tragically, it’s human nature to grow attached to the things we spend time and attention on, and to not want to discard the things we have grown attached to. It’s even harder for an organization to make a tough decision like that (This is yet another one of those cognitive illusions: We think, “I’m working on it, therefore it must be good”).
You have to be tough enough to nurture your idea to the point where its fatal flaws become visible, then ruthlessly put it out of your misery. The author William Faulkner said, “Kill your darlings.” Any writer (and coder) knows that you have to ruthlessly cut some of your most cherished paragraphs if they don’t move the reader directly to the heart of your story.
At bottom, investing our time and attention to develop new ideas is what all post-industrial workers do. As self-motivated workers, what keeps us going is the opportunity to devote ourselves to something we truly care about. And when what we care about turns out to be one of the plentiful bad ideas, we have to man up and euthanize it. That is pretty damn hard to do.
Thus I arrive at the inevitable conclusion that the really tough problem of innovation is waste disposal.
Despite the desire for efficiency, there is simply no shortcut to determining if an idea is any good. After some effort, it will reveal its true nature, but you can’t rush the process. Our intuition and common sense simply cannot be depended upon to discriminate between raw ideas in advance of empirical evidence.
Most organizations, recognizing the magnitude of opportunity cost, try to avoid it. They select promising ideas to pursue and, before they invest in them, bad ones to discard. This looks good on the surface, as intelligent people make sensible decisions based on reliable thinking. The only problem is that it doesn’t work. Those decisions are about as accurate as flipping a coin, and their arbitrariness tramples morale.
Discarding presumptive bad ideas before they have been tested protects knowledge workers from the pain of discarding their work. Sadly, it also precludes those workers from achieving any significant level of creativity. In other words, in a creative organization, any effort towards efficiency generally dooms innovation.
In a culture that celebrates only winners, it’s difficult leading a team of knowledge workers into a dead end project and then extricating them without blunting their morale. We must instead create a culture that celebrates waste disposal with the same enthusiasm. Instead of rewarding the pursuit of efficiency, we must instead reward effectiveness, which includes discarding our favorites even more than it does pursuing them. We must recognize the value of thinning the herd.
At its heart, this is a problem of cultural change, and the biggest change is redirecting our attention from the bottom line to the top line. It is better to give our creative employees the opportunity to achieve stellar success than it is to prevent them from a brush with disaster. In other words, it is better to waste resources and win, than it is to conserve resources and lose.
Alan Cooper is the co-founder of Cooper and a pioneer of the modern computing era. He created the programming language Visual Basic and wrote industry-standard books on design practice like “About Face.”