page updated on April 02, 2014
I have seen something else under the sun: The race is not to the swift or the battle to the strong, nor does food come to the wise or wealth to the brilliant or favor to the learned; but time and chance happen to them all.
Imagine two products. One is beautifully designed. Its features are elegant. It's consistent through and through. If you looked at every piece down to its most irreducible core, you'd conclude that it is of the highest possible quality. The other product works, and that's the best you can say about it. It does the job.
Which product will win in the market?
Why, obviously the beautiful product should win. We want to believe that users reward quality—even the qualities they don't see, such as the underlying architecture of the source code of a software product or the micron-perfect layout of a circuit board in a piece of consumer electronics. We want to believe that, and yet we're proven wrong time and time again.
In 1973, Robert Metcalfe co-invented a short-distance networking standard
called Ethernet. One of his observations, later termed Metcalfe's Law, is that
the value of a network increases dramatically—more than
linearly—with the number of participants in the network. Every new person
in your network increases the total number of possible connections in your
network polynomially (
n * (n - 1) / 2). A network of 2 users has
1 connection, while a network of 3 users has 3 connections and 4 users 6 and so
on. (In practice, the
upper bound of a network may be described better by Zipf's Law, but that's
a technical refinement which does not affect this argument.)
With respect to products which succeed or fall on the strength of their markets, this law appears to be instructive. Imagine, if you will, the early days of the public Internet, when AOL was a walled garden with millions of subscribers and no way to access the hundreds of pages on the nascent World Wide Web. If Metcalfe's Law were to hold in this situation, every new web site available outside of AOL would have made the public Internet slightly more valuable—and it would reach a point where it was more valuable to the users inside AOL's private network than that private network was to the very same users.
Even if AOL's private offerings were better in terms of quality or features or ease of use or predictability of access or user interface, the value of the larger network dwarfed the value of the technical quality of the smaller network.
You can make the same argument for VHS over Betamax videocassettes in the 1980s or CDs over minidiscs or BluRay over HD-DVD: the smaller network loses, even if the technical qualities are better.
(Price is a factor, as is the desire for some customers to avoid relying on a single vendor, especially one like Sony. Then again, this argument applies even in the 2010s: a new video game console with no third-party games will struggle against a competitor which has successfully courted outside studios.)
Put another way, as much as people want to believe that they buy products of the highest technical quality, their actions demonstrate that it's not the primary concern.
You can see this effect in programming language adoption. Even though programmers tend to pride themselves on choosing "the best tool for the job", that's often a thought-killing cliché used as a post-hoc justification for personal prejudices.
When the Ruby programming language first had any English-speaking users, they had quite a time until Programming Ruby came out. Even then, they had a lot of work ahead of them. Ruby had very little English documentation. It had few libraries. Its tools were immature. Ruby had few users.
When Ruby on Rails became popular in January 2005, Ruby was still relatively immature. Its main implementation was slow and used a lot of memory. It was undertested. It didn't handle Unicode well. It had a larger—but not large—community. If you compared it to its competitors such as Java or Perl or PHP, it didn't have many advantages.
It did, however, have momentum. Unlike PHP, it had a design aesthetic. Unlike Perl, it seemed unified and straightforward to get started. Unlike Java, it seemed like easygoing fun. Rails didn't offer much that web programmers couldn't get elsewhere. (Rails didn't offer much that good web programmers hadn't already been getting elsewhere.) It moved fast. It broke things. It made mistakes and it eventually gained documentation.
Rails did one thing right for Ruby, however. Instead of trying to compete on terms of technical quality where it couldn't compete (lack of performance, lack of libraries, lack of documentation, lack of mature implementation), it competed on one thing: it had a consistent message. That message was "We're having fun. We're getting things done. It's easy to join us."
Whether this was a deliberate attempt to build a community quickly or just time and chance happening under the sun, it worked. Rails built a community that web developers couldn't ignore and it built up a Ruby community such that people began experimenting with Ruby solely because it was getting buzz.
Ruby's time to address technical deficiencies could come later. (It's still coming, in fact.) It grew by attracting a large community faster, however.
When Lisp was still a viable programming language (sorry, Lisp fans: not even ITA and Clojure prove that it is), Richard Gabriel wrote Worse is Better. Years later, your author met Mr. Gabriel. One gem from that conversation was Mr. Gabriel admitting that the first time he looked at the Java programming language, he was surprised to discover that it didn't include continuations.
As a highly capable programmer with tremendous practical experience using powerful languages such as Common Lisp and Scheme, the idea of programming without the ability to define his own control flow constructs was akin to typing wearing oven mitts.
Yet Java has more users than CL and Scheme combined.
There's a curious exchange in the Chuck Moore interview of Masterminds of Programming, where the interviewee says "Operating systems are dauntingly complex and totally unnecessary." In context, Mr. Moore is suggesting that a capable programmer should be able to write his or her own software from the bootloader through device drivers to editor and daily use applications.
Chuck Moore was an amazing programmer with a gift for making brilliant decisions and building high-quality software from small pieces with an economy of code. Neither you nor the author is Chuck Moore. We're far better off relying on experts in OpenGL to write drivers for our video cards or experts in file systems to write leveling code for our SSDs or experts in compilers to write more efficient and safe programming languages.
Goodness knows the days of any one programmer writing an entire operating system from scratch are over; would you trust your business's stack to someone who can hold all of the knowledge to implement a networking protocol, a relational database, a reporting system, and a user interface? Could you afford this person? Could you recognize this person? Could you wait until he or she finished all of that software?
Even if the world's best programming language were released tomorrow, how long before it supported the libraries you needed? How long before it had enough available developers to hire? (The size of the developer pool isn't always a good indication of developer quality, but a larger pool is generally better than a smaller pool.) Even a popular and high quality language such as Python struggles with a multi-year process of switching between two major versions.
By all means pursue technical quality. With everything else equal, a language or community of higher quality is better (by some measurement) than a language or community which does not take quality seriously. (PHP earned its reputation as an insecure language by failing to improve security for a long time. Perhaps it's unfair that PHP has retained that reputation, but strong impressions can stick around.)
If you want to build something that lasts and succeeds, however, you need to consider how you're going to attract, retain, and curate a community. Perhaps that's creating a powerful language extension system such as Perl's CPAN or being the default environment for a new platform (HTML5 for Firefox OS or VBA for Windows machines in office settings).
Often that means offering a good foreign function interface to C libraries.
Focusing on the soft and humane skills of attracting good contributors, mentoring novices, helping turn new programmers into experts who can themselves train others is difficult, and it may seem foreign to programmers who'd rather pride themselves on the measurable technical work of building up an imposing meritocracy. (That word's not used positively here.) While you can produce a benchmark to show that one patch is faster or more efficient or uses less memory than another, you can't reproduce community experiments. Once you've messed up your language by splitting the community, you can't undo the damage.
Managing a community is more art than craft, but it's vital to the success of any product. Especially when that product depends on a small army of free and open source contributors to develop, document, test, refine, deploy, and promote it, you're dooming your project to irrelevance without careful consideration of the support network you're curating or ignoring.Tweet