page updated on December 31, 2020
The early days of the (public) Internet felt like living in the science fiction world of a Bruce Sterling or William Gibson, whether finally being able to download a web browser with image support (Slipknot?) or realizing that you were logged into a machine in Australia from the middle of farm country USA and it was responding to your FTP commands.
To everyone who'd learned how to type on an old PS/1 computer running WordPerfect for DOS—with the little plastic overlay giving you the keyboard shortcuts for formatting hints—and for whom Alt-F3 "Reveal Codes" made word processing make sense, HTML was the easiest thing in the world to learn (even if the best design you could do was center things with tables).
Your author remembers the first time he came across a web page which changed when you reloaded it. Some university student had set up the official homepage for a band and every new page view displayed a different snippet of lyrics. These web pages weren't simple documents. They were programs. This was science fiction. This was living in the future.
Then came forms and server side processing and your author graduated from college and learned Java and scrounged together spare parts at work to build his own Linux machine and learned that the Internet wasn't magic science fiction. It was all programs and protocols, and the greatest magic protocol of them all at that time was HTTP, especially when combined with the new HTML extensions for form submissions and the Common Gateway Interface that let a web server launch programs to process form parameters and produce a fresh batch of HTML.
CGI programs could be anything. You could write them in a shell script on an HP-UX or Irix box. You could write them in C. Netscape tried to make money selling server software where you could write them in JavaScript (and oh how everyone laughed).
Like almost everyone in the late '90s during the Internet boom, your author learned Perl.
Perl was already everywhere, at least on the serious machines you wanted connected to the Internet. (Microsoft was still struggling to understand why you'd want to connect to the Internet. Then again, in the 2010s, Microsoft started to wonder why you would want a computer which didn't behave exactly like a tablet. Legions of gamers cringe as they consider a Halo 5 where you must tap your TV screen to fight off the Covenant and Prometheans.) Perl was everywhere because it was a good language for system administration, bridging the gap between C and shell. Shell was a terrible language, mostly because it didn't run the same way everywhere and it didn't handle things like variables or functions very well.
Perl was everywhere, at least in the Unix world. It was good at manipulating strings. Back in those days, HTTP and HTML were all stringy protocols. You get a string, you parse it, and you emit a string. It was a mess, but it was a simple mess and it was a small mess, so you could get away with it—more importantly, you could get your work done before you figured out where you'd gone wrong with string handling in C.
Perl was on version 5 of the language, as exemplified by the second edition of a book called Programming Perl. You could walk into a room full of techie cubicles and find that book on at least one desk, all dog-eared and full of Post-its, because in any group of a couple of dozen engineers/programmers/techies in those days, one of them was running a little webserver on a Linux box under his or her desk and odds are he or she was using Perl.
The code wasn't great. A lot of it was copied and pasted and modified and downloaded and modified again from the kind of technical support forms that StackOverflow has mostly thankfully (and almost entirely humorlessly) supplanted.
But it worked.
Perl did have the advantage of momentum and ubiquity and efficacy in those days. Then it lost. It seems competitive languages have eaten its lunch. Why?
You can make the argument (and you're correct to do so) that people didn't learn Perl for the sake of learning Perl. You can also make the argument that people didn't really learn Perl at all—they learned enough to modify programs that almost worked right to work a little bit righter. You can argue that, but you're missing the important point: that's what people always do. It looks different on the web now these days with bigger frameworks and more choices and Google and StackOverflow replacing internet relay chat and random web forums, but it's not materially different now than it was then.
Goodness knows, Perl's core documentation is an updated version of the venerable Camel book: thousands of pages of instruction on how the language works and how to use it, at the fingertips of everyone who used it. If you want a book, you can get one, but it ships with one.
Some people argue that learning Perl is easy, with the right training, such as O'Reilly's Learning Perl or Modern Perl 4e, but that argument relies on improvements made in the 2010s, after Perl had started to stop winning.
Perl as a language had its quirks and flaws. It still has some of those flaws and it definitely has those quirks. Perl's ease of using external binaries within programs made it great for gluing things together, but there were multiple documented cases of security holes and unintentional bugs from people using backticks in scalar or list context and not knowing the difference and exposing sensitive information.
People writing code by copy and paste and modify didn't learn the nuance of the Perl philosophy which makes context important in part because people writing code in that fashion don't learn the philosophy of any language and in part because no Perl tutorial really explained the philosophy of the language in an accessible fashion until 2010's Modern Perl.
Is learning Perl hard? No. Not learning it is the hard way to learn it, but Perl is dead easy to learn just enough to be useful.
The execution model of CGI programs was simple and awful in its simplicity: the web server would launch a new separate program and provide all of the data received by the web browser to that program. It would listen for data produced by that program and send it to the web browser. That doesn't sound awful, but if you're reading this, you've probably replaced a couple of smartphones with many times more memory and processor power than the web servers back in those days had.
Launching a new program for every page requested was a little bit expensive once your site became a little bit popular.
Even though that model of operation is conceptually straightforward to understand (just launch a program), configuring your web server to execute these programs (written in any language by any user) wasn't as easy as it sounds. If you had a good system administrator, he or she would have set up the web server to give every user a specific directory in which to put these CGI programs where they'd run but not expose the world to security holes. If you had a Linux box under your desk and you were your own system administrator, you had to learn about directory structures and permissions and paths and the details of how your web server was compiled and configured. Again, it was relatively straightforward, but there was a lot to learn and there were a lot of ways things could go wrong.
None of this was Perl's fault. It was a consequence of the Unix model meeting the Internet and a lot of intelligent people figuring out the simplest way to get something working quickly without necessarily thinking about the best way to build something which would have to last for 20 years and counting. (See also DNS, SMTP, FTP, NNTP, IMAP, IMAPS, SSL, TLS, et cetera.)
Web programs were easier to write in Perl than they were in C because Perl, as a language, made them easier, but the dominant execution model of Perl programs was slow and expensive and difficult to configure.
A web server is just a program, however. If you modified the program, you could make it do anything—including connecting it with a different programming language. In fact, you could put Perl in the web server so that you could simplify the configuration of these programs and not pay the price of loading every program anew for every request.
This is what mod_perl did. Of course, mod_perl was part of making the Apache httpd web server more configurable by giving it a plugin architecture where you could write extensions to the web server to customize every part of the web server request and response cycle, so mod_perl became less a way to make writing and deploying Perl programs simpler and faster and cheaper and more a way to write your own extensions to Apache httpd in Perl instead of C (because, again, Perl was a better language for web programming than C).
This was great, until it wasn't. It addressed part of the problem but it didn't address other parts of the problem. Installing and configuring mod_perl wasn't trivially easy. It was downright difficult in some cases. You could get a speed boost from using it, by trading some memory up front for less memory used later, but every program attached to mod_perl could potentially access the memory space of every other program attached to mod_perl, even if both programs belonged to different users.
mod_php solved this problem. PHP was a terrible language back then, good mostly for making very simple templates which represented web pages. PHP's memory model fit mod_php better, though. PHP's deployment model was a lot simpler, too. No one wanted to write Apache httpd extensions in PHP because it was a template language, so there was no pressure to make mod_php anything other than a very simple template processor which happened to be easy to deploy.
In fact, it was so easy to deploy that people used it. System administrators set it up because it was so easy to do they couldn't not do it. People began to answer questions about "How do I make a dynamic web page?" with PHP answers because they didn't mostly have to answer the question "How do I configure my web server to deploy this CGI program I wrote in PHP?" and because they never had to answer the question "How do I bribe my system administrator to enable CGI execution on our web server?"
mod_php may not be the best way to deploy programs in 2020, but it was the easiest way for non-programmers to deploy programs in 1999 or 2000 and it's still difficult to beat. (Some people may argue that deploying with Docker or Kubernetes or Heroku or Apache Lambda is even easier in 2020, but people in 1999 or 2000 didn't have to learn CVS, let alone API keys, software as a service, dependency manifests, or branch management in distributed version control systems to deploy simple applications to use FTP to drop a .php file in a shared folder.)
PHP didn't get much respect because it wasn't a real programming language—and why would you want to use software deliberately simple, only able to implement a templating language, when you could use software that let you write your own plugins to customize the entirety of Apache httpd's request and response cycle? Not that you would, but you could?
The other question people never asked was "how long does it take to learn Perl"? For those who needed it, just long enough to do a job. For those who didn't want to put in the time, it didn't matter.
Perl 5 also had a distinct advantage in the CPAN, a massive library of free software you can download, use, modify, and deploy in your own programs. Perl had this right for a long time. Perl advocates long argued (and some presumably still do) that the large library of CPAN is the reason to use Perl.
This may have been true at one point, but it's an argument that gets weaker over time. Competitive languages compete in many sectors, and dominance in one doesn't mean dominance in all.
A module ecosystem is an advantage when it contains what you need to get your job done. If you need a library and it doesn't exists, you have a few choices. You can do without. You can switch to another language or toolkit which has that library. You can build your own. If a language community has enough skilled people with the time and desire and inclination and permission to write their own libraries, the module ecosystem will grow. When there aren't enough people (the ecosystem is too new or the ecosystem is in decline), those libraries won't get written.
In the olden days, you could expect to find a Perl module for most anything you wanted to do: scrape Google, buy or sell things on eBay, check your Amazon rank, whatever. You could also find Perl modules to format text, work with ancient mainframe data specifications, manipulate images in weird formats, whatever. You could find these, because there were enough people using Perl by the numbers that the small percentage of users capable of and interested in producing these modules would do so.
You need large absolute numbers of users to grow a library. That's why library ecosystems for languages like Python, Ruby, Node.js, and Go have grown large in recent years. They may have started with fewer modules than existed on the CPAN at the time, but given enough users in absolute numbers, you'll eventually get the libraries you need.
Keep in mind one other piece of nuance: the modules that written in 2020 aren't necessarily the modules written in 1999. It's less useful to have a module which understands the Gopher or Archie protocols in 2020 than it was 15 years ago, whereas it's unlikely that anyone in 1999 would have been prescient enough to produce a Perl module which understands Stripe's JSON over REST API. In other words, you tend to get in any year the modules that people need now. Just because CPAN was dominant in 1999 or 2004 doesn't mean that it has what people want in 2020.
Ted Neward makes the point that libraries have to get written, and it's a good point, but writing a language-specific wrapper around a C library (as one did in the 20th century) is different from writing an API client that understands REST, HTTP, GraphQL, XML, and/or JSON (as one does in the 21st century). Perl's ability to manipulate text is a lot less of an advantage when data interchange takes place in well-understood and highly structured ways, and Perl's ability to speak Unix and C at a low level is a lot less of an advantage when RPC is more important than in-process linking.
The concern gets more subtle as you consider the lifecycle of products begun at different times. A new project started in 2020 is likely to use different components (and protocols and design patterns) than a project begun in 1999. For example, the ORMs available in 1999 were... rare. The web frameworks available in 1999 were... rare. The user management or credit card payment systems available in 1999 were... well, they weren't Mozilla's Persona or Stripe.
The modules used by a project first begun in 1999 and maintained until this day will probably not reflect the modules used by a project first begun tomorrow. (It's possible to continue to refactor the design and implementation of an existing project such that it evolves through the years to take advantage of new techniques and new opportunities, but ask yourself how often that really happens and how much work you're willing to put in to convince business interests and developers to do so instead of eventually scrapping things in favor of a fresh rewrite in a new language or at least a new architecture.)
The implications for a programming language are difficult to prove conclusively, but simple to explain: an ecosystem focused more on maintaining existing projects than creating new projects will be less attractive for new projects. This is even more true socially than it is technically: an ecosystem with buzz ("Should you write your next project in Node.js? How about Swift? Go?" on the cover of your favorite IT rag) is more attractive than one you first heard about 15 years ago ("Yep, still using C++ at Google.")
There's a corollary that seems right, but it's a little lazy. What's the right language to use? Why it's one that's already used in the majority of businesses!
What language is that? Just about every language, depending on how you define "use" and "majority" and "businesses". Reject that framing. What's commonly used is commonly used, and what's used commonly is used commonly and they're not the same thing.
New developers—dabblers and dilettantes, even—don't necessarily care about the same things experienced developers do. An experienced developer ought to think about safety, about simplicity, and about the long term maintenance costs of design decisions. Dabblers want to get things done now, as easily as possible, without having to learn more than they need to. A week spent figuring out how to configure a web server is a week wasted, when they could be on to the next thing without their boss breathing down their necks.
Ease of getting started is big. Ease of finding modules or plugins or whatever kind of code you can download and modify is really important. The Perl community did a good job of this early on (and CPAN still has advantages in testing and documentation standards that aren't easily replicated), but Perl lost when it switched to maintenance mode.
Laziness isn't a virtue when trying to curate a community. It's a vice. Laziness is something a community has to cultivate on behalf of community members. Releasing a module for a community as free software is a lot of work, but it enables a greater laziness among other people. So does reporting or fixing a bug or writing documentation. Enabling the laziness of others is an act of love.
Open source proponents want to believe that enlightened self interest will eventually somehow fill in the gaps such that your laziness can be enabled by my hard work and vice versa. (One can point to superluminary projects such as the Linux kernel as an example, but one can just as easily point to tens of thousands of less successful projects as examples that making this work is very, very difficult. Yes, Usain Bolt is fast, but 100% of people in the world aren't as fast as Usain Bolt, to a great degree of accuracy in decimal placement.)
Enlightened self interest—especially if that self interest depends on mundane concerns such as "I have a day job" and "I need this for my day job" and "I need my day job to pay my mortgage and buy food for my children"—focuses on specific concerns related to what you're doing right now. If you're creating new things, your concerns might be modules for new technologies and techniques that other people want right now. If you're maintaining existing code, then your concerns might be refining existing things that other people might have wanted when you started your project those many years ago.
In other words, an ecosystem will slowly become less attractive as a base of new projects unless it has a continual focus on developing new projects. (This doesn't even take into account the difficulty of hiring good developers—without an attractive ecosystem for a language, how can you hire junior developers?)
Don't neglect the ease of getting started or the ready availability of support for IDEs and development tools. You may or may not like how VSCode took over the world, but if your preferred language doesn't have LSP plugins, it's behind.
Some people make the argument that Perl was always a stopgap technology because of syntax and semantics. Ignore that argument. Consider the dominance of PHP and especially JavaScript. Neither language is an example of careful language design, nor does either language demonstrate modern research into modern programming languages. (One might joke that both languages are stuck in the computer science of 1972 at best. One would be wrong to do so, but one could make that joke.) Yet in both cases, technical quality is a secondary consideration to product market fit. Both languages succeeded because they're everywhere and both languages continue to be everywhere because they are everywhere.
(PHP? Everywhere? Yep. WordPress is still a big deal and the ease of deployment of PHP is still a big deal.)
Perl dominated Unix by 1998 because it had a better product market fit than
anything else, the same way that the first thing you did when you booted your
shiny new Sun workstation was to install the GNU tools, because they were just
better. (Sorry, BSD fans. Even something as simple as sort -h
is
the kind of usability improvement that made it worth avoiding the builtin SunOS
stuff, even if Sun Studio weren't both more expensive and less useful than
GCC.)
All that changed when Perl forked itself in July 2000.
It's easy to criticize Perl 6 20 years later. It's more important to discuss why the failures of Perl 6 doomed Perl, at least so that other languages can attempt to make new and interesting failures.
The problem wasn't apparent in 2000. The solution seemed obvious: rather than let Perl stagnate, start a new project with a lot of excitement and harness the community's energy to make improvements which would allow Perl to be a better platform for further new development. A new VM! Better defaults! Performance improvements! Simplicities! Better library support! These are exciting things.
Put yourself in the shoes of someone trying to choose a language for a new project.
It's August 2000. You've heard that Perl 6 will have a beta out in mid-2001. Cool! Maybe you'll start in Perl 5 and port to Perl 6 when the benefits of the latter are obvious. After all, they plan to have a compatibility layer, so you won't have to throw your code away.
It's mid-2001. There's no beta, but that's fine. You've heard that they plan to have something by the end of the year. It's disappointing, but it's okay. They're going to merge Perl 5.12 and Perl 6, so you're okay sticking with Perl 5 for now.
It's 2002. There's no beta. There's no delivery date for a beta. In fact, there's no delivery date for the specification. There's a Perl 5.8 on the way. Maybe that'll be worth playing with. Everyone knows that Perl 6 is the future of Perl, but no one can tell you what the migration strategy is from new code you write today in Perl 5 to use Perl 6.
It's 2006. Six months after the project fizzled out, the Perl 5 to Perl 6 bridge is finally announced as abandoned.
It's 2007. Maybe there'll be a 5.10 this year. Will Perl 6 ever replace Perl 5? No one can tell you, and you're getting a little suspicious that it's been 18 months away since July 2000.
It's 2009. Maybe they'll fix some of the bugs in 5.10 this year.
It's 2014. Are you still paying attention?
It's Christmas 2015, and there's a release called "stable". The world does not rush to unwrap its present.
It's October 2019, and the thing formerly known as Perl 6 is now Raku, which is a good thing for Perl and probably Raku, but the world is still in no rush to unwrap its present.
At least Arc had Hacker News.
With every year that passed, as Perl 6 produced more press releases than actual code, the attractiveness of Perl as a platform declined. Sure, it still had users. Sure, it still had people starting new projects. (The Modern Perl movement was a decent attempt to bring wider enthusiasm back into the ecosystem by dispelling some of the worst myths of the language. It modeled itself after JavaScript: The Good Parts without realizing that Perl lacked JavaScript's insurmountable advantage of ubiquity. Who could have predicted that Objective-C would be interesting again a year before the iPhone came out?)
What it didn't have was a clearly defined future, let alone an articulated one.
(The argument that Raku has no practical guiding theme is subtler, but not difficult to make. In specific, the current design principle of Raku is to explore ways to modify the grammar of a programming language such that the language designers can make backwards incompatible changes in subsequent versions without making programs written in those languages incompatible while also allowing users to make local slangs, and all of this without adopting S-expressions.)
Defining the future of a language across the chasm of a version split is difficult and dangerous. Python has done an adequate job, and it'll have taken them 20 years from the initial announcements of Python 3000 in 2000 to the point at which the maintainers have stopped maintaining Python 2.x at all in 2020. Python 3 came out and won. It finally replaced Python 2.
"But wait," some Perl advocates protest. "The marketing message of Perl 6 was, for a long time, that Perl 5 and Perl 6 were sister languages, where Perl 6 had no intent to supplant Perl 5 in toto and Perl 5 had no plan to cease to exist!"
The problem with that line of reasoning—besides the facts that it started ten years too late to matter and lasted far too long until the Raku renaming—is how it split the potential adopter base. The stated technical reasons for the existence of Perl 6 have not been addressed fully in Perl 5 and may never be addressed, so even if those flaws were not material to the concerns of potential adopters, they have for decades been direct and obvious concerns that take up mental space and present themselves as risks.
Worse yet, the question still arises as to what the Perl community as a whole would look like if Raku were to fulfill its promises. Would there be a compatibility layer, such as a revived Ponie project? When will Raku have the library support people have come to expect from Perl 5? Will there be documentation, example code, tools, and knowledgeable support channels as a modern language needs in 2020? Where will new users of Raku come from and where will existing Perl developers go?
No one can answer these questions. There's just not enough information.
None of this is to say that it's impossible for Perl to reinvent itself again and start to gain users. It's possible that Raku or some variant will reach a point of usability and gain stability and maturity as a useful product, with documentation, libraries, support, and uptake.
Yet with every passing year, it's less likely, and the alternatives grow in number and desirability.
If you count its years as Perl 6, Raku is older than Perl was when Perl 6 was announced. Granted, fortunes can change quickly. Ruby had a brief run outside of Japan in 2000 with the publication of the first Programming Ruby book, then disappeared for a while until Rails brought it into the limelight.
Then again, at the original publication date of this article, the Swift programming language had been public for less than a week and already had more users than Perl 6, and Swift was entirely built on technologies which had been invented since the Perl 6 announcement.
How does a language win? By being compelling enough to be used for new things. It's not solely a technical concern; it's a concern of the language community and ecosystem.
Some of the world's most influential programming language designers share their thoughts on what their languages do and don't do well, how to manage complexity, and where software and technology will take us in the 21st century.
Network Effects are More Important than Technical Quality
Ready Player One Review (Ernest Cline)
Why Do Government IT and Software Projects Fail?
Denormalize Data at Business Boundaries
The Lemon Market of Programming Language Adoption
Theme, Pragmatics, and Purpose in Programming Language Implementation
Search Engine Results Placement and Click Through Ratios
Agile Long Term Planning (with your backlog)
Why JavaScript and PHP Won the Web
Cohort Analysis Refines Your Business