Interview with Anil Dash of Six Apart

Your post ‘Web Development Trends for 2006′ clearly outlines likely portents of the evolution of web development technologies. How will this make end-users’ experience better?

I think the common thread that binds all of the technologies that are gaining ground is a respect for user’s time and the pleasantness of their experience. Even back-end technologies or things that seem very esoteric eventually translate into a better end user experience. The way this is happening is by letting developers shift from features in web apps merely being *possible* to those features being straightforward enough that developers can spend time making them *fun*.

This is just like the shift to GUIs instead of command-line computer interaction. When computers had 4k of RAM, a graphical interface was a wasteful luxury. These days, we can visit sites where the favorites icon for bookmarking a site uses that much storage. And it’s all part of spending development effort on things that make people smile.

I’ve written a bit more about this in my post about Making Something Meaningful:

Oreilly’s market trend in computer books shows that Ruby is ahead of Perl and the PHP’s book market is suffering from incursions of Ruby. Do you think Ruby is becoming a replacement or threat to PHP and Perl?

I don’t think Ruby is becoming a replacement for PHP or Perl, or even Java, which it often gets pitted against in people’s comparisons. I work at a company that does most of its development in Perl, and we’re hiring and doing more coding than ever. I’m a (pretty shoddy) PHP coder myself, and I think PHP is so common as a part of infrastructure that it’s just taken for granted, which is a great sign.

The rapid upswing in purchase of Ruby books is a measure of the *interest in learning Ruby*, which is a great thing. Knowing more than one language makes you more fluent in the languages that you already know — I’ve seen that as someone who grew up in a bilingual household, and I’ve seen that as someone who’s transitioned from Basic on my Commodore 64 as a kid to throwing together quick-and-dirty Ruby scripts today.

Even if all those book buyers don’t end up coding Ruby applications full-time, the Rails ethos of convention over configuration, or the little language niceties that make Lisp fans such zealots all add up to making us all better coders, in whatever language we choose.

The idea of one language making another obsolete is just ridiculous. If all you care about is getting attention, it might be fun to pit one language against the other as if it’s a zero-sum game. But aside from hype or sensationalism, I don’t know of anybody in the real world who only uses a single programming language all the time.

There’s a simpler point here, too. A lot of geeks use books to learn a new language, but look up reference materials on the web. So, I’ll crack open a Ruby book on a plane when I really want to learn something I’m unfamiliar with, but just go to$foo when I want to remember the syntax for a particular command. So book sales are a useful measure: For which new languages people want to add to their repertoire, not for which languages are actually in use and deployed today.

With the widespread use of mobile devices, how do you see web development practices changing to enable the reach of the Web to be easily extended onto mobile devices?

I *hope* the change is that mobile is no longer an afterthought. Especially here in the States, developers often see a good mobile experience as a nice-to-have feature addition, for a 1.1 release or at some indefinite point in the future.

As the tools for creating these experiences get better, I hope toolkits are addressing mobile deployment natively. I see this already on platforms like Java and Flash, but it’s still not necessarily built in to the Ajax libraries, and mobile browsers often just plain don’t support contemporary web development techniques. In the long run, the only platforms that will succeed are those that accommodate _all_ users.

How has the use of frameworks, APIs and classes changed the way we develop application? Are we going to see a wider adoption in the coming years?

Again, I think this is part of a continuum of more and more efficient development that stretches back to the beginning of software development. Higher levels of abstraction have continuously been adopted by developers, and I’d guess that the overwhelming majority of developers are already using some kind of framework any time they’re creating an application that’s substantial.

The biggest change for the positive is that things it’s easier to connect heterogeneous systems. I can simply connect almost any two applications across processor architectures, operating systems, database platforms, programming languages, and geographical distances, each of which used to be a big barrier in the past.

The biggest negative change is that there’s more developers than ever who don’t necessarily understand the assumptions and inner workings of their frameworks or abstraction layers. Whether they’re using code-building tools in Visual Studio or Eclipse, or designing Rails applications, some problems will arise whenever you can’t (or don’t want to) see “under the hood”. As long as developers keep a fundamental understanding of CS theory and good solid programming habits as part of their culture, this problem won’t be insurmountable.

How has the developer’s toolbox changed throughout the years? Should we still rely only on a text-editor?

I think this varies greatly depending on the kind of work you want to do. I think people in the HotScripts community mostly come from the LAMP world, where everyone’s in vi or emacs, or on their desktop in UltraEdit, TextPad, or TextMate. Most hackers I know have customized these environments to the degree where they’ve effectively snapped together their own IDE anyway.

But there’s just as many developers living in Visual Studio or Eclipse, and their experience of development is completely different from the LAMP text editor world. That seems like an evolutionary step, at least for some kinds of developers.

For most people who are considered “developers” today, one of these options is fine. Interestingly, I think the biggest audience of people who could be considered developers are actually working in much more visually rich environments, with very little actual text editing as part of their workflow. Whether it’s recording macros in Excel or even something as simple as setting up complicated rules in an email client or configuring a programmable remote, the line between “real”programming and what end users do every day at home is a lot fuzzier than people think.

I’ve talked to people who give their Tivos and Roombas extensive, detailed sets of instructions, but think of themselves as completely non-technical. Fifty years ago, they would have been recognized as robot programmers.

With the rise of code generators and powerful application developer tools, it seems we are all developers now. What do you think of that approach?

I think the democratization of tools is almost always a good thing.

The best, least predictable innovations often come from people outside a discipline. And people who have a really unique contribution to make will never be threatened by the masses of amateurs who are just trying to scratch an itch.

If you take a look at the scripts on HotScripts, you’ll notice that every developer has his own way of developing applications. Don’t you think its time to adopt a application-to-application communication standards?

I think it’s a great ideal, but aside from very broad standards like XML, or a few specialized but broadly applicable formats like RSS/Atom, I think it’s not particularly likely. One, a lot of developers are guys with decent-sized egos, who always think they can make something “better” than what’s out there. We’ve all been guilty of it, but every time something is better, even if it’s justifiable, you’re abandoning whatever standards came before.

In addition, there’s a lot of necessary customization of formats because each application is unique. Tim O’Reilly has said that in the current generation of web applications, open formats and protocols are just as important as open source, and I think he’s right.

Which web development technologies currently promise the biggest opportunities for a career?

I think the most valuable skill isn’t any particular languages/platforms, but more of a focus on solving user problems or business problems instead of obsesssing over technical details. I think all of us who are geeks love finding debates to have, like PHP vs. Perl or Ruby vs. Java or Windows vs. Linux, when the thing that really makes someone a valuable coder is knowing how to hear a request and translate it into code.

For specifics I’d pick these areas:

  • Scaling new technologies (call them Web 2.0 or Ajax or whatever) to true internet scale, for tens or even hundreds of millions of users.Right now, many of the applications deploying this technology are limited to only a few thousand users, or maybe a million users on the high end. It takes a change in discipline and architecture to support the more popular uses, and that’s going to be important and valuable.
  • Making sure mobile users have a first-class experience. Again, this is often an afterthought, especially in the U.S. Anything that can be done to make development for mobile devices simpler or even automatic for web developers is going to be in tremendous demand.
  • Integrating new applications with social media like blogs or wikis. I work for Six Apart, the blogging company, so I get a first-hand look at a lot of these things. What I see is that companies want to get their existing business applications to be able to publish information or output reports to blogs. At the same time, people want to share their information using blogs in their private lives, as well. So connecting those technologies together using feeds and blogging APIs is only going to increase in popularity.

Finally, looking back on your blog post ‘Web Development Trends for 2006’ now, what do you think  you would have said differently if you were to predict web development trends for the next 5 years?

That’s a tough question! I revisited some of the predictions I made last year recently:

There were some wrong calls, like my enthusiasm for E4X, but the things I think I’d change if I were writing for a five-year time frame have to do with that idea of focusing on understanding business/end-user needs, instead of merely being competent with new technologies.

I’d also place a lot more emphasis on the need for geeks to market themselves well. We all tend to have this belief that just being very talented or doing good work is enough to get your name out there, or to make people understand the value of your code. And that’s just plain not true — think of all the situations where an inferior technology wins. Usually when that happens, it’s because it gets promoted more effectively or distributed more broadly, or because people are enthusiastic about recommending it.

In all of those cases, what you’re seeing is the impact of understanding marketing. If you’re a true geek and the idea of “marketing” makes you itch, just think of it as “communicating better”. Most of the time, the words that programmers uses are interpreted by non-technical people like they’d received a proprietary binary format. Marketing your expertise or technology effectively means communicating about your code in a way that’s open, standardized, well-documented, and easily discoverable. It’s just like working with data that’s clean and normalized — it lets you focus on the more interesting stuff, the actual applications.