When Java first arrived, the initial hype was all about “write once, run anywhere”. It’s language a mess of compromises born out of it’s heritage as a language for embedded machines and a desire to keep C++ programmers happy. Once people got over the novelty of inherently portable code, the attention then fell (initially favourably) on Applets, followed by protests against the non-portable nature of AWT. Penetration outside of the web was minor.
Swing addressed some of the flaws of AWT, and the introduction of the Servlet API lead to Java being used in more enterprise environments – the Servlet API was by no means perfect, but coupled with Java’s threading model and its portability made it an attractive choice for web-based applications. With the various J2EE technologies, Java really took of in the enterprise space, and the tools started to follow – no longer did we have to make do with inferior tools such as Visual Cafe or Parts, but could now use free IDE’s such as Netbeans, Eclipse and commercial tools like IntelliJ, JProbe and the like.
Now, Java is vying for control of the enterprise space with Microsoft’s .NET, with many companies being a ‘mixed shop’ – a host of interoperability technologies making using both feasible. Java started off as a language which took a few good ideas, a few old ideas, and a few bad ideas. It took API’s and tools to push it into the enterprise space, and this didn’t happen overnight, but each advance made the next advance easier.
Ruby is a language which has been around for about ten years now, but outside of perhaps Japan has made little inroads into traditional computing environments. Many put this down to it’s dynamic-typed nature (and use this as a reason for Python’s similarly slow adoption compared to that of Java) – and perhaps to a point that is true. In a less-mature environment, where testing is not a key concern, compile-time safety might be an important safeguard – however frequently people are finding static-typing more of a restriction than a benefit.
So what else might be to blame for the lack of success of languages such as Python and Ruby? Unlike Java, neither has had the killer API’s and the killer tools to differentiate themselves enough from the competition. The Java API’s weren’t perfect, but as a package (taken together with the backing of commercial interests) they differentiated themselves enough from the alternatives to make them a success.
In many ways there are a series of vicious circles at work here that are slowing the adoption of alternative languages such as Ruby and Python. Without knowledge in workforce, companies won’t adopt a new language, therefore they wont use the new language, therefore there won’t be a workforce who know the new language. Without the tools, the cost of using the new language is greater, therefore less people will use it, therefore there is less of a market for tools to support the new language. And without a clear differentiator as a reason to adopt the language, the language doesn’t get adopted as widely, therefore the possibility of a clear differentiator arriving is reduced.
As any fan of system’s thinking will tell you, one of the ways to turn a vicious circle to a virtuous one, is to inject some new variable into the equation. Ruby and Python need to start seeing some decent tools and API’s that mark them apart from others. And it’s possible, _just_ possible that Ruby might have found its first killer app in Rails. But for Ruby to make more inroads, it has to be the start of a process that will see further tools, libraries a knowledgeable workforce follow.