Java as a Web Technology is Living on Borrowed Time

08.04.2015

by

Categories:
Tags:

Java’s change in market status from a viable web technology to what it is now—undesirable legacy code—is astonishing. Oracle’s recent patch for Java’s zero-day flaw puts one more nail in the coffin.  Zero-day status is assigned to flaws that are currently being exploited by hackers, and for which there is no available fix.

A little more than a decade ago, Java was the leading technology for supplying web applications. A tremendously popular approach, Java was used for everything, including access to key enterprise applications—which happens to be the part of the industry I work in. But today Java is often categorized as legacy code that should be purged from the enterprise, at least as a web technology. So what happened?

Through my position as a product manager, I’ve gained some interesting insight into this topic. The two products I manage basically provide the same type of access to mainframe users. One product, introduced in the early 2000s, supplies its services as Java browser applet. The other product, released last year, uses HTML5 and JavaScript.

I spend a good part of my work week talking with users and enterprise IT managers about what motivates them and why they like, or don’t like, my products and their related technologies. What has become abundantly clear is this: IT no longer believes that Java can solve the problems it was built to solve.

In the beginning, IT saw Java as a way to build enterprise applications that could be run without installation, updates, or device-specific requirements. Unfortunately, to realize this benefit, you have to install and maintain some pretty problematic infrastructure—the Java Runtime Environment (JRE)—on all participating devices. Therein lies the problem.

The JRE is one big maintenance and security headache for IT. The seemingly simple requirement to install and maintain the JRE on every end-user device reintroduces the problem that Java was originally supposed to solve.

Of course, this isn’t new information. What’s new is IT’s view of web applications. Not Java-based apps, but applications built with nothing but standard browser-native technologies—namely, HTML5 and Javascript. Together, these two technologies are now viewed as the preferred approach for delivering critical applications across and beyond the enterprise.

There’s a simple reason for the change in perspective: The HTML5/JavaScript approach requires no device-specific components beyond a modern browser. In other words, IT staff can serve up web applications to hundreds of thousands of users without having to touch any user devices. They need only maintain a dozen or so application servers. The browser-native approach reduces IT’s endpoint-management concerns by a factor of thousands.

In short, HTML now supplies what Java was supposed to—a simple, reliable way to provide enterprise-grade applications to end users without any desktop installation and management dependencies. Another bonus: HTML is device-agnostic. IT staff can easily expand application support to the new class of mobile devices pushing into the enterprise. Business needs become the focus, rather than device management.

The shift from Java web applications to native HTML web applications is already happening. While not all the shops I talk to have a transition plan in place, they all agree they need one.

What is your plan?

Learn more about what Micro Focus has to offer for an HTML5 solution.

Share this post:

Leave a Reply

Your email address will not be published. Required fields are marked *