~/imallett (Ian Mallett)

Java is Bad for You
by Ian Mallett

====INTRODUCTION====

A distressing trend in Computer Science pedagogy is to increasingly concentrate on Java to the exclusion of (almost) all other languages.

This reflects an underlying move and self-reinforcing cycle in the software industry: software engineering is switching over, slowly, to Java and other "newer" "non-legacy" languages.

There is some merit for this. Compared to C++, for example, Java development can be faster, especially with low-talent programmers. Additionally, as a prototyping framework (e.g., a rough GUI or a quick proof-of-concept), Java may be useful. Java is widely (although not universally) supported.

But there is a darker side.

Most obviously, Java is not the ideal language for everything, despite what its creators and less-informed neophytes may contend. But I'm not here to argue that triviality. I will claim that Java is not merely a suboptimal language in many cases, but an actively poor one in most. This is a stronger result.

The following discourse will be primarily factually-oriented instead of anecdotal. I try to adopt a neutral tone, since this is intended to be as informational an opinion piece as possible. There is harsh criticism within, but it intends to be factually based.

====LANGUAGE RESTRICTIONS====

Java restricts language features unnecessarily to make things easier for beginning programmers while not offering any viable replacements. This hurts expressiveness, performance, and clarity. Java is also inconsistent in doing so.

For example, Java does not support destructors. This makes good resource management (especially the hallowed RAII pattern) difficult if not impossible. In advanced Java programs, you'll see programmers calling "init" and "deinit" methods explicitly to work around the issue (although this becomes very ugly when combined with exceptions—one of the most major applications/advantages of automatic destruction in other languages, notably C++). More commonly, programs instead count on resources to be freed by the Java system library (which doesn't always happen in a timely or correct manner). Worst, many Java programmers advocate using try/finally to (kinda) force cleanup. Using exceptions for non-exceptional control flow? Makes me want to cry.

Nor does Java support copy constructors. This causes endless grief when trying to duplicate objects consistently. In Google's GWT, which I once had the misfortune to be required to use, I encountered a situation where I needed to make a temporary copy of some object. Since most of its instance variables were private, I had to spend weeks crufting a module that would execute a combination of method calls to change the object's internal state to the right form. The result was buggy, brittle, and a massive kludge no one could be proud of. It worked, but the point is that a simple copy constructor would have naturally solved the problem perfectly with one line. It was a massive waste of my time. Inconsistently, some built-in Java objects (especially the numeric ones) actually have copy constructors (and assignment operators too), but there's no way for new user code to do this. Java objects are privileged.

As another example, Java does not support multiple inheritance. It has a cheap cop-out called "Interface"s, which are not functionally better than C's "typedef". Since Interfaces do not support method implementations, they have effectively no use. I commonly see them used for no gain whatsoever in Java programs by those who don't know what they're doing. In the works of the best, I seldom see them used if at all. The applications of real multiple inheritance are admittedly few, but when you don't have it, it hurts.

As another example, Java does not support goto (if you think goto is always bad, read this). It does have named break statements, which cover some use cases of goto, but the lack of an explicit statement is still a shortcoming (although admittedly among higher-level languages, a common one).

As another example, Java does not support operator overloading. Say I wanted to write my own BigNum class. I would be stupid not to use operator overloading. Instead, Java forces one to write pathological expressions like "new BigNum(4).exponentiate(51).mod(6).subtract(1)". You laugh, but I have often seen method chaining of such bletcherosity in production code. Java is inconsistent on this too! The operator "+" is overloaded for the "String" class. There is no way for user code to do that.

As another example, Java does not support pointer arithmetic. I understand why, of course, but the lack of being able to directly control memory hurts both performance-wise and philosophically. Java's choice to instead represent pointers as references is perhaps commendable as a simplification, although the semantics are slightly baroque. Among other issues, it limits things (like the "restrict" keyword in C99, for which Java has no equivalent).

As another example, Java does not support submethods. Many expressive languages support this, including even C (albeit through extensions). As Java is much higher-level, I was surprised that it does not support this. It has been a feature request for a while, but nothing has come of it.

As another example, Java does not support templating. Java does support "generic"s, which are a weak and broken version with which it is impossible to do metaprogramming. Generics play especially poorly with inheritance, but the point is that they are implemented as type erasure, not actual template instantiation. There's a difference.

The point is that most of these language features were deliberately omitted for no other purpose than to simplify coding for the lay-person. That's nice for the lay-person, but as a real software developer, the lack of these generally very basic features constantly gets in one's way.

I actually do frequently receive that argument—that somehow this deliberate crippling is actually a Good Thing because it's impossible to abuse a language feature that isn't there. No. If a language feature is abused, that doesn't make the language feature bad. It makes the programmers who abuse it stupid. Any language feature might be abused. Inheritance is a language feature. The fact that Java programmers in particular do abuse inheritance more often than C++ programmers isn't any more reason to remove inheritance from Java than the "+" operator being overloaded for long division would be an argument to remove addition from Python. Beginning programmers don't know about, and get along happily without, advanced language features. They don't need to know everything at the outset. Removing perfectly good features means that everyone will be limited later.

====PERFORMANCE====

Java is also slower than C/C++. If you don't believe me, read this. The performance argument isn't as relevant on today's machines, but it's still a consideration—and often a deciding factor for high-performance computing.

====COMPUTATIONAL CULTURE OF JAVA PROGRAMMERS====

In my experience, pure Java programmers are often far too ignorant of how the machine actually works to get a deep appreciation of or genuine competence with advanced algorithms. Since Java manages almost everything through classes, Java programmers frequently have minimal knowledge of how resources are allocated and deallocated. The lack of a clear distinction between Java references (pseudo-pointers) and the objects they reference in Java pedagogy too often leads to a conflation of these ideas in the minds of Java coders.

For example, Java's interfaces exist for no real purpose, but Java programmers use them frequently. Java's lack of operator overloading encourages long, one-line method chains. Java's extensively exception-oriented model of semi-regular control flow contributes to code bloat and messy jumps through multiple levels of exception chaining. To its credit, Java does support the "final" keyword (with similar semantics to C's "const"), but the internal culture of Java programmers makes its use almost nonexistent.

As far as resource management, many students I teach coming from a Java background are baffled when I tell them that file descriptors should be closed, or that dynamic memory needs to be freed. "Why do I have to free memory? I never free memory in Java and it works just fine."

In short, the closed-culture Java propagates leads to and exacerbates these worst practices. As I mentioned, the lack of destructors in Java's classes all but preclude the RAII design pattern. Most Java programmers, learning primarily from each other, simply don't realize at an intuitive level that the resources their classes consume are finite—because the Java language provides no consistent mechanism to represent that. Ignorance of memory appears to be an inherited trait, and cache-naïve algorithms are becoming de rigueur.

Teachers imparting this kind of willful ignorance to their students are not merely disrespecting the grand computational history that precedes them, but are quite frankly being irresponsible. Despite the shockingly broad mass delusion to the contrary, resource-conscious programming is more important than ever. Cores are shrinking and multiplying. There are more levels of cache, and new massively-parallel computational and storage architectures require ever-tighter coordination between applications and the resources they consume. Chipsets are being invented that allow new kinds of communication between cores, more than generalizing a shared cache—while at the same time, the difference between what is a single processor or a single core is melting before smaller methods of nanoassembly. Single dies can already incorporate multiple CPUs and GPU alike. Threads are being dissected into vectorized operations, and new instructions rearrange the instruction pipeline more fundamentally than instruction reordering in an optimizing compiler. The buses are shortening, but RAM is still becoming ever more distant. The implications of these trends for concurrent applications are staggering, and the problems inherent to thread and memory access will redouble in complexity. The right approach cannot be to perpetuate ignorance about them.

The point is that (in my experience) Java programmers usually are less adept at computational science and are less inclined to learn than their counterparts from other languages. The culture is nearly completely self-contained, so as a result best practices are learned slowly if at all. Finally, Java's distressingly scant emphasis on resource management and limited expressiveness make these problems worse. These are harsh accusations and are all generalizations, but I have found them to be largely true in the real world.

====COMMUNITY AND FALLACY====

Probably the most rebarbative aspect about the Java language isn't actually the language itself, but the community that built it. Not all of the community is this way, but I have noticed a distressingly large population of Java programmers fancy themselves elite in some way since they only use the "one true language". Maybe I'm just paranoid, but every now and again, I get a clue-repellent youngling on my doorstep, as it were, asking about how he can grok the mysteries of the "greatest language on Earth".

I honestly don't mind Java itself—in the same way I don't mind COBOL or (Visual) Basic. The people who use these languages keep to themselves. They know that their language is passé. While I personally despise both languages, I do respect and appreciate that COBOL and Basic programmers at least don't try to foist their methodologies on others.

Not so, Java.

Everywhere I look, I see Java promoted. It's "new" and hip, and every manager everywhere thinks forcing their development team to use it is the next big thing. It's said to be faster than C; it's said to be used in all industries as the "language of choice" and as the catch-all, general-purpose replacement for the "old" "legacy" languages that C++ has failed to be.

The magnitude and tenacity of such falsehoods are staggering.

I dissected at length the allegation that Java is somehow faster than a language like C, C++, or FORTRAN.

"Language of Choice"? Sorry, you can't, by necessity, write an operating system in Java. It's impossible. You don't use it in parallel or high-performance computing, since you need low-level pointer access. General-purpose? Sure, but Python, C++, Lisp, Pascal, and Perl, among others, are too—it's hardly a defining characteristic.

Other languages are "legacy"? Widely-used, highly-successful languages that are actively being used today are not "legacy". And Java is almost twenty years old; it's hardly new. I have seen Ruby compared to Java as a "legacy language"—but both languages were invented in the same year! The C programming language is not "legacy". I wrote a C program this morning. A lot of other people did too. "Legacy" ought to refer to the actual usage of the language—but for example C actually handily beats Java in current usage—Java has no place calling it a "legacy language"! C++, Python, and Ruby, in particular, are also up there. These are not legacy languages either.

A few other falsehoods bear mention since they are not self-evidently false.

One is the notion that Java is "Object-Oriented". Granted, Java is most like an object-oriented language, but it oughtn't to be considered the canonical one. Java lacks many core OO features—among them, destructors, copy constructors, assignment overloads, operator overloads, and multiple inheritance. Most are essential to many design patterns, and some (such as the first two) are essential for any OO programming in general. Other useful semi-OO features Java lacks include sufficient pass-by-value/pass-by-reference distinctions, function pointers/objects, and subfunctions. Furthermore, the computational culture discussed earlier does not promote object-oriented programming despite its very loud claims otherwise. Putting code inside a class doesn't make it object-oriented any more than putting a rock inside a microwave makes it food.

Java proponents often claim that their system is "write once, run anywhere". By this, Java proponents mean that their VM's bytecode will run anywhere once compiled. I personally have not found this to be entirely true in practice. But supposing it were, the problem is that doing that makes CPU-specific optimizations (almost) impossible. The issue is discussed further in my performance comparison. The practical upshot is that recompilation for each architecture isn't actually a Bad Thing. So, "write once, run anywhere", really isn't substantively different from "write once, compile anywhere, run anywhere"—which describes languages like C. Viewed in this light, suddenly Java's portability isn't a unique accomplishment. Writing in C++, you can easily write a program once that will run anywhere. Windowing toolkits, such as wx and Qt, already abstract platform-specific junk in the same way that Java's AWT and Swing do in the JVM. My C++ codebase was written once. It runs on Win32/64, Ubuntu32/64, Fedora32, MacOS 9, MacOS X, Solaris, Free/Net/OpenBSD, and my own personal pet operating system, MOSS (where applicable) and on x86, x86-64, and (with a few reductions) on MIPS and ARM.

Carrying that further, Java is in fact less portable than C! Java can't run in a number of places that C actually can—for example, embedded devices and microcontrollers in applications like satellites, remote sensors, robots, industrial process control, appliances, and vehicles. Memory and computational power is at a premium in these applications. No one is going to waste resources running a virtual machine when a piece of nice, vanilla C does the job better. Then there are the usual software applications—drivers, OSes, and so on, that can't be written in Java for technical reasons. So no, Java is not "write once, run anywhere".

Java proponents occasionally claim that Java compilers can do global optimization while C programs cannot. In fact, almost all the research in the field of modern compiling optimization happened for C (and FORTRAN) compilers and is now spilling over into Java compilers. The fact that a Java compiler can do global optimization is not an advantage over C. C compilers can do it too—they invented it. In fact, the important algorithms for optimizing compilers were discovered before Java was even invented. For more discussion, see the compilers section of my article on Java's performance.

The point with all this is that Java's proponents regularly fail to recognize hard facts about the computing industry. Java is not new; all other languages are not "legacy", Java is not a perfect exemplar of an OO language, and "write once, run anywhere" isn't special nor even necessarily desirable. While these do not ipso facto denigrate the Java language itself, they are indicative of a widespread culture of misinformation that is at once maligned and repulsive.

====MISC.====

Java has, as one of its core features, a massive standard library. It is bloated, even by the terms of C++'s all-encompassing standard library and template library. "There's a class for that," is an emerging meme among Java programmers. For APIs, completeness isn't usually bad, but redundancy always is. Java, for example, supports two windowing toolkits: Swing and AWT. They accomplish basically the same thing, modulo a few differences. This kind of repetition is everywhere, and learning how to do anything in a robust, standards-compliant way is almost impossible—you can't see the forest for the trees!

Oracle (Java's producer, as it were) has recently come under scrutiny for truly objectionable legal issues related to Java—including attempting to sue for reuse of their API's method names. While that particular example appears to have been quelled, relatedly shady legal proceedings are currently underway. Consult a technology news source for further information as this develops.

====CONCLUSION====

Java does have its applications, I'll admit, but recently a nasty strain of what is, quite frankly, propaganda, has been spreading throughout the entry-level programming community. For the record, Java is not faster than C, it is not widely used in high performance computing, and languages like C, C++, Python, and other languages in widespread use today are not "legacy" or somehow inferior to Java.

Low-level languages like C are more powerful. General purpose languages like C++ have more features and are more expressive. Scripting languages like Perl and Lua have intuitive syntax and convenient language features. Interpreted languages like Python are a mixture. Java is just one of the crowd—there's nothing that makes it fundamentally better, and there are a whole lot of things that make it fundamentally worse.

Java is at least somewhat slower than C and it restricts languages features for no reason—consequently getting in programmers' ways when they try to do anything too advanced. It has a bloated standard library, an obscene memory overhead, perpetuates a single design philosophy dogmatically, and has cultivated a programmer subculture that ignores computational realities, best practices, and a general lack of concern for system resources and efficient algorithms. Finally, this subculture has produced a litany of false aggrandizement for their "one true language". Ultimately, many Java programmers' arrogant, naïve, obstinate refusal to learn other tried-and-true languages—low-level and high-level alike—cultivates intolerance, inefficiency, bureaucracy, and massive ignorance.

I can't reasonably argue that Java shouldn't be used at all. Java has some advantages. It is still quite fast, it is easy-to-use for simple programs, and it fills a happy medium between strictly-typed, compiled languages and flexible, interpreted ones—but these advantages do not make Java the end-all solution that fledgling programmers, Oracle personnel, and inexperienced teachers seem to think it is. Its use may be justified in some cases, but it is fundamentally irresponsible for programmers to know Java at the exclusion of all else.


COMMENTS
Ian Mallett - Contact -
Donate
- 2021 - Creative Commons License