The main selling points of Java is that it can actually be used on any computer. All that the computer needs is an interpreter for Java bytecode. Such an interpreter simulates the Java virtual machine in the same way that Virtual PC simulates a PC computer. Theoretically if you write a Java program well, it should work on any computer, and for many years to come.
In practice there have been too many times where a certain Java program cannot be used on the computer you want to use it, even the computer has Java support in it. I have seen too many cases where a device configuration tool or user interface written with Java has become unuseable very soon, and you just can’t use that on modern computers. A typical case is that Java application has run nicely on some old Java version, but refuses to run on some newer Java version. And when you have a newer PC, you can’t get the needed very old Java version to that PC in any practical way. And the manyfacturer does not provide any updated softwere. So the somewhat old still well working hardware or software system too soon becomes unuseable, because the Java based user interface just don’t work.
Java is supposed to run on any PC, but in practice many Java application just don’t work. Why does many Windows program seem take better time (even very old version runs nicely on modern Windows version or Windows emulator on Linux system)? Is the problem that Java keeps changing too much too often (problem to compatibility) or are the Java applications just so baddly written (in incompatible way)?