Affiliation:
1. Auckland University, New Zealand
Abstract
This topic was brought to mind by the title of a recentish publication[1], where "a tiny virtual machine" was mentioned. I have told you[2] about our virtual machine for Basic which ran in 16 kilobytes; the smallest of the processors used for the "tiny virtual machine" provided 32 kilobytes of non-volatile storage, with an additional 0.5 or 1 kilobyte of RAM. Feeling comfortably superior, I read the article -- which I found very interesting -- but then continued musing on the pressures of memory limitations.
They are still with us, but the scale seems to have changed. In another article[9], the notion of "virtual infinite memory" was extolled; closer to home, one of our technical staff was recently explaining that I would probably have some trouble with my computer unless I acquired another 64 megabytes or so of RAM. ( I already had 256 megabytes. ) As I am pretty sure that all the interesting things I want to do with computers can be done in, say, 64 kilobytes, I find these numbers fantastic.
( Yes, I think there are still interesting things that will fit into 64K. I suspect that all the really interesting computing questions will fit, though I can't prove it so I'll settle for "most of". The things that don't fit are irrelevancies such as speed, colour, graphics, etc., which are really rather boring. Indeed, 64K might be an overestimate; I choose it only because I got used to it, and found it worked pretty well. I accept that it's handy to have a disc as well, but a megabyte or so is essentially infinity. )
( -- and before you get really cross, I emphasise that I specified "really interesting computing questions"; that doesn't include getting productive work done. And before you get really cross about that, I point out that the traditional scientific method, which does seem to work pretty well in some fields, is to begin by isolating the phenomenon you're trying to study as well as you can. That's the bit I think you can do in 64K. )
So I take refuge in the old days, when we wanted to do different things, and therefore did different things, such as seeking extreme means to economise on memory use. For present purposes, "the old days" is defined roughly as 1975 to 1980, when one of my preoccupations was experimenting with languages to do interesting things on computers. To experiment on languages, you need the requisite apparatus -- in particular, you need some sort of processor for your experimental animals. It doesn't much matter whether it's a compiler or an interpreter, or a trained performing seal for that matter, but you must have something. If you want to do it a lot, you must have a lot of processors; if you can't buy them, you have to make them; so I wanted a way to make a variety of processors for experimental languages in artificial intelligence research.
I had heard of things called compiler-compilers. They were programmes which would write you a compiler for your language, provided that you gave them a description of the language in something like BNF. I also found that respectable people called them compiler generators. How could I get one ? I expected that I'd have to write one -- that was the usual answer if you wanted anything slightly out of the way in days when there was more than one machine architecture and therefore more than one operating system. That might not have been so good for Intel, which was perhaps why Intel didn't exist, but it was far more interesting for human beings.
( It is true that there was a lot of IBM around ( remember IBM ? ) but there was a vigorous undergrowth in the mainframe world ( remember mainframes ? ). notably the BUNCH[10] -- Burroughs, Univac, NCR, CDC, Honeywell -- and we haven't even started on the minis, which is where DEC came from ( remember DEC ? ). ( Doubtless language differences account for the omission of a vigorous growth in Europe, such as the British ICL, which produced some worthy machines. ) So I digress, but in the spirit of "Programming Lessons from Days Gone By" I add that the variety kept the programmers -- except perhaps for those locked into the IBM universe -- up to scratch. )
We had a Burroughs machine. I made enquiries about the availability of compiler generators for Burroughs machines. It was harder than it is now; the world was slower then. The fast ways -- telephone, telegraph, Telex -- were expensive, if you started from New Zealand, and there was nothing like today's easy electronic access to far too much information. In the event, such enquiries as were possible didn't come up with a compiler generator from anywhere else; I would have been quite disappointed if they had, and happily set out to enjoy myself by writing my own.
At this point, I pause to note an interesting phenomenon, which seems to have turned up quite often in my life. I was aiming at artificial intelligence; this pushed me into languages; that led to compilers; here I am, setting out to write a compiler-compiler, and -- as will become clear -- ending up totally out of my depth in something like number theory. Of course, I never got back to the artificial intelligence which started it off. This is exceedingly good fun, but not a strategy for career building. Strongly recommended.
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Graphics and Computer-Aided Design,Software