Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
I never imagined it would be edible.

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.

register, login


                     

Doubly minimal CPU

OISC with a minimal threaded interpreted language (probably FORTH)
  (+6, -1)
(+6, -1)
 

I apologise in advance for telling anyone stuff you already know.

In case you don't know, FORTH is a fairly low-level programming language used for control applications such as for radio telescopes, pinball machines and apparently also device drivers. One of the fun things about it is that rather than write conventional programs, you define words in terms of more primitive words, and it relies on a stack where the data are placed, which it then manipulates using words. Hence 2 2 + would leave 4 on the top of the stack and . would print the integer value at the top of the stack, removing it as it does so.

As a teen, I used to play a game where I'd try to pare away the FORTH vocabulary to a bare minimum so that the other words usually provided could be defined in their terms. So for a trivial example, subtraction could be defined through negating the value on the top of the stack and adding it to the value beneath it. I got quite a long way, but as you get further down you have to understand more about how the interpreter or compiler works to work out how to do it.

Now out there somewhere are these things called Peano's axioms for natural numbers, the set Z, and somewhere else there is expressive adequacy, the finding that all logical operators can be defined in terms of a single operator, the choice being between NAND and NOR. Integers can be defined thus: 0 is a constant; every number has a successor; no natural number has a successor which is zero. You can then fiddle about and define negative integers, fixed point numbers and complex numbers, in various ways. Do this with FORTH and you've dispensed with many other words. You can even get rid of the parameter stack because it's just a series of consecutive locations in memory which you can deal with by incrementing, decrementing and loading and storing from the current location. You in fact end up with a very small vocabulary, thus: 0 SUCCESSOR ¦ C@ C! : ; IF EXIT and that's it. The less than obvious ones are C@ , which is "fetch a byte from the location at the top of the stack", C! , which is "place the value two places down on the stack at the location named at the top of the stack", : , which introduces a definition, ; , which closes a definition, IF , which is a conditional, and EXIT , which returns the flow of execution to the next highest level.

So that's the outer circle, available to the user, although in practice someone will have probably defined a load of higher level words to make it feasible to use it. These primitive words are realised as op codes in a virtual CPU.

Yes, a virtual CPU, because there's a lower level. This is all running on a sixteen bit single instruction CPU, and a single instruction CPU is in fact a zero instruction CPU because it can assume that everything it encounters is to be processed using that instruction, so the op code, whatever it might be, is actually completely redundant.

That single instruction is "subtract one and branch if negative". This is a very common design for a single instruction CPU and isn't my invention, but what this processor is doing is running an emulation of that minimal FORTH virtual machine I mentioned above, in its firmware.

It's unwise to trust AI chatbots of course, but on asking for an estimate of the number of transistors needed to realise this processor, I got an estimate of around a thousand. That's only half as many as the Intel 4004, and it can be reduced further by replacing register flip-flops with capacitors and possibly by having it operate in series rather than parallel. Such a CPU could have been put on a single silicon wafer by about 1970.

Or, it could've been realised as thermionic valves, with some of the valves integrated as, say, a couple of triodes in the same glass envelope, then put on ten circuit boards and occupying a stack of them about a metre high and only 10 cm square. This could've been done in the '50s.

So in the end what I have here, conceptually, is a microcomputer which would've been slow but serviceable using a single-chip CPU which could've been built using the electronic technology of the end of the '60s, or going back further a computer small enough to build into a desk made out of valves.

Ergo, we could've had home micros before the Beatles split up.

nineteenthly, Nov 04 2025





       A doubly minimal explanation as well
pocmloc, Nov 04 2025
  

       I didn't know how much detail to put in.
nineteenthly, Nov 04 2025
  

       //"subtract one and branch if negative"   

       You probably mean "subtract and branch if negative".   

       It seems to me that you're forgetting about memory - and all the other things a system needs. And also how long computations take. This probably isn't an efficient architecture with either transistors or computation time.
Loris, Nov 04 2025
  

       I was thinking about memory. I'm assuming the stuff which consists of rings on intersecting wires. And yes, I may be thinking of that but I suspect subtracting one would also work. It would mean repeated operations but it'd still amount to the same thing.   

       In detail, I was thinking a fuse-based ROM and either a capacitor-based dynamic RAM or the ring and wire things, or possibly something like the CRTs plus plates they used to use. There are many options.   

       I/O I was assuming was memory-mapped but it could either be a TTY-like system or a character-mapped display.
nineteenthly, Nov 04 2025
  

       Forth rocks. (+)
bhumphrys, Nov 04 2025
  

       Thanks. It does, and this has made me think of it differently, i.e. more as able to approach a minimal Turing machine or lambda calculus. I suppose it's obvious really but I never considered it that way before.
nineteenthly, Nov 05 2025
  

       Trying to work out if my brain has deteriorated too much to interact at this level, or if I was never capable in the first place.   

       Er, bun for the image of a 50s valve-based microcomputer, glowing slightly in a dark room somewhere. Probably because I'm a Fallout fan; definitely not because I'm making a valid contribution to the discussion.
david_scothern, Nov 05 2025
  

       :-) . It's easily possible that this idea is a manifestation of a deranged mind, so it may well not be you.   

       There is a bit of an issue here which I'm myself not sure about. FORTH generally stacks signed 16-bit integers, i.e. stores and fetches them at memory locations pointed to by a value which is incremented and decremented according to whether the value is being pushed and popped. To do this, so far as I can tell it would have to use literal values expressed as binary numbers and not numbers expressed as either zero or the successor of that constant or that's successor and so forth. That kind of number doesn't sound like something which can be pushed onto or popped off a stack. That may be one problem with this idea.   

       The other issue, which isn't exactly a problem but is interesting, is that the way I've written this, I've assumed that the parameter and return stacks are locations in main RAM. If they are, the same issue arises as before in that addresses are being used which are binary numbers, referring to locations. There is another way to do this though, which is to implement the stacks more explicitly, thus: there is a kind of peripheral, as it were, which can be presented with a value which will then be stored inside it, and can do two things. It can take an input or it can output a value. This would get round the problem of using binary sequences to refer to locations as numbers, but would still mean that general memory would need them.   

       However, the very nature of a stack is similar to the tape in a Turing machine, so maybe it's just about moving around in memory one step at a time and not using straightforwardly numbered locations. Or maybe the stack is the tape and that's all there is, and it's still Turing-complete, but it's a weird kind of stack if so because the values on the top are still there after being pushed and popped, just in a different place from where the pointer currently indicates.   

       But yes, it sits there glowing, perhaps in the basement warming a boiler and providing central heating and a hypocaust while simultaneously controlling the timing and thermostat on said boiler.
nineteenthly, Nov 05 2025
  

       You can't trust compilers.
Baker Gordon Halveseys, Nov 10 2025
  

       Real Programmers use binary opcodes.
nineteenthly, Nov 11 2025
  
         


 

back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle