They didn’t start it with rocks. The first calculators used gears. Those were hard to reprogram. So they started using relais. That worked but was very slow. Then they found out that lamps (vacuum tubes) could take the place of relais but these wore down too fast. Then someone figured out that rock stuff (silicium) could do the same as a vacuum tube. After that it became a race to make them as small as possible to cram more of them together.
I took a course in computing systems engineering which was basically going all the way from semiconductors up to operating systems and it was incredibly interesting.
One of the things that surprised me was how easy it was to abstract away the lower-level complexity as soon as you got one step up. It’s kind of like recursive Lego pieces, you only have to design one piece then you can use a bunch of those to design another piece, then use a bunch of those to design another, and so on. By the end you have several orders of magnitude of the fundamental pieces but you don’t really think about them anymore.
The thing about real world processor design though is that all those abstractions are leaky.
At higher levels of design you end up having to consider things like the electrical behavior of transistors, thermal density, the molecular dynamics of strained silicon crystals (and how they behave under thermal cycling), antenna theory, and the limits and quirks of the photolithography process you’re using (which is a whole other can of worms with a million things to consider).
Not everyone needs to know everything about every part of the process (that’s impossible), but when you’re pushing the limits of high performance chips each layer of the design is entangled enough with the others to make everyone’s job really complicated.
EDIT: Some interesting links:
https://www.youtube.com/watch?v=U885cIhOXBM
Dude doesn’t even know about the magic smoke.
https://www.catb.org/jargon/html/M/magic-smoke.html
I wonder whether ESR will one day up his SSL game.
Anon also forgot to infuse the lightning pixies after inscribing the runes, tsk tsk.
They’re the ones who let the smoke out if you anger them!
Generally it’s only the wizards that deal with the physical side - such as rock shapping and rock combining - that get magic smoke, though if they did their part wrong the wizards that make rocks think might get it as can the people playing Skyrim using the thinking rocks.
How do they get the smoke to go in? Once I let the magic smoke out I am never quite able to put it back in properly.
You have to melt the whole thing back to raw rock and remake it from scratch, which is usually more work than just grabbing some fresh raw rock to build a new one so it’s seldom done.
How do you get the yolk back in the egg? Well, just ask the chicken of course (or the Wizard in this case).
When I learned about how they are making the new CPUs it blew my mind. Dropping a microscopic droplet of metal and blasting it with lasers to a stencil like thingy to create the nanometer circuitry. I was like how the fuck did you even thought about doing that?.. Technologies like these are really marvelous.
You start with macroscopic photolithography, add material science of semiconductors and then iterate a million times. It didn’t start at nanoscale.
Give me a break… I’m still trying to wrap my head around how transistors work. For a layman this is like magic.
Photolithography started as a printing technique and is pretty basic.
It’s essentially the same as somebody with a couple of cans of spray piant and a handfull of carboard sheets with cutouts spray paimting a muti-color tag on a wall.
As the logo kept getting smaller and smaller and the errors of the process of just putting that cardboard in front of the wall and spraying the whole thing had too much imperfection for tiny logos, they had to come up with more and more tricks to get it to still do tiny logos without those logos ending up too distorted.
Like glueing a cardboard to the wall and then dissolving it after spraying :)
Exactly, and “we need this as small and precise as possible” means “can lasers do it?” As an engineer I default to fast and precise means computer guided laser if possible
They use electron beams and extreme UV light nowadays. Lasers are not necessarily the best light source, even at other wavelengths.
I’m still stuck on how you get from “switch go on, switch go off” to, well, anything
One Switch can have two states. Switch on is a 1 and switch off is a 0. Group 8 switches together and you get a byte. Miniaturize the switches and put 8 trillion of them into the size of a fingernail, and ta-da you have a 1TB micro SD card.
Wire up two switches so that a light bulb only will go on when both switches are on (1). This wiring creates an AND gate. Adjusting the wiring so that if either of the switches are on, the light turns on. This wiring is an OR gate.
Channing the output of the lightbulb and treating it like a new switch allows you to combine enough AND and OR gates to make other logic blocks, NOT, NAND, XOR, etc.
Combine enough logic blocks and you can wire up a circuit so that you can add the value of two switches together, and now you can start to perform addition.
This all naturally evolves to the point where you can play Skyrim with the most degenerate porn mods.
I like your explanation, but i dont understand it. Keep up the good work.
This game may help, but I didn’t play it myself (yet)
In addition to Turing Complete, which is really good, Code: The Hidden Language of Computer Hardware and Software is a fantastic book that literally goes from two kids trying to talk to each other at night with flashlights, to a fully working Z80 clone, while not being hard to understand and using a really good conversational teaching method. It’s how I figured out a lot about CPU design, microarchitectures, assembly and machine langauge and a lot of other things.
Simply put, the switching doesn’t do anything by itself. It’s the meaning we assign to the arrangement of on-off switches. Much like flag signals, the flags don’t do anything besides be visible and locatable. Yet, we can establish a communication protocol with flags, lights, fingers on a hand, etc. this signaling is done electronically with many layers of meaning and complexity, and nowadays at unfathomable scale and speed.
Watch this. It’s a guy who shows how computers work using dominoes. It really helps explain how calculating something works at its most fundamental level
Transistors (electrically activated diodes) allow for logic gates. Logic gates allow for wild bullshit
Well… since you put it that way, it is quite staggeringly improbable, isn’t it?
“Through these terse, inter-connected runes, an invisible magic flows. You cannot change the rune, as then the spell will be broken.”
“Where does the magic come from, mommy?”
“From the highest point in the invisible topology of this magic, Billy: the Hoover Dam/Niagara Falls”.blessed be the white magic that reoies not on corruption of the elements
sighs Okay, to start, you’re going to need some amber and sheep’s wool…
deleted by creator
It was later called the stone age.
I’m a little offended that this utterly skips over software, as if a CPU would do anything without the component that was invented before any CPU.
Software without a CPU is still useful. The reverse is not true.
Isn’t software the “trick rock into thinking” part?
Software is a list of instructions, you can execute it with pen and paper or just using your brain if you want.
Try doing that with my code.
Or with a Skyrim porn mod
Insert Matrix quote " All I see is blonde, brunette, redhead…"
Try doing it with IOCCC code.
Yuck
Software is a necessary component, just like screws are a necessary component in an engine. Screws don’t exist only in engines, have existed since long before engines, and can be used in other ways. Just like software.
How is software without a CPU useful? Its literally a list of instructions for a CPU.
Also a CPU can still calculate stuff if you just send electrical signals to the right connections. Software is just a way for the CPU to keep going and do more calculations with the results.
Software is algorithmic instructions. We wrote and executed algorithms by hand long before we had calculating machines; and when we did get computers that could run more complex algorithms, they didn’t have CPUs. They had vacuum tubes (there were even simpler programmable purely mechanical computers before even vacuum tubes). CPUs didn’t come along until much later; we’d been writing software and programming computers for decades before the first CPU.
And even if you try to argue that vacuum tubes computers had some collection of tubes that you could call a “CPU” - which would be a stretch - then it still wouldn’t have been made from silicon (rocks) as in the OP post.
But before the first calculating mashing, people are writing algorithms - what software literally is - and executing them by hand long before we had calculating machines to do it for us. Look up how we calculated the ranging tables for artillery in WWII. Algorithms. Computed by hand.
The word “computer” literally comes from the word for the people (often women) who would execute algorithms using their brains to compute results.
I think you’re conflating “algorithm” with “software”. You’re right in saying that algorithms can be computed by hand, but I don’t think anyone would refer to that as “running software”. The word “software” implies that it’s run on “hardware”, and hardware usually implies some sort of electronic (or even mechanical*) circuit, not pen and paper and a human brain.
Software runs on processing power. Doesn’t matter if it’s mechanical, electrical or biological computing power.
The important part is, that something is processing it.
And although by now software development through abstraction feels disconnected from just specialised algorithms: everything will break down into numbers and some form of algorithm to process the informationSay I agree with your distinction - or restriction. There was still software written for, and programmed into, general-purpose, Turing-complete calculating machines long before there are CPUs.
So let’s look at the technical details of the word. The term “Software” was coined in 1958 by John Tukey. The computers in use at that time were machines like the IBM 704, the PDP-1, and the UNIVAC 1107; these are all vacuum tube computers that contained no silicon microchips and no CPUs. Even technically, the term “software” predates silicon and CPUs.
Non-technically, I disagree with your premise on the basis that it’s often been argued - and I agree with the argument - that humans are just computers with software personalities programmed by social conditioning, running on wetware and a fair bit of firmware. And there’s increasing evidence that there’s no real CPU, just a bunch of cooperating microorganisms and an id that retroactively convinces itself that it’s making the decisions. Even if the term “software” wasn’t coined until 1958, software has been a thing since complex organisms capable of learning from experience arose.
Unless we’re all living in a simulation, in which case, who knows if software or hardware really exist up there, or whether there’s even a distinction.
They called the box with all the tubes in it that executed instructions a “CPU”; memory, CPU, and IO subsystems were distinct and well-defined.
I feel like you mean “microprocessor”
We also had machines and computers based on relays and other electro mechanical devices earlier than even vacuum tubes. If you follow Technology Connections he breaks down the inner workings of a pinball machine using that technology, but programmable machines have also been made with it.
The babel(SP?) machine from Greek times.
my guy megabytes of executable binary are just about as usable as a cpu try reverse engeneering more 1s and 0s than you’ve ever read into something usable when you dont even know how it converts into assembly and logical operations because you lack the architecture knowledge of a cpu
No it’s even worse. We taught the rock how to think, and now force it to think what we want it to think. Millions of thoughts that we want, every second.
It’s only time until the rock fights back.
People who are coding Skyrim porn mods are smarter than the one who invented CPU /s
And the “inscribe ancient runes” step takes up to 4 months.
The classic response is that you have to capture lightning first to apply to the rick that you want to do the thinking.
What’s going on when it gets turned back into a rock?
Probably the silicon being made into a wafer.
I was going to send this to my colleagues, but then I read the last line.
I fail to see the issue.
Crop it? 🤷🏻
Related to development of semiconductor devices, especially transistors. The original computers used punched holes. In some way it’s an extension of that same idea
Edit: for the curious, some accessible intros
https://www.livescience.com/20718-computer-history.html
https://news.stanford.edu/stories/2023/09/stanford-explainer-semiconductors