Altair 8800, brought in MS-DOS (CP/M clone) 3. Gordon Moore, Moore’s law: “number of transistors per square inch on integrated circuits doubles approx. every year” 4. Steve Jobs, Apple – all things iPhone/Mac (based on BSD, a cousin of Linux etc)
the first ever compiler for Sperry called A-0, then there was A-1 and A2. She went on to develop FLOWMATIC and Cobol, amongst many others things. She died in 1992.
hardly accepted, but Hopper followed her philosophy of "Go ahead and do it. You can apologize later.". She was disappointed —"I had a running compiler, and nobody would touch it because, they carefully told me, computers could only do arithmetic; they could not do programs. It was a selling job to get people to try it. I think with any new idea, because people are allergic to change, you have to get out and sell the idea."
which manages the interaction between applications (computer programs), the hardware and a wider world. Short version: It handles all of the difficult bits that we shouldn’t do.
glorified calculators, with little, if any, ability to program. Sometimes numeric values were internally represented as decimal, but that was costly and very complex. The solution is to internally manage values as binary then convert to decimals for humans to see.
values is 1 or 0, but what to do with large numbers? Early devices manage chunks of bits, at a time, eg. 18 and 36- bit processors were common, ie. PDP-15 or IBM 370. Issues remained on how to do calculations and how to handle characters (letters).
or no power numbers) 2 to the power of 0 is 1, 2 to the power of 2 is 2, 2 to the power of 2 is 4, 2 to the power of 3 if 8 etc. Take numbers 7 and 2, convert to binary 111 (power of 2 – binary) and 010, add together 1001 = 9 (decimal). Much of early computer development was spent on how best to handle and manipulate binary.
humans (decimal numbers or as characters)? Émile Baudot developed a solution which pre-dates, ASCII (7 bit), Unicode (21-bit code space, but normally chucks of 8 or 16 bits) and even IBM's EBCDIC (8-bit).
something to oversee: data manipulation, character representation and programmability. That is why we need operating systems, in part, but that is only one element of the story.
executes a single piece of code without any process protection or control. An operating systems: manages multiple processes, applications, enforces a degree of separation, control and security.
single jobs, batched up, one after another. Faster CPUs: manage the process life cycle, processor resources, Disk IO and networking and interactions with the user.
were very primitive, then came the advent of batch processing and beyond. 1960s - Time-sharing, Multics. 1972 - Unix rewritten in C. 1991 - Linux starts.
Twitter, Docker, etc As of November 2017, 500 (100%) of the world’s top 500 Supercomputers use Linux. Android phones, tablets, Internet of Things (IoT) devices, etc. Everywhere. Servers. Raspberry Pis etc.
60+ years, with periods of openness (MVS is one example), to a closed period (late 1980s to late 1990s) and then onto the new age of post-Git sharing of ideas and inspiration. Open source made that possible and the people that supported it.