This article presents a detailed timeline of events in the history of computing from 1950 to 1979. For narratives explaining the overall developments, see the history of computing.
The Z4 was replaced by ERMETH, a computer developed at the ETH in Switzerland from 1953 to 1956, one of the first electronic computers on the European continent.
This computer is the first to allow interactive computing, allowing users to interact with it using a keyboard and a cathode-ray tube. The Whirlwind design was later developed into SAGE, a comprehensive system of real-time computers used for early warning of air attacks.
The Mark 1 is a commercial version of the Manchester Mark 1 machine from the University of Manchester. The music program was written by Christopher Strachey.
EDVAC could have new programs loaded from the tape. Proposed by John von Neumann, it was installed at the Institute for Advance Study, Princeton University, Princeton, New Jersey, US.
The development continued until 1957. It is still in use for scientific programming. Before being run, a FORTRAN program needs to be converted into a machine program by a compiler, itself a program.
I have travelled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year.[needs context]— Editor in charge of business books for Prentice Hall
I have travelled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year.[needs context]
Robert Noyce, who later set up Intel, also worked separately on the invention. Intel later went on to perfect the microprocessor. The patent was applied for in 1959 and granted in 1964. This patent was not accepted by Japan so Japanese businesses could avoid paying any fees, but in 1989 – after a 30-year legal battle – Japan granted the patent; so all Japanese companies paid fees up until the year 2001 – long after the patent became obsolete in the rest of the world.
In contrast to Kilby's germanium integrated circuit, Noyce developed a silicon integrated circuit, using Jean Hoerni's planar process.[11]
This machine introduced many modern architectural concepts: spooling, interrupts, pipelining, interleaved memory, virtual memory and paging. It was the most powerful machine in the world at the time of release.
The game ran on a DEC PDP-1. Competing players fired at each other's space ships using an early version of a joystick.
The mouse was not to become popular until 1983 with Apple Computer's Lisa and Macintosh and not adopted by IBM until 1987 – although compatible computers such as the Amstrad PC1512 were fitted with mice before this date.
It was published in the 35th Anniversary edition of Electronics magazine. The law was revised in 1975 to suggest a doubling in complexity every two years.
BASIC was not implemented on microcomputers until 1975. This was the first language designed to be used in a time-sharing environment, such as DTSS (Dartmouth Time-Sharing System), or GCOS.
But what ... is it good for?[needs context]— Engineer at the Advanced Computing Systems Division of IBM commenting on the microchip.
But what ... is it good for?[needs context]
It was later released as C source code to aid portability, and subsequently versions are obtainable for many different computers, including the IBM PC. It and its clones (such as Linux) are still widely used on network servers and scientific workstations. Originally developed by Ken Thompson and Dennis Ritchie.
Dennis Ritchie, one of the inventors of the Unix operating system, simplifies BCPL into a language he calls B, then iterates B into C. It is a very popular language, especially for systems programming – as it is flexible and fast. C was considered a refreshing change in the computing field because it helped introduce structured programming. Inspired by C, C++, was introduced in the 1980s, and in turn helped usher in the era of object-oriented programming.
There is no reason anyone would want a computer in their home.[needs context]— Ken Olsen, founder, president, and chairman of Digital Equipment Corporation
There is no reason anyone would want a computer in their home.[needs context]
{{cite journal}}
Historians credit seminal insights to Welsh scientist Donald W. Davies and American engineer Paul Baran
Essentially all the work was defined by 1961, and fleshed out and put into formal written form in 1962. The idea of hot potato routing dates from late 1960.
Almost immediately after the 1965 meeting, Donald Davies conceived of the details of a store-and-forward packet switching system
Then in June 1966, Davies wrote a second internal paper, "Proposal for a Digital Communication Network" In which he coined the word packet,- a small sub part of the message the user wants to send, and also introduced the concept of an "Interface computer" to sit between the user equipment and the packet network.
In nearly all respects, Davies' original proposal, developed in late 1965, was similar to the actual networks being built today.
The first packet-switching network was implemented at the National Physical Laboratories in the United Kingdom. It was quickly followed by the ARPANET in 1969.
The system first went 'live' early in 1969
The set of algorithms, equations and arcane mathematics that make up public key cryptography are a crucial technology for preserving computer privacy in and making commerce possible on the Internet. Some hail its discovery as one of the most important accomplishments of 20th-century mathematics because it allows two people to set up a secure phone call without meeting beforehand. Without it, there would be no privacy in cyberspace.