May 17, 2017 | Last updated on April 23, 2024

The History of Software: 6 Revolutionary Innovations

Written by Pete Nystrom

The combination of intuitive software and powerful hardware working together has unlocked a myriad of possibilities. It has created a back-end miracle for allowing our smartphones, our computers, and even our watches to function.

The rise of the software has also given birth to an entire industry giving people like us at Seamgen an opportunity to further specialize in its craft. It may be hard to believe but we were not always surrounded by this much software.  Have you ever wondered how history of software all started?

Even before counting off the greatest innovations that would help build today’s software industry, the word software should be defined. The term “software” within the computing world comes from American statistician John W. Turkey. Published within the 1958 January volume of The American Mathematical Monthly, The Teaching of Concrete Mathematics states software as:

Today the ‘software’ comprising the carefully planned interpretive routines, compilers, and other aspects of automative programming are at least as important to the modern electronic calculator as its ‘hardware’ of tubes, transistors, wires, tapes and the like

The important distinction here is how Turkey labels the difference between hardware and software. Until Turkey’s definition, there was never any mentions of software as an entity and much of that resulted from a lack of distinction between software and hardware during this period.

It was not until after the invention of programming languages and operating systems that the software industry takes off. Without programming languages or operating systems, everything was up to the individual programmer.

It’s hard to imagine computers today without programming languages or operating systems but within the pre-software industry context, computers were not the fancy machines we have today. Rather it was used to describe those whose occupation was to compute.

1. Alan Turing and the Turing Machine

Known as the father of modern computing, he is well known for breaking codes from the German Enigma machines in World War II.

The Turing Machine was a theoretical construct in which mathematical calculations could be done.

According to Stephen Wolfram’s A New Kind of Science, the machine works similar to mobile automata in that they consist of a line of cells, known as the “tape”, together with a single active cell, known as the “head”. But unlike mobile automaton, the head in a Turing machine can have several possible states.

turing

What this meant was that the complex calculations could be broken down and performed based on symbols, which is the backbone of computer programming.

2. SAGE

The 1950’s saw the rise of electronic computing and arguably the most significant innovation from this era was the construction of the SAGE (Semi-Automatic Ground Environment). Created as a real-time air defense system for the United States, it became the first computer network.
Characterized by its enormous hardware and large displays, the operation program consisted of close to a quarter of a million instructions and over a million lines of code. The contract to build this was given to a then relatively new company, IBM.

The same IBM who is revolutionizing medicine with AI.

Here is an original advisement for the SAGE system.

 

 

3. OS/360

Thanks to SAGE, IBM got a head start in the computer industry, and it started revolutionizing the way of computing with a family of computers that had consistent architecture. This mainframe architecture was called the S/360 and became a huge success as government organizations and businesses adopted it.

Software wise, the OS/360 within was nothing spectacular but was a huge advancement not only because of the resources poured into developing it, but because it also allowed the computer to run multiple jobs at a time, or “multitasking”.

4. Database Management Systems

By the 1970’s IBM had opened up the software market, and the industry itself was becoming closer to its modern-day iteration. There was just one thing holding it back.  Data management was still an issue.

Progress would be made with leaps and bounds but the concept of a relational database would revolutionize our approach to data. The relational database called for data structures to be standardized meaning different programs could use the same data and laid the foundation for companies such as Oracle to rise.

IBM’s take on the relational database management system (RDBMS) lead to System R and SQL (Structured Query Language) which became a standard language for querying RDBMS.

5. The Minicomputer and the Microcomputer

Before the microcomputer came the minicomputer. In 1965 the PDP-8 (Programmed Data Processor) was released and made computing affordable.

Although nothing like the powerhouse specs of today’s hardware, the PDP-8 had 4 kilobytes of memory and would cost users $18k. This price tag of course did not make it affordable for the everyday person but back then computers were for science and engineering; normal citizens had no need for a complex machine like this.

The affordability and size of the PDP-8 created hopes that one day computers would be even smaller, and could be bought and used by individuals. The era of the personal computer was fast approaching.

old computer

Source: rs-online.com

The birth and popularization of the microcomputer is traced to commercializing the transistor and integrated circuit which allowed for computer memory, and logic circuits to be on a single silicon chip.

This transistor was developed with the help of William Shockley, who would later found Shockley Semiconductor from which the Traitorous eight come from.

The Traitorous eight would leave Shockley Semiconductor and find their own company Fairchild Semiconductor. From Fairchild came powerhouses such as Intel and AMD and the beginnings of Silicon Valley.

robert noyce

Source: britannica.com

Due to the creation of companies such as Fairchild and Intel, a microcomputer revolution would occur in the 1970’s and 1980’s. It is during this technological revolution that Microsoft and Apple appear.

6. The Internet

It is hard to imagine the Internet not being easy to navigate. In the era before the browser, HTML and HTTP was used to navigate and jump between locations on the internet.

The browser was truly a revolutionary piece of software as the original intent of the browser was to navigate hypertext links with a click while displaying graphics and text.

This was realized with Mosaic.

mosaic site

Thanks for reading!

History is deep and reach but we hope you enjoyed this brief look at the history of the software. Without it, you wouldn’t be reading this blog. Want to read more about technology? We got more where that came from, only on the Seamgen blog.

The State of Internet Encryption Technologies 

Neuralink: Elon Musk Wants Humans to be Smarter

Value Propositions: Decoding That Powerful Sentence 

Pete Nystrom
Written by
Pete Nystrom
VP of Engineering, Seamgen
Software architect, full-stack web/mobile engineer, and cloud transformation expert
Top Application Development Company San Diego and web design company in San Diego

Do you need a premier custom software development partner?

Let’s discuss your modernization strategy and digital application goals.

Let's Connect

Contact

hello@seamgen.com

(858) 735-6272

Text us
We’re ready for you! Fill out the fields below and our team will get back to you as soon as possible.