Article related image

Key Moments in the History of Computer Science Coding

There’s no doubt that computers have changed both the world and the way we live our lives drastically. While the rise of the personal computer may have only happened in the past few decades, computer coding as we know it started much, much earlier. Here are some of the key moments in computer science coding history that helped to shape the digital world into what it is today.

Binary number system (1703)

Binary code is the primary language that computers use to communicate. Whether it’s a video of a cat playing a piano or the complicated algorithms of the most powerful supercomputer, everything you see on your screens and all the internal workings of your devices use binary.

While it might be one of the most important languages in computer science coding, the binary system was invented long before the first computer was even dreamed of. German philosopher and mathematician Gottfried Wilhelm Leibniz first invented binary code in 1703.

Binary code uses only two symbols: 0 and 1, otherwise known as “off” and “on.” Each digit represents a power of two, and the way the 0s and 1s are arranged is used to represent different letters and thus computer operations.

If you’d like to write “Hello, how are you?” in binary, it would look like this:

01001000 01100101 01101100 01101100 01101111 00101100 00100000 01101000 01101111 01110111 00100000 01100001 01110010 01100101 00100000 01111001 01101111 01110101 00111111

Punch cards (1801)

Machine to punch cards at work
Credit: PixHouse/ iStock

Long before the invention of the first computer, in 1801, a man named Joseph Marie Jacquard was looking for a way to allow an unskilled worker to weave complex patterns using a loom. He devised a way to use cards with holes punched in them to determine which fabric should be raised every time the shuttle passes in the loom. That way, a worker would only have to make sure the loom was functioning properly, while the punch cards handled all the complex operations.

Of course, Jacquard’s punch cards had applications far beyond the textile industry. Pretty soon, everyone from railroads to the federal government was using punch cards for their information storage needs. In fact, much of the information from the U.S. Census of 1890 was stored on punch cards.

Punch cards and punched tape were adopted for computer use, particularly by IBM, to store, sort, and report information. This method was used in computer systems up until the 1970s.

Integrated circuit (1949)

In 1949, a German engineer named Werner Jacobi built a new semiconductor that involved using smaller transistors instead of the big, bulky vacuum tubes used in electronics at the time. Despite the benefit of being able to shrink down devices without having to warm them up — like vacuum tubes had to be — there seemed to be little interest in the new product.

Jumping ahead to 1957, a U.S. Army engineer named Jack Kilby proposed an idea for creating small ceramic wafers that could be used as an integrated circuit, similar to Jacobi’s original idea. He built his first prototype in 1958 for Texas Instruments and applied for a patent in 1959. The new circuits were smaller, faster and more reliable than the older vacuum tubes. Computers that took up an entire room could now be shrunk down to fit inside of a small(ish) box. The idea took off, and soon everyone from the Air Force to NASA was using Kilby’s integrated circuits. They were even used in the Apollo 11 mission to the moon.

COBOL (1959)

Zoomed in computer screen showing coding work
Credit: markusspiske/ Unsplash

Before 1959, every computer manufacturer used their own programming language, which meant that nothing was compatible. Frustrated with the lack of cooperation between systems, a group of programmers got together to design the first programming language that could communicate with computers made by different manufacturers. Their COmmon, Business-Oriented Language was called COBOL.

Over time, dozens of new programming languages were created based on the COBOL idea. Today, some of the most popular programming languages are:

  • JavaScript
  • Python
  • Java
  • PHP
  • C#
  • C++

The Space Race (1961)

The Cold War was a time of tension and fear between the United States and the Soviet Union. But out of fear, came progress. The competition between the two countries led to a whole host of new technologies.

Prior to the Cold War, computer advancements had largely been made by private companies and researchers. Once President John F. Kennedy vowed that the U.S. would beat the Soviets to the moon, government funding poured into computer research.

To get Neil Armstrong to the moon, NASA enlisted the help of IBM to build the computers and systems required to navigate the space shuttles, monitor safety procedures, and perform millions of calculations. While the development of a 36-kilobyte memory system might not seem impressive by modern standards, in the 1960s, it was a big deal. The new computers developed for the Space Race surpassed everything that had come before and pushed the world firmly into the technological age.

World Wide Web (1989)

Man sits a laptop and types, with coffee and books adorned on desk
Credit: G-Stock Studio/ Shutterstock

In the late 1980s, a computer scientist working at the European Organization for Nuclear Research (CERN) named Tim Berners-Lee developed a program that could store information in files that also contained links to and from other files. He called the technique "hypertext." He eventually used hypertext to link computers together to form a network.

Not into thinking small, Berners-Lee proposed a grand idea: what if there was a global hypertext document system that could link computers from all over the world? He drew up a proposal in 1989 and got to work. Soon after, the World Wide Web was introduced and changed the world, and computer science, forever.