History of computers

When were computers invented? Malwarebytes covers the history of computers and computer precursors.

Apollo 11, the American spaceflight that carried the first humans to land on the moon in 1969, couldn't have done so without its state-of-the-art computer. The system had 4KB of RAM, 72KB of ROM, a 32KB hard disk, and a processor that ran at 0.043 MHz.

Computers have come a long way since then.

A modern mobile phone with 512GB of ROM and a 2500 MHz processor has over seven million times more memory and 100,000 times more processing power than the Apollo Guidance Computer (AGC). Moreover, a mobile phone can fit in your pocket. Carrying the 30K AGC around in a pocket would have been an almost impossible task.

The evolution of computers didn't happen overnight. It's taken years for computers to become smaller, more powerful, and multifunctional. We can actually trace the lineage of computers to the early 1800s.

What is the history of computers?

The 1800s

  • 1801: Joseph Marie Jacquard, a French weaver and merchant, develops a loom that employs wooden punch cards to weave fabric designs automatically, planting the seeds for early computers.
  • 1821: Charles Babbage, an English mathematician, comes up with the Babbage Engines. These machines use technologies that will inspire the development of modern computers.
  • 1848: English mathematician Adaa Lovelace writes some computation while translating a French paper on Babbage’s machine. These notes are, in essence, an algorithm, the world’s first computer program. 
  • 1890: Herman Hollerith, an inventor, creates a punch card system to help the U.S. government calculate the census. Hollerith later goes on to create a little company eventually known as International Business Machines Corporation (IBM).

The 1900s

  • 1936: The father of modern computers, Alan Turing, comes up with the idea for a machine called the Turing machine that can compute anything computable. The British scientist later helps develop the Turing-Welchman Bombe, a device that deciphers Nazi codes and plays a critical role for the Allies in World War II.
  • 1937: American physicist, inventor, and professor, John Vincent Atanasoff, lays the groundwork for the first electronic digital computer in a grant proposal. His concept famously lacks the cams, belts, gears, or shafts found in typical machines of the mid-1900s.
  • 1939: Two men named Bill Hewlett, and David Packard toss a coin in a garage in Palo Alto, California to decide the name of their technology company. The coin lands, and they agree to call their company Hewlett-Packard. Their first product is the HP 200A Audio Oscillator. It helps Walt Disney Pictures test sound for theaters that go on to show Fantasia a year later.
  • 1941: German inventor Konrad Zuse creates Z3, the world’s first digital computer. Unfortunately, the machine is destroyed. Zuse flees Nazi Germany and creates the Z4 nearly a decade later.
  • 1941: Atanasoff and his student Clifford Berry create the first computer than can store data within its memory.
  • 1943: University of Pennsylvania professors John Mauchly and J. Presper Eckert create a large digital computer powered by 18,000 vacuum tubes. Called the ENIAC (Electronic Numerical Integrator and Computer), it’s the world’s first digital computer that’s electronic and programmable.
  • 1946: After leaving the University of Pennsylvania, Mauchly and Eckert secure funding from the Census Bureau to build the world’s first commercial computer for businesses and the government.
  • 1951: Christopher Strachey writes the first successful artificial intelligence (AI) program.
  • 1952: IBM releases its first commercial scientific computer called the IBM 701 Electronic Data Processing Machine.
  • 1953: Grace Hopper, a computer scientist, devises the theory for the programming language FLOW-MATIC. The programming language is later expanded to create the high-level programming language COBOL.
  • 1953: University of Cambridge Computer Laboratory starts the world’s first computer science degree program.
  • 1958: The integrated circuit by Jack Kilby lays the groundwork for the modern computer chip.
  • 1962: IBM debuts the 1311 Disk Storage Drive, leading to a new phase of removable disk storage and marking the end of the punched card era.
  • 1964: Douglas Engelbart, with assistance from Bill English, creates a prototype for the first computer mouse.
  • 1968: Bernard Lechner of RCA Laboratories conceives the idea of a TFT-based liquid-crystal display (LCD).
  • 1969: Computer scientists Ken Thompson and Dennis Ritchie create the first version of the highly influential operating system, UNIX.  
  • 1969: Advanced Micro Devices, Inc. (AMD) is founded by Walter Jeremiah (“Jerry”) Sanders and seven others.
  • 1970: Intel introduces the first commercially available dynamic random-access memory (DRAM) integrated circuit (IC), called the Intel 1103.
  • 1971: IBM introduces the world to an 8-inch flexible plastic memory disk that everyone will eventually call the "floppy disk."
  • 1972: Sherwin Gooch invents the first sound card, the Gooch Synthetic Woodwind.
  • 1972: Atari releases Pong, the first commercially successful video game.
  • 1973: A team at Xerox Corporation's Palo Alto Research Center (Xerox PARC) in California creates the Ethernet.
  • 1975: Two boyhood friends from Seattle convert the programming language BASIC for use on a personal computer. Not long afterwards, the duo, Bill Gates and Paul G. Allen, name their new company Micro-Soft. Eventually, the name is changed to the Microsoft Corporation.
  • 1976: Dataram shows off the world’s first solid-state drive.
  • 1976: Gary Starkweather invents the first laser printer at a Xerox research lab.  
  • 1976: Two college dropouts introduce the world to the first computer with a single-circuit board. The men, Steve Jobs and Steve Wozniak, name their company Apple Computers, inc.
  • 1977: The Apple II launches with color graphics and an audio cassette storage drive.
  • 1978: The White House installs its first computers, the Hewlett-Packard HP3000.
  • 1979: Toshi Doi and Kees Schouhamer Immink work on a task force for Sony and Phillips to develop the Compact Disc Digital Audio (CD-DA). A few years later, the CD-DA evolves into the CD-ROM.
  • 1980: Atari Inc. releases Battlezone, the first popular game to use 3D graphics.
  • 1981: Hot on the heels of Apple, IBM releases the Acorn, with an Intel chip, two floppy drives, and a color monitor.
  • 1982: Time Magazine recognizes the fast-changing landscape and names the computer as the “Machine of the Year” in place of the “Man of the Year.”
  • 1982: Japanese company Denon develops the CD-ROM and introduces it to Sony two years later.
  • 1983: Home productivity hits a milestone as Microsoft releases Word 1.0 for Xenix and MS-DOS.
  • 1984: Apple unveils the Macintosh in a fancy ad during the Superbowl. The original price is around $2,500.
  • 1984: CompuServe launches Islands of Kesmai, the first commercial multiplayer online role-playing game.
  • 1985: Microsoft launches a Graphical User Interface (GUI) as an extension of MS-DOS. It calls it Windows.
  • 1985: Array Technologies Inc, later known as ATI Technologies, is founded.
  • 1985: The first edition of the C++ programming language debuts.
  • 1986: Two brothers in Pakistan running a computer store create the first PC virus called Brain.
  • 1989: British scientist Tim-Berners-Lee invents the World Wide Web (WWW) to help share information between scientists worldwide.
  • 1989: Nintendo releases an 8-bit handheld gaming console called the Game Boy.
  • 1989: Creative Labs releases the Sound Blaster 1.0 sound card, starting a revolution in PC audio.
  • 1991: The world’s first website is ready to launch.
  • 1991: Finnish student Linus Torvalds creates a new operating system kernel. It’s later known as the Linux kernel.
  • 1992: The evolution of Windows continues with the launch of Windows 3.1.
  • 1992: Mark Segal and Kurt Akeley launch an application programming interface (API) called Open GL (Open Graphics Library).
  • 1993: Apple enters the handheld computer market with the Newton.
  • 1993: Jensen Huang, Chris Malachowsky and Curtis Priem create graphics chips company NVIDIA.
  • 1993: Intel introduces its successor to the 80486 microprocessor. It carries two processors on a single chip and is called the Pentium.
  • 1994: Sony Computer Entertainment achieves a milestone in console gaming with the release of the PlayStation.
  • 1995: Microsoft debuts Windows 95, a significant step up from the Windows 3x series, introducing many features still around in the latest iteration of the operating system.
  • 1995: Microsoft launches an API called DirectX. The DirectX vs OpenGL debate will continue for years.  
  • 1995: 3DFX releases a VGA 3D accelerator add-in card called the Voodoo Graphics. The Voodoo series helps start a 3D gaming revolution. 
  • 1996: Uh-oh, ICQ, the first instant messaging service to gain worldwide popularity, hits the Internet. 
  • 1996: Two Ph.D. students at Stanford University named Larry Page and Sergey Brin, start a research project. Eventually, they come up with a new name for their project, Google.
  • 1997: A committee agrees on a name for an official wireless standard, leading everyone to ask: “What does WiFi stand for?”  
  • 1997: NVIDIA launches the RIVA series of graphics processors. The RIVA 128 is one of the first consumer GPUs to integrate 3D acceleration and conventional 2D.
  • 1998: Apple launches a consumer desktop for $1,299 called the iMac.
  • 1999: Kyocera Visual Phone VP-210, the first commercial camera phone, debuts in Japan.
  • 1999: NVIDIA launches the landmark GeForce 256 GPU, offering top 3D graphics quality.
  • 1999: AMD gives Intel something to think about with the release of the Athlon processor.

The 2000s

  • 2000: ATI launches The Radeon, heating up the competition with NVIDIA.
  • 2000: Nvidia buys 3dfx, marking the end of a pioneer of 3D graphics and video cards.
  • 2001: IBM introduces the world's first multicore processor.
  • 2001: Apple launches the first version of Mac OS X. It’s based on the UNIX architecture.
  • 2001: Microsoft launches the critically acclaimed Windows XP.
  • 2001: Microsoft releases the Xbox to compete with Sony’s PlayStation 2 and Nintendo’s GameCube.
  • 2004: After feeling frustrated with the performance of antivirus software on his mother’s computer, Marcin Kleczynski informally establishes Malwarebytes Inc.
  • 2004: Firefox 1.0 launches and earns 100 million downloads in a year.
  • 2005: The first dual-core chips for x86-based computers hit the market. They’re called the Pentium D.
  • 2005: Three former employees of PayPal, Steve Chen, Chad Hurley, and Jawed Karim, create a platform for sharing videos called YouTube.
  • 2006: Malwarebytes releases anti-malware software to defend computers from different types of potentially unwanted programs.
  • 2007: HP is one of the first companies to release a smart TV for the masses. Samsung follows up with its own smart TV a year later.
  • 2003: Android Inc. starts work on an operating system for digital cameras that changes to an operating system for smartphones a year later.
  • 2003: AMD ships the first x86-based 64-bit processor.
  • 2005: Google buys Android for $50 million as part of its expansion plans.
  • 2006: Google makes its second excellent acquisition when it buys YouTube for $1.6 billion.
  • 2006: AMD acquires ATI for $5.6 billion to expand its line of products.
  • 2006: Intel introduces a quad-core CPU.
  • 2006: Apple launches a new Macintosh notebook called the MacBook.
  • 2007: Apple introduces the world to the groundbreaking iPhone. The mobile phone features an intuitive touch screen design and is feature-rich. It’s essentially a small pocket-sized computer that can make phone calls.
  • 2007: Microsoft launches the well-received Windows 7 operating system.
  • 2008: Google launches a cross-platform web browser called Google Chrome.
  • 2009: The first Samsung Galaxy goes on sale, starting a mobile phone rivalry with the iPhone.
  • 2010: Apple acquires the voice assistant program Siri for its products.
  • 2010: Apple introduces its tablet computer, the iPad.
  • 2011: Google launches a low-cost light laptop that relies on an Internet connection and cloud services called the Chromebook.
  • 2014: Amazon introduces the world to the AI-powered voice assistant, Alexa.
  • 2015: Apple releases a small and powerful wearable device called the Apple Watch.
  • 2016: Google releases its own AI-powered voice assistant.
  • 2016: Oculus VR releases the Oculus Rift lineup, a landmark for consumer-focused virtual reality (VR) technology.
  • 2016: Niantic launches Pokémon GO. The game goes on to earn over 500 million downloads worldwide and helps promote augmented reality (AR) technology.
  • 2016: Scientists create the first reprogrammable quantum computer.
  • 2020: AMD releases the Threadripper 3990X, the world’s first 64-Core desktop processor.
  • 2020: Sony and Microsoft launch the latest in video game consoles with the PlayStation 5 and the Xbox Series X.
  • 2021: Facebook head Mark Zuckerberg introduces the world to the Metaverse, a virtual universe with 3D spaces and environments for people and businesses.

 

In a few short decades, computers have evolved in multiple fascinating ways. Many intelligent and creative men and women have had a hand in this amazing journey. Just one small step forward from one innovator can propel a giant leap for the entire industry.

Select your language