Table of Contents
The computer industry consists of different computer-related businesses that offer various kinds of services for consumers, including developing software or programs, designing hardware, creating infrastructures on the web, and manufacturing important components like chips, processors, and transistors.
Computers are widely influential in other kinds of industries because of their attributes, functions, and features that can help companies to have easier access to business-related data and information that can all be stored on a computer or through the internet. In addition to other industries, computers are also a part of every person’s daily life, as they can come in the form of PCs (personal computers), laptops, tablets, and even smartphones that are used by most people every day.
History of the Early Computers
The concept of “computing” predates the beginning of the computer industry, as ancient people have already been experimenting with different methods to compute days, number of items, and numbers in general.
One of the first computing devices, the abacus, was invented around 300 BC in Ancient Babylonia, although this iteration of the abacus had a different name and mechanics. The first abacus, often referred to as the oldest counting board, is called the Salamis Tablet, which was first discovered in modern times on Salamis, an island in Greece, in 1846.  The Salamis Tablet is described to be a slab of white marble that is 75 centimeters in width, 4.5 centimeters in thickness, and 149 centimeters in height. Furthermore, the tablet is engraved with two sets of five horizontal lines and one vertical line at the center. This tablet is believed to have been a precursor of the modern abacus and was utilized to perform mathematical calculations.
About 200 years after the creation or invention of the Salamis Tablet, the calculating feature of the tablet was improved through the invention of the Roman Calculi and the Hand-abacus. Most of the calculating devices, or presumably all of those that survived until the modern era, were made of sheets of metal or a solid slab of stone.
In 1000 AD, the hand abacus made its way to Asia, specifically in China and Japan, where the abacus was further improved and was called “suan-pan” and “soroban.” The abacus that we use today is similar to the abacus invented in China.
Creation of Mechanical Calculators
The first mechanical computing device was invented in 1623 by Wilhelm Schickard, a German polymath or math expert that designed a calculator based on “Napier’s bones,” a calculating device created by Scottish physicist John Napier to improve upon the computing abilities of the abacus. Unfortunately, one of Schickard’s machines was destroyed during a fire in 1964, and because of the incident, he was so disheartened that he didn’t want to build another calculator.
On the other hand, a young French mathematician by the name of Blaise Pascal started working on designs and mechanisms for a mechanical calculator in 1642. After 50 prototypes that were made and scrapped in a period of three years, Pascal was finally able to invent a working mechanical calculator. Pascal then wrote a pamphlet that contained texts about the features of the calculator, as well as a dedication letter written for Pierre Seguier, the then-Chancellor of France who encouraged Pascal to continue the development of the machine.  The calculator would eventually be named the “Pascaline” and would become one of the most important machines in the history of computers. Pascal built twenty Pascalines from 1645 to 1655, and only nine of those machines have survived today. Most of those Pascaline are currently on display in various museums in Europe.
Electric and Electronic Calculators
In the 20th century, the mechanical calculators of the past evolved into calculating machines that function through the use of electric motors. During this time, the “computing” job became widely popular among some women, who were often called “computers.” Throughout the 1930s to the 1950s, the calculators that companies used were desktop mechanical calculators manufactured by Monroe, Friden, and Marchant Calculator.
A portable calculator called the “Curta” was invented by Austrian engineer Curt Herzstark in 1948 as a way to minimize the space taken by desktop calculators in offices. The Curta was cylindrical in shape and had several sliders that could be used to calculate numbers.  Herzstark would eventually inspire the creation of other portable calculators that are still being used today in schools, offices, and stores.
The first all-electronic desktop calculator, ANITA Mark VII, was invented in 1961 by Bell Punch Co., a British company that focused on creating ticket machines and desktop calculators.  Its power source, the vacuum tubes, is the standard power source for the first generation of computers, which is particularly for its heaviness and bulky size due to the many components attached to the tubes.
The First Computer
Although “computing” was already done through various devices and machines invented from 300 BC up to the 1600s, the term “computer” that refers to a machine that can compute numerical data was only coined when the Babbage Difference Engine was designed and created in the 1820s. It is widely known in many history books that the Babbage Difference Engine, which was developed by prolific English mathematician Charles Babbage, is the very first computer ever designed. Although it is credited as the first computer, Babbage actually failed to build a different engine. 
There are many theories as to why Babbage was unable to create a working difference engine, with some suggesting that his wealth wasn’t enough to cover the building expenses, while others believed that it was due to the lack of government funding. Fortunately, Babbage left some very descriptive drawings and notes so that others can build them in the future. After many years, a working difference engine called the “Difference Engine No.2” was built in 2002 and is faithful to Babbage’s notes and blueprints.
The Creation of ENIAC
The computer that is widely regarded as the grandfather of all modern computers is the ENIAC, also known by its full name Electronic Numerical Integrator and Computer. The ENIAC is the first electronic, programmable, and general-purpose digital computer that was invented in 1945 as a way to accurately test and calculate artillery firing tables at the Ballistic Research Laboratory department (later absorbed by the Army Research Lab) of the United States Army.
The ENIAC was financed by the United States Army, with the project being led by Major General Gladeon M. Barnes. The contract to construct the ENIAC was signed on June 5, 1943, and the development of the computer was done secretly under the code name “Project PX” at the Moore School of Electrical Engineering, an establishment located within the University of Pennsylvania. The computer was primarily designed by J. Presper Eckert and John W. Mauchly. Eckert was a grad student at the Moore School of Electrical Engineering, while Mauchly was a professor at the same school. 
In addition to the two lead designers, the ENIAC’s appearance and features were completed by a team of design engineers, which included Jeffrey Chuan Chu, Robert F. Shaw, Thomas Kite Sharpless, Harry Huske, Arthur Burks, Jack Davis, and Frank Mural. The programming of the ENIAC was handled by female mathematicians that would later be known as the “Top Secret Rosies,” and these women are Kay McNulty, Jean Jennings, Ruth Lichterman, Marlyn Wescoff, Frances Bilas, and Betty Snyder.
While it was intended to be used during World War II, it was only booted up for the first time in November 1945, which is more than one month after the end of the said war. The ENIAC was then announced to the public on February 14, 1946. A day after the announcement, the computer was dedicated to the University of Pennsylvania. In 1947, the ENIAC was transferred to the Aberdeen Proving Ground in Maryland, where it operated from July 1947 to October 2, 1955.
IBM and Its Mainframes
Right around the time when computing machines were becoming more prominent in different industries, IBM was founded in 1911 as the Computing-Tabulating-Recording Company or CTR. In 1924, CTR changed its name to “International Business Machines” or IBM.
Although the company has been the leading provider of computer systems from the 1920s up to the 1940s, it became a much bigger business when it released its first series of mainframes, or large computer systems called the 700/7000 series in the 1950s. From there, they continued to release different series of mainframes, with one of the most notable being the System/360 series which eliminated most of their competitors because of how advanced and popular the framework is. In fact, their competitors have either merged or gone bankrupt because they weren’t able to keep up with the success of IBM. 
IBM 700/7000 Series
The IBM Mainframes were introduced in 1952 with the launch of the IBM 700/7000 series. The 700 computers run on vacuum tubes, which were the prominent power source for first-generation computers during that era, while the successor to the 700, the 7000 computers, were powered by transistors, a popular power source for second-generation computers.
This series is divided into two categories; one category consists of computers that are made for scientific or engineering use, while the other category has computers that are made for data processing and commercial utilization.
IBM 1400 Series
The IBM 1400 series of decimal computers served as the successor to IBM’s tabulating machines, most notably the IBM 407. The first computer in the series, 1401, was launched on October 5, 1959. It was IBM’s first computer to have 10,000 units deployed. After 1401, it was followed by 1410 in 1960, 1420 in 1962, 1450 in 1968, and 1460 in 1963.
IBM System/360 Series
The series of mainframe computer systems that truly put IBM in the spotlight as one of the biggest computer providers in the world is the IBM System/360, which was officially announced on April 7, 1964.
What’s interesting about the System/360 series is that it deviated from the norm and was developed as a completely new series of computers that ranges from small to large bodies with low to high-performance specs, unlike most of the computers made during that time where one series would only consist of large computers without any cheaper models. In addition, the computers with the System/360 series, whether they are small or large, will have the same instruction set. With this feature, users would easily be able to upgrade or buy a bigger System/360 system without the need to learn a new instruction set or rewrite software, which could take a lot of time to do.
Because of how easy the computers are to use and upgrade, the System/360 series became very popular in different industries and businesses, and the design of keeping the same instruction set for various models in the series inspired other manufacturers to have the same feature and design for their future computers.
IBM System/370 Series
The IBM System/370 served as the successor of the S/360 series and was launched on June 30, 1970. The S/370 series features backward compatibility with the S/360, so those that have data from the older series of computers may transfer those data to the S/370 computers. The first computers released for the series were Models 155 and 165 which were sold in February and April 1971, respectively.
IBM 3000 Series
The IBM 3000 series were intended to be the high-end version of the System/370 computers that were introduced in 1977 with the release of the IBM 3031. However, the most notable computer in this series is the IBM 3090, which had more than one million-bit memory chips installed in it.  In addition, the 3090 also features Thermal Conduction Modules that allow the chips inside the 3090 to respond and act faster without the risk of overheating.
IBM System/390 Series
The next series released by IBM is the System/390 series, which features the same instruction set architecture as the S/360 series. The first computer in the series is the Enterprise System/9000 which was introduced in 1990. The System/390 series was discontinued on May 24, 1998, with 18 initial models.
IBM Z Series
As of 2021, the most current series that IBM has is the Z series, a family of large mainframe computers that was initially named the eServer zSeries from when it was launched in 2000 with the z900 computer until July 2017, when it was announced that the name of the series will be changed to “IBM Z.”
Growth of IBM from 1999 to 2020
Because of the prominence of small or portal personal computers that have power and speed that can rival large computers of the past, IBM’s revenue has been up and down over the years. Despite the massive popularity of the personal computer market, IBM is still one of the top computer-focused companies in the world.
In a revenue report conducted by Statista, which is seen in the graph above, it could be seen that the annual revenues of IBM in the late 2010s are at an all-time low when compared to the company’s revenues from 1999 up to 2014.  As the industry shifts its focus toward digital technology like Cloud services, IBM is currently trying to keep up with the trends, as the company is starting to create its own cloud platforms and digital services.
The Rise of Microsoft
The computer industry wouldn’t be as successful today without Microsoft, one of the leading companies that have innovated during its early years and are still very relevant in the current happenings within the industry.
Microsoft was founded by two childhood friends named Paul Allen and Bill Gates on April 4, 1975.  Gates and Allen founded the company in Albuquerque, New Mexico, but the office eventually relocated to Washington State in 1979 to have an easier time recruiting more employees. According to Allen, both he and Gates were inspired to form the company after reading about a January 1975 issue of Popular Electronics magazine that talked about the Altair 8800, the first-ever microcomputer.
After learning about Altair and its manufacturer, Micro Instrumentation and Telemetry Systems or MITS, Gates and Allen began working on an implementation of BASIC for the Altair system. In a span of two months, Gates and Allen, along with another friend named Monte Davidoff, were able to work out a proper deal with MITS to distribute the Altair BASIC interpreter.
The big break allowed Allen and Gates to have enough funds to form Microsoft. It was actually Allen who came up with the name “Micro-Soft” to symbolize the two main focuses of the company, which are “microcomputer” and “software.” By 1978, Microsoft became successful even if they were a startup company, as they were able to get more than $1 million in revenue. The success of Microsoft led to the company moving from Albuquerque to a bigger office in Washington.
On June 11, 1980, Steve Ballmer was hired to become the company’s first business manager. In 2000, Ballmer would replace Gates as the company’s CEO, a position that he would later give to Satya Nadella in 2014. Ballmer’s hiring proved to be successful, as he was the key for Microsoft to get its second big break: a contract to create the MS-DOS operating system for IBM’s first personal computer.  MS-DOS was released in 1981, and by 1983, Microsoft became one of the most influential companies in America, as they were able to generate $55 million in sales. Around the same year when they were starting to become successful, Paul Allen left Microsoft when he was diagnosed with Hodgkin’s lymphoma.
Two years after Paul Allen resigned from Microsoft, the company launched a new operating system called “Windows.” The Windows system features scroll bars, drop-down menus, and other quality of life improvements that can help users have an easier and more convenient time using personal computers. Because of the overwhelming sales of Windows, the company became the world’s biggest computer software company, and almost every computer that was manufactured from the mid-1980s up to the early 90s had Windows software.
Microsoft further improved upon its creation by releasing Windows 95, a new MS-DOS-based system that supported plug-and-play hardware, 32-bit applications, and enhanced interface stability. Instead of the standard Program Manager that allowed users to choose programs on the computer, Windows 95 featured the Windows Explorer shell, taskbar, and Start menu that enabled the system to be more streamlined instead of looking complex.
After Windows 95 and its upgraded versions (Windows 98 and Windows Me), Microsoft released Windows 2000, which was supposed to be a successor to the NT series of operating systems that was first introduced in 1993. One of their most successful operating systems was Windows XP, an OS released in 2001 that combined the user-friendliness of Windows 95 with the complex architecture of the Windows NT series. When Windows XP was launched, Microsoft introduced two versions of the operating system. The first version was the Home Edition which was geared towards people that used computers at home, while the second version was the Professional Edition which had exclusive networking and security features that were beneficial for companies and employees using computers for work.
Succeeding Windows XP was Windows Vista, an operating system released in 2006 and was widely criticized for being a downgrade to the XP in terms of its performance and user interface. From there, Microsoft did its best to follow customer feedback and released numbered operating systems, with the first one being Windows 7 which was released in 2009. After the success of Windows 7, the company followed it up with the release of Windows 8 in 2012 and Windows 10 in 2014. As of 2021, Microsoft’s latest operating system is Windows 11, which was offered on October 5, 2021, as a free upgrade to computers that have Windows 10.
Growth of Microsoft from 1990 to 2021
According to Dazeinfo, Microsoft has a 13.54% year-over-year growth. This consistent growth in Microsoft’s annual revenue is mainly due to the company’s great ability to keep up with the trends, especially those that involve Cloud systems and improvements to the user interface of personal computers.
As shown in the graph above, Microsoft had a steady rise in total revenue per year, except in 2009, when their revenue fell to 58.437 million from 60.42 million in 2008, and also in 2016 when it fell to 91.154 million from 93.58 million in 2015. As Microsoft continues to innovate and keep up with the trends, we should expect that the continue will still rise steadily within the industry.
Emergence of Apple
Besides IBM and Microsoft, another prominent company in the computer industry is Apple Inc., which was founded as the Apple Computer Company in 1976 by Steve Wozniak, Ronald Wayne, and Steve Jobs.
Steve Wozniak and Steve Jobs, who were referred to as “the two Steves,” met through an acquaintance in 1971. The acquaintance was user-interface architect Bill Fernandez, who introduced Wozniak (who was 21 years old during that time) to a then-16-year-old Jobs. Their chemistry when it came to business partnerships began in the same year when Jobs was able to sell two hundred units of the “blue boxes” (devices that allowed long-distance calls without any fees) that were invented and built by Wozniak.
Apple First Product – The Apple I
By 1975, both Wozniak and Jobs dropped out of college and began attending meetings of the Homebrew Computer Club, a group that consisted of computer hobbyists that had a knack for modifying computers. With the things that he learned at the club, Wozniak was able to create his own computer on March 1, 1976, that combined all the best elements of the personal computers that were popular in that period. Wozniak initially wanted to share the schematics or blueprints for the computer for free, but the business-minded Jobs urged Wozniak to build and sell several bare printed circuit boards of the computer. The circuit could then be installed by the buyer in a casing, and with a monitor and a keyboard, the circuit board could turn into a fully functioning computing machine.
Wozniak first offered the circuit board design to Hewlett-Packard (HP), where he was working during that time, but it was rejected five times at different meetings. Jobs then encouraged Wozniak to form a business together so that they could sell the circuit board on their own. For them to produce the first batch of printed circuit boards to sell, Wozniak sold his valuable HP-65 programmable calculator for $500, while Jobs sold his Volkswagen Type 2 minibus for $1500. 
The Apple Computer Company was then founded on April 1, 1976, with Steve Jobs, Steve Wozniak, and Ronald Wayne being registered as the co-founders of the California business partnership. Wayne was convinced by Jobs and Wozniak to become one of the co-founders of the company because of his expertise in managing a business. In exchange for agreeing to be a co-founder, Wayne would have a 10% stake in the company.
Wayne worked at the video game company Atari as a chief draftsman, and he even had a business venture four years before the founding of Apple. However, because his venture was not successful, Wayne worried that Apple might fail as a business, too. So, just two weeks after the creation of Apple, Wayne left the company and sold his 10% share to Jobs for only $800.
After the company was founded, Wozniak and Jobs visited the Homebrew Computer Club for the last time to demonstrate the computer to the club members. The computer was called the “Apple I,” and during the demonstration, Paul Terrell (an owner of a computer store called “Byte Shop”) became interested in the machine and decided to give Wozniak and Jobs his business card. The next day, Jobs visited Terrell in his shop and offered to sell the printed circuit boards for the Apple I. However, Terrell would only agree to sell Apple’s product if it was already assembled as a functioning computer. 
Jobs then showed Terrell’s purchase order of 50 assembled computers to Cramer Electronics, an electronic parts distributor where Jobs needed to purchase parts to assemble 50 Apple Is. However, Jobs did not have the money to purchase the parts, so he just assured the store that he would be able to pay for it once he sold the 50 computers needed by the Byte Shop.
Surprised by the large purchase order, the credit manager of Cramer Electronic first contacted Terrell to confirm if the order was real. Terrell, impressed by Jobs’s persistence, told the credit manager that Jobs would have more than enough money to pay for the parts once he sold the 50 computers. Jobs was able to complete the purchase order, and Terrell is widely credited for being the first person to popularize the Apple I, which became a massive success for the company.
Apple’s Success, Decline, and Reemergence
Although the Apple I was relatively successful for a startup company, Apple’s biggest break before the 2000s came in the form of the Apple II, which helped revolutionize the computer industry by having the first color graphics display, a feature that would be later seen in personal computers made by other companies or manufacturers. With the release of the Apple II, the company experienced a sudden jump in sales in just a few years. According to a report, the sales of Apple went from $7.8 million in 1978 to $117 million in 1980. 
Unfortunately, Wozniak left the company in 1985 due to losing interest in the company and also believing that he was not needed by Apply anymore. After he left, he sold most of his stock for the company. In the same year, Steve Jobs also left Apple to pursue other business ventures. Managing Apple was left to John Sculley, who became president of the company after Wozniak left. One of the most notable moments in Sculley’s tenure as the president of Apple was when he rejected an offer from Bill Gates (founder of Microsoft) to license parts of the Apple GUI (graphics user interface) that would be used for Windows 2.0. The rejection of the deal eventually led to Microsoft becoming one of Apple’s fierce competitors in the late 80s up to the 90s.
Because of the popularity of Microsoft and their Windows software, Apple struggled with its sales as people were losing interest in the company’s products. In 1996, it was predicted that Apple would soon be bankrupt. In a move of desperation, Apple bought out Steve Jobs’ company (NeXT Software) so that Jobs could technically return to assist Apple. Because of the buyout, Jobs became the interim CEO (iCEO) of Apple from 1997 until 2000, which was the year he returned as the company’s CEO. Through Job, the relationship between Apple and Microsoft began to heal, and they were even collaborating on making a Macintosh version of Microsoft Office.
Soon after, Jobs decided to revamp the ideals and the goals of the company by making it innovative once again with the release of the iBook, a stylish personal laptop that served as the predecessor of the popular MacBook family of notebook computers. Along with the iBook, Jobs also helped in releasing other great products like the iPod mp3 player, the iPad tablet, and the iPhone, a smartphone that is regarded as the most popular phone in the world. With Jobs at the helm once again, Apple became one of the most profitable tech-focused companies in the 2000s. Jobs passed away in October 2011, after an arduous battle with Neuroendocrine cancer, but Apple still retains the vision that Jobs instilled in the company through the leadership of CEO Tim Cook.
Growth of Apple from 2005 to 2021
With Steve Jobs returning to lead Apple in 2000, Apple experienced a resurgence in sales following the release of the iBook, iPod, iPad, and iPhone devices. Without Jobs, it could be said that Apple would not have survived until the late 90s. However, with Jobs’s vision and goals still intact in the company even after his death, Apple will continue to prosper as more people are enticed to buy the company’s great products.
As seen in the graph shown above, Apple steadily increased its annual revenue despite Steve Jobs’s death in 2011.  Around the mid-2010s, Apple had fierce competition against smartphone manufacturers in China, especially Xiaomi, who have overtaken Apple and Samsung as the world’s largest smartphone manufacturers in 2021.  However, despite the emergence of Xiaomi and other manufacturers in China, Apple will remain a profitable company in the 2020s because of the company’s increasing revenue each year, which is proof that its products are still quite popular around the world.
Moore’s Law and the Increasing Power of Computers
It is evident in the current trends within the computer industry that different kinds of computing machines and even the smallest smartphones are getting more powerful each year. The increasing power of computers has already been predicted by an American businessman and engineer named Gordon Moore, who proposed in 1965 that the number of transistors (the building block of computer circuitry) will double each year.  By 1975, he revised his theory by stating that transistors will double every two years instead of each year by observing the compound annual growth rate (CAGR) of 41% in the number of components per integrated circuit.
Although Moore did not provide sufficient evidence to support his claim, his theory was proven to be correct when analysts saw the increase in the number of transistors added to computers every two years. Because of the seemingly accurate prediction, Moore’s theory eventually became a law that is aptly called “Moore’s law.”
While it is true that the number of transistors does double after two years, some researchers predict that the law wouldn’t apply anymore in the future, as the power of computers will eventually reach its threshold. However, as of 2021, the number of transistors installed in computers is still increasing, and with the creation of various semiconductor device fabrication processes in 2018, the production of transistors will keep up with Moore’s law until the near future.
Transistor Count Per Year (1970 – 2021)
Based on the graph and data shown above, it can be seen that the application of Moore’s Law is more prominent through the 2010s up to the first two years of the 2020s.  The prominence of the said law is brought by the fact that manufacturers have already begun the process of making effective fabrication methods that would allow them to produce more transistors in a short period of time.
Availability of Computers Throughout the Years
Despite the increase in the number of transistors installed on computers every two years, the price of personal computers has relatively gone down mainly due to computer parts becoming less expensive. While the demand for personal computers has also gone down due to the increasing power and performance of smartphones, there are still many households that have at least one PC.
Number of US Adults that Have Desktops or Laptops from 2008 to 2019
When you look at the data provided by Statista above, you will see that the share of US adults goes up and down over the years, but it has stayed above 70% from 2008 to 2019, which means that the computer penetration rate in US households is relatively stable.
On the next graph, we will see that the penetration rate of computers around the world is steadily increasing, which is quite peculiar at first considering that the US has a stable rate when it comes to how many people own computers. However, it is important to note that companies that popularized personal computers were founded in the United States, thus making these companies’ computers more available, accessible, and cheaper in the country compared to other parts of the world, which are just beginning to have much wider access to desktops and laptops. 
Worldwide Computer Penetration Rate Among Households from 2005 to 2019
As evident in the graph above, the computer penetration rate among households has been steadily going up over the years, as PCs are starting to become more available and accessible.  However, as smartphones become more powerful they can be considered portable computers, personal computers experienced a decline in sales throughout the 2010s.
But, when a global pandemic hit various parts of the world in 2020, PC sales suddenly boomed, as employees of different companies began to work from home because of mandates that prohibited people from gathering and staying in closed spaces like offices. In addition, schools also implemented study-from-home programs that allowed classes to continue despite the closing and limited capacity of education establishments. 
Besides households and offices, other industries, departments, and businesses around the world also use computers to enhance the features of certain devices or machines. One of the best examples of machines that utilize modern computers are cars, which are sometimes equipped with GPS devices and even steering and parking assistance programs.
The cars that Tesla, Inc. manufactures, like the Model S and the Model X, feature Autopilot which enables the cars to drive themselves by utilizing several advanced mechanics that are made possible through computers. Because of the modernization of cars and other devices in the early 2020s, the industry has been experiencing chip shortages not because there is a low supply of chips but because there are not enough combinations of parts made that are suitable for the chips. So, it could be said that the chip shortage is also a shortage in other parts needed to make computers. According to Pat Gelsinger, the CEO of Intel, the global chip shortage may last until 2023.  However, major computer manufacturers are working on solving the issue before it can affect the entire computer industry within the 2020s and beyond.
20 Interesting Facts About the Computer Industry
- The very first computer mouse was invented by Doug Engelbart in 1964. Engelbart’s mouse was primarily made of wood.
- In honor of ENIAC’s 50th anniversary in 1996, the University of Pennsylvania sponsored a project called “ENIAC-on-a-Chip,” wherein a group of engineers produced a small silicon computer chip that contains all of the power that the ENIAC had when it was invented. Because of Moore’s Law, a small computer chip in the modern era can have the same functionality as the ENIAC.
- It was Steve Jobs who proposed “Apple Computer” as the name of his and Wozniak’s company. According to Jobs, he got the name after visiting Robert Friedland’s All-One Farm in Oregon. Jobs thought that the name “Apple” sounded fun and not intimidating within the computer industry.
- There are more than 6000 viruses that are written or made each month, which is why antivirus programs are working 24/7 to ensure that computers are always protected against new threats.
- The very first disk drive that had 1 gigabyte (GB) of memory was the IBM 3380 which was introduced in 1980. The disk drive weighed approximately 1000 lbs or 455 kg and was priced between $81,000 to $142,000.
- Intel’s first microprocessor was called the “4004” and was intended to be used as a Busicom calculator. The creators of the 4004, Federico Faggin, Marcian Hoff, and Stan Mazor, were awarded the prestigious National Medal of Technology and Innovation or NMTI in 2010 for pioneering the commercially produced microprocessor.
- The former CCO of Pixar Animation Studios, John Lasseter, was fired from Disney after promoting computer animation to the admins of the company. Lasseter then worked in Lucasfilm, where he gained the spotlight for popularizing CGI animation. It all went full circle for Lasseter when he became the CCO Of Pixar Animation Studios, which is currently owned by Disney. Lasseter is responsible for directing Toy Story in 1995, Cars in 2006, and many more iconic Pixar films.
- The first woman to get a Ph.D. in computer science in the United States was Mary Kenneth Keller. Keller was an American Roman Catholic religious sister who was very knowledgeable about mathematics and physics.
- Apple and Microsoft both started in a garage. Microsoft began as a company inside the garage of Bill Gates, while Apple was founded in the garage of Steve Jobs’s parents.
- As of 2021, Amazon is selling more eBooks or digital books than paperbacks and hardcovers. This is due to the fact that eBooks are becoming more popular now as people tend to read more comfortably on their smartphones or tablets.
- No matter how fast modern-day computers are, they are still not as powerful as the human brain, which is able to process 38 thousand trillion actions per second.
- When using a computer, a human being only blinks seven times a minute instead of the usual 20 times per minute.
- The first computer programmer was Lady Ada Lovelace, who worked on several aspects of the analytical engine, a computer that serves as the successor to Babbage’s Difference Engine.
- The full name of CAPTCHA is “Completely Automated Public Turing test to tell Computers and Humans Apart.”
- The first computer virus called “The Creeper” was created by Bob Thomas, an employee at BBN Technologies who wanted to experiment on how fast viruses can infect one computer to another.
- The first webcam was invented by a group of people in the Cambridge University Computer Science Department. The webcam was set up in front of a coffee machine, and the purpose of the camera was to let people know if the coffee machine was empty so that they won’t waste time going to the Trojan Room, where the machine was located.
- The original name of Windows is Interface Manager. To make the operating system more unique, Microsoft decided to change its name to “Windows” when they announced it in November 1983.
- The two most used passwords for accounts are “123456” and “12345678.” Both of those passwords can be an easy gateway for hackers, so many experts recommend that people should use strong passwords to protect their personal information on the web.
- The first Macintosh computer case contained 47 signatures that were written by the employees and admins of Apple in 1982.
- 90% of the world’s currency exists in computers, so only 10% of that currency is represented through real cash.
Collazo, F.J., 2005. A Brief History of the Abacus. FJCollazo.com. Available at: http://www.fjcollazo.com/documents/AbacusHist.htm [Accessed October 20, 2021]
 History Computer. Pascaline – The Complete History of the Pascaline Calculator. History-Computer.com. Available at: https://history-computer.com/inventions/pascaline-complete-history-of-the-pascaline-calculator/ [Accessed October 20, 2021]
 History Computer. Curta Calculator – History of the Curta Mechanical Computer Calculator. History-Computer.com. Available at: https://history-computer.com/inventions/curta-calculator-history-of-the-curta-mechanical-computer-calculator/ [Accessed October 20, 2021]
 Vintage Calculators Web Museum. Bell Punch/Sumlock/ANITA.. VintageCalculators.com. Available at: http://www.vintagecalculators.com/html/bell_punch_-_anita.html [Accessed October 20, 2021]
 Computer History Museum. The Babbage Engine. ComputerHistory.org. Available at: https://www.computerhistory.org/babbage/ [Accessed October 20, 2021]
 Levy, S., 2013. The Brief History of the ENIAC Computer. SmithsonianMag.com. Available at: https://www.smithsonianmag.com/history/the-brief-history-of-the-eniac-computer-3889120/ [Accessed October 20, 2021]
 Computer History Museum. International Business Machines Corporation (IBM). ComputerHistory.org. Available at: https://www.computerhistory.org/brochures/g-i/international-business-machines-corporation-ibm/ [Accessed October 21, 2021]
 Cooney, M. In pictures: The (mostly) cool history of the IBM mainframe. ARNNet.com.au. Available at: https://www.arnnet.com.au/slideshow/541873/pictures-mostly-cool-history-ibm-mainframe/ [Accessed October 21, 2021]
 Alsop, T., 2021. IBM Global Revenue 1999-2020. Statista.com. Available at: https://www.statista.com/statistics/265003/ibms-revenue-since-1999/ [Accessed October 21, 2021]
 Cowling, J., 2016. A Brief History of Microsoft – The World’s Biggest Software Company. Content.DSP.co.uk. Available at: https://content.dsp.co.uk/a-brief-history-of-microsoft-the-worlds-biggest-software-company [Accessed October 20, 2021
 The History Channel. This Day In History | April 04, 1975 – Microsoft Founded. History.com. Available at: https://www.history.com/this-day-in-history/microsoft-founded [Accessed October 20, 2021]
 Dazeinfo, 2021. Microsoft Revenue by Year: FY 1990 – 2021. Dazeinfo.com. Available at: https://dazeinfo.com/2019/11/11/microsoft-revenue-worldwide-by-year-graphfarm/ [Accessed October 20, 2021]
 Verma, A., 2018. Steve Jobs sold his Volkswagen bus to start Apple. Inshorts.com. Available at: https://inshorts.com/en/news/steve-jobs-sold-his-volkswagen-bus-to-start-apple-1519482021508 [Accessed October 21, 2021]
 Dormehl, L,, 2020. Today in Apple history: The Byte Shop, Apple’s first retailer, opens. CultofMac.com. Available at: https://www.cultofmac.com/457420/byte-shop-opens-tiah/ [Accessed October 21, 2021]
 Richardson, A., 2008. The Founding of Apple Computers, Inc. Guides.LOC.gov. Available at: https://guides.loc.gov/this-month-in-business-history/april/apple-computers-founded [Accessed October 21, 2021]
 Dazeinfo, 2021. Apple Revenue by Year: FY 1990 – 2020. Dazeinfo.com. Available at: https://dazeinfo.com/2019/08/01/apple-revenue-by-year-worldwide-graphfarm/ [Accessed October 21, 2021]
 Martin, S., 2021. Xiaomi is now the world’s largest smartphone manufacturer, here’s how it got there. KrASIA.com. Available at: https://kr-asia.com/xiaomi-is-now-the-worlds-largest-smartphone-manufacturer-heres-how-it-got-there [Accessed October 21, 2021]
 Corporate Finance Institute. What is Moore’s Law?. CorporateFinanceInstitute.com. Available at: https://corporatefinanceinstitute.com/resources/knowledge/other/moores-law/ [Accessed October 22, 2021]
 Roser, M. and Ritchie, H,, 2013. Technological Progress. OurWorldinData.org. Available at: https://ourworldindata.org/technological-progress [Accessed October 22, 2021]
 Alsop, T., 2020. Desktop/laptop ownership among US adults 2008 to 2019. Statista.org. Available at: https://www.statista.com/statistics/756054/united-states-adults-desktop-laptop-ownership/ [Accessed November 9, 2021]
 Alsop, T., 2021. Computer penetration rate among households worldwide 2005-2019. Statista.com. Available at: https://www.statista.com/statistics/748551/worldwide-households-with-computer/ [Accessed October 22, 2021]
 Poletti, T., 2020. The pandemic has brought the personal computer back to life, with the help from Zoom. MarketWatch.com. Available at: https://www.marketwatch.com/story/the-pandemic-has-brought-the-personal-computer-back-to-life-with-help-from-zoom-11599316207 [Accessed October 22, 2021]
 Gartenberg, C., 2021. Intel CEO warns chip shortage won’t end until at least 2023, as laptop sales get hit by supply issues. TheVerge.com. Available at: https://www.theverge.com/2021/10/21/22739192/intel-chip-shortage-q3-2021-earning-laptop-revenue [Accessed October 22, 2021]