Who is the Father of AI, ML, and Cybersecurity?
AI, machine learning, and cybersecurity are leading innovations in today's fast-changing tech world. These domains include assistants, recommendation tools, online payments, and security systems. So, who built the foundation for these groundbreaking technologies?
There is still debate over who deserves credit, since few inventions come from just one person. Still, some people are known for shaping and advancing these fields.
This article looks at the three figures often called the father of AI, machine learning, and cybersecurity: their stories, major work, and lasting impact. Based on history and expert views, it shows how their mid-1900s work built today’s technology. Although the so-called "fathers" are debatable, the fingerprints point toward John McCarthy for AI, Arthur Samuel for machine learning, and Bob Thomas for cybersecurity.
Let’s look at how their ideas changed technology.
The Father of Artificial Intelligence: John McCarthy
AI copies human thinking in machines. Its history dates back to philosophical discussions about thinking machines. Most experts credit John McCarthy with establishing AI as a formal science. An American computer scientist born September 4, 1927, in Boston, Massachusetts. McCarthy showed early talent, earning a math degree from Caltech in 1948. He gained his PhD in math from Princeton in 1951. His career took him to prestigious institutions, but he spent most of it at Stanford University, where he was made a full professor of computer science in 1962 and remained there until he retired in 2000.
McCarthy’s biggest breakthrough came in 1956, when he coined he term “artificial intelligence” in a proposal with Marvin Minsky, Nathaniel Rochester, and Claude Shannon for the Dartmouth Conference. The Dartmouth workshop started AI research and gathered experts to discuss how machines could learn, think, and understand like humans. McCarthy believed machines would become smart within ten years, though that was too hopeful. This was, of course, a little on the rosy side, but the conference did establish AI as an entity to be reckoned with and inspired millions of researchers thereafter.
McCarthy created Lisp in the late 1950s and published it in 1960. It became the main AI programming language because it handled symbols well. It served generous applications of recursion and dynamic typing. He also invented garbage collection, which helped computers manage memory automatically. In 1958, garbage collection changed how programs managed memory.
Lisp’s influence still appears in languages like Python and JavaScript, proving McCarthy’s lasting impact on modern computing.
There were many other inventions by McCarthy. In 1958, McCarthy proposed the “advice taker”, a program that could reason and learn from input, setting the base for logic-based AI. He also worked on time-sharing systems in the 1960s, which let many users share a computer, which is an early step toward cloud computing. In 1961, he suggested utility computing, where people could buy computing power like electricity. From 1978 until 1986, he developed circumscription – non-monotonic reasoning for dealing with incomplete information in AI systems.
Alan Turing is often called a founding father because of his 1950 paper Computing Machinery and Intelligence, which introduced the Turing Test; however, McCarthy is given the main title due to his role in institutionalising AI. In 1971, McCarthy was awarded the Turing Award, in 1988 the Kyoto Prize, and in 1990 the National Medal of Science. He died on October 24, 2011, but his dream is still used to promote AI, including chatbots and self-driving vehicles. Without McCarthy, AI may remain a philosophical fantasy and not an actual reality.
Read Also: Top 50 Artificial Intelligence AI Project Ideas for Final-Year Students
The Father of Machine Learning: Arthur Samuel
Machine learning is a part of AI that uses data to help computers learn without being directly coded. Arthur Lee Samuel, an American computer scientist born on December 5, 1901, in Emporia, Kansas, pioneered the conceptual framework. Samuel's academic background included a college degree in 1923 from the College of Emporia and a master's degree in 1926 from MIT in electrical engineering. He started his career at Bell Laboratories in 1928, where he was employed with vacuum tubes and radar during World War II. He later worked at the University of Illinois, IBM, and then Stanford, where he retired.
Samuel used the term “machine learning” for the first time in his 1959 paper about teaching computers to play checkers. This was based on his experiments since 1949, when he developed the Samuel Checkers-playing Program, one of the world's first self-learning programs. The program, which utilized a scoring function depending on board positions, ran on an IBM 701 computer and used a minimax strategy and alpha-beta pruning to determine the moves. The program learned by memorizing board positions and results. It improved by studying professional games and playing against itself. By the mid-1970s, it was at the amateur master level, and it was possible to see that machines could be enhanced by experience.
The checkers program showed key ML ideas: learning from data, adjusting rules, and applying them to new cases. Samuel's definition of ML as the study that allows computers to learn without being directly programmed is still basic. His work influenced modern AI methods, like reinforcement learning, used in systems such as AlphaGo.
In addition to ML, Samuel worked on hardware and software at IBM, such as hash tables and early transistor research. He helped design processors for text-based work and wrote a TeX manual in 1983. In 1953, he published a readable introduction to computing, making the subject more decipherable.
Although neural networks were further developed by other people, such as Frank Rosenblatt (1957, inventor of the Perceptron) and Geoffrey Hinton (1980s, pioneering backpropagation), who is known as the father of deep learning, Samuel is the pioneer of the practical demonstrations of neural networks. In 1987, he was given the IEEE Computer Pioneer Award and passed away on July 29, 1990. Today’s predictive analytics and image recognition systems exist because of Samuel’s early ideas.
The Father of Cyber Security: Bob Thomas
Cybersecurity, which protects computers from attacks, grew out of early experiments. Bob Thomas, born around 1943, created the first computer virus, Creeper, in 1971, which earned him the title “father of computer viruses.” While working at BBN Technologies, which helped build ARPANET, Thomas created Creeper, a self-replicating test program.
Creeper targeted DEC PDP-10 mainframes running TENEX over ARPANET, moving between machines and displaying the message "I'M THE CREEPER: CATCH ME IF YOU CAN."
It was non-malicious, causing no data damage, and affected only about 28 systems in a controlled environment. A later version, modified by colleague Ray Tomlinson, enabled self-copying, highlighting the potential for autonomous propagation.
This experiment accidentally started the field of cybersecurity. In response, Tomlinson created Reaper in 1972, the first antivirus program that traversed the network to delete Creeper instances. Thomas's work demonstrated vulnerabilities in networked systems, prompting the need for defences against self-replicating code. Before Creeper, threats were theoretical; after, the field evolved to address viruses, worms, and beyond.
Read Also: Top 30 Best-Known Cybersecurity Case Studies
Less is known about Thomas’s life, but at BBN, he helped improve computer networking. His work made people aware of computer risks and led to early cybersecurity methods. Other figures like Dorothy Denning (cryptography expert) and Becky Bace (intrusion detection) built on this foundation, but Thomas's spark is credited as the origin.
Thomas’s work shows that new technology often brings new problems, such as cyberattacks. His work laid the groundwork for antivirus software, firewalls, and ethical hacking, protecting trillions in digital assets today.
Read Also: Top 50 Cyber Security Projects for Final Year Students
Legacies That Endure
John McCarthy, Arthur Samuel, and Bob Thomas turned big ideas into real technology. McCarthy formalised AI, Samuel operationalised learning in machines, and Thomas exposed the need for digital defences. Their work from the 1950s to 1970s grew alongside major computing changes like ARPANET and early programming languages.
These frontierspeople remind us that technology's progress is collaborative and iterative. Their work connects today, as AI and ML help detect online threats through smart systems.
Read Also: Top 50 Computer Science Project Ideas and Topics for Final Year Students
In 2025, with rising cyber threats and ethics debates, these figures inspire responsible progress. Who knows what future "fathers" (or mothers) will emerge? The age of technology they began still continues.
Post Your Ad Here



Comments