AI agents are the result of decades of work by some of the most brilliant researchers alive. Here is the story of the people who built what you are using right now — and why their work changed everything.
Every tool you use has a story. AI agents have one of the most interesting stories in the history of technology — and it involves a remarkable collection of people who were right about something important, often long before anyone else believed them.
Alan Turing never built an AI agent. But in 1950, he asked the question that made everything else possible: can machines think? His Turing Test — the idea that a machine indistinguishable from a human in conversation could be considered intelligent — set the research agenda for a field that would take 70 years to fully emerge.
Turing was a mathematician, codebreaker, and visionary who was decades ahead of his time. His foundational work on computation created the conceptual framework that every AI system runs on.
In 1956, Marvin Minsky and John McCarthy organized the Dartmouth Conference — the event generally considered the founding moment of AI as an academic field. McCarthy coined the term "artificial intelligence." Minsky went on to build the first neural network computer and write foundational work on knowledge representation.
McCarthy later developed LISP, the programming language that became the foundation of AI research for decades. Minsky's work on frames and knowledge representation laid groundwork for how AI systems model the world.
These three researchers — known as the "Godfathers of Deep Learning" — spent decades working on neural networks when the mainstream AI community had largely abandoned the approach. They won the 2018 Turing Award (the Nobel Prize of computing) for their persistence and contributions.
Hinton's work on backpropagation — the algorithm that trains neural networks — is embedded in virtually every modern AI system. LeCun's convolutional neural networks power most computer vision AI. Bengio's work on attention mechanisms led directly to the transformer architecture that underlies GPT, Claude, and every modern language model.
In 2017, Ashish Vaswani and seven colleagues at Google published "Attention Is All You Need" — the paper introducing the transformer architecture. This single paper is arguably the most consequential publication in AI history. The transformer made modern language models possible, and modern language models made modern AI agents possible.
The researchers who built the foundation. The founders who built the applications. OpenAI, Anthropic, and the wave of AI companies they inspired are translating decades of research into tools that real businesses can use.
The agents your business runs today exist because of a continuous chain of insight, persistence, and collaboration stretching back 75 years. That is worth knowing.
Explore More
If you are looking to implement AI skills in your business, these are the platforms our team uses and recommends:
*Some links above may be affiliate links. We only recommend tools we actually use.*
Tell us what is costing you the most time. We will map out exactly what your business needs. Free, no obligation.
Get Started Free