For as long as software has existed, there have been promises, often grand, sometimes naive, that the need to “know how to code” would soon vanish. The vision: ordinary people, business analysts, or even executives designing powerful applications without writing a single line of code. From the earliest days of computing to today’s AI revolution, this dream has been revived again and again. Yet, despite billions in investments and waves of hype, the core of software development, the logic, structure, and abstraction, remains stubbornly human.
The 1960s: COBOL and the Business User
In the 1960s, COBOL (Common Business Oriented Language) was created to make programming accessible to business people. With its English-like syntax, COBOL was supposed to bridge the gap between domain experts and machine code. The dream was clear: managers and analysts would write software themselves.
But COBOL, while more readable than assembly, still required training, structure, and logical thinking. The dream didn’t materialize. COBOL coders,still in demand decades later, became their own specialized workforce. Instead of removing the need for programmers, COBOL expanded the profession.
The 1980s-90s: 4GLs and Visual Tools
Fourth-Generation Languages (4GLs) promised another leap. Tools like Fox Pro, Power Builder, and Oracle Forms let users “draw” applications. Visual Basic allowed developers to build GUIs with drag-and-drop components. At the time, these were seen as the end of traditional coding.
But while these tools simplified UI creation and database binding, complex business logic still required real coding. The abstraction broke down quickly as projects grew. Power users emerged, but professional developers remained essential.
The UML Era: Modeling as Programming?
In the late 1990s and early 2000s, the Unified Modeling Language (UML) was heralded as the new foundation for software development. Why write code, the thinking went, when you could diagram it? With Model-Driven Architecture (MDA), one could draw class and activity diagrams and automatically generate applications from them.
Despite heavy support from enterprise vendors, this approach never took off at scale. Software is not just structure; it’s behavior, and behavior is messy. Diagrams became too complex, brittle, and incomplete to replace real code. UML found a niche in documentation and architecture, but the coder was not dethroned.
The No-Code/Low-Code Renaissance
In the 2010s, a new generation of no-code and low-code platforms emerged: Bubble, Out Systems, Mendix, and others. These platforms boasted intuitive interfaces for building web apps, workflows, and integrations. This time, the audience expanded to entrepreneurs and startups.
While successful for prototyping, internal tools, or constrained domains, these platforms hit a wall when it came to scalability, customization, and maintainability. Developers were still needed to extend functionality, ensure security, and keep performance in check. Once again, the promise remained only partially fulfilled.
Now: AI Will Replace Coders?
The latest iteration of the promise centers around artificial intelligence. Tools like GitHub Copilot, ChatGPT, and Claude can write code, refactor it, explain it, and even suggest solutions. Surely now, many claim, AI will finally eliminate the need to know how to code.
But even AI doesn’t remove the core challenge of software development: understanding what needs to be built, translating that into logical structure, and debugging edge cases. AI is a powerful tool—perhaps the most powerful yet—but it is a copilot, not a captain. It accelerates developers, it doesn’t replace them. Just as calculators didn’t eliminate the need to understand math, AI won’t eliminate the need to understand code.
Why the Dream Won’t Die—and Why It Won’t Come True
The repeated promises share a common mistake: underestimating what software development actually is. Coding is not just syntax; it’s problem-solving, system design, abstraction, trade-offs, and communication. Each time we try to automate or abstract it away, we rediscover how central human reasoning is to the process.
Software is not a commodity product. It’s a living, changing expression of intent. Until we can automate intent, and all the ambiguity, creativity, and complexity it entails, there will always be a place for coders.








