Hands-On Coding: The Only Way to Battle Software Complexity
The Unseen Architect: Why Staying Hands-On is the Only Way to Battle Software Complexity
In a world increasingly reliant on abstract architectural diagrams and dogmatic adherence to patterns, Dennis Doomen, a 30-year veteran coding architect, offers a blunt counter-narrative: if you're not in the code, you’re losing your edge. This conversation reveals the hidden consequences of architectural detachment, highlighting how the failure to stay hands-on breeds complexity, erodes maintainability, and ultimately undermines the very systems architects are tasked with building. This analysis is crucial for any software engineer or architect who seeks to build enduring systems, offering a strategic advantage by emphasizing practical experience over theoretical purity.
The Hands-On Imperative: Why Architects Must Code
The landscape of software architecture is often portrayed as a realm of high-level diagrams and strategic decision-making, removed from the messy reality of day-to-day coding. Dennis Doomen, however, argues forcefully against this separation. For him, effective software architecture is inextricably linked to continuous, hands-on coding. This isn't merely a preference; it's presented as the only way to truly understand the practical implications of architectural choices and to effectively combat the inevitable rise of complexity.
Doomen’s core argument is that without direct involvement in writing and maintaining production code, architects lose their ability to judge the real-world viability and long-term consequences of their decisions. This detachment leads to a disconnect, where theoretical ideals clash with practical execution, often resulting in codebases that become increasingly difficult to manage. He observes this phenomenon frequently: teams inheriting architectures they don’t fully grasp, blindly applying patterns without understanding their underlying purpose. This dogmatic adherence, Doomen notes, is a direct consequence of architects stepping away from the code, leaving teams to interpret and implement abstract principles in isolation.
"If you stop coding, if you actually stop building production systems and put them on the internet and everything, you lose that experience."
This loss of experience has tangible downstream effects. Architects who are no longer coding may propose solutions that sound elegant on paper but prove cumbersome or even detrimental in practice. They might miss crucial nuances in framework behavior, new paradigms, or the subtle ways in which code complexity accumulates. Doomen uses the analogy of building an onion architecture or clean architecture: a vision might be established, but without ongoing coding involvement, the development team can lose sight of the original intent, leading to a superficial application of the pattern. This lack of deep understanding means teams struggle to optimize or simplify the codebase, instead perpetuating complexity through rote imitation. The consequence? Systems that are brittle, hard to debug, and resistant to change.
The Compounding Cost of Abstracted Complexity
Software engineering, at its heart, is a battle against complexity. Doomen emphasizes that this is not just about managing the inherent complexity of a problem, but also the "accidental complexity" introduced through design choices and evolving understanding. The danger lies in the gradual, often unnoticed, accumulation of this complexity over time, a process that is accelerated when architectural decisions are made in a vacuum.
Consider the common debate around consistency versus simplicity. A team might encounter a module that has become overly complex but adheres to established patterns. The pragmatic approach, Doomen argues, is to simplify that specific module, even if it introduces a localized inconsistency. The alternative--prioritizing rigid consistency at the expense of clarity--leads to a codebase where understanding and maintenance become increasingly burdensome. This is particularly evident when developers, often those with 10-15 years of experience, become overly dogmatic about principles like SOLID. They might insist on abstractions for future-proofing or maintainability, even when the immediate need is absent.
"I'm more the person that will say, 'No, no, we treat this module as a boundary, and the consistency that we want to keep applies only to this boundary, this module. So we make this module simpler because I know that if you make it simpler, it will also be much easier to maintain.'"
The downstream effect of such rigid adherence is a system that becomes a monument to theoretical purity rather than a practical tool. Developers struggle with understanding abstractions, the purpose of extensive unit testing, or the conscious decision to duplicate code for reduced coupling. This creates a subtle but significant drag on development velocity and introduces a hidden cost that compounds over time. The apparent "simplicity" of a consistent, albeit complex, codebase is a mirage; the reality is a growing burden that requires constant, often painful, effort to manage.
AI as a Catalyst: Accelerating Build, Elevating Planning
The advent of AI-assisted development tools presents a profound shift, forcing a re-evaluation of how software is built. Doomen’s experience with tools like GitHub Copilot is illustrative. He describes how AI can rapidly generate significant portions of an open-source project, including features and tests, often adhering to existing conventions. This acceleration of the "build" phase is undeniable.
However, this rapid acceleration also highlights a critical consequence: the increased importance of the "planning" and "research" phases. If AI can handle much of the implementation, the ability to accurately define the problem, understand the requirements, and make sound architectural decisions becomes paramount. Doomen’s own process of using AI involves treating the generated code as a starting point, refining it, and crucially, updating documentation and instructions to guide future AI interactions. This active engagement prevents the code from becoming a black box, even when AI-generated.
"My assumption is that research and planning, figuring out what problems you need to solve, is going to become more and more important. Build is going to accelerate, so we can build really fast."
The implication here is a potential competitive advantage for those who master this new workflow. Teams that can effectively leverage AI for speed while maintaining rigorous oversight and strategic planning will outpace those who simply accept AI-generated code without critical evaluation. The danger lies in inexperienced developers using AI as a crutch, producing code they don't understand, leading to a different, perhaps more insidious, form of complexity and maintainability issues. The true value emerges when AI is used as a sophisticated tool to augment human expertise, not replace it. This requires a conscious effort to maintain critical thinking and a deep understanding of underlying principles, even as the tools evolve.
The Enduring Value of Context and Decision Records
In the face of rapidly evolving tools and AI-driven code generation, the importance of historical context and documented decisions becomes even more pronounced. Doomen champions the practice of maintaining detailed Git history, commit messages, and, most critically, Architectural Decision Records (ADRs). These artifacts serve as the crucial "why" behind technical choices, providing a narrative that AI tools can leverage and humans can rely on for understanding complex systems.
The downstream effect of neglecting this documentation is significant. As codebases age and team members change, the rationale behind architectural decisions can be lost. This creates an environment where existing systems are challenged without context, potentially leading to costly and unnecessary re-architecting. Conversely, teams that invest in capturing this context--whether through ADRs, detailed pull request descriptions, or even AI-assisted documentation--build a robust foundation for future development and maintenance.
"The companies that have invested in that, they will be leaps and bounds ahead compared to the companies that don't invest in that."
This investment in context is not merely about historical record-keeping; it’s a strategic advantage. It allows teams to navigate complexity, understand trade-offs, and make informed decisions, even when faced with the rapid advancements of AI. The ability to query this historical context, either manually or with AI assistance, ensures that future decisions are built upon a solid understanding of past reasoning, preventing the repetition of past mistakes and fostering a more sustainable development process.
Key Action Items
- Prioritize Hands-On Coding: Dedicate a portion of your week (e.g., 10-20%) to coding within production systems. This is an ongoing investment, not a one-time task.
- Embrace Pragmatism Over Dogma: When faced with complexity, actively seek opportunities to simplify specific modules, even if it means localized inconsistency. Challenge rigid adherence to patterns when they impede maintainability.
- Develop AI Literacy: Experiment with AI coding assistants (GitHub Copilot, Claude, Juni) to understand their capabilities and limitations. Focus on using them to augment, not replace, your understanding.
- Document Your Decisions: Implement a system for capturing Architectural Decision Records (ADRs) for non-trivial choices. This is a long-term investment that pays dividends in maintainability and knowledge transfer.
- Master Prompt Engineering for AI: Learn to craft detailed prompts that guide AI tools, providing context and specifying desired outcomes. This is crucial for generating useful and maintainable code.
- Focus on Test Quality: Ensure your tests are robust, well-written, and serve as a reliable safety net. This is essential for trusting any code, whether human or AI-generated.
- Invest in Codebase History: Maintain detailed commit messages and pull request descriptions that explain the "why" behind changes, not just the "what." This historical context is invaluable for future understanding and AI utilization.