Codebase as AI-Driven Customer Experience Engine

Original Title: I gave Claude Code our entire codebase. Our customers noticed. | Al Chen (Galileo)

The most valuable asset in your tech stack isn't your code; it's the ability to ask it the right questions. In this conversation with Al Chen of Galileo, we uncover a critical shift in how customer-facing teams can leverage their codebase not just for internal understanding, but as a direct engine for superior customer experience. The hidden consequence of traditional documentation is its inherent lag and incompleteness, forcing valuable customer interactions through inefficient bottlenecks. Chen reveals how by bringing multiple code repositories directly into an AI-powered IDE, customer-facing roles can bypass these limitations, delivering hyper-personalized, technically accurate solutions that build trust and accelerate customer success. This approach is essential for anyone in customer engineering, technical support, or product management who seeks to transform reactive problem-solving into proactive, differentiated customer engagement.

The Codebase as a Customer-Facing Moat

The traditional wisdom in software development often dictates that documentation is king. It’s the repository of truth, the guide for users, and the single source of clarity. But Al Chen, field engineer at Galileo, challenges this notion head-on. For customer-facing teams, especially those dealing with complex, multi-service architectures like Galileo’s AI observability platform, public documentation often falls short. Customers don't just want to know what a feature does; they need to know how it integrates, how it deploys, and how it specifically addresses their unique environment. This gap between documentation and real-world application is where Chen found his job becoming untenable.

"The minute I realized I couldn't really do my job was when I was trying to reference our public documentation and trying to provide an answer to my customers, even by using Cloud Code or ChatGPT or whatever, and trying to take all these different help docs and trying to come up with the answer. It just still wasn't coming up with the answer that my customers were looking for."

The core problem, as Chen articulates, is that documentation is static and generalized, while customer needs are dynamic and specific. Galileo’s platform, for instance, comprises fifteen distinct repositories, each representing a critical service. Asking a customer to navigate this complexity through generic docs is akin to asking them to build a house with a single blueprint and a hammer. The immediate consequence is frustration, delayed deployments, and a lack of customer confidence. The downstream effect? A competitive disadvantage rooted in a subpar customer experience.

Chen’s insight is to treat the codebase itself as the ultimate, most up-to-date source of truth. By pulling all fifteen repositories into his VS Code environment, he can then leverage AI tools like Claude Code to query the entire codebase. This isn't just about finding a specific function; it's about understanding how services interact, how data flows, and how a particular feature is implemented across multiple modules. This capability fundamentally shifts the dynamic from reactive information retrieval to proactive, precise problem-solving. When a customer asks a nuanced question about deployment or integration, Chen can now query the live code, referencing specific repositories and even cross-referencing them to construct a step-by-step answer that is not only technically accurate but also deeply contextualized. This immediate, code-grounded response bypasses the slow, iterative process of updating documentation, offering a significant advantage in customer engagement.

The "Customer Quirks" System: Personalization at Scale

The true power of this approach emerges when Chen layers additional context onto the codebase query. He developed a "customer quirks" system, documented in Confluence, which captures specific details about how individual enterprise customers deploy and configure Galileo within their unique, often highly secure, environments. These "quirks" might involve specific secret management strategies, namespace configurations, or sidecar implementations.

When a customer asks a deployment question, Chen’s custom Claude Code command first consults the relevant Confluence pages for that customer’s specific quirks. Then, it dives into the codebase to find the relevant implementation details. The resulting answer is not a generic deployment guide but a hyper-personalized, step-by-step process tailored to that customer's exact environment and requirements. This level of customization is incredibly difficult to achieve through traditional documentation. The immediate benefit is a dramatically improved customer experience, fostering trust and accelerating adoption. The delayed payoff is a powerful competitive moat: customers receive support that feels bespoke and deeply informed, a level of service that is hard for competitors to replicate without similar AI-driven systems.

"This is actually one of the core pages that goes into this DPL custom command, which is, 'Look at the customer quirks page. If I'm mentioning a customer that's on that page, look at all their quirks.' And then in the response from Cloud, it's highly customized, highly tailored to their environment because I've seen from working with our DevOps team that we can provide a generic answer about Kubernetes or about ClickHouse or about whatever for the customer, but it's like something you can just find online by Googling or using AI. But when it's tailored to specific security requirements and deployment requirements, it's way more effective and just gives the customer more trust that we know what we're doing, basically."

This approach highlights a key systems-thinking insight: organization matters less when AI can navigate complexity. Chen argues that instead of obsessing over perfectly structured knowledge bases, teams can embrace a more chaotic information environment. By feeding relevant context--whether it's code, Confluence pages, or even Slack threads--into an AI tool, the system can find and synthesize the necessary information. This collapses the cost of information retrieval, allowing customer-facing teams to focus on delivering value rather than managing knowledge silos.

The Virtuous Cycle of Reactive Support

Beyond proactive deployment assistance, Chen also applies AI to reactive support scenarios, particularly within Slack. Using a tool like Pylon, he can ingest long Slack threads where customers ask complex questions. The AI then helps generate draft help articles directly from these conversations. This transforms a single, isolated customer interaction into a scalable knowledge asset.

The "and then" workflow discovery, as described by Clara Val, is crucial here. A customer asks a question in Slack. And then, the AI helps answer it. And then, that answer is used to draft a knowledge base article. And then, that article is added to the public knowledge base, benefiting future customers. And then, it can inform the product roadmap by highlighting frequently asked questions or areas of confusion. This creates a virtuous cycle where each customer interaction fuels continuous improvement and knowledge dissemination. The immediate benefit is faster resolution of individual issues. The long-term advantage is a constantly growing, highly relevant knowledge base that reduces support load and enhances customer self-service, differentiating the company through superior customer experience.

"The reality is we can now all live in a little bit more chaos because the AI navigates all that information for us across systems, right? So you can be in your code querying Confluence. It will find, you can kind of point it in the right direction. It will find the information. You have to be less precious about where and how you store the information."

This strategy directly counters the conventional wisdom that requires meticulous organization and polished documentation processes. By embracing AI’s ability to traverse disparate information sources, companies can accelerate knowledge creation and dissemination, turning customer interactions into a strategic asset rather than a support burden.

The Human Element in an AI-Driven World

Despite the power of AI, Chen emphasizes that humans remain critical. He doesn't blindly copy-paste AI-generated responses. Instead, he proofreads, humanizes the language, and applies his understanding of the customer's context to distill complex answers into actionable insights. Furthermore, he uses AI as a learning tool, prompting it to explain its reasoning and cite sources, thereby deepening his own understanding of the codebase and underlying technologies. This continuous learning, driven by curiosity, is where human value truly lies.

The implication for teams is clear: while AI can automate information retrieval and synthesis, the ability to interpret, contextualize, and communicate effectively remains a human differentiator. The competitive advantage isn't just in using AI, but in how humans leverage AI to build deeper relationships and deliver more personalized, effective solutions. This requires a willingness to embrace new tools and, crucially, to develop the "hard skills" of interacting with them--including a basic understanding of code and AI prompting.

Key Action Items

  • Immediate Action (Next Quarter):

    • Environment Setup: For customer-facing roles (field engineering, technical support, customer success), dedicate time to setting up a local development environment capable of pulling multiple repositories. This is the foundational step.
    • IDE Integration: Integrate an AI coding assistant (like Claude Code, Cursor, or GitHub Copilot) into your IDE. Familiarize yourself with its basic code querying capabilities.
    • Contextualize Slack Threads: For critical customer Slack conversations, experiment with AI tools to summarize threads and draft initial knowledge base articles.
    • "Customer Quirks" Lite: Begin documenting specific customer environment details or deployment challenges in a centralized, accessible location (e.g., a shared document or Confluence page).
  • Longer-Term Investments (6-18 Months):

    • Develop Custom Commands: Investigate building custom AI commands or scripts to automate common customer support workflows, such as generating deployment guides based on codebase and customer-specific context.
    • Cross-Repository Querying: Train customer-facing teams on how to effectively query across multiple repositories within their IDE to answer complex technical questions.
    • Knowledge Dissemination System: Establish a formal process for converting AI-generated insights from customer interactions into public-facing knowledge base articles or documentation, creating a virtuous cycle of learning.
    • Upskill Customer-Facing Teams: Implement training programs to equip customer-facing teams with the necessary technical skills (e.g., basic Git, IDE usage, AI prompting) to leverage codebases effectively. This requires a commitment to hiring or developing talent comfortable with technical tools.
    • Embrace Information Chaos: Shift organizational mindset to be less precious about perfect information organization and more focused on AI's ability to navigate and synthesize diverse data sources.
  • Items Requiring Discomfort for Future Advantage:

    • Democratizing Code Access: Overcome engineering team hesitancy by demonstrating the time savings and efficiency gains from allowing customer-facing teams to query the codebase directly, thus reducing engineering bottlenecks. This requires building trust and providing appropriate guardrails.
    • Continuous Learning: Foster a culture of curiosity where all team members, regardless of role, are encouraged to learn basic coding concepts and develop hard skills related to interacting with code and AI. This investment in technical literacy will pay dividends as AI becomes more deeply integrated into workflows.

---
Handpicked links, AI-assisted summaries. Human judgment, machine efficiency.
This content is a personally curated review and synopsis derived from the original podcast episode.