AI Freed Up Hours of Your Week. Here's What to Do With Them.
AI tools have eliminated hours of weekly coordination work for CSMs — status updates, meeting prep, account summaries, CRM hygiene. Most teams are using that recovered time to do more of the same. The CSMs who are pulling ahead are investing it in three skills that make them structurally harder to replace: data analysis and SQL fluency, technical product knowledge, and prompt engineering for workflow automation. The common thread is a shift from operating inside systems other people built to building the systems yourself.
How Skrift helps: Skrift automates the signal detection, account summarization, and risk identification work that used to consume hours of CSM time each week — freeing capacity that teams can redirect toward the higher-leverage skills this article describes.
One of my CSMs came to me in January with a problem I wasn’t expecting. She said she’d been using AI tools for about four months to automate her meeting prep, account summaries, and CRM updates. She estimated it was saving her six or seven hours a week. Good news.
The problem: she’d been filling those hours with more check-in calls and more follow-up emails. She was busier than before. Her accounts weren’t doing any better. “I’m doing more of the same thing, just faster,” she shared. “I don’t think that’s the point.”
She was right, and it took her saying it out loud for me to realize we had the same problem across the whole team.
The Capacity Dividend Nobody’s Investing
Here’s what’s happened across most CS orgs in the past year. AI tools have quietly eaten the administrative work that used to consume 30-40% of a CSM’s week. Meeting prep that took 45 minutes now takes five. Account summaries that required pulling data from four systems happen automatically. CRM updates, status reports, internal briefing docs. All of it is either faster or fully automated.
The capacity dividend — the recovered time that AI automation creates by eliminating routine CSM tasks — is real. It’s sitting there on every CSM’s calendar as newly open blocks. And most teams are wasting it.
I don’t mean they’re literally doing nothing. They’re doing more: more calls, more emails, more touchpoints. It feels productive. But volume was never the constraint. A CSM who makes 12 check-in calls a week instead of 8 isn’t meaningfully moving the needle on retention or expansion. They’re running the same playbook at higher RPM.
A VP of CX at a mid-market analytics company we interviewed shared something that stuck:
“We automated all the busywork and then watched our team fill the gap with more busywork. Nobody had told them what to do instead. Honestly, I hadn’t figured that out myself yet.”
I think this is where most teams are right now. The time has been freed. The question of what to do with it is mostly unanswered. And the answer matters because it’s going to determine which CSMs are still in high-demand roles in two years and which ones are managing smaller books at lower comp because the easy work they used to do doesn’t require a human anymore.
The Three Skills That Change the Shape of the Role
I’ve been watching this play out on my team and across a handful of post-sales orgs I’m close to. The CSMs who are pulling ahead aren’t the ones doing more of the old work. They’re the ones investing their recovered hours into skills that make them structurally harder to replace. Three skills keep coming up.
They all share a common thread: they move you from operating inside systems other people built to building the systems yourself.
Data analysis and SQL
This is the one that makes the biggest immediate difference, and it’s the one most CSMs resist the hardest.
The default CSM workflow for data is: ask the data team for a report, wait three days, get a dashboard that almost answers your question, ask for a modification, wait two more days. By the time you have what you need, the moment has passed. You’re reacting to data that’s a week old about a situation that needed attention on Tuesday.
A CSM who can write a basic SQL query against your product analytics database shortcuts this entire loop. They can check feature adoption trends for a specific account in five minutes. They can pull login frequency by user over the last 90 days. They can segment their book by usage patterns and spot the account that’s quietly declining before it shows up in anyone’s health score.
I pushed two of my CSMs to learn SQL basics last year. One used Mode Analytics tutorials, the other did a DataCamp course. The initial investment was about 15 hours each over six weeks. Within three months, one of them had built a personal dashboard that tracked engagement velocity across her accounts using a query she wrote herself. She caught a $165K account’s usage decline 23 days before our standard health score flagged it. That’s not a marginal improvement. That’s a different category of CSM.
The practical path: learn basic SELECT, WHERE, JOIN, and GROUP BY statements. That covers 80% of what you’ll need. Then ask your data team for read-only access to your product analytics database and start poking around. You don’t need to become a data engineer. You need to reach the point where you can answer your own questions without filing a ticket.
I’ll be honest about the failure case too. My third CSM who tried this gave up after a week. She found the syntax frustrating and couldn’t connect the abstract exercises to her actual work. The tutorials that worked were the ones where the CSM was querying data that looked like their real data. Generic SQL courses with sample e-commerce databases didn’t stick. If you’re going to invest in this for your team, use your own product data for practice exercises.
Technical product fluency
This one is subtler but compounds over time in ways that are hard to overstate.
Most CSMs know their product at the feature level: what buttons do what, how to configure settings, what the standard workflow looks like. What they don’t know is how the product actually works underneath. How data flows between systems. What the API does. Why certain integrations behave the way they do. What the implementation team is actually configuring during onboarding.
This matters because the conversations that determine renewal and expansion outcomes increasingly involve technical stakeholders. The director of engineering who needs to approve the API integration. The data architect who wants to understand how your product handles PII. The VP of IT who needs to know what your security posture looks like before they’ll sign off on an enterprise renewal.
When those conversations happen, most CSMs pull in an SE. And that’s fine for the complex stuff. But I’ve watched opportunities stall for two weeks because the CSM needed to schedule an SE call for a question they could have answered themselves with 30 minutes of product knowledge. A Head of CS at a developer tools company put it bluntly:
“The CSMs on my team who understand our API and data model close expansion deals faster, get pulled into strategic conversations earlier, and almost never lose a renewal because of a technical objection they couldn’t address.”
The starting point is lower-effort than most CSMs assume. Sit in on three or four implementation calls and take notes on everything you don’t understand. Then take that list to your SE team and ask for a 30-minute walkthrough of how your product’s data actually flows. Read your own product’s API docs. You don’t need to understand every endpoint. You need to understand the architecture well enough that when a customer asks “how does your product get data from Salesforce,” you can answer without saying “let me loop in my SE.”
I started requiring my CSMs to shadow at least two implementation calls per quarter about eight months ago. The ones who took it seriously are having fundamentally different customer conversations now. They’re catching technical issues during QBRs that they would have missed before. They’re suggesting integration patterns that lead to expansion. One of them identified a data pipeline misconfiguration during a routine check-in that would have caused a major issue for the customer. That’s the kind of moment that turns a vendor relationship into a partnership.
Prompt engineering and AI workflow design
This is the newest of the three and the one where I’m least confident about the long-term trajectory. But the early results from CSMs who’ve invested here are compelling enough that I think it’s worth the bet.
Prompt engineering for CS workflows is the practice of designing reusable AI prompts that automate or semi-automate repetitive processes. Not the one-off “paste a transcript and get a summary” workflow, which as I’ve written about before is a trap. I’m talking about building durable micro-automations: a prompt template that generates a QBR prep doc from structured account data. A workflow that produces a first-draft renewal business case. A system that takes raw usage metrics and outputs a stakeholder-ready narrative.
The skill isn’t writing one good prompt. It’s iterating on a prompt until the output is consistently usable without manual editing. That iteration process is where the real capability develops.
One of my CSMs spent about three weeks building a prompt chain that takes our standard account data export and produces a renewal risk assessment. The first version was mediocre. By the fourth iteration, it was producing assessments that matched our manual risk reviews about 80% of the time. She’s now spending 15 minutes per account on renewal prep instead of 90. And she built it herself, which means she can modify it when the inputs change.
The practical starting point: pick one repetitive workflow you do every week. Not something complex. Something boring and predictable. Try to automate or semi-automate it with AI. The first attempt will probably produce garbage. Iterate on the prompt. Adjust the input format. Add constraints and examples to the prompt. Keep going until the output is actually usable. That iteration cycle, frustrating as it is, builds the muscle.
I want to be careful not to oversell this. Most prompt engineering content online is breathless about productivity gains and light on honest assessment. In my experience, about half the workflows my team has tried to automate with prompts have produced reliably good results. The other half either required so much prompt tuning that the time investment exceeded the time saved, or produced output that was consistently 70% right, which is worse than useless because you still have to check everything. The skill is partly knowing which workflows are good candidates and which ones aren’t.
The Career Bifurcation
I want to zoom out for a moment because I think the stakes here are higher than “learn some new skills.”
The CSM role as it existed in 2020 was heavily weighted toward coordination, information routing, and manual data management. A good CSM was organized, responsive, and persistent. Those are still valuable traits. They’re just not sufficient anymore.
What’s happening is a career bifurcation — a widening split between CSMs who are using AI-freed capacity to develop analytical, technical, and automation skills, and those who are doing the same coordination work faster. Some people in the industry are calling the first group “full-stack CSMs” or “T-shaped CSMs,” and while I find the labels a bit reductive, the underlying pattern is real. On one side, you have CSMs who are using the capacity dividend to move upmarket in their skill set. They’re querying data, having technical conversations, and building automated workflows. They’re becoming strategic advisors who happen to manage accounts. On the other side, you have CSMs who are doing the same job with better tools. More efficient, yes. But not fundamentally more valuable.
I don’t love the framing of “CSMs will be replaced by AI” because I think it’s wrong. The relationship-building, the judgment calls, the ability to read a room on a customer call, the political navigation of a complex renewal. None of that is going away. But the CSM who can do all of that and also pull their own usage data, hold their own in a technical architecture conversation, and automate their own workflows? That person is worth dramatically more than someone who can only do the first set of things.
The gap is already visible in hiring. I’ve reviewed probably 50 CS job postings in the last three months, and the number of them that mention SQL, data fluency, or technical aptitude has roughly doubled compared to a year ago. That’s not a coincidence.
What CS Leaders Should Do About This
If you manage a CS team, the worst thing you can do is let the capacity dividend dissipate into more of the same activity. The second worst thing is to mandate upskilling without creating the conditions for it. I’ve done both, and the second one is sneakier because it feels like leadership.
Block 2-3 hours per week on your team’s calendar explicitly for skill development. Not “when you have time.” Scheduled, protected time. Make it a team OKR, not an individual initiative. When I first tried the “encouraged” approach, one person out of eight invested the time. The rest let it get eaten by customer calls and internal meetings. When I made it a scheduled block and asked people to share what they’d worked on in our Friday standup, six out of eight engaged consistently.
Specificity matters more than motivation. When I gave my team a generic “learn SQL” goal, one person did it. When I gave them a specific Mode Analytics tutorial, a database login, and three practice queries related to their actual accounts, four people did it. The difference wasn’t enthusiasm. It was activation energy.
Create a prompt library. When one CSM builds a prompt that works well, make it available to the whole team. We started a shared Notion page where CSMs post their working prompts with input templates and example outputs. It’s the most-used internal resource my team has created in the past year. I didn’t expect this to work as well as it did. The competitive dynamic of “someone else built something cool and I want to contribute too” turned out to be a stronger motivator than anything I could have mandated.
And be honest that this is uncomfortable. I asked a 10-year CS veteran on my team to learn SQL last year. She looked at me like I’d asked her to learn to fly a helicopter. Six months later she said it was the most useful skill she’d picked up in years. But that first month was genuinely miserable for her, and I almost pulled back on the ask twice because watching someone struggle with something new when they’re excellent at their current job feels wrong. It’s not wrong. It’s necessary. But acknowledge the learning curve. Make it safe to be bad at something new.
The Version of the Role That’s Already Gone
The CSM role isn’t going away. I’m more convinced of that than I was a year ago. The strategic, human, judgment-intensive parts of the work are becoming more valuable, not less, precisely because AI is making the informational parts trivial.
But the version of the role that was mostly coordination and data entry? That’s already gone. The CSMs who were spending 70% of their time on meeting prep, CRM updates, status emails, and account summaries are finding that AI does all of that faster and at least as well. The question is what they do with the 70%.
The answer, I think, is to become the kind of professional who builds systems rather than operating inside them. The CSM who can query their own data, hold a technical conversation, and automate their own workflows isn’t just a better CSM. They’re a different kind of professional entirely. The capacity dividend is the opportunity to make that transition. But it’s a limited window. The teams that invest it wisely are going to be the ones running post-sales orgs in five years. The ones that absorb it into more check-in calls are going to wonder why the role feels smaller than it used to.
Frequently Asked Questions
What skills should CSMs learn now that AI handles routine tasks?
The three highest-leverage skills for CSMs in an AI-augmented environment are data analysis and SQL fluency (the ability to query product analytics and build your own dashboards rather than waiting on data teams), technical product fluency (understanding how your product's data flows, API integrations work, and implementations are architected), and prompt engineering or AI workflow design (the ability to automate repetitive workflows by building reusable AI prompts and processes). These skills share a common thread: they move CSMs from operating inside systems to building and shaping them.
What is the capacity dividend in customer success?
The capacity dividend is the recovered time that AI automation creates by eliminating routine CSM tasks like meeting prep, status updates, CRM data entry, and account summarization. Most teams unconsciously absorb this dividend by doing more of the same work — more check-in calls, more emails, more reactive firefighting. Teams that deliberately invest the capacity dividend in skill development and strategic work see compounding returns in retention, expansion, and CSM career progression.
Why should CSMs learn SQL?
SQL fluency gives CSMs direct access to product usage data, engagement analytics, and customer behavior patterns without waiting for a data team to pull reports. A CSM who can write a basic query to check feature adoption trends, usage frequency by account segment, or login patterns over the last 90 days can identify risks and opportunities days or weeks before they surface in a dashboard someone else built. The skill gap between CSMs who can query data and those who cannot is becoming one of the strongest predictors of career advancement in post-sales.
How do CSMs develop technical product fluency?
Technical product fluency is built through three progressive steps: sitting in on implementation and solutions engineering calls to absorb how your product is actually deployed (not how marketing describes it), reading your product's API documentation to understand what data flows where, and asking your SE team for a 30-minute walkthrough of the technical architecture. The goal is not to become an engineer — it's to reach the point where you can have a credible technical conversation with a customer's technical stakeholders without needing to pull in an SE for every question.
What is prompt engineering for customer success workflows?
Prompt engineering for CS workflows is the practice of designing reusable AI prompts that automate or semi-automate repetitive processes — QBR preparation, account risk summaries, renewal business cases, stakeholder briefing docs. The skill is not about writing one good prompt; it's about iterating on prompts until the output is consistently usable without manual editing. CSMs who develop this skill effectively build their own micro-automations, compounding the capacity dividend by turning one-time AI experiments into durable workflow improvements.
Will AI replace customer success managers?
AI is not replacing CSMs — it's replacing the version of the CSM role that was primarily coordination, data entry, and information routing. The strategic, relationship-driven, judgment-intensive parts of the role are becoming more important, not less. What is changing is the minimum viable skill set: CSMs who only do the work that AI can now automate are at risk, while CSMs who use the recovered capacity to develop analytical, technical, and automation skills are becoming more valuable. The role is evolving, not disappearing.
How should CS leaders help their teams upskill for AI?
The most effective approach is structured but lightweight: allocate 2-3 hours per week explicitly for skill development, provide access to learning resources (Mode Analytics tutorials for SQL, product API docs for technical fluency, internal prompt libraries for AI workflows), and create safe spaces for experimentation where CSMs can try building queries or automations without pressure to deliver polished outputs immediately. The leaders who are getting this right treat upskilling as a team OKR, not an individual side project.
What is the CSM career bifurcation?
The CSM career bifurcation is the widening split between two trajectories in customer success. On one side are CSMs who invest AI-freed capacity into developing analytical, technical, and automation skills — sometimes called full-stack CSMs or T-shaped CSMs — who are becoming strategic advisors with the ability to query data, hold technical conversations, and build their own workflows. On the other side are CSMs who continue doing coordination and relationship management work faster with AI tools but without expanding their skill set. The bifurcation is visible in hiring trends, compensation gaps, and the types of accounts each group is trusted to manage.
How do CSMs learn prompt engineering for their workflows?
The most effective approach is project-based: pick one repetitive weekly workflow (QBR prep, account summaries, renewal business cases), attempt to automate it with an AI prompt, and iterate until the output is consistently usable. The first attempt will typically be poor. The skill develops through the iteration cycle — adjusting inputs, adding constraints, providing examples, and refining the prompt over 3-5 versions. CSMs who build working prompt templates should share them in a team prompt library so the entire team benefits. The goal is not to write one clever prompt but to develop the judgment for which workflows are good automation candidates and which aren't.
See how Skrift surfaces these signals automatically.
Learn more about Skrift