Portfolio Project — SaaS Revenue Analysis
End-to-end revenue analysis across 10,000 customers and 36 months.
MRR modeling, cohort retention, and channel segmentation to surface what blended reporting misses.
Leadership needs a clearer view of 3 things:
Whether the revenue base healthy, which customer cohorts are retaining well, and whether the right customers are being acquired by channel.
Since top-level metrics don't answer these questions reliably, this analysis builds the segmented view that does.
The dataset is synthetic but realistic: logistic growth curve peaking at month 11, plan migrations (Starter → Growth → Enterprise), failed payments, reactivations, and one deliberately seeded anomaly designed to be invisible at the top level.
Each month's MRR movement is classified into five buckets: New, Expansion, Contraction, Churn, or Reactivation. The waterfall helps to show the composition of growth, and not just the net number.
The S-curve is clearly shown from the rapid growth through Year 1, expansion MRR compounding in Year 2, then net new MRR compressing in Year 3 as churn scales with the base and acquisition slows. By late 2025, the business has shifted into retention.
Every percentage point of churn costs more in Year 3 than it did in Year 1. The math changes when the base is large.
Cohorts are defined by month of first subscription. For each cohort, we track what percentage of original MRR is still active at each subsequent month.
Month 1 retention averages 101.5% because expansion revenue from early upgrades outpaces logo churn. After that, the curve decays consistently with no cohort significantly outperforming the average. By M24, average retention sits at 71.6%. The narrow P25-P75 band means cohort behavior is predictable, which makes LTV modelable. It also means the average seems like a reasonable estimate for most segments.
Channel retention at M12 ranges from 66% to 74% across all 6 acquisition channels. partner_referral sits at 66.3%, which is within the normal range. Nothing stands out here.
Now filtering to Enterprise only.
A 17–24 percentage point gap.
Why is it invisible at the top level? Enterprise is only 13% of signups, but it's 48% of MRR. The Starter and Growth tier customers from partner_referral churn at normal rates, and they outnumber Enterprise customers by volume. Therefore, the top-level view averages it away.
LTV is modeled as average initial MRR × (1 / monthly churn rate), segmented by channel and plan tier. CAC is assumed from industry-representative benchmarks by channel.
| Metric | partner_referral Enterprise | Other channels Enterprise avg |
|---|---|---|
| Monthly churn | 6.05% | 1.87% |
| Avg customer lifetime | 16.5 months | 52.0 months |
| LTV | $4,654 | $14,567 |
| CAC | $95 | $60–$340 |
| LTV:CAC ratio | 49x (looks healthy) | varies |
partner_referral has the lowest CAC at $95 and a blended LTV:CAC of 49x, which looks strong. At the Enterprise tier specifically, monthly churn of 6.05% compresses average customer lifetime to 16.5 months versus 52 months for other channels. LTV comes out to $4,654 compared to $14,567 for other Enterprise segments.
The ratio is misleading due to the low CAC.
The churn spike is channel- and tier-specific. That pattern points to a fit or expectation problem, not a product problem. The fix is qualitative: interview churned Enterprise customers from this channel before spending another dollar scaling it.
Blended channel retention is structurally misleading when plan mix varies by channel. The default board view should always show Enterprise retention separately. One chart change, significant difference in what gets caught early.
LTV:CAC ratio alone is insufficient when churn varies 3× by segment. Require a minimum expected customer lifetime (e.g. 24 months) before scaling any channel at the Enterprise tier. CAC efficiency means nothing if the customer doesn't stay.
A note on methodology: this anomaly was seeded deliberately to demonstrate the analytical approach. The same methodology applied to real data would surface equivalent patterns that aggregated reporting typically misses.