Semantic Layer vs Knowledge Graph: What Data Leaders Need to Know
Two Solutions to Different Problems
If you lead a data team, you've almost certainly dealt with the "metric definition problem." Revenue means one thing to Finance, another to Sales, and something else entirely to Product. The semantic layer emerged as the solution: a single, governed place where metrics are defined once and used everywhere.
It's a good solution. And it solves a real problem. But it solves only one problem — and the bigger problem remains untouched.
The semantic layer tells you what a metric means. The knowledge graph tells you how metrics relate to each other. These are fundamentally different capabilities, and understanding the distinction is critical for data leaders evaluating their analytics architecture.
What a Semantic Layer Does Well
A semantic layer (tools like dbt Metrics, Looker's LookML, AtScale, or Cube) provides a governed abstraction between your data warehouse and your analytics consumers. It defines:
- Metric definitions: Revenue = SUM(amount) WHERE status = 'closed_won' AND date >= current_period_start
- Dimensions: How metrics can be sliced (by region, by product, by customer segment)
- Aggregation rules: How metrics roll up across time and hierarchy
- Access controls: Who can see what
This is genuinely valuable. When your CEO asks "What was revenue last quarter?" and your CFO, VP of Sales, and Head of Analytics all give the same answer, that's the semantic layer working.
The semantic layer ensures consistency of definition. Everyone is looking at the same number, calculated the same way.
What a Semantic Layer Cannot Do
Here's the question a semantic layer can't answer: "Why did revenue drop?"
The semantic layer knows that Revenue = SUM(closed_won_amount). It knows you can slice Revenue by region, segment, and product. But it has no concept that Revenue is driven by Deal Count and Average Deal Size. It doesn't know that Deal Count depends on Opportunity Count and Win Rate. It doesn't know that a change in MQL-to-SQL conversion rate three weeks ago should be expected to affect revenue now.
A semantic layer is a catalog of definitions. A knowledge graph is a model of causal relationships.
Let's make this concrete with three scenarios.
Scenario 1: Diagnosing a Revenue Drop
With a semantic layer only: Your analyst knows exactly how revenue is defined. They can slice it by any dimension. They see it dropped 12% in the Enterprise segment. Now what? They open another dashboard to check pipeline metrics. Then another for marketing metrics. They manually investigate each possible driver, joining insights across tools and queries.
With a knowledge graph: The system already knows that Enterprise Revenue is driven by Enterprise Deal Count x Enterprise Average Deal Size, which are driven by Enterprise Opportunity Count x Enterprise Win Rate, which are driven by Enterprise SQL Count x Enterprise SQL-to-Opportunity Conversion Rate. When revenue drops, Root Cause Analysis traverses this graph automatically and identifies which upstream metric deviated, by how much, and when the deviation started.
The semantic layer made sure "revenue" means the same thing everywhere. The knowledge graph made the 12-hour investigation take 30 seconds.
Scenario 2: Forecasting the Impact of a Marketing Change
With a semantic layer only: Marketing wants to shift 30% of budget from paid search to content marketing. They can report on historical spend and historical leads from each channel. But they have no quantified model of how a spending change in one channel will affect lead volume, pipeline, and ultimately revenue — or how long that effect will take to materialize.
With a knowledge graph: The relationships between marketing spend, lead volume, pipeline, and revenue are explicitly mapped with measured elasticities and lag times. Fig's Scenario Modeling can project: "Reducing paid search by 30% is expected to decrease MQLs by 18% within 2 weeks, impacting pipeline by approximately 12% in 6 weeks, with a revenue effect of 8% in 10-14 weeks. The content marketing increase is expected to offset 40% of this decline, but with a longer lag of 8-12 weeks for pipeline impact."
The semantic layer ensures "MQL" is defined consistently. The knowledge graph quantifies the causal chain from spend to revenue.
Scenario 3: Setting Consistent Targets
With a semantic layer only: Leadership sets a revenue target of $50M for Q3. The Sales team sets pipeline targets. Marketing sets lead targets. Product sets activation targets. Each team builds their targets independently, often in spreadsheets, with different assumptions about conversion rates and growth trajectories. By mid-quarter, the targets are inconsistent — if marketing hits their lead target but conversion rates don't match assumptions, the pipeline target becomes unreachable.
With a knowledge graph: Targets propagate through the causal graph. A $50M revenue target, combined with historical win rates and deal sizes, implies a specific opportunity count. That opportunity count, given conversion rates, implies a specific SQL count. That SQL count implies a specific MQL count. The knowledge graph ensures mathematical consistency across the entire target hierarchy, and flags when assumptions at one level are inconsistent with assumptions at another.
The Architecture Distinction
Think of it this way:
A semantic layer is like a dictionary. It defines terms precisely so everyone speaks the same language. "Revenue" means this. "Active User" means that. Essential for communication, but a dictionary doesn't tell you how words relate in a sentence.
A knowledge graph is like a map. It shows you the roads between cities — which metrics influence which other metrics, how strong the connection is, and how long it takes for an effect to travel from one to another. A map is useless if you don't agree on the names of the cities (that's the semantic layer's job). But knowing the names of the cities is useless if you don't know the roads between them (that's the knowledge graph's job).
In technical terms:
- The semantic layer defines nodes (metrics) with consistent calculations
- The knowledge graph defines edges (relationships) with measured properties: direction, strength (elasticity), and timing (lag)
How They Work Together
The ideal architecture uses both, and they complement each other naturally.
The semantic layer feeds the knowledge graph. Metric definitions from your semantic layer become the nodes in the knowledge graph. This ensures that when the knowledge graph says "MQL-to-SQL Conversion Rate dropped 8%," it's using the governed definition of MQL and SQL that your entire organization has agreed on.
The knowledge graph extends the semantic layer. Once metrics are defined, the knowledge graph adds the relational intelligence that enables automated analysis. The semantic layer says "here are your metrics." The knowledge graph says "here's how they drive each other, and here's what happens when one of them changes."
In practice, this means:
- Your semantic layer (dbt, LookML, Cube, etc.) defines and governs metric calculations
- Fig's knowledge graph ingests those definitions and maps the causal relationships between them
- When an anomaly is detected, the knowledge graph powers Root Cause Analysis using consistently-defined metrics
- When a scenario is modeled, the results are expressed in the same metric definitions your team already uses
What Data Leaders Should Ask
If you're evaluating your analytics architecture, here are the questions that separate semantic-layer-only thinking from knowledge-graph thinking:
-
Can your system explain WHY a metric changed, or only show you THAT it changed? If the answer is only "that," you have a reporting gap that no amount of better dashboards will close.
-
Can your system quantify the downstream impact of a proposed change? Not qualitatively ("this will probably affect pipeline") but quantitatively ("this is expected to reduce pipeline by 12% with a 6-week lag").
-
Are your targets mathematically consistent across teams? If each team sets targets independently, they're almost certainly inconsistent. A knowledge graph makes inconsistencies visible before you're halfway through the quarter.
-
How long does it take your team to diagnose the cause of a metric change? If the answer is measured in days, you're paying for knowledge that should be automated.
The semantic layer was a major step forward for the data industry. It solved the "what does this metric mean?" problem decisively. The knowledge graph is the next step: solving the "how do these metrics relate?" problem with the same rigor and consistency.
You don't choose between them. You build one on top of the other.
Ready to see Fig in action?
Start with free credits. Connect your data warehouse. See your first causal analysis in minutes.
Start With Free Credits →