Semola Digital SEO perspective
Agency Perspective10 min read

How We Report on SEO Progress: The Metrics We Use and Why

Oladoyin Falana
Oladoyin Falana

May 5, 2026

Reviewed bySemola Digital Content Team

How an SEO agency reports on its work is one of the most revealing things about how it thinks. A report is not just a document that summarizes the month’s activity. It is a communication of values.

The Report is the Relationship

We have seen the reports that other agencies send. The ones that lead with a keyword ranking table — fifty rows of positions, half highlighted green, a quarter highlighted red, with no context for what any of it means for the business. The ones where “total organic traffic” is the headline metric, and the fact that none of that traffic converted is buried in a footnote or simply absent.

The ones where the narrative section says “results are progressing in line with expectations” without ever stating what those expectations were.

Those reports are not designed to inform you. They are designed to reassure you long enough to keep the retainer going. They work by keeping the signal-to-noise ratio low enough that a critical question never crystallises: is this investment actually generating a return?

Our reports are designed around the opposite philosophy. Every metric we include is there because it either tells you whether SEO is generating business value, or tells us where the work needs to change. Every metric that does not pass that test is excluded, regardless of how impressive it looks in a graph.

This article is a complete transparency document: the metrics, the rationale, the format, and the conversations we commit to having when the numbers do not go the way we planned.

The Report Nobody Should Be Sending

Two real monthly SEO reports. The left measures activity. The right measures value.

Before explaining our approach, it is worth being direct about what the industry standard looks like and why it fails.

The typical agency SEO report contains three sections. First, a keyword rankings table showing position movement across a set of tracked keywords. Second, a traffic overview showing total organic sessions compared to last month and last year. Third, a summary paragraph that describes the month in positive language and ends with a forward-looking statement about next steps.

There is nothing wrong with any of this data individually. Rankings and traffic are real signals. The problem is what is missing: any connection between those signals and the business outcomes that justify the investment.

The three questions a good report must answer

1. Is SEO generating leads or revenue? Not traffic. Not rankings. Not authority scores. Actual business outcomes attributable to organic search. If this question is not answered in your monthly report, the report is incomplete.

2. Is the traffic attracting the right traffic? A site can grow organic traffic 40% month-on-month while simultaneously attracting an entirely different audience from the one the business serves. Traffic growth without engagement quality context is a vanity number.

3. What did we do this month and what are we doing next? Accountability requires a log of completed work and a plan for what follows. A report without these sections is a measurement of outcomes without attribution to actions, which makes it impossible to know what is working and what should change.

What Our Report Looks Like

The four sections of every Semola Digital monthly report, mapped to the audience each section serves and the core question each one answers.

Section 1: The Executive Summary

The executive summary is the first page of every report. It is written to be read in under two minutes by someone who does not want to read the rest of the report. It contains exactly three headline KPIs, a one-paragraph narrative, and a RAG (Red, Amber, Green) status indicator for the engagement overall.

The three headline KPIs are always the same:

  • Organic sessions this month vs. last month and vs. three months ago
  • Organic goal completions (leads, contact form submissions, or purchases, depending on the client’s conversion model) this month vs. last month
  • Estimated organic ROI for the month, calculated from the conversion volume and the client’s reported average lead or transaction value

The narrative paragraph answers the question: what happened this month, and what caused it? It is written in plain English. It names the specific content, technical changes, or external factors that drove the numbers. If something went down, the narrative explains why without softening it. If something went unexpectedly well, the narrative explains the mechanism so it can be replicated.

The RAG status is our honest assessment of whether the engagement is on track:

  • Green: — all core KPIs are moving in the right direction and the business is within the expected trajectory for the engagement’s stage
  • Amber: — one or more KPIs are flat or declining, a diagnosis has been made, and a corrective action is in progress
  • Red: — a significant problem has been identified that requires a strategy conversation and possibly a direction change

We have sent Red reports. They are uncomfortable. They are also the reports that, in our experience, most clearly demonstrate the value of working with an agency that tells the truth, because the clients who saw a Red report and worked through the diagnosis with us came out on the other side with a stronger strategy than the one that had stalled.

Section 2: The Performance Deep-Dive

The deep-dive section is the diagnostic layer. It is primarily for the marketing lead or SEO owner rather than the executive team, and it goes one level deeper on each of the KPIs from the summary. Its purpose is not to celebrate or worry about numbers in isolation, but to understand the mechanism behind them: why is one page converting and another is not, which queries are driving engaged traffic vs. bounce traffic, where in the conversion funnel is the biggest drop-off?

This section takes longer to produce and longer to read than the summary, but it is where the strategic decisions that shape next month’s work actually get made.

Section 3: Work Completed

This section is the accountability layer. It is a factual log of every deliverable produced and every optimisation made during the month: content published with target keywords and word counts, technical issues resolved with a before-and-after description, structured data added or updated, backlinks acquired with source domain and anchor text, and any changes made to GA4 or GSC configuration.

We include this section for a specific reason: every recommendation we make in an SEO engagement has an implementation cost. If a recommendation was made but not implemented, the report should say so, because the absence of an action can explain the absence of a result. Logging the work creates the paper trail that makes it possible to distinguish between ‘we did the work and it did not perform as expected’ and ‘the work was not done.’

Section 4: Next 30 Days

The forward-looking section connects the month’s performance to the month’s plan. It includes the content pieces being worked on with their target keywords and estimated publication dates, any technical work prioritised for the upcoming sprint, notes on algorithm updates or competitive shifts that are influencing strategy, and any items that require client decision or input before we can proceed.

This section matters because it creates a cycle of accountability that runs from plan to execution to measurement and back to plan. When we arrive at next month’s report, the first thing we check is whether last month’s ‘next 30 days’ section was completed. If it was not, we explain why. If it was, we show what it produced.

The Metrics, One by One

What follows is the complete metric set we use, with the rationale for each, the source where it is found, what we consider a healthy signal, and the condition that triggers a diagnostic conversation. We are publishing this because we believe that a client who understands what they are looking at is a better partner, makes better decisions, and gets better results than one who is handed a report they cannot interrogate.

Business Outcome Metrics

Organic GoalCompletionsGA4 → ConversionsThe number of conversion events (contact form submissions, phone call clicks, email link clicks, purchase completions, or resource downloads) that originated from an organic search session in the reporting period.✓ Good: Growing month-over-month from month 4 onward; all conversion events correctly configured before launch.⚠ Watch: Zero conversions despite growing traffic after month 4 — conversion configuration error or critical intent mismatch.Why we use it: This is the primary measure of whether SEO is producing business value. Everything else is context for understanding this number.
OrganicConversion RateGA4 → Conversions ÷ SessionsThe percentage of organic sessions that result in a goal completion. Calculated as: organic goal completions ÷ organic sessions × 100.✓ Good: 1.5–4.0% for B2B service sites; 2.0–5.0% for e-commerce; above 5% is exceptional for service pages.⚠ Watch: Below 1% after month 6, or a significant month-over-month decline despite stable traffic — page or offer issue.Why we use it: Separates traffic quality from traffic volume. A site can grow sessions while its conversion rate collapses if the new traffic is the wrong audience.
Cost PerOrganic Lead(Monthly spend) ÷ (Conversions)Monthly SEO investment (agency fee + content + tools) divided by the number of organic goal completions in the same period.✓ Good: Below cost-per-lead from paid search for the same query types; typically 50–80% lower by month 9–12.⚠ Watch: Higher than equivalent paid CPA after month 9 — conversion funnel or audience targeting review required.Why we use it: The most direct comparison to paid acquisition cost. Makes the compounding ROI argument concrete and financial-team-legible.

Engagement Quality Metrics

Engaged OrganicSessionsGA4 → Acquisition → Traffic acquisitionOrganic sessions lasting more than 10 seconds, viewing more than one page, or triggering a conversion event. GA4’s default engagement definition. More meaningful than raw session count.✓ Good: Engagement rate above 50% for content pages; above 55% for service pages.⚠ Watch: Engagement rate below 35% on pages receiving significant traffic — strong intent mismatch signal.Why we use it: Filters out the bounce traffic that inflates session counts without contributing to business outcomes. The quality denominator for conversion rate.
Average EngagementTime (Organic)GA4 → Engagement overviewThe mean time that organic users spend actively engaged with the site per session — not the same as average session duration. GA4 only counts time when the browser tab is active and in focus.✓ Good: 60–120 seconds for service pages; 90–180 seconds for long-form content articles.⚠ Watch: Below 30 seconds at scale — content is not holding attention; above-fold content may be misaligned with the incoming query.Why we use it: The most direct proxy for content relevance to the incoming audience. Users read what they came to read. They leave immediately what they did not.
Organic LandingPage CVR by PageGA4 Explore → Landing page × ConversionsConversion rate broken down per landing page, filtered to organic sessions only. Shows which specific pages are converting traffic into business outcomes and which are not.✓ Good: Top-performing pages converting at 3%+ from organic; visible improvement month-on-month on optimised pages.⚠ Watch: High-traffic pages converting below 0.5% — signals the page is serving the wrong intent for its ranking queries.Why we use it: The diagnostic tool for conversion problems. Identifies which pages are working as intended and which need content, design, or offer intervention.

Search Performance Metrics

Organic Click-Through RateGSC → Performance → Pages or QueriesThe percentage of Google Search impressions that result in a click to the site. Calculated per query group, page, or overall in Google Search Console.✓ Good: Above 4% overall; above 8% for branded queries; above 3% for competitive informational queries.⚠ Watch: Below 2% at positions 3–5 — title tag or meta description is not matching the searcher’s expectation for that query.Why we use it: The bridge between rankings and traffic. A page that ranks well but has a low CTR is leaving traffic on the table that belongs to it.
Keyword PositionMovement (in context)GSC → Performance (+ SEO tool for volume)Position changes for the tracked keyword set, reported alongside the search intent of each keyword, the monthly search volume, and the trend over three and six months rather than just month-on-month.✓ Good: Priority keywords moving into top 10 within 6 months; top 3 within 12 months for competitive terms.⚠ Watch: Tracked keywords improving that carry no search volume or commercial intent — classic vanity metric manipulation.Why we use it: Rankings matter, but only in context: what is the intent, what is the volume, and is the direction of movement consistent over a meaningful period? One-month position snapshots are noise.
Top QueriesDriving ConversionsGA4 → Acquisition → Search ConsoleThe specific search queries that both generated organic sessions and led to a goal completion, pulled from the GSC–GA4 linked report. Shows which queries have both traffic value and commercial alignment.✓ Good: Increasing number of commercial-intent queries appearing in this list month-on-month.⚠ Watch: Top conversion-driving queries are all branded — non-branded organic content is not converting; content cluster strategy needs review.Why we use it: The most precise signal of SEO’s commercial contribution. It names the exact searches that became business outcomes.

Technical Health Indicators

GSC CoverageStatusGSC → Indexing → PagesThe count of indexed pages, pages with errors, pages with warnings, and excluded pages across the site as reported by Google Search Console’s Coverage or Indexing report.✓ Good: Error count at zero or trending toward zero; indexed count growing as new content is published.⚠ Watch: Rising error count month-on-month — indicates a crawlability or indexation problem that will suppress rankings if uncorrected.Why we use it: The early warning system for technical issues. A page with errors cannot rank regardless of content quality.
Core Web VitalsField Data (CrUX)GSC → Experience → Core Web VitalsReal-user performance data collected by Chrome browsers across actual visits to the site, reported by Google Search Console. The field data (not the lab data) is what Google uses in rankings.✓ Good: All three metrics (LCP, CLS, INP) in ‘Good’ range for both mobile and desktop. 100% of URLs assessed as Good.⚠ Watch: Any URL category in ‘Poor’ range, especially on mobile; or INP failing (over 500ms) — immediate technical priority.Why we use it: CWV is a confirmed ranking factor. Field data failures affect rankings directly and are invisible in standard SEO reports that only show Lighthouse lab scores.

The Metrics We Do Not Report On

This section is as important as the previous one. Every metric we include in a report was chosen deliberately. Every metric below was excluded deliberately.

Domain Authority / Domain Rating

Domain Authority (Moz) and Domain Rating (Ahrefs) are third-party approximations of a site’s link-based authority. They are useful for competitive benchmarking and for getting a rough sense of backlink profile strength. They are not Google ranking factors. Google has explicitly confirmed it does not use either metric. We monitor them internally, but they do not appear in client reports as a performance indicator because they cannot be directly connected to business outcomes and are subject to calculation changes by the third-party platforms that compute them.

Total Organic Sessions (as a headline metric)

We report organic sessions, but never as the headline KPI. Sessions without engagement and conversion context can increase while the business outcomes from organic search decline. We have seen this happen: an algorithm update brings a new audience to a site from queries the business cannot serve, sessions rise 30%, conversions drop 20%, and the agency reports a great month. Total sessions as a headline metric is a trap.

We report on backlinks acquired in the work log — specifically which links, from which domains, with what anchor text. We do not report total backlink count as a performance metric because volume without quality is meaningless and, in some cases, actively harmful. A site that has acquired 500 links from low-quality directories is worse off than one with 50 links from authoritative, topically relevant sources. We measure the quality and strategic value of individual links acquired, not the count.

Social Shares and Social Traffic

Social signals do not directly influence Google rankings, and social traffic has different intent characteristics from search traffic. We do not include social metrics in SEO reports. If a client wants social media performance reporting, that requires a separate measurement framework designed around social’s specific conversion paths and audience dynamics.

“Every metric in our report earns its place by answering a question about the business. Metrics that do not earn that place do not appear, regardless of how good they make the numbers look.”

When the Numbers Go the Wrong Way

The most important test of a reporting relationship is not what happens when SEO is working. It is what happens when it is not. And at some point in every engagement, something will not go as planned. An algorithm update will affect a content cluster. A competitor will publish aggressive content on a target keyword. A technical migration will briefly disrupt indexation. A new content piece will fail to rank despite being well-constructed.

These are not catastrophes. They are the normal texture of a long-term SEO engagement in a competitive environment. The question is how they are communicated.

Our diagnostic protocol when a metric declines

When a Tier 2 or Tier 3 metric declines month-on-month, we run a structured diagnosis before the report is written. The diagnosis follows a fixed sequence:

  • First: check the data source. Is this a real decline or a tracking configuration issue? GA4 session data can be affected by tag manager changes, consent mode configuration updates, or platform sampling. We verify the data before acting on it.
  • Second: check the external environment. Did Google release an algorithm update in the reporting period? Is there a seasonal pattern that explains the movement? Has a major competitor published new content on the target queries?
  • Third: check the internal factors. Did the site undergo any changes — redesign, URL restructure, new hosting — that could explain the movement? Was there any change to the conversion tracking setup?
  • Fourth: if none of the above explains it, we look at the query-level and page-level data. Which specific pages lost traffic? From which queries? What changed about the intent or competition landscape for those queries?
  • Fifth: a proposed corrective action is included in the report with a specific hypothesis about what is causing the decline and what we expect the fix to do. We do not say ‘we are monitoring the situation.’ We say what we are going to do and what we expect it to produce.

When we recommend a strategy change

Some declines are symptoms of a strategy issue rather than an execution issue. If the content cluster approach is not generating qualified traffic after eight months, adding more content to the same cluster is not the answer. If service pages are ranking but not converting, the problem may be in the web design or the offer rather than the SEO. If a competitor has fundamentally outpaced a target content cluster, the strategic decision may be to shift resources to a different cluster rather than to intensify the fight for the same ground.

We make these recommendations explicitly and early. An agency that continues executing a strategy it privately believes is not working, because recommending a change risks the retainer conversation, is not serving its client. We would rather have that conversation in month six than in month twelve.

How to Evaluate Any Agency’s Reporting

Whether you are considering working with Semola Digital or evaluating another agency’s proposal, the following questions will reveal more about how an agency thinks about reporting than any pitch deck.

QuestionWhy It Reveals the TruthRed Flag Answer
Can I see a sample report from a current client?A real sample (anonymised) shows exactly what you will receive. An agency that cannot or will not share one is protecting something.“We don’t share client reports” without a counterproposal. Or a sample that consists entirely of keyword ranking tables.
What is the headline metric on page one of your report?Reveals what the agency considers most important. The answer should be a business outcome, not a vanity metric.Domain Authority, total backlink count, or keyword rankings with no conversion context as the primary KPI.
What happens when a metric declines?Reveals whether the agency has a diagnostic protocol or just hopes things improve.“We monitor the situation” or “Sometimes rankings fluctuate.” No specific diagnosis, no specific proposed action.
Do I own the reporting dashboard and the data?Your GA4 account and GSC property should be yours. If an agency builds your reporting in their own account, the data leaves when they do.Reporting in an agency-owned platform. “We’ll set up tracking in our system.”
How do you report on organic leads specifically?If the agency cannot explain their conversion tracking configuration, they are not measuring organic revenue contribution.Confusion about the difference between traffic and conversions, or “we can add that later.”
What metric would tell you the engagement is not working?Reveals whether the agency has defined failure. An agency that cannot describe failure cannot recognise it when it arrives.No clear answer, or “SEOs takes time — we always see results eventually.”

Conclusion: Transparency is the Product

The reporting framework we have described in this article is not a differentiator in the sense of being a unique competitive advantage. Measuring organic conversions, diagnosing declines, and writing plain-English narratives are table stakes for any agency that takes its work seriously.

The reason we are publishing this in detail is not to claim credit for doing something exceptional. It is because the industry norm is significantly lower than this standard, and most business owners signing SEO contracts do not know what they should be asking for until they have already signed a contract that delivers something much less.

If you are about to hire an SEO agency and the reporting structure has not been discussed in detail, bring this article into the conversation. Ask what the headline metric on page one of the monthly report will be. Ask to see a sample report from a client at a similar stage. Ask what happens when something declines.

The answers will tell you a great deal about what the next twelve months will look like.

At Semola Digital, every engagement begins with a reporting alignment session: we agree the conversion events to track, the KPIs to headline, and the format of the monthly report before any SEO work begins. The report is not an afterthought. It is the frame that makes the work legible. And legibility, in an investment relationship, is not a courtesy. It is an obligation.

See a real report sample: semoladigita@gmail.com

We share anonymised report samples on request with prospective clients during the evaluation stage. No commitment required to see one.

Share this article

Oladoyin Falana
Oladoyin Falana

Founder, Technical Analyst

Oladoyin Falana is a certified digital growth strategist and full-stack web professional with over four years of hands-on experience at the intersection of SEO, web design & development. His journey into the digital world began as a content writer — a foundation that gave him a deep, instinctive understanding of how keywords, content and intent drive organic visibility. While honing his craft in content, he simultaneously taught himself the building blocks of the modern web: HTML, CSS, and React.js — a pursuit that would eventually evolve into full-stack Web Development and a Technical SEO Analyst.

Follow me on LinkedIn →

Related Insights