Skip to content
Try Free →

The Analytics dashboard

Last updated: · 5 min read

What the dashboard shows

Eight panels arranged in a single scrollable view:

  1. KPI cards. Total queries, conversations, satisfaction, average response time.
  2. Query volume over time. Area chart of daily query counts.
  3. Top questions. Most-asked questions with answer count.
  4. Unanswered queries. Questions where retrieval came back weak.
  5. Channel breakdown. Pie chart of conversation volume per channel.
  6. Satisfaction trend. Thumbs up/down ratio over time.
  7. Peak hours heatmap. Day-of-week + hour-of-day grid.
  8. Sources usage. Which documents drove the most retrievals.

Each panel has its own date-range picker, exportable as CSV (Business+) or PNG.

KPI cards

Four cards at the top show this period's headline metrics:

  • Total queries. Count of AI-generated responses across all channels.
  • Total conversations. Unique conversation threads. Multiple queries per conversation count once.
  • Satisfaction. Percentage of thumbs-up votes among responses that received feedback.
  • Average response time. Median latency from customer message to bot reply.

Each card shows the value plus a percentage change vs the previous period (e.g., last 7 days vs the 7 days before). Green for improvement, red for regression.

Date range

The top-right date-range picker controls all panels:

  • Last 24 hours. Useful for spotting today's issues.
  • Last 7 days (default). Most common view.
  • Last 30 days. Trend visibility.
  • Last 90 days. Longer-trend visibility.
  • Custom range. Pick any start and end date.

Up to 365 days of history available on standard plans, 6 years on Enterprise.

Top questions

Lists the 25 most-asked questions in the date range. Each row:

  • Question text (de-duplicated across paraphrases).
  • Count. How many times asked.
  • Average satisfaction for responses to this question.
  • Top source the bot cited.

Useful for spotting:

  • High-volume questions to add to your FAQ. If "what's your refund policy?" comes up 200 times a month, it deserves a dedicated FAQ entry.
  • Underperforming responses. A high-volume question with low satisfaction signals the bot's answer isn't good. Investigate.

Click any row to see the conversations behind it.

Unanswered queries

Lists the 25 most common questions where the bot's confidence was low or it refused to answer. Each row:

  • Question text.
  • Count. How many times asked.
  • Top retrieval source the bot found (even though it wasn't confident enough).

This is the most actionable panel. Unanswered queries are knowledge gaps. Plug them by:

  1. Adding a Q&A pair with the canonical answer.
  2. Adding a snippet with the relevant info.
  3. Re-crawling content if your documentation has the answer but indexing missed it.

Review weekly. Reduce the unanswered list by 5 to 10 entries per week and the bot's coverage compounds.

Channel breakdown

A pie chart of conversation volume per channel. Helps you see which channels are getting traction.

For most B2B SaaS:

  • Widget dominates (60 to 80% of conversations).
  • WhatsApp for India and SEA-focused customers (10 to 30%).
  • Email Assistant for B2B support escalations (5 to 20%).
  • Slack, Telegram, Discord as supporting channels.

Track the breakdown over time to spot channel shifts (e.g., WhatsApp adoption growing) and budget your channel investments.

Satisfaction trend

A line chart of thumbs-up percentage over time. Daily granularity by default.

Three things to watch:

  • Baseline. Most teams settle at 75 to 90% satisfaction. Below 70% suggests systemic issues.
  • Spikes down. A bad day correlates with a content update gone wrong, a model provider issue, or a sudden volume spike.
  • Long-term drift. Slowly declining satisfaction signals content rot. Schedule re-crawls.

The chart excludes responses without feedback (most). Only thumb-voted responses count toward satisfaction.

Peak hours heatmap

A 7x24 grid showing conversation volume per day-of-week and hour-of-day. Useful for:

  • Staffing decisions. When is your team busiest? Match coverage.
  • SLA configuration. Set tighter SLAs during peak hours.
  • Campaign timing. Launch outbound at low-volume hours to avoid drowning the inbox.

Heatmap cells deepen in color with higher volume. Hover for exact count.

Sources usage

Lists the 25 most-cited documents in the date range. Each row:

  • Document name.
  • Citation count.
  • Average satisfaction for responses citing this document.

Useful for content investment decisions. The top-10 sources usually carry 50 to 70% of citations. Focus content updates there.

CSV export

For deeper analysis (BI tools, custom reports):

  • Per-panel export. Click the download icon on any panel.
  • All-data export. Analytics > Export All generates a zip with every panel as CSV.

Export available on Business and above. Business+

Real-time vs daily aggregates

Two data freshness modes:

  • Real-time. KPI cards and unanswered queries update every 30 seconds.
  • Daily aggregates. Trend charts (volume, satisfaction over time) compute overnight. Today's data appears the next morning.

For real-time inspection, use the Live Chat inbox directly. Analytics is for trend analysis.

Limits

  • Data retention. 365 days standard, 6 years Enterprise.
  • CSV export size. Up to 100 MB per export. Larger ranges split into multiple files.
  • Plan availability. Basic analytics on every paid plan. CSV export and custom date ranges on Business+.

Common pitfalls

Satisfaction looks low. Customers only vote on responses they care strongly about (positive or negative). 60 to 70% is fine; 30% is concerning.

Unanswered list grows over time. New questions arrive faster than you backfill content. Schedule weekly review and aim to plug 10 to 20 per week.

Channel breakdown skewed by one customer. A high-traffic enterprise customer dominates the chart. Filter by audience or workspace tag to see segment-level data.

Peak hours don't match your assumptions. Often the inbox is busier in evenings than during business hours, depending on your customer mix. Trust the data, not your assumptions.

FAQ

Can I see analytics per channel?

Yes. Apply a channel filter at the top. All panels filter consistently.

Can I compare two date ranges?

Yes on Business+. Analytics > Compare Ranges shows side-by-side panels for two custom date ranges.

Are agent metrics included?

Yes. Analytics > Agents shows per-agent response times, claim rates, resolution counts, and customer satisfaction.

Can I get alerts when metrics dip?

Yes. Configure thresholds under Settings > Alerts. Slack or email notification when satisfaction drops below 70% or response time exceeds 2 minutes.

How accurate is the data?

Counts are exact. Latency is real-time measured. Satisfaction is opt-in from customer feedback. Trends use 24-hour overnight aggregates.

Was this page helpful?