Skip to main content

Overview

The dashboard surfaces a concise set of metrics focused on recent generation activity. Use these to keep tabs on volume and model mix without digging through raw history.

Generation metrics

Generations (selected range)

What it is: Count of generation requests in the last 7 or 30 days. Includes:
  • Completed generations
  • Failed generations
Use it to: Track overall activity and watch for spikes or drops.

Recent generations

What it is: A grid of your latest completed generations. Includes:
  • Prompt snippet
  • Model used
  • Timestamp (relative)
  • Cost per generation
  • Download links
Use it to: Quickly review recent outputs and grab assets without opening a session.

Model metrics

Daily model activity (stacked bars)

What it is: Daily counts for your top three models in the selected range. Use it to:
  • See which models dominated specific days
  • Identify whether one model is shouldering most of the work

Model breakdown (pie)

What it is: Share of total generations per model for the selected range. Use it to:
  • Confirm your go-to models
  • Spot underused models you may want to experiment with

Cost visibility

  • Costs are shown per generation in the Recent generations grid.
  • Aggregated cost totals are not displayed on the dashboard; refer to session history or billing views for detailed spend.

Interpreting metrics

Healthy patterns

  • Stable generation count: Consistent volume within your expected range
  • Balanced model usage: Multiple models represented across days
  • Recent completions: A steady stream of completed generations without prolonged gaps

Warning signs

  • Sudden drop in generations: Verify sessions are active and prompts are valid
  • Overreliance on one model: Consider trying alternatives for comparison
  • Few recent completions: Check for failed runs in session history

Using metrics effectively

Check ranges: Flip between 7-day and 30-day views to spot short-term versus longer trends.
Pair with history: Use the dashboard to spot patterns, then open session history for detailed troubleshooting.
Compare models: If one model dominates the pie chart, try multi-model runs to validate whether it’s the best fit.