Skip to content

Analytics (Non-Technical Overview)

This page explains how DreamStream analytics work in plain language so marketing and product teams can confidently read dashboards, define experiments, and tell the story of user behavior.


Why we track

We use analytics to answer three simple questions:

  1. Are people completing the core journey? (log dreams → reflect → get insights)
  2. Which features create the most value? (Deep Insights, Dream Guide Chat, learning guides)
  3. Where are users getting stuck? (onboarding, generation, exports, settings)

How it works (high level)

  • Every meaningful action in the app sends a single, consistent event to PostHog.
  • Events are enriched with context (platform, app version, user tier, locale, etc.) so we can slice results by segment.
  • A small number of events drive core dashboards and funnels.
  • We intentionally avoid duplicating events (e.g., chat open) so results stay accurate.

What we track (by user journey)

Below is a simple map of what we measure, grouped by the way a person moves through the product.

1) Onboarding & access

  • App open, sign‑up, login, magic‑link requests
  • Onboarding start/complete/skip

Why: measures acquisition quality and onboarding drop‑off.

2) Dream capture & generation

  • Dream logger opened / cancelled
  • Dream logged (online/offline)
  • Dream generation requested / retried
  • Dream refined, insight generated
  • Dream shared, downloaded, exported

Why: confirms core habit formation and content value.

3) Deep Insights

  • Deep Insights opened, generated, shared

Why: measures the premium ā€œaha momentā€ and its impact.

4) Dream Guide Chat

  • Chat opened
  • Message sent
  • Session started / loaded / deleted
  • Dictation started / stopped / transcribed (if enabled)

Why: validates ongoing support usage and retention signal.

5) Learn & Pathways

  • Learn section opened
  • Guide opened
  • Plan started
  • Day completed
  • Audio played

Why: tracks behavior change, engagement depth, and content effectiveness.

6) Profile & settings

  • Profile opened / saved
  • Avatar or Digital Twin updated
  • Notification preferences saved
  • Language changed

Why: shows personalization depth and long‑term commitment.

6.5) Ratings & feedback

  • Rating prompt viewed / requested (iOS system prompt attempt)
  • Feedback pulse shown / feedback opened

Why: measures sentiment loops and protects App Store rating quality.

7) Search & filters

  • Search performed
  • Calendar filter applied

Why: signals intent, exploration, and historical reflection.

8) Reliability & monetization

  • Offline sync started / completed / failed
  • Paywall viewed
  • Subscription started / cancelled

Why: captures conversion behavior and reliability risks.

Reliability event properties (important)

  • dream_logged_offline now includes fallback_reason so we can separate true offline saves from cloud-save connectivity failures.
  • offline_sync_completed includes draft_success_count to track draft sync health separately from full dreams.
  • dream_draft_saved may include save_mode='offline_fallback' and cached_audio_blob to monitor when quick-capture audio backup succeeds vs text-only fallback.

What context is attached to events

Each event automatically carries helpful context so we can segment without extra work:

  • Platform & app version (iOS, Android, Web)
  • User tier (free/pro/premium)
  • Locale / language
  • Connectivity status (online/offline)
  • Onboarding completion

Many events also include lightweight metadata like dream type, selected format, or counts. We do not send dream text or private content as analytics properties.


How to use this in PostHog

Dashboards we maintain

  • DreamStream Analytics (Core Events): the primary product analytics dashboard covering the full journey
  • Product Health: core journey completion + retention
  • CEO Insights: executive pulse metrics
  • AI Analytics: high value actions and AI usage

What each dashboard shows

DreamStream Analytics (Core Events)

  • Core Journey Trend (12mo): log -> view -> refine -> insight -> Deep Insights -> chat -> export trendline
  • Core Journey Funnel (12mo): ordered funnel: log -> view -> Deep Insights -> Dream Guide message
  • Deep Insights Engagement (12mo): opened / generated / shared activity
  • Dream Guide Chat Engagement (12mo): chat open, sessions, messages, dictation
  • Dream Export Formats (12mo): export volume split by PDF/CSV/TXT
  • Learn & Pathways Engagement (12mo): learn opens, guides, plans, day completions, audio
  • Profile & Preferences (12mo): profile opens/saves, avatar/digital twin updates, notifications, language
  • Search & Filters (12mo): search usage and calendar filter usage
  • Reliability & Sync (12mo): offline sync start/complete/fail + error events

Product Health

  • Daily Active Users (DAU): unique active users per day
  • New Signups: daily signup volume
  • Dreams Logged: daily dream capture volume

CEO Insights

  • Daily Active Users: daily unique users (30d)
  • Weekly Active Users: weekly active trend (90d)
  • Dreams Logged per Day: dream logging cadence (30d)
  • Feature Adoption: Deep Insights, Dream Guide Chat, Pathways, Dream Views (30d)
  • AI Generations: generation requests + profiles generated (30d)

AI Analytics

  • AI Cost Trend: total AI cost per day
  • Generation Success Rate: success vs failure breakdown
  • Latency by Model: average latency by model
  • Token Usage by Job Type: total tokens by job type

Per‑user analytics (yes, it’s possible)

PostHog automatically keeps a timeline for each user. You can:

  • Open a person’s profile to view every event in order
  • Segment funnels by user tier, platform, locale, or onboarding status
  • Compare cohorts (e.g., ā€œUsers who opened Deep Insights vs. those who didn’tā€)

Example questions we can now answer

  • What percentage of new users reach Deep Insights within 7 days?
  • Do Dream Guide Chat users retain better than non‑chat users?
  • Which export formats are most used (PDF vs CSV vs TXT)?
  • Where do people drop off after dream generation?
  • Which learning pathways drive the highest completion rate?

If you need something new tracked

Add a request with:

  1. What decision it should inform
  2. The user action that represents it
  3. Who needs to see the insight

We can then add a new event or property in a clean, consistent way.


← Home Ā· Insights & Analytics →