What You'll Build

A fully automated workflow that fetches PostHog session data, uses Continue CLI to identify UX problems, and creates GitHub issues with specific technical recommendations

Prerequisites

Before starting, ensure you have:
1

Install Continue CLI

bash npm i -g @continuedev/cli
2

Install & Setup GitHub CLI

# macOS 
brew install gh
# Ubuntu/Debian
sudo apt install gh
# Windows
winget install --id GitHub.cli
Then authenticate: gh auth login
3

Set up PostHog API access

Get your Personal API Key from PostHog settings
4

Test the workflow

Run your first automated analysis

Step 1: Environment Setup

Get Your PostHog API Credentials

You need a Personal API Key (not a Project API key) to access session recordings:
  1. Go to Personal API Keys in PostHog
  2. Click + Create a personal API Key
  3. Name it “Continue CLI Session Analysis”
  4. Select these scopes:
    • session_recording:read - Required for accessing session data
    • insight:read - Optional, for additional analytics
  5. Copy the key immediately (you won’t see it again!)

Set Up Your Environment Variables

# Create working directory
mkdir posthog-continuous-ai
cd posthog-continuous-ai

# Create environment file
cat > .env << EOF
# PostHog Configuration
POSTHOG_API_KEY=your_personal_api_key_here
POSTHOG_PROJECT_ID=your_project_id_here
POSTHOG_HOST=https://us.posthog.com

# GitHub Configuration
GITHUB_OWNER=YourUsername
GITHUB_REPO=your-repository-name
EOF

# Load environment
source .env
Find Your Project Details: Your Project ID is in your PostHog URL: https://app.posthog.com/project/YOUR_PROJECT_ID

GitHub CLI Setup

The workflow uses GitHub CLI instead of API tokens for better security and easier authentication:

Why GitHub CLI?

GitHub CLI provides secure authentication without needing to manage Personal Access Tokens in your environment files.
Verify GitHub CLI is ready:
# Check if GitHub CLI is installed
gh --version

# Check authentication status
gh auth status

# If not authenticated, login
gh auth login
Authentication: GitHub CLI handles authentication automatically once you run gh auth login. No need to manage tokens in .env files!

Step 2: Session Analysis Script

Create an intelligent analysis script that fetches and analyzes session recordings:
#!/bin/bash
# analyze-sessions.sh
set -e
source .env

echo "🎬 Fetching session recordings from PostHog..."

# Fetch recent session recordings with potential issues
curl -s -H "Authorization: Bearer $POSTHOG_API_KEY" \
  "$POSTHOG_HOST/api/projects/$POSTHOG_PROJECT_ID/session_recordings/?limit=20" \
  | jq '.results[] | {
      id: .id,
      duration: .recording_duration,
      start_url: .start_url,
      click_count: .click_count,
      console_error_count: .console_error_count,
      person_distinct_id: .person.distinct_ids[0],
      start_time: .start_time
    }' > sessions.json

echo "📊 Found $(cat sessions.json | jq -s length) sessions"

# Filter for problematic sessions (errors or long durations)
cat sessions.json | jq -s 'map(select(.console_error_count > 0 or .recording_duration > 300))' > problem-sessions.json

echo "🚨 Found $(cat problem-sessions.json | jq length) problematic sessions"

# Analyze with Continue CLI
cat problem-sessions.json | cn -p "
Analyze these PostHog session recordings to identify user experience issues.

Each session contains:
- duration: how long the session lasted (in seconds)
- start_url: the page where the user started
- click_count: total number of clicks
- console_error_count: JavaScript errors encountered

Look for patterns that suggest code issues:

1. **High Error Sessions**: Sessions with console_error_count > 0
   - What pages/URLs are generating errors?
   - Are users abandoning after errors occur?

2. **Long Duration Sessions**: Sessions over 300 seconds (5+ minutes)
   - Are users struggling to complete tasks?
   - Low click count + high duration = user confusion

3. **Abandonment Patterns**:
   - Users starting on key pages but not progressing
   - Short sessions on important conversion pages

For each issue pattern you identify:
- Describe the user behavior problem
- Suggest likely technical causes (JS errors, slow loading, UI confusion)
- Recommend specific code areas to investigate
- Provide example fixes or improvements
- Priority: High (blocks user goals), Medium (hurts UX), Low (minor issue)

Focus on actionable technical improvements that will measurably improve user experience.
"

echo "✅ Analysis complete! Check the output above for optimization opportunities."
What This Script Does:
  1. Fetches recent session recordings from PostHog API
  2. Filters for problematic sessions (errors or long durations)
  3. Analyzes with Continue CLI to identify specific UX issues
  4. Provides actionable technical recommendations
Make the script executable:
chmod +x analyze-sessions.sh

Step 3: Automated GitHub Issue Creation

Create a smart script that parses the AI analysis and automatically creates GitHub issues:

Intelligent Issue Creation

This script automatically extracts issues from Continue CLI analysis, determines priority levels, assigns appropriate labels, and creates well-formatted GitHub issues
#!/bin/bash
# create-github-issues.sh - Creates GitHub issues from Continue CLI analysis output
set -e
source .env

# Input file containing Continue CLI analysis output
ANALYSIS_FILE="analysis-results.txt"

# File to track created issues to prevent duplicates
ISSUE_TRACKING_FILE=".created_issues.json"

# Check if analysis file exists
if [ ! -f "$ANALYSIS_FILE" ]; then
  echo "❌ Analysis file not found: $ANALYSIS_FILE"
  exit 1
fi

# Check if GitHub CLI is installed
if ! command -v gh &> /dev/null; then
  echo "❌ GitHub CLI (gh) not found. Please install it first:"
  echo "  https://cli.github.com/manual/installation"
  exit 1
fi

# Make sure gh is authenticated
if ! gh auth status &> /dev/null; then
  echo "❌ GitHub CLI not authenticated. Please run 'gh auth login' first."
  exit 1
fi

# Create issue tracking file if it doesn't exist
if [ ! -f "$ISSUE_TRACKING_FILE" ]; then
  echo "[]" > "$ISSUE_TRACKING_FILE"
fi

# Function to check if an issue with this title already exists
issue_exists() {
  local title="$1"
  local issue_hash=$(echo "$title" | md5sum | cut -d' ' -f1)

  # Check in our tracking file
  if grep -q "\"$issue_hash\"" "$ISSUE_TRACKING_FILE"; then
    return 0  # Issue exists
  fi

  # Also check actual GitHub issues to be doubly sure
  if gh issue list --repo "$GITHUB_OWNER/$GITHUB_REPO" --search "$title in:title" --json title | grep -q "$title"; then
    # Add to tracking file if found on GitHub but not in our tracking
    local tracking_data=$(cat "$ISSUE_TRACKING_FILE")
    echo "$tracking_data" | jq --arg hash "$issue_hash" '. += [$hash]' > "$ISSUE_TRACKING_FILE"
    return 0  # Issue exists
  fi

  return 1  # Issue doesn't exist
}

# Function to record that we created an issue
record_issue() {
  local title="$1"
  local issue_hash=$(echo "$title" | md5sum | cut -d' ' -f1)

  local tracking_data=$(cat "$ISSUE_TRACKING_FILE")
  echo "$tracking_data" | jq --arg hash "$issue_hash" '. += [$hash]' > "$ISSUE_TRACKING_FILE"
}

# Function to create a GitHub issue using GitHub CLI
create_github_issue() {
  local title="$1"
  local body="$2"
  local labels="$3"

  # Check if this issue already exists
  if issue_exists "$title"; then
    echo "ℹ️ Issue already exists: $title"
    return 0
  fi

  echo "Creating issue: $title"

  # Convert labels from JSON array format to comma-separated list
  # Remove brackets, quotes, and add commas
  labels_list=$(echo $labels | sed 's/\[//g' | sed 's/\]//g' | sed 's/"//g' | sed 's/,/,/g')

  # Create issue using GitHub CLI
  issue_url=$(gh issue create \
    --repo "$GITHUB_OWNER/$GITHUB_REPO" \
    --title "$title" \
    --body "$body" \
    --label "$labels_list" \
    --web)

  if [ $? -eq 0 ]; then
    echo "✅ Created issue: $issue_url"
    # Record that we created this issue
    record_issue "$title"
  else
    echo "❌ Failed to create issue"
  fi
}

# Extract and parse issues from analysis output
echo "🔍 Parsing analysis results for issues..."

# Count the issues in the file by looking for ## Issue pattern
issue_count=$(grep -c "^## Issue" "$ANALYSIS_FILE" || echo 0)

if [ "$issue_count" -eq 0 ]; then
  echo "ℹ️ No issues found in analysis output."
  exit 0
fi

echo "📊 Found $issue_count potential issues to process"

# Process each issue section
while IFS= read -r line || [[ -n "$line" ]]; do
  if [[ $line =~ ^## ]]; then
    # If we have a previous issue, create it
    if [ -n "${issue_title:-}" ] && [ -n "${issue_body:-}" ]; then
      # Determine appropriate labels based on priority in the issue body
      if [[ "$issue_body" =~ "High Priority" ]]; then
        labels="[\"bug\", \"high-priority\", \"user-experience\"]"
      elif [[ "$issue_body" =~ "Medium Priority" ]]; then
        labels="[\"enhancement\", \"medium-priority\", \"user-experience\"]"
      else
        labels="[\"low-priority\", \"user-experience\"]"
      fi

      # Create the issue
      create_github_issue "$issue_title" "$issue_body" "$labels"
    fi

    # Start a new issue
    issue_title="UX Issue: ${line#\#\# Issue }"
    issue_body=""
  elif [ -n "${issue_title:-}" ]; then
    # Add line to current issue body
    if [ -n "$issue_body" ]; then
      issue_body="$issue_body\n$line"
    else
      issue_body="$line"
    fi
  fi
done < <(sed -n '/^## Issue/,/^## Issue\|^$/p' "$ANALYSIS_FILE")

# Create the last issue if there is one
if [ -n "${issue_title:-}" ] && [ -n "${issue_body:-}" ]; then
  # Determine appropriate labels based on priority in the issue body
  if [[ "$issue_body" =~ "High Priority" ]]; then
    labels="[\"bug\", \"high-priority\", \"user-experience\"]"
  elif [[ "$issue_body" =~ "Medium Priority" ]]; then
    labels="[\"enhancement\", \"medium-priority\", \"user-experience\"]"
  else
    labels="[\"low-priority\", \"user-experience\"]"
  fi

  # Create the issue
  create_github_issue "$issue_title" "$issue_body" "$labels"
fi

echo "✅ GitHub issues creation process completed!"
Make it executable:
chmod +x create-github-issues.sh
Smart Label Assignment: The script automatically assigns labels based on issue priority:
  • High Prioritybug, high-priority, user-experience
  • Medium Priorityenhancement, medium-priority, user-experience
  • Low Prioritylow-priority, user-experience
Be sure to customize the label logic in the script to match your repository’s labeling conventions!

Step 4: Complete Automated Workflow

Create the main orchestration script that runs the entire process:
#!/bin/bash
# daily-analysis.sh
set -e

echo "🔍 Starting daily PostHog session analysis..."

# Run the session analysis
./analyze-sessions.sh > analysis-results.txt

# Create GitHub issues based on the analysis
./create-github-issues.sh

echo "✅ Daily analysis complete!"
Make it executable and test:
chmod +x daily-analysis.sh
./daily-analysis.sh

Step 5: Automation Options

Step 6: Testing Your Workflow

1

Test API Connections

# Test PostHog API curl -H
\
"$POSTHOG_HOST/api/projects/$POSTHOG_PROJECT_ID/session_recordings/?limit=1"
# Test GitHub API curl -H "Authorization: token
$GITHUB_PERSONAL_ACCESS_TOKEN" \
"https://api.github.com/repos/$GITHUB_OWNER/$GITHUB_REPO" 
2

Run Session Analysis

bash ./analyze-sessions.sh Expected: JSON files with session data and AI analysis output
3

Test Issue Creation

bash ./create-github-issues.sh Expected: New issues created in your GitHub repository
4

Full Workflow Test

bash ./daily-analysis.sh Expected: Complete end-to-end execution with GitHub issues created
Success Indicators:
  • PostHog API returns session data (not empty)
  • Continue CLI generates analysis with identified issues
  • GitHub issues are created with proper labels and formatting
  • No error messages in the console output

What You’ve Built

After completing this guide, you have a complete Continuous AI system that: Monitors user experience - Automatically fetches and analyzes PostHog session data
Identifies problems intelligently - Uses AI to spot patterns and technical issues
Creates actionable tasks - Generates GitHub issues with specific recommendations
Runs autonomously - Operates daily without manual intervention
Scales with your team - Handles growing amounts of session data automatically

Continuous AI

Your system now operates at Level 2 Continuous AI - AI handles routine analysis tasks with human oversight through GitHub issue review and prioritization.

Security Best Practices

Protect Your API Keys:
  • Never commit .env files or secrets to version control
  • Use GitHub Secrets for automation workflows
  • Limit token scopes to minimum required permissions
  • Rotate API keys regularly (every 90 days recommended)
  • Monitor token usage for unusual activity

Continue Your Workflow

Consider these additions:

Advanced Features?

Enhance this workflow with additional PostHog data sources (funnel analysis, feature flags, etc.)

Custom Analysis Rules

Train the AI on your specific codebase patterns and conventions

Slack Integration

Get notifications when critical issues are detected

Performance Monitoring

Track how fixes improve user experience metrics over time

Next Steps

  1. Monitor your first issues - Review the GitHub issues created and implement fixes
  2. Measure impact - Track how resolved issues improve PostHog metrics
  3. Refine analysis - Adjust the Continue CLI prompts based on issue quality
  4. Scale the system - Add more data sources or create specialized analysis workflows