perplexity-migration-deep-diveClaude Skill
Execute Perplexity major re-architecture and migration strategies with strangler fig pattern.
| name | perplexity-migration-deep-dive |
| description | Execute Perplexity major re-architecture and migration strategies with strangler fig pattern. Use when migrating to or from Perplexity, performing major version upgrades, or re-platforming existing integrations to Perplexity. Trigger with phrases like "migrate perplexity", "perplexity migration", "switch to perplexity", "perplexity replatform", "perplexity upgrade major". |
| allowed-tools | Read, Write, Edit, Bash(npm:*), Bash(node:*), Bash(kubectl:*) |
| version | 1.0.0 |
| license | MIT |
| author | Jeremy Longshore <jeremy@intentsolutions.io> |
Perplexity Migration Deep Dive
Overview
Comprehensive guide for migrating to or from Perplexity, or major version upgrades.
Prerequisites
- Current system documentation
- Perplexity SDK installed
- Feature flag infrastructure
- Rollback strategy tested
Migration Types
| Type | Complexity | Duration | Risk |
|---|---|---|---|
| Fresh install | Low | Days | Low |
| From competitor | Medium | Weeks | Medium |
| Major version | Medium | Weeks | Medium |
| Full replatform | High | Months | High |
Pre-Migration Assessment
Step 1: Current State Analysis
# Document current implementation find . -name "*.ts" -o -name "*.py" | xargs grep -l "perplexity" > perplexity-files.txt # Count integration points wc -l perplexity-files.txt # Identify dependencies npm list | grep perplexity pip freeze | grep perplexity
Step 2: Data Inventory
interface MigrationInventory { dataTypes: string[]; recordCounts: Record<string, number>; dependencies: string[]; integrationPoints: string[]; customizations: string[]; } async function assessPerplexityMigration(): Promise<MigrationInventory> { return { dataTypes: await getDataTypes(), recordCounts: await getRecordCounts(), dependencies: await analyzeDependencies(), integrationPoints: await findIntegrationPoints(), customizations: await documentCustomizations(), }; }
Migration Strategy: Strangler Fig Pattern
Phase 1: Parallel Run
┌─────────────┐ ┌─────────────┐
│ Old │ │ New │
│ System │ ──▶ │ Perplexity │
│ (100%) │ │ (0%) │
└─────────────┘ └─────────────┘
Phase 2: Gradual Shift
┌─────────────┐ ┌─────────────┐
│ Old │ │ New │
│ (50%) │ ──▶ │ (50%) │
└─────────────┘ └─────────────┘
Phase 3: Complete
┌─────────────┐ ┌─────────────┐
│ Old │ │ New │
│ (0%) │ ──▶ │ (100%) │
└─────────────┘ └─────────────┘
Implementation Plan
Phase 1: Setup (Week 1-2)
# Install Perplexity SDK npm install @perplexity/sdk # Configure credentials cp .env.example .env.perplexity # Edit with new credentials # Verify connectivity node -e "require('@perplexity/sdk').ping()"
Phase 2: Adapter Layer (Week 3-4)
// src/adapters/perplexity.ts interface ServiceAdapter { create(data: CreateInput): Promise<Resource>; read(id: string): Promise<Resource>; update(id: string, data: UpdateInput): Promise<Resource>; delete(id: string): Promise<void>; } class PerplexityAdapter implements ServiceAdapter { async create(data: CreateInput): Promise<Resource> { const perplexityData = this.transform(data); return perplexityClient.create(perplexityData); } private transform(data: CreateInput): PerplexityInput { // Map from old format to Perplexity format } }
Phase 3: Data Migration (Week 5-6)
async function migratePerplexityData(): Promise<MigrationResult> { const batchSize = 100; let processed = 0; let errors: MigrationError[] = []; for await (const batch of oldSystem.iterateBatches(batchSize)) { try { const transformed = batch.map(transform); await perplexityClient.batchCreate(transformed); processed += batch.length; } catch (error) { errors.push({ batch, error }); } // Progress update console.log(`Migrated ${processed} records`); } return { processed, errors }; }
Phase 4: Traffic Shift (Week 7-8)
// Feature flag controlled traffic split function getServiceAdapter(): ServiceAdapter { const perplexityPercentage = getFeatureFlag('perplexity_migration_percentage'); if (Math.random() * 100 < perplexityPercentage) { return new PerplexityAdapter(); } return new LegacyAdapter(); }
Rollback Plan
# Immediate rollback kubectl set env deployment/app PERPLEXITY_ENABLED=false kubectl rollout restart deployment/app # Data rollback (if needed) ./scripts/restore-from-backup.sh --date YYYY-MM-DD # Verify rollback curl https://app.yourcompany.com/health | jq '.services.perplexity'
Post-Migration Validation
async function validatePerplexityMigration(): Promise<ValidationReport> { const checks = [ { name: 'Data count match', fn: checkDataCounts }, { name: 'API functionality', fn: checkApiFunctionality }, { name: 'Performance baseline', fn: checkPerformance }, { name: 'Error rates', fn: checkErrorRates }, ]; const results = await Promise.all( checks.map(async c => ({ name: c.name, result: await c.fn() })) ); return { checks: results, passed: results.every(r => r.result.success) }; }
Instructions
Step 1: Assess Current State
Document existing implementation and data inventory.
Step 2: Build Adapter Layer
Create abstraction layer for gradual migration.
Step 3: Migrate Data
Run batch data migration with error handling.
Step 4: Shift Traffic
Gradually route traffic to new Perplexity integration.
Output
- Migration assessment complete
- Adapter layer implemented
- Data migrated successfully
- Traffic fully shifted to Perplexity
Error Handling
| Issue | Cause | Solution |
|---|---|---|
| Data mismatch | Transform errors | Validate transform logic |
| Performance drop | No caching | Add caching layer |
| Rollback triggered | Errors spiked | Reduce traffic percentage |
| Validation failed | Missing data | Check batch processing |
Examples
Quick Migration Status
const status = await validatePerplexityMigration(); console.log(`Migration ${status.passed ? 'PASSED' : 'FAILED'}`); status.checks.forEach(c => console.log(` ${c.name}: ${c.result.success}`));
Resources
Flagship+ Skills
For advanced troubleshooting, see perplexity-advanced-troubleshooting.
Similar Claude Skills & Agent Workflows
git-commit
Generate well-formatted git commit messages following conventional commit standards
code-review
Comprehensive code review assistant that analyzes code quality, security, and best practices
dsql
Build with Aurora DSQL - manage schemas, execute queries, and handle migrations with DSQL-specific requirements.
backend-dev-guidelines
Comprehensive backend development guide for Langfuse's Next.js 14/tRPC/Express/TypeScript monorepo.
Material Component Dev
FlowGram 物料组件开发指南 - 用于在 form-materials 包中创建新的物料组件
Create Node
用于在 FlowGram demo-free-layout 中创建新的自定义节点,支持简单节点(自动表单)和复杂节点(自定义 UI)