Skip to content

Redis OM Python 0.x to 1.0 Migration Guide

This guide covers the breaking changes and migration steps required when upgrading from Redis OM Python 0.x to 1.0.

Overview of Breaking Changes

Redis OM Python 1.0 introduces several breaking changes that improve performance and provide better query capabilities:

  1. Python 3.10+ required - Dropped support for Python 3.8 and 3.9
  2. Pydantic v2 required - Dropped support for Pydantic v1
  3. Model-level indexing - Models are now indexed at the class level instead of field-by-field
  4. Datetime field indexing - Datetime fields are now indexed as NUMERIC instead of TAG for better range queries
  5. Enhanced migration system - New data migration capabilities with rollback support
  6. CLI changes - The migrate command is removed; use om migrate instead
  7. Migrator class renamed - Migrator is now SchemaDetector (backward compat alias available)

Breaking Change 1: Python and Pydantic Requirements

Python Version

Redis OM Python 1.0 requires Python 3.10 or higher. Python 3.8 and 3.9 are no longer supported.

Pydantic Version

Redis OM Python 1.0 requires Pydantic v2. Pydantic v1 is no longer supported.

If you're still on Pydantic v1, you'll need to migrate to Pydantic v2 first. See the Pydantic v2 Migration Guide for details.

Migration Steps

  1. Check your Python version:

    python --version  # Must be 3.10+
    

  2. Update Pydantic:

    pip install pydantic>=2.0
    

  3. Update your Pydantic v1 code to v2 syntax:

  4. @validator@field_validator
  5. Config class → model_config = ConfigDict(...)
  6. parse_obj()model_validate()
  7. dict()model_dump()

Breaking Change 2: Model-Level Indexing

What Changed

In 0.x, you marked individual fields as indexed. In 1.0, you mark the entire model as indexed and then specify field-level indexing options.

Before (0.x)

class Member(HashModel):
    id: int = Field(index=True, primary_key=True)
    first_name: str = Field(index=True, case_sensitive=True)
    last_name: str = Field(index=True)
    email: str = Field(index=True)
    join_date: datetime.date
    age: int = Field(index=True, sortable=True)
    bio: str = Field(index=True, full_text_search=True)

After (1.0)

class Member(HashModel, index=True):  # ← Model-level indexing
    id: int = Field(index=True, primary_key=True)
    first_name: str = Field(index=True, case_sensitive=True)
    last_name: str = Field(index=True)
    email: str = Field(index=True)
    join_date: datetime.date
    age: int = Field(sortable=True)  # ← No need for index=True if model is indexed
    bio: str = Field(full_text_search=True)  # ← No need for index=True if model is indexed

Migration Steps

  1. Add index=True to your model class:

    # Change this:
    class MyModel(HashModel):
    
    # To this:
    class MyModel(HashModel, index=True):
    

  2. Remove redundant index=True from fields (optional but recommended):

  3. Keep index=True on fields that need special indexing behavior
  4. Remove index=True from fields that only need basic indexing
  5. Keep field-specific options like sortable=True, full_text_search=True, case_sensitive=True

  6. Update both HashModel and JsonModel classes:

    class User(JsonModel, index=True):  # ← Add index=True here too
        name: str = Field(index=True)
        age: int = Field(sortable=True)
    

Breaking Change 3: Datetime Field Indexing

What Changed

Datetime fields are now indexed as NUMERIC fields (Unix timestamps) instead of TAG fields (ISO strings). This enables: - Range queries on datetime fields - Sorting by datetime fields - Better query performance

Impact on Your Code

Queries that now work (previously failed):

# Range queries
users = await User.find(User.created_at > datetime.now() - timedelta(days=7)).all()

# Sorting by datetime
users = await User.find().sort_by('created_at').all()

# Between queries
start = datetime(2023, 1, 1)
end = datetime(2023, 12, 31)
users = await User.find(
    (User.created_at >= start) & (User.created_at <= end)
).all()

Data storage format change: - Before: "2023-12-01T14:30:22.123456" (ISO string) - After: 1701435022 (Unix timestamp)

Migration Steps

  1. Run schema migration to update indexes:

    om migrate
    

  2. Run data migration to convert datetime values:

    om migrate-data run
    

  3. Verify migration completed successfully:

    om migrate-data verify
    

For detailed datetime migration instructions, see the Datetime Migration Section below.

Migration Process

Step 1: Backup Your Data

Critical: Always backup your Redis data before migrating:

# Create Redis backup
redis-cli BGSAVE

# Or use Redis persistence
redis-cli SAVE

Step 2: Update Your Models

Update all your model classes to use the new indexing syntax:

# Before
class Product(HashModel):
    name: str = Field(index=True)
    price: float = Field(index=True, sortable=True)
    category: str = Field(index=True)

# After  
class Product(HashModel, index=True):
    name: str = Field(index=True)
    price: float = Field(sortable=True)
    category: str = Field(index=True)

Step 3: Install Redis OM 1.0

pip install redis-om-python>=1.0.0

Step 4: Run Schema Migration

Update your RediSearch indexes to match the new model definitions:

om migrate

Step 5: Run Data Migration

Convert datetime fields from ISO strings to Unix timestamps:

# Check what will be migrated
om migrate-data status

# Run the migration
om migrate-data run

# Verify completion
om migrate-data verify

Step 6: Test Your Application

  • Test datetime queries and sorting
  • Verify all indexed fields work correctly
  • Check application functionality

Datetime Migration Details

Prerequisites

  • Redis with RediSearch module
  • Backup of your Redis data
  • Redis OM Python 1.0+

Migration Commands

# Check migration status
om migrate-data status

# Run migration with progress monitoring
om migrate-data run --verbose

# Verify data integrity
om migrate-data verify --check-data

# Check for schema mismatches
om migrate-data check-schema

Migration Options

For large datasets or specific requirements:

# Custom batch size for large datasets
om migrate-data run --batch-size 500

# Handle errors gracefully
om migrate-data run --failure-mode log_and_skip --max-errors 100

# Dry run to preview changes
om migrate-data run --dry-run

Rollback

If you need to rollback the datetime migration:

# Rollback to previous format
om migrate-data rollback 001_datetime_fields_to_timestamps

# Or restore from backup
redis-cli FLUSHALL
# Restore your backup file

Troubleshooting

Common Issues

  1. Schema mismatch errors:

    om migrate-data check-schema
    

  2. Migration fails with high error rate:

    om migrate-data run --failure-mode log_and_skip
    

  3. Out of memory during migration:

    om migrate-data run --batch-size 100
    

Getting Help

For detailed troubleshooting, see: - Migration Documentation - Error Handling Guide

Compatibility Notes

What Still Works

  • All existing query syntax
  • Model field definitions (with updated indexing)
  • Redis connection configuration
  • Async/sync dual API

What's Deprecated

  • Field-by-field indexing without model-level index=True
  • Old migration CLI (migrate command - use om migrate instead)
  • Migrator class name (use SchemaDetector or SchemaMigrator instead)

Breaking Change 4: CLI and Migration System Changes

CLI Changes

The standalone migrate command has been removed. Use the unified om CLI instead:

# Old (removed)
migrate

# New
om migrate run

The om CLI now provides a complete set of commands:

om migrate create     # Create migration file from schema diff
om migrate run        # Run pending migrations
om migrate status     # Show migration status
om migrate rollback   # Rollback specific migration by ID
om migrate downgrade  # Rollback last N migrations
om migrate reset      # Clear migration history

om index create       # Create indexes (dev workflow)
om index drop         # Drop all indexes
om index rebuild      # Drop and recreate indexes

om migrate-data run   # Run data migrations
om migrate-data check-schema  # Check for schema mismatches

Migrator Class Renamed

The Migrator class has been renamed to SchemaDetector to better reflect its purpose (detecting schema differences). A backward compatibility alias is provided but deprecated:

# Old (deprecated but still works)
from aredis_om import Migrator
await Migrator().run()

# New (recommended for production)
from aredis_om import SchemaMigrator
migrator = SchemaMigrator(migrations_dir="./migrations")
await migrator.run()

# New (for development/testing)
from aredis_om import SchemaDetector
await SchemaDetector().run()

Recommended approach for production: Use file-based migrations with SchemaMigrator for tracked, reversible schema changes.

For development: Use om index rebuild for quick iteration, or SchemaDetector programmatically.

Next Steps

After successful migration:

  1. Update your code to take advantage of datetime range queries
  2. Remove redundant index=True from fields where not needed
  3. Test performance with the new NUMERIC datetime indexing
  4. Update documentation to reflect new model syntax

Example: Complete Migration

Here's a complete before/after example:

Before (0.x)

class User(HashModel):
    name: str = Field(index=True)
    email: str = Field(index=True)
    created_at: datetime.datetime = Field(index=True)
    age: int = Field(index=True, sortable=True)
    bio: str = Field(index=True, full_text_search=True)

After (1.0)

class User(HashModel, index=True):
    name: str = Field(index=True)
    email: str = Field(index=True)
    created_at: datetime.datetime  # Now supports range queries!
    age: int = Field(sortable=True)
    bio: str = Field(full_text_search=True)

# New capabilities:
recent_users = await User.find(
    User.created_at > datetime.now() - timedelta(days=30)
).sort_by('created_at').all()

This migration unlocks powerful new datetime query capabilities while maintaining backward compatibility for most use cases.