Types of Environments: What's The Right Setup for Your Product?
Hovhannes Babayan • April 9, 2024
Purpose: Production Workload
Data Source: Live, operational data.
User Roles and Permissions: Restricted to essential personnel.
Data Refresh Rate: Real-time, continuous updates.
Security Level: Highest, with strict access controls and encryption.
Monitoring and Alerts: Comprehensive monitoring with immediate alerts for any issues.
Backup and Recovery: Frequent backups, with rapid and reliable recovery processes.
Change Management: Strict change management processes
Performance Metrics: Critical, includes uptime, response time, transaction volume.
Tools and Services Used: Enterprise-grade solutions for monitoring, security, and management.
Purpose: Dedicated env (which is not production, but same versions) for potential customers to run tests (this can also be done in production, but some companies prefer having this environment).
Data Source: Anonymized or synthetic data.
User Roles and Permissions: Access typically granted to sales and marketing teams and potential clients.
Data Refresh Rate: Periodic or on demand.
Security Level: Moderate, with access controls to prevent data breaches.
Monitoring and Alerts: Basic monitoring for system availability and performance.
Backup and Recovery: Less critical, with infrequent backups and standard recovery processes.
Change Management: Changes can be more frequent to update demo features or data.
Performance Metrics: Focus on user experience and demo flow.
Tools and Services Used: Tools that showcase the product's capabilities, often with simplified management.
Purpose: To let customers test upcoming releases.
Data Source: Mirror of production data or anonymized production data.
User Roles and Permissions: Limited to QA and specific development personnel.
Data Refresh Rate: Regular updates, often synchronized with production environments.
Security Level: High, similar to production to ensure a secure testing environment.
Monitoring and Alerts: Extensive monitoring to catch any pre-release issues.
Backup and Recovery: Regular backups to ensure testing continuity.
Change Management: Rigorous, as this is the final step before production.
Performance Metrics: Close monitoring of performance against production standards.
Tools and Services Used: Similar to production but with additional testing and staging tools.
Purpose: For external parties (clients of clients) to implement integrations on unreleased versions.
Data Source: Synthetic or isolated subsets of production data.
User Roles and Permissions: Primarily developers and integration testers.
Data Refresh Rate: As needed, based on testing requirements.
Security Level: Moderate, with focus on internal access control.
Monitoring and Alerts: Focused on system integration points and data flow.
Backup and Recovery: As necessary for the testing process, not as critical as production.
Change Management: Continuous, as new components are integrated and tested regularly.
Performance Metrics: Emphasis on integration points, data processing, and system interactions.
Tools and Services Used: Integration testing tools, middleware, and APIs.
Purpose: Could be same as pre-production or staging.
Data Source: Synthetic or anonymized data for safe testing.
User Roles and Permissions: Access mainly for developers and testers.
Data Refresh Rate: Updated as needed for specific tests.
Security Level: Moderate, to protect against unauthorized access.
Monitoring and Alerts: Limited, focused on immediate testing needs.
Backup and Recovery: Less critical, with basic backup for ongoing work.
Change Management: Flexible allowing quick changes for experimentation.
Performance Metrics: Secondary, unless tied to performance testing.
Tools and Services Used: Varied, based on the experimental or development needs.
Purpose: Before going to pre-production or to production features have to be integrated into production version and tested. Sometimes used instead of QA env. Feature branches should not be deployed here, but only tags.
Data Source: Production-like data, often anonymized.
User Roles and Permissions: Restricted to development and QA teams.
Data Refresh Rate: Regularly updated to reflect the production environment.
Security Level: High, to protect the integrity of the staging data.
Monitoring and Alerts: Similar to production to ensure staging accurately reflects production performance.
Backup and Recovery: Important for maintaining a consistent test environment, though less frequent than production.
Change Management: Structured, as staging is a step before production release.
Performance Metrics: Performance, load, and stress testing metrics are key.
Tools and Services Used: Testing and deployment tools that mimic production.
Purpose: For automated end to end tests
Data Source: Comprehensive test data covering all operational scenarios.
User Roles and Permissions: Primarily testers, with some developer access for debugging.
Data Refresh Rate: As needed for test scenarios.
Security Level: Moderate, focused on test data integrity.
Monitoring and Alerts: Targeted on the testing process and outcome validation.
Backup and Recovery: Not typically a priority, as environments can be reset or recreated.
Change Management: Adaptive to allow for frequent testing of different scenarios.
Performance Metrics: Focused on process flows and user experience.
Tools and Services Used: E2E testing frameworks and automation tools.
Purpose: Enables QA testing of nearly final results on feature branches.
Data Source: Mix of synthetic and production-like data.
User Roles and Permissions: Access mainly for QA engineers and testers.
Data Refresh Rate: Regularly updated for test accuracy.
Security Level: Moderate to high, protecting sensitive data.
Monitoring and Alerts: Focus on application behavior and errors.
Backup and Recovery: Regular backups for data integrity.
Change Management: Strict testing before production rollout.
Performance Metrics: Emphasizes load and stress testing.
Tools and Services Used: Test automation and monitoring tools.
Purpose: Allows developers to test almost final outcomes on specific branches, akin to QA testing.
Data Source: Synthetic or sample data for feature testing.
User Roles and Permissions: Access restricted to developers and technical leads.
Data Refresh Rate: Updated as needed for development cycles.
Security Level: Lower, prioritizing functionality over data security.
Monitoring and Alerts: Targets development metrics and error logs.
Backup and Recovery: Code is version-controlled; environment backup less critical.
Change Management: Flexible for iterative code changes.
Performance Metrics: Monitored for potential performance impacts.
Tools and Services Used: IDEs, local test servers, and CI tools.
Development Environment
Testing/QA Environment
Production Environment
Development Environment
Testing/QA Environment
Staging
Production Environment
Development Environment
Testing/QA Environment
Demo
Staging
Pre-production
Production Environment