Hybrid QA Automation Engineer - Dallas, TX bei Photon Group
Photon Group · Dallas, Vereinigte Staaten Von Amerika · Hybrid
- Professional
- Optionales Büro in Dallas
This job description is for a QA Automation Engineer to work on a cloud-native platform modernization project. The successful candidate will be responsible for ensuring the quality, reliability, and data integrity of a new AWS-based system that replaces a legacy platform.
Responsibilities:
Test Planning & Strategy: Develop a comprehensive delivery plan for onboarding data and decommissioning legacy files.
Automation Framework Development: Design, build, and maintain automated testing frameworks for both real-time and batch data processing capabilities. Implement automation to validate the provisioning of AWS infrastructure and its adherence to security and compliance standards.
Testing Execution: Conduct a variety of tests, including unit, integration, performance, and security testing. Perform DR drills to validate RTO/RPO objectives. Support user acceptance testing (UAT) and resolve any defects.
Data Validation & Reconciliation: Develop and validate a reconciliation system to ensure data integrity and parity between the legacy and new platforms. Verify that the new process runs in parallel with legacy files for 60 days as part of the migration strategy.
Quality Assurance: Verify that access controls, encryption, and audit logs are correctly implemented and reviewed with stakeholders. Ensure that logging, monitoring, and alerting are functional and thoroughly tested. Validate that CI/CD pipelines support automated deployment and rollback.
Documentation & Reporting: Deliver operational documentation, including test plans and test reports. Ensure all documentation is accessible and up to date, and that the handover checklist is signed off by engineering and support leads.
Key points:
- Experience in Data Ingestion Testing is must. The resource should understand the data flow and should have the ability to test transformations, schema mapping and data integrity.
- Should have experience in Streaming & Batch processing tools like Apache Kafka, Spark, Flink or equilents
- Should know to run test for latency, throughput and fault tolerance
- Should be familiar with Data profiling and anomaly detection tools
- good in writing test cases for completeness, accuracy and consistency
- should know to test IAM roles, policies, encryption at rest/in-transit
- should know to validate secure API-end points and token-based authentication
- should know to verify data residency and retention policies
- ability to simulate data ingestion scenarios
Required Skills & Qualifications:
Automation Expertise: Proven experience in designing and implementing automated testing frameworks from scratch.
Cloud & DevOps: Hands-on experience with AWS and testing cloud-native applications. Familiarity with CI/CD pipelines and Infrastructure as Code (IaC) is crucial.
Data Testing: Strong experience with data validation, integrity checks, and building reconciliation systems to compare datasets from different platforms.
Technical Knowledge: Knowledge of real-time and batch data processing, and experience in testing secure and scalable data ingestion platforms.
Problem-Solving: Ability to analyze requirements and mappings for data ingestion and translate them into effective test cases. Experience with Agile methodologies and managing agile backlogs is a plus.
Compensation, Benefits and Duration
Minimum Compensation: USD 35,000
Maximum Compensation: USD 124,000
Compensation is based on actual experience and qualifications of the candidate. The above is a reasonable and a good faith estimate for the role.
Medical, vision, and dental benefits, 401k retirement plan, variable pay/incentives, paid time off, and paid holidays are available for full time employees.
This position is not available for independent contractors
No applications will be considered if received more than 120 days after the date of this post