Design and maintain enterprise-scale data pipelines using AWS cloud services, handling schema evolution in data feeds and delivering analytics-ready datasets to BI platforms. This role requires hands-on expertise with the full AWS data stack and proven ability to build enterprise-grade data solutions that scale.
Requirements
Essential Functions
Build and orchestrate ETL/ELT workflows using Apache Airflow for complex data pipeline management
Develop serverless data processing with AWS Lambda and EventBridge for real-time transformations
Create scalable ETL jobs using AWS Glue with automated schema discovery and catalog management
Execute database migrations and continuous replication using AWS DMS
Design and optimize Amazon Redshift data warehouses and Amazon Athena federated queries
Implement streaming data pipelines with Apache Kafka for real-time ingestion
Manage schema changes in data feeds with automated detection and pipeline adaptation
Create data feeds for Tableau and BusinessObjects reporting platforms
Supervisory Responsibilities
No supervisory responsibilities
Required Skills/Abilities
Airflow: DAG development, custom operators, workflow orchestration, production deployment
Tableau: Data source optimization, extract management, connection configuration
BusinessObjects: Universe design, report development, data feed creation
Tableau: Design and Code rewrite
Education and Experience
5+ years AWS data platform development
3+ years production Airflow experience with complex workflow orchestration
Proven experience managing high-volume data feeds (TB+ daily) with schema evolution
Database migration expertise using DMS for enterprise-scale projects
BI integration experience with Tableau and BusinessObjects platforms
Tableau: 2 plus year
Key Competencies
Design fault-tolerant data pipelines with automated error handling and recovery
Handle schema changes in real-time and batch data feeds without pipeline disruption
Optimize performance across streaming and batch processing architectures
Implement data quality validation and monitoring frameworks
Coordinate cross-platform data synchronization and lineage tracking
Preferred Qualifications
AWS Data Analytics Specialty or Solutions Architect Professional certification
Experience with Infrastructure as Code (Terraform, CloudFormation)
Knowledge of DataOps practices and CI/CD for data pipelines
Containerization experience (Docker, ECS, EKS) for data workloads
Working Conditions
Work is generally performed within an indoor office environment utilizing standard office equipment.
General office environment requires frequent sitting; dexterity of hands and fingers to operate a computer keyboard and mouse; walking and standing for long periods of time; and lifting of less than 20 pounds.
Diese Cookies sind für das Funktionieren der Website erforderlich und können in unseren Systemen nicht abgeschaltet werden. Sie können Ihren Browser so einstellen, dass er diese Cookies blockiert, aber dann könnten einige Teile der Website nicht funktionieren.
Sicherheit
Benutzererfahrung
Zielgruppenorientierte Cookies
Diese Cookies werden über unsere Website von unseren Werbepartnern gesetzt. Sie können von diesen Unternehmen verwendet werden, um ein Profil Ihrer Interessen zu erstellen und Ihnen an anderer Stelle relevante Werbung zu zeigen.
Google Analytics
Google Ads
Wir benutzen Cookies
🍪
Unsere Website verwendet Cookies und ähnliche Technologien, um Inhalte zu personalisieren, das Nutzererlebnis zu optimieren und Werbung zu indvidualisieren und auszuwerten. Indem Sie auf Okay klicken oder eine Option in den Cookie-Einstellungen aktivieren, stimmen Sie dem zu.
Die besten Remote-Jobs per E-Mail
Schliess dich über 5'000+ Personen an, die wöchentlich Benachrichtigungen über Remote-Jobs erhalten!