The Data Engineering Manager will lead distributed teams responsible for designing, modernizing, and operating large-scale data pipelines on the NexusOne platform — an open-source-based data and AI control plane for regulated enterprise environments.
This leader combines hands-on technical depth with strategic delivery management, ensuring that complex data workloads are refactored, migrated, and optimized using modern, composable frameworks.
The role requires strong architectural intuition, proven leadership, and fluency in modern data-engineering technologies.
Core Responsibilities
Must-Have Core Skills
Distributed Data Systems: Deep expertise in Apache Spark (core, SQL, streaming) and Airflow orchestration.
Open-Source Data Stack: Hands-on with Iceberg, Trino/Presto, Hive, NiFi, and Kafka.
Data Modeling & Transformation: Strong command of SQL, DBT, and schema-on-read design principles.
Programming & Scripting: Proficiency in Python, Scala, and Java for data pipeline development.
Infrastructure & Automation: Practical experience with Kubernetes, Terraform, Helm, GitLab CI/Jenkins, and Linux-based environments.
Governance & Security: Familiarity with Ranger, DataHub, RBAC, and enterprise IAM/SSO integration.
Data Quality & Testing: Working knowledge of Great Expectations or similar validation frameworks.
Delivery Leadership: Proven ability to manage distributed engineering teams in Agile/DevOps settings.
Ideal Background
8–12 years of experience in data engineering or platform modernization, including 3+ years leading technical teams.
Strong foundation in open-source and hybrid-cloud data ecosystems.
Demonstrated success leading migration or modernization programs at enterprise scale.
Excellent communication and stakeholder-management skills; capable of bridging executive vision and technical execution.
Success Criteria
Delivery of high-performing, production-ready data pipelines on the NexusOne platform.
Consistent adherence to governance, quality, and performance SLAs.
Adoption of standardized frameworks and automation across engineering teams.
Measurable improvement in pipeline reliability, cost efficiency, and developer productivity.
High engagement, retention, and growth within the data-engineering team.
Requirements
Must-Have Core Skills
Distributed Data Systems: Deep expertise in Apache Spark (core, SQL, streaming) and Airflow orchestration.
Open-Source Data Stack: Hands-on with Iceberg, Trino/Presto, Hive, NiFi, and Kafka.
Data Modeling & Transformation: Strong command of SQL, DBT, and schema-on-read design principles.
Programming & Scripting: Proficiency in Python, Scala, and Java for data pipeline development.
Infrastructure & Automation: Practical experience with Kubernetes, Terraform, Helm, GitLab CI/Jenkins, and Linux-based environments.
Governance & Security: Familiarity with Ranger, DataHub, RBAC, and enterprise IAM/SSO integration.
Data Quality & Testing: Working knowledge of Great Expectations or similar validation frameworks.
Delivery Leadership: Proven ability to manage distributed engineering teams in Agile/DevOps settings.
Ideal Background
8–12 years of experience in data engineering or platform modernization, including 3+ years leading technical teams.
Strong foundation in open-source and hybrid-cloud data ecosystems.
Demonstrated success leading migration or modernization programs at enterprise scale.
Excellent communication and stakeholder-management skills; capable of bridging executive vision and technical execution.
Success Criteria
Delivery of high-performing, production-ready data pipelines on the NexusOne platform.
Consistent adherence to governance, quality, and performance SLAs.
Adoption of standardized frameworks and automation across engineering teams.
Measurable improvement in pipeline reliability, cost efficiency, and developer productivity.
High engagement, retention, and growth within the data-engineering team.
Estas cookies son necesarias para que el sitio web funcione y no se pueden desactivar en nuestros sistemas. Puede configurar su navegador para bloquear estas cookies, pero entonces algunas partes del sitio web podrían no funcionar.
Seguridad
Experiencia de usuario
Cookies orientadas al público objetivo
Estas cookies son instaladas a través de nuestro sitio web por nuestros socios publicitarios. Estas empresas pueden utilizarlas para elaborar un perfil de sus intereses y mostrarle publicidad relevante en otros lugares.
Google Analytics
Anuncios Google
Utilizamos cookies
🍪
Nuestro sitio web utiliza cookies y tecnologías similares para personalizar el contenido, optimizar la experiencia del usuario e indvidualizar y evaluar la publicidad. Al hacer clic en Aceptar o activar una opción en la configuración de cookies, usted acepta esto.
Los mejores empleos remotos por correo electrónico
¡Únete a más de 5.000 personas que reciben alertas semanales con empleos remotos!