Platzhalter Bild

Manager-Data Engineering-Big Data Engineering chez exl Service.com (India) Private Limited

exl Service.com (India) Private Limited · Bengaluru, Inde · Hybrid

Postuler maintenant

About the Role

We are seeking a highly skilled Lead Data Engineer to drive end-to-end data engineering initiatives, lead cross-functional teams, and deliver scalable, cloud-based data solutions. The ideal candidate will bring deep technical expertise, strong leadership, and the ability to collaborate effectively with global stakeholders in a fast-paced environment.

 

Key Responsibilities

Leadership & Team Management

  • Lead and mentor a team of onshore and offshore data engineers to ensure high-quality deliverables.
  • Provide technical direction, coaching, and knowledge sharing to foster team growth and capability building.
  • Establish and enforce engineering best practices, coding standards, and reusable frameworks.
  • Champion innovation, continuous learning, and thought leadership within the data engineering function.

Project Delivery & Execution

  • Oversee end-to-end project delivery, ensuring timely, high-quality execution aligned with project objectives.
  • Define high-level solution designs, data architectures, and ETL/ELT frameworks for cloud-based data platforms.
  • Drive development, code reviews, unit testing, and deployment to production environments.
  • Ensure optimal performance, scalability, and reliability across data pipelines and systems.

Stakeholder Communication & Collaboration

  • Collaborate with clients, product owners, business leaders, and offshore teams to gather requirements and define technical solutions.
  • Communicate project updates, timelines, risks, and technical concepts effectively to both technical and non-technical stakeholders.
  • Act as the primary point of contact for client technical discussions, solution design workshops, and progress reviews.

Risk & Issue Management

  • Proactively identify project risks, dependencies, and issues; develop and execute mitigation plans.
  • Ensure governance, compliance, and alignment with organizational standards and methodologies.
 

Must-Have Skills

  • 12+ years of experience in Data Engineering with proven experience in delivering enterprise-scale projects.
  • Strong expertise in Big Data concepts, distributed systems, and cloud-native architectures.
  • Proficiency in Snowflake, SQL, and a wide range of AWS services (Glue, EMR, S3, Aurora, RDS, Lambda, Step Functions).
  • Hands-on experience with Python, PySpark, and building cloud-based microservices.
  • Strong problem-solving, analytical skills, and end-to-end ownership mindset.
  • Proven ability to work in Agile/Scrum environments with iterative delivery cycles.
  • Exceptional communication, leadership, and stakeholder management skills.
  • Demonstrated capability in leading onshore–offshore teams and coordinating multi-region delivery efforts.
 

Good-to-Have Skills

  • Experience with DevOps tools (Jenkins, Git, GitHub/GitLab) and CI/CD pipeline implementation.
  • Experience with cloud migration and large-scale modernization projects.
  • Familiarity with the US insurance/reinsurance domain + P&C Insurance knowledge
  • Knowledge of Data Vault 2.0 and modern data modeling techniques.

Responsibilities

About the Role

We are seeking a highly skilled Lead Data Engineer to drive end-to-end data engineering initiatives, lead cross-functional teams, and deliver scalable, cloud-based data solutions. The ideal candidate will bring deep technical expertise, strong leadership, and the ability to collaborate effectively with global stakeholders in a fast-paced environment.

 

Key Responsibilities

Leadership & Team Management

  • Lead and mentor a team of onshore and offshore data engineers to ensure high-quality deliverables.
  • Provide technical direction, coaching, and knowledge sharing to foster team growth and capability building.
  • Establish and enforce engineering best practices, coding standards, and reusable frameworks.
  • Champion innovation, continuous learning, and thought leadership within the data engineering function.

Project Delivery & Execution

  • Oversee end-to-end project delivery, ensuring timely, high-quality execution aligned with project objectives.
  • Define high-level solution designs, data architectures, and ETL/ELT frameworks for cloud-based data platforms.
  • Drive development, code reviews, unit testing, and deployment to production environments.
  • Ensure optimal performance, scalability, and reliability across data pipelines and systems.

Stakeholder Communication & Collaboration

  • Collaborate with clients, product owners, business leaders, and offshore teams to gather requirements and define technical solutions.
  • Communicate project updates, timelines, risks, and technical concepts effectively to both technical and non-technical stakeholders.
  • Act as the primary point of contact for client technical discussions, solution design workshops, and progress reviews.

Risk & Issue Management

  • Proactively identify project risks, dependencies, and issues; develop and execute mitigation plans.
  • Ensure governance, compliance, and alignment with organizational standards and methodologies.
 

Must-Have Skills

  • 12+ years of experience in Data Engineering with proven experience in delivering enterprise-scale projects.
  • Strong expertise in Big Data concepts, distributed systems, and cloud-native architectures.
  • Proficiency in Snowflake, SQL, and a wide range of AWS services (Glue, EMR, S3, Aurora, RDS, Lambda, Step Functions).
  • Hands-on experience with Python, PySpark, and building cloud-based microservices.
  • Strong problem-solving, analytical skills, and end-to-end ownership mindset.
  • Proven ability to work in Agile/Scrum environments with iterative delivery cycles.
  • Exceptional communication, leadership, and stakeholder management skills.
  • Demonstrated capability in leading onshore–offshore teams and coordinating multi-region delivery efforts.
 

Good-to-Have Skills

  • Experience with DevOps tools (Jenkins, Git, GitHub/GitLab) and CI/CD pipeline implementation.
  • Experience with cloud migration and large-scale modernization projects.
  • Familiarity with the US insurance/reinsurance domain + P&C Insurance knowledge
  • Knowledge of Data Vault 2.0 and modern data modeling techniques.
 

 

Qualifications

Educational Qualifications

Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or a related field.

Postuler maintenant

Plus d'emplois