Platzhalter Bild

Senior Data Engineer bei Liven

Liven · Chennai, Indien · Hybrid

Jetzt bewerben

About the role

As a Senior Data Engineer at Liven, you'll be a critical part of our Data & Analytics team—building and maintaining scalable ELT pipelines, ensuring data reliability, and enabling impactful decision-making across the company. This role blends engineering excellence with business insight, giving you the opportunity to influence how data is used company-wide.

You’ll be working with modern open-source tools and cloud-based data platforms, with the autonomy to shape the future of Liven’s data architecture.

What you'll do

  • Design, develop, and maintain scalable ELT pipelines using Python and SQL.
  • Own and manage data workflows using tools like Airbyte or Meltano.
  • Build and maintain robust dbt models and support analytics across the company.
  • Operate and improve data infrastructure using DevOps practices and CI/CD pipelines.
  • Collaborate with cross-functional teams to define metrics, model data, and deliver high-impact analytical outputs.
  • Manage and monitor DAGs in Prefect or Apache Airflow, ensuring data freshness and reliability.
  • Use Terraform and infrastructure-as-code principles to provision and manage cloud-based resources.
  • Support testing and monitoring using frameworks like Great Expectations (nice to have).
  • Contribute to data quality, governance, and documentation best practices.
  • Participate in a strong feedback culture, contributing to continuous improvement of data workflows and team practices.

Qualifications


Must-Have Experience

  • 4–8 years in Data or Analytics Engineering, ideally in a startup or scale-up environment.
  • Strong hands-on skills with Python and SQL.
  • Deep experience with DevOps practices and operational ownership of tools like Airbyte or Meltano.
  • Solid understanding and experience with dbt.
  • Proven track record of building reliable, scalable pipelines in a modern cloud data stack.

Should-Have Experience

  • Experience managing large DAGs using Prefect or Airflow.
  • Familiarity with Kafka and real-time pipeline architecture.
  • Experience using Terraform or other infrastructure-as-code tools.

Good to Have

  • Familiarity with ELT testing frameworks like Great Expectations.
  • Contributions to open-source data tools or frameworks.
  • Experience with proprietary ingestion tools like Fivetran or Segment.



Jetzt bewerben

Weitere Jobs