Platzhalter Bild

Homeoffice Senior Data Platform Engineer chez Liven

Liven · Chennai, Inde · Remote

Postuler maintenant

About the role

As a Senior Data Platform Engineer, you will be the technical owner of Liven’s data platform infrastructure—driving automation, scalability, and resilience across our data workflows. Sitting at the intersection of DevOps and Data Engineering, this role is critical in enabling our teams to ship faster, fail less, and derive more value from our data ecosystem.

You’ll work closely with data engineers, analysts, product managers, and software engineers to build and operate a modern, secure, and scalable data platform that supports everything from machine learning pipelines to real-time analytics.

What you'll do

  • Own and operate the end-to-end data infrastructure, ensuring performance, reliability, and scalability.
  • Design and implement CI/CD pipelines specifically for data workflows and tooling.
  • Deploy and manage tools like Airbyte, Prefect, and Superset using Docker and Kubernetes.
  • Set up and maintain monitoring, secrets management, and alerting systems to ensure platform health and security.
  • Apply GitOps practices or tools like Argo Workflows for streamlined infrastructure deployments.
  • Manage and scale Kafka, Spark, or DuckDB clusters to support real-time and batch data workloads.
  • Explore and maintain self-hosted tools like dbt Cloud, ensuring smooth integration and performance.
  • Use Infrastructure-as-Code tools like Terraform or Helm to automate provisioning and configuration.
  • Administer observability stacks such as Grafana and Prometheus for infrastructure visibility.
  • Implement secure access control, role-based permissions, and ensure compliance with GDPR, HIPAA, and internal data governance standards.
  • Collaborate across teams to support data engineers, analysts, and developers with reliable infrastructure and workflow tooling.
  • Steer clear of proprietary infrastructure platforms like AWS Glue or Azure Synapse (we’re staying open-source/cloud-native for now).

Qualifications

  • 5–8 years of experience in DataOps, DevOps, or Platform Engineering roles.
  • Proficiency with modern data stack components (e.g., Airflow, dbt, Kafka, Databricks, Redshift).
  • Solid understanding of cloud platforms (AWS or GCP).
  • Strong communication skills to collaborate across product, data science, and engineering teams.
  • Bias for ownership, automation, and proactive resolution.

Good to Have

  • Experience with Infrastructure-as-Code tools like Terraform or Helm for managing Kubernetes and cloud resources.
  • Familiarity with administering Grafana, Prometheus, or similar observability stacks.
  • Exposure to GitOps methodologies and tools like Argo CD or Flux.
  • Hands-on experience with self-hosted or hybrid setups of tools like dbt Cloud.
  • Understanding of auto-scaling strategies for distributed systems (Kafka, Spark, DuckDB).
  • Experience contributing to platform or DevOps initiatives in a data-heavy environment.


Postuler maintenant

Plus d'emplois