Jetzt bewerben
Overview:

As a Senior Technical Lead, you will design, build, and maintain scalable backend and distributed systems that power real-time and data-intensive applications. You will have the autonomy to use your expertise in Java, Scala, or Python—crafting robust microservices, APIs, and integrating event-driven and streaming solutions. This role blends hands-on engineering with architectural responsibility, and offers opportunities to shape our evolving tech stack across batch and streaming environments.

Responsibilities:
  • Design, develop, and maintain scalable, production-quality backend microservices and REST/gRPC APIs using your primary language and framework (e.g., Java/Spring Boot, Scala/Akka, Python/FastAPI/Django).
  • Architect and support event-driven and real-time data solutions using messaging or streaming platforms such as Apache Kafka, Apache flink, Apache spark structured streaming, Pulsar, Pub/Sub, or similar.
  • Collaborate in building and optimizing both streaming and batch data pipelines for reliable, timely data delivery.
  • Integrate with, and optimize, relational (PostgreSQL, MySQL) or NoSQL databases, designing schema and high-performance queries.
  • Leverage containerization (Docker) and orchestration (Kubernetes) to build and deploy cloud-native, resilient applications.
  • Contribute to CI/CD pipelines, infrastructure as code, and cloud-native operational practices.
  • Champion secure coding, observability, monitoring, and performance optimization across all services.
  • Collaborate closely with product, data, DevOps, and engineering peers in Agile/Scrum cycles.
  • Mentor team members, participate in code/design reviews, and foster knowledge sharing.
Requirements:
  • Bachelor’s degree or higher in Computer Science, Engineering, or a related technical discipline.
  • 11+ years of hands-on software or data engineering experience, including designing and maintaining streaming or real-time data pipelines.
  • Strong expertise with distributed streaming platforms such as Kafka, Flink, Spark Structured Streaming, or Pulsar, and their associated tooling.
  • Strong programming experience in Java, Scala, or Python for backend and distributed systems; proficiency in one major framework (Spring Boot, Django, FastAPI, Akka, etc.).
  • In-depth understanding of event-driven architectures, stateful stream processing, windowing, and exactly-once delivery semantics.
  • Experience architecting robust, high-performance systems optimized for latency, throughput, and resilience.
  • Familiarity with cloud-based streaming services (AWS Kinesis, Google Pub/Sub, Azure Event Hubs) is a plus.
  • Experience working with Docker and Kubernetes for containerization and orchestration is beneficial.
  • Excellent collaboration, problem-solving, and communication skills.
Jetzt bewerben

Weitere Jobs