- Professional
- Optionales Büro in Madrid
Are you excited about building modern data infrastructure that empowers self-service analytics at scale? At Superbet, data is a strategic enabler of decision-making, experimentation, and performance measurement. Our Analytics Platform team builds the foundations that make data reliable, accessible, and actionable—partnering with product and business teams to measure what matters.
You’ll be joining a mature and growing community of 40+ data engineers who are shaping one of the most advanced data ecosystems in the industry.
What you'll be doing:
- design, build, and optimize scalable ETL/ELT pipelines for high-volume, fast-changing datasets
- apply proven data warehouse methodologies (e.g. Kimball, Inmon) to design and deliver trustworthy, performant data models
- develop and maintain semantic layers and curated data marts to enable analytical self-service across the company
- orchestrate reliable workflows with modern tooling such as Apache Airflow
- enforce data quality, lineage, and observability practices to ensure confidence in our data products
- partner with product, analytics, and engineering teams to co-create scalable data solutions that deliver measurable business value
- raise the engineering bar through code reviews, testing, automation, and architectural improvements
We're looking for:
- strong SQL and Python skills, with solid understanding of distributed data systems and modern data architectures
- deep knowledge of dimensional data modeling (Kimball, Inmon) and hands-on experience building analytical layers in DWHs such as Snowflake or BigQuery
- hands-on experience with cloud platforms (AWS preferred, Azure/GCP also valuable)
- workflow orchestration experience with Airflow or similar tools
- strong ownership mindset—able to work independently, structure work, and partner effectively with technical and non-technical stakeholders
- excellent communication skills with a collaborative, problem-solving approach
- based in Madrid or surrounding area as this will require hybrid work
Bonus points for:
- experience with real-time streaming technologies (Kafka or similar)
- experience with cloud native Analytical databases like Snowflake
- multi-cloud exposure and understanding of platform trade-offs
- background in product-led or experimentation-driven environments
- experience with data governance, cataloging, and documentation practices (e.g. DataHub)
Jetzt bewerben