Platzhalter Bild

Sr Data Engineer (Snowflake, SnowPark, SnowPipe) : W2 role en HARAMAIN SYSTEMS INC.

HARAMAIN SYSTEMS INC. · St. Louis, Estados Unidos De América · Hybrid

Solicitar ahora

Sr Data Engineer (Snowflake, SnowPark, SnowPipe) :: W2 role

Location: St. Louis, MO Hybrid

This job will be mostly remote, but there will be meetings each week where team needs to meet onsite in St. Louis. Notice could be the day before to meet the next day in the office, but then may be off site the rest of the week or for the next 2 weeks.

PERSON MUST RELOCATE TO ST LOUIS, No Relocation expenses

Requirements:

  • Executive-level communication skills required
    • work with Executive leadership to communicate ideas,
    • build trust with technical depth and collaborate.
  • Technical - Most Senior Data Engineer / Principal Architect level role
  • 3+ years Snowflake experience
  • Expert: Snowflake, SnowPark (data science), Data ingestion (SnowPipe, Azure Data Factory)
  • Programming : Scala (build custom data pipelines, Python (data modeling), PySpark
  • Azure preferred, AWS or GCP is acceptable.
  • Expert building Data Pipelines
  • Must Start October 1st

About the Job

Safety National - owned by Tokyo Marine (large equity holding fund)

Project involves extracting value from third-party administrator data using Snowflake platform

High Priority - Customer facing, handling architecture, challenges, and leading the conversation.

Ready to take a step towards a principal role soon and have the visible leadership to run this large project and interact with their executive team. Carry that leadership torch for Safety National and be in the leadership pipeline for us.


JOB DETAIL

We are seeking a highly skilled and experienced Senior Data Engineer to design and build a robust set of data ingestion and processing pipelines, maintaining best practices for a data lakehouse architecture. This is a highly visible, client-facing role that requires a blend of technical expertise, architectural leadership, and strong communication skills. You'll be the technical lead, working closely with team members to deliver innovative data solutions that support analytics, machine learning, and real-time decision-making.

Key Responsibilities

  • Lead the design, implementation, and maintenance of scalable data lakehouse platforms using modern tools like Databricks, Snowflake, and Apache Iceberg.
  • Develop and optimize high-performance batch and streaming ETL/ELT pipelines, with a strong focus on Snowflake, Snowpipe, and Snowpark.
  • Act as a technical leader, managing architecture discussions and leading conversations with both internal teams and external clients.
  • Implement and enforce data quality, governance, and security best practices to ensure data integrity and compliance.
  • Identify opportunities to integrate platform-level AI tools (like those in Snowflake, Databricks, and Fabric) to outpace traditional data science efforts and deliver faster, more impactful insights.
  • Collaborate with cross-functional teams, including data scientists and business stakeholders, to deliver high-quality, business-critical datasets.

Qualifications

  • Slowflake experience/proficiency is critical.
  • Azure experience is preferred, but Google Cloud and AWS okay
  • 5+ years of professional experience in data engineering.
  • Strong technical leadership and excellent communication skills, with proven experience in a client-facing role.
  • Deep expertise in cloud data platforms, with significant hands-on experience in Snowflake.
  • Demonstrated experience with data lakehouse design patterns and modern data architectures.
Solicitar ahora

Otros empleos