Platzhalter Bild

Homeoffice Senior Data Engineer chez Advance Local Media LLC

Advance Local Media LLC · Remote, États-Unis d'Amérique · Remote

$120,000.00  -  $140,000.00

Postuler maintenant

Advance Local is looking for a Senior Data Engineer to design, build, and maintain the enterprise data infrastructure that powers our cloud data platform.  This position will combine your deep technical expertise in data engineering with team leadership responsibilities for data engineering, overseeing the ingestion, integration, and reliability of data systems across Snowflake, AWS, Google Cloud, and legacy platforms.  You’ll partner with data product and across business units to translate requirements into technical solutions, integrate data from numerous third-party platforms, (CDPs, DMPs, analytics platforms, marketing tech) into central data platform, collaborate closely with the Data Architect on platform strategy and ensure scalable, well-engineered solutions for modern data infrastructure using infrastructure as code and API-driven integrations.

 

The base salary range is $120,000 - $140,000 per year.

 

What you’ll be doing:

  • Lead the design and implementation of scalable data ingestion pipelines from diverse sources into Snowflake.
  • Partner with platform owners across business units to establish and maintain data integrations from third party systems into the central data platform.
  • Architect and maintain data infrastructure using IAC, ensuring reproducibility, version control and disaster recovery capabilities.
  • Design and implement API integrations and event-driven data flows to support real time and batch data requirements.
  • Evaluate technical capabilities and integration patterns of existing and potential third-party platforms, advising on platform consolidation and optimization opportunities.
  • Partner with the Data Architect and data product to define the overall data platform strategy, ensuring alignment between raw data ingestion and analytics-ready data products that serve business unit needs.
  • Develop and enforce data engineering best practices including testing frameworks, deployment automation, and observability.
  • Support rapid prototyping of new data products in collaboration with data product by building flexible, reusable data infrastructure components.
  • Design, develop, and maintain scalable data pipelines and ETL processes; optimize and improve existing data systems for performance, cost efficiency, and scalability.
  • Collaborate with data product, third-party platform owners, Data Architects, Analytics Engineers, Data Scientists, and business stakeholders to understand data requirements and deliver technical solutions that enable business outcomes across the organization.
  • Implement data quality validation, monitoring, and alerting systems to ensure reliability of data pipelines from all sources.
  • Develop and maintain comprehensive documentation for data engineering processes and systems, architecture, integration patterns, and runbooks.
  • Lead incident response and troubleshooting efforts for data pipeline issues, ensuring minimal business impact.
  • Stay current with the emerging data engineering technologies, cloud services, SaaS platform capabilities, and industry best practices.

 

Our ideal candidate will have the following:

  • Bachelor's degree in computer science, engineering, or a related field
  • Minimum of seven years of experience in data engineering with at least two years in a lead or senior technical role
  • Expert proficiency in Snowflake data engineering patterns
  • Strong experience with AWS services (S3, Lambda, Glue, Step Functions) and Google Cloud Platform
  • Experience integrating data from SaaS platforms and marketing technology stacks (CDPs, DMPs, analytics platforms, CRMs)
  • Proven ability to work with third party APIs, webhooks, and data exports
  • Experience with infrastructure such as code (Terraform, Cloud Formation) and CI/CD pipelines for data infrastructure
  • Proven ability to design and implement API integrations and event-driven architecture
  • Experience with data modeling, data warehousing, and ETL processes at scale
  • Advanced proficiency in Python and SQL for data pipeline development
  • Experience with data orchestration tools (airflow, dbt, Snowflake tasks)
  • Strong understanding of data security, access controls, and compliance requirements
  • Ability to navigate vendor relationships and evaluate technical capabilities of third-party platforms
  • Excellent problem-solving skills and attention to detail
  • Strong communication and collaboraion skills 
Postuler maintenant

Plus d'emplois