- Senior
- Escritório em Mexico City
- Design, develop, and maintain scalable data pipelines using Python, Snowflake, AWS Glue, and Airflow.
- Build and optimize ETL/ELT workflows to ingest, transform, and deliver data from various sources.
- Implement data quality checks, monitoring, and alerting to ensure pipeline reliability.
- Collaborate with analysts, scientists, and stakeholders to understand requirements and deliver solutions.
- Establish and enforce data engineering best practices, coding standards, and design patterns.
- Optimize query performance and data storage strategies in Snowflake and cloud data warehouses.
- Implement CI/CD pipelines for automated testing and deployment of data workflows.
- Monitor pipeline performance, troubleshoot issues, and implement improvements proactively.
- Create and maintain technical documentation for data architectures, pipelines, and processes.
- Participate in code reviews and provide constructive feedback to team members.
- Mentor junior data engineers on modern tools, techniques, and best practices.
- Stay current with emerging data technologies and evaluate their applicability to business needs.
- Collaborate with DevOps and platform teams to ensure infrastructure scalability and reliability.
- Support data governance initiatives and ensure compliance with security policies.
- 5+ years of experience in data engineering or related field.
- Bachelor's Degree in Computer Science, Engineering, or equivalent practical experience.
- Expert proficiency in Python for data engineering, ELT development, and automation.
- Strong SQL skills with experience in query optimization and performance tuning.
- Hands-on experience with Snowflake (SnowSQL, stored procedures, streams/tasks).
- Proficiency with AWS data services: Glue, S3, Lambda, Redshift.
- Experience with Apache Airflow or similar orchestration tools (Luigi, Prefect, Dagster).
- Knowledge of data modeling principles (dimensional modeling, data vault, normalization).
- Experience with streaming technologies (Kafka, Kinesis) and dbt for analytics engineering.
- Understanding of DataOps, CI/CD for data pipelines, and infrastructure as code (Terraform).
- Familiarity with version control (Git), code reviews, and collaborative development.
- Knowledge of data quality frameworks (Great Expectations) and containerization (Docker/Kubernetes) is a plus.
- Understanding of data governance, security best practices, and compliance requirements.
- Strong problem-solving skills and ability to research new technologies independently.
- Excellent communication skills for technical and non-technical audiences.
- Private Medical/Dental Plan
- Savings Fund
- Life Insurance
- Meal/Grocery Voucher
We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120.
Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here.
Please read our Candidate Privacy Policy.
We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law.
USA Job Seekers:
Candidatar-se agora