- Professional
- Optionales Büro in Bengaluru
Job Description:
Responsibilities:
- Design, develop, and maintain robust data pipelines and workflows using Snowflake features including:
- Stored Procedures
- Tasks and Streams
- Snowpark for Python
- Write optimized and maintainable SQL and Python code for data processing and transformation.
- Collaborate with cross-functional teams to build and support marketing data products.
- Ensure data quality, integrity, and governance across all data products.
- Participate in Agile ceremonies (standups, planning, retrospectives), contributing to sprint planning and delivery.
- Understand data requirements and deliver scalable solutions.
- Monitor and optimize data pipeline performance and cost-efficiency in Snowflake.
- Create technical documentation and maintain Confluence pages for knowledge sharing.
Required Skills & Qualifications:
- 3-5 years of experience in Data Engineering or a related field.
- Strong expertise in Snowflake development:
- Stored Procedures
- Tasks/Streams
- SQL
- Snowpark for Python
- Proficient in Python programming, especially for data manipulation and automation.
- Hands-on experience with marketing data, attribution models, campaign data, and customer segmentation.
- Familiarity with the Agile development process and tools.
- Excellent written and verbal communication skills.
Good to Have:
- Experience with JIRA and Confluence.
- Exposure to CI/CD practices using GitHub and/or Bitbucket.
- Understanding of orchestration and automation tools.
- Knowledge of modern data architectures and cloud-native solutions.
Location:
DGS India - Bengaluru - Manyata H2 blockBrand:
MerkleTime Type:
Full timeContract Type:
Permanent Jetzt bewerben