- Professional
- Optionales Büro in Indianapolis
Job Details
Description
Walker Overview
Walker is a full-service Experience Management (XM) firm. We believe everyone deserves an amazing experience. This is our purpose, and we fulfill it by providing the world’s leading brands with the services, guidance, and best practices needed to maximize the value of their XM programs.
Walker is located in Indianapolis, IN. We are open to on-site, hybrid, or remote work locations to meet the varying needs of our team members. Remote options are available from any state in which we have operations in the continental U.S. Walker is intentional and mindful about creating a workforce of diverse people who are compensated fairly and are free to be their authentic selves. We know doing this will further enhance the experience our associates and customers have with our company.
Summary
You will help design and implement data engineering solutions leveraging Snowflake, primarily in AWS or Azure environments. You’ll support pipelines, ETL/ELT processes, and collaborate closely with both internal teams and clients to deliver impactful, scalable solutions.
Responsibilities
• Develop, optimize, and maintain data pipelines using Snowflake Streams or Tasks.
• Work with cloud data platforms (AWS or Azure) to ingest and transform data using Snowflake best practices.
• Support and engineer projects on the Snowflake platform with internal and external stakeholders, often larger in scope and fully committed to Snowflake.
• Collaborate with business and technical stakeholders to translate requirements into actionable solutions, ensuring strong communication throughout.
• Implement monitoring, logging, and performance tuning for Snowflake-based pipelines.
• Ensure data quality, integrity, and security using Snowflake’s governance and access control features.
• Quickly develop products or solutions that can be offered in the marketplace through Databricks.
• Document workflows and maintain version control through tools such as dbt, Airflow/Jenkins and Git.
• Consistently deliver robust, end-to-end Snowflake pipelines supporting business reporting and analytics.
• Proactively optimize workflows and cost-effective compute and storage configurations.
• Collaborate smoothly with internal teams and external stakeholders, contributing effectively to agile sprint cycles.
• Demonstrate continuous learning through available upskilling or external training.
• Other duties as assigned.
Education & Experience
• 1+ years of experience working in Data Engineering or Python-based application development on Snowflake or cloud-based data platforms.
• Bachelor’s degree in computer science, Data Science, Information Technology, or a related field; or equivalent professional experience.
• Hands-on experience with at least one major cloud platform (AWS or Azure).
• Experience working on projects dedicated to Snowflake, ideally in a client-facing or consulting environment (preferred).
Knowledge, Skills & Abilities
• Proficiency in Python and SQL.
• Familiarity with Snowflake features such as Time Travel, Zero-Copy Cloning, and Data Sharing.
• Strong understanding of ETL/ELT processes and data warehousing principles.
• Excellent communication skills for working directly with internal and external stakeholders.
• Ability to proactively translate requirements into effective solutions.
• Self-starter mindset with a passion for emerging data technologies.
• Strong problem-solving and analytical skills.
• Client-facing skills preferred; ability to build trust and communicate value clearly.
Preferred Qualifications
• Snowflake SnowPro Core Certification or similar.
• Experience with dbt or other similar data transformation tools
• Exposure to Snowpark.
• Experience with data workflow orchestration tools (Airflow, Jenkins).
• Prior consulting or client-facing project experience.
• Experience with machine learning concepts.
Walker is an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status.
Why work at Walker? https://walkerinfo.wistia.com/medias/wwmqw7ssw2
Learn more about the Walker DEI efforts: https://walkerinfo.com/diversity-equity-and-inclusion/
Perks and Benefits: https://walkerinfo.com/careers/
Qualifications
Education & Experience
• 1+ years of experience working in Data Engineering or Python-based application development on Databricks or Apache Spark.
• Bachelor’s degree in computer science, Data Science, Information Technology, or a related field; or equivalent professional experience.
• Hands-on experience with at least one major cloud platform (AWS or Azure).
• Experience working on projects dedicated to Databricks, ideally in a client-facing or consulting environment (preferred).
Knowledge, Skills & Abilities
• Proficiency in Python and SQL.
• Familiarity with Unity Catalog, the backbone of Databricks.
• Strong understanding of ETL/ELT processes and Delta Lake.
• Excellent communication skills for working directly with internal and external stakeholders.
• Ability to proactively translate requirements into effective solutions.
• Self-starter mindset with a passion for emerging data technologies.
• Strong problem-solving and analytical skills.
• Client-facing skills preferred; ability to build trust and communicate value clearly.
• This role requires the ability to travel to client sites, with expected travel approximately once per month.