Firmenlogo

Senior Data Engineer na Integra Partners

Integra Partners · Troy, Estados Unidos Da América · Onsite

US$ 124.000,00  -  US$ 124.000,00

Candidatar-se agora
Integra Partners is seeking an enthusiastic and motivated individual to join our data engineering team. As a Senior Data Engineer, you will play a key role in building and maintaining robust data pipelines and enabling advanced analytics and AI solutions across the organization. This position requires hands-on experience with modern data engineering tools and cloud platforms, including Snowflake, DBT, and Azure. You will work closely with the Data Analytics team to ensure data availability, reliability, and performance to support business insights and decision-making. We are looking for someone with strong problem-solving skills and an eagerness to learn.

JOB RESPONSIBILITIES/SKILLS:
  • Design, develop, and maintain scalable data pipelines utilizing Snowflake and DBT. 
  • Create and optimize efficient data models in Snowflake to support business intelligence and analytics requirements.
  • Implement ETL workflows to transform raw data into analytics-ready datasets using DBT. 
  • Tune Snowflake queries and DBT models for maximum performance and scalability.
  • Seamlessly integrate Snowflake with diverse data sources (Salesforce, MS SQL, PostgreSQL, CSV etc.).
  • Orchestrate data workflows using tools such as Apache Airflow or Azure Data Factory.
  • Collaborate with DevOps to support automated CI/CD pipelines for data infrastructure code.
  • Partner with Data Analytics team, and other stakeholders to gather requirements and deliver data solutions. 
  • Develop and enforce data quality checks to ensure accuracy, consistency, and reliability across pipelines.
  • Maintain detailed documentation of data models, transformation processes, and pipelines.
  • Support AI and machine learning initiatives by preparing, curating, and managing high-quality feature datasets.
  • Ensure all data engineering solutions adhere to healthcare industry standards such as HIPAA and HITRUST.
  • Implement monitoring, alerting, and logging for data pipelines to proactively detect and resolve issues.
  • Collaborate with data analysts to operationalize AI/ML models within the Snowflake environment.
REQUIREMENTS: 
  • 5+ years of experience in data engineering or a related field, with a strong understanding of modern data architectures.
  • Proficiency in Snowflake, including performance tuning, data modeling, and integration with downstream analytics tools.
  • Familiarity with Azure Services, including storage (e.g., Blob Storage), and orchestration tools (e.g., Azure Data Factory).
  • Hands-on experience developing and maintaining data transformation workflows using DBT.
  • Hands-on experience with workflow orchestration tools (e.g., Apache Airflow or Azure Data Factory).
  • Strong programming skills in Python, particularly for data pipeline development, automation, and integration.
  • Experience with Medallion Architecture and Data Vault 2.0 principles for structuring and managing enterprise data.
  • Experience with Snowflake’s AI/ML features (Cortex Analyst, Cortex Agents, Cortex AISQL etc.)
  • Proficiency with CI/CD pipelines and modern DevOps practices, including automated testing, deployment, and monitoring.
  • Demonstrated ability to implement robust data quality checks, validation frameworks, and observability practices.
  • Knowledge of data governance, lineage tracking, and integration with metadata management tools.
  • Excellent collaboration and communication skills, with experience working in Agile, cross-functional teams.
  • Working knowledge of data modeling within the healthcare industry is a plus.
SALARY: $124,000.00/annually
Candidatar-se agora

Outros empregos